How to get Repositorytool.jar in Apache ACE? - apache

Apache Ace documentation refers about RepositoryTool.jar that can be used to manage Repository. But I could not find this tool in the Apache ACE distribution. Where can I download this tool?

The page you're referring to is part of the old site (the new one is located at http://ace.apache.org), and refers to tooling you probably shouldn't be using anymore: it has been used before there were other ways to interact with the repository, mainly for development purposes.
Depending on your needs, you can use the repository in a number of ways,
If you need to programmatically read and write the repositories (remember that they're only XML), use the HTTP API available for that.
You can do the same thing from code, see Repository and its implementations.
If you want to edit 'meaningful' ACE data (such as linking distributions and targets), use the Client REST API. This is probably the option you want.

Related

Chrome manifest v3 - is there a viable workaround to use Google's File Picker in a Chrome extension?

My searches have turned up nothing concrete. My extension uses Google's file picker to allow the user to browse their sheets and choose a desired file to write some data to, which manifest v3 breaks because of some GAPI limitations. Unless I've missed something obvious, there does not seem to be a simple workaround or method for this to migrate to v3 -- it just seems to be disallowed.
I'm not asking if there's a way to do something that they intend to not be possible (even though I doubt such a thing would exist with Google) but I'm optimistically hoping that maybe there is some hacky/annoying workaround that still fits within their rules. If I absolutely have to just allow them to set a sheet URL manually I will...I'm just trying to avoid it.
Any tips or suggestions would be appreciated.
You may have to test it yourself to make sure there are no weird behaviors, but Google has some recommendations regarding this in their migration guide:
In Manifest V3, all of your extension's logic must be included in the extension. You can no longer load and execute a remotely hosted file. A number of alternative approaches are available, depending on your use case and the reason for remote hosting. Here are approaches to consider:
Configuration-driven features and logic
In this approach, your extension loads a remote configuration (for example a JSON file) at runtime and caches the configuration locally. The extension then uses this cached configuration to decide which features to enable.
Externalize logic with a remote service
Consider migrating application logic from the extension to a remote web service that your extension can call. (Essentially a form of message passing.) This provides you the ability to keep code private and change the code on demand while avoiding the extra overhead of resubmitting to the Chrome Web Store.
Bundle third-party libraries
If you are using a popular framework like React or Bootstrap, you can download the minified files, add them to your project and import them locally.
For your case, option #3 seems like the easiest. Looking at the Google Picker API documentation it only uses two relatively small script files, https://apis.google.com/js/api.js and https://accounts.google.com/gsi/client. You could try to bundle these in your Chrome extension and call the methods locally.

Create network file share from an API

I am exploring the possibilities of exposing an EMC Documentum folder, and the files/folders within, as a network file share.
The reason is so we can enable another application to read and write files to what it thinks is a standard UNC path, but really the repository is in Documentum.
That Documentum product doesn’t seem to offer this, however does expose an API.
A few thoughts here were a bespoke ‘driver’ for SAMBA, possibly something using WebDAV, but really I haven’t investigated these much yet, so both may be unviable.
Basically, how can I wrap an API up to look like a network drive?
I’ll keep self exploring this but hopefully someone can provide some leads here too..?
Update: using FUSE for Linux.
Documentum "folder" as you see it is not something like Windows folder. It is a database record of object with its related properties. Nothing else.
Documentum "documents" are somehow more related to Windows documents but still are only database record of objects with related properties and specific content stored somewhere in storage. Storage can be something like:
file share on Windows / Linux OS
specialized storage soluton like
Centra
specialized storage cloud solution
So you have misunderstanding of what you call Documentum folder. Your requirement can still be achieved in some way, thats for sure.
For example you could make integration between windows folder to Documentum via Spring Intergration framework (SI) from Windows folder side and at Documentum side implement listeners to hook SI and implement BOF (Business Object Framework) services to process events from SI. This is just one of the options.
Technically it is possible to create an interface to Documentum repository using any standard (SMB, CIFS, WebDav, IMAP, .... ) which can represent a document.
The fun task / hard part is mapping Documentum functionality to your chosen standard.
For example: back in 2013 I wrote a basic proof of concept Webdav interface to Documentum repository. I used the Miltion WebDav java library (http://milton.io).
With a WebDav interface, the Documentum Repository was exposed to a Windows computer as a drive using Add Network Location.
We identified that we can use FUSE on Linux.

Is it possible to remote access and parse git revision history?

I have a usecase where I need to be able to inspect Git repositories as part of a web service and the average repo size will be very large - 1GB+ due to being used for video game projects. I need to do simple actions such as listing the revision history, etc.
Right now I'm implementing it via API calls to the remote Git host services (Github, Bitbucket, etc). This works okay, however there are some great Git projects like GitVersion that only work with real Git repos, that use libGit2sharp, and I cannot easily write a work around for.
I feel like this'll be a longshot, but I was wondering if anyone has discussed or begun work upon an implementation of libGit2sharp that works with the major Git hosts via their API's. Obviously not all actions available in libGit2 will work with an API interface, but at least most read-only actions should be.
If this is an entirely new feature request - I'd like to get the opinion of someone with knowledge of the libGit2sharp codebase about how difficult such a feature request would be to implement.
Git only specifies the network protocol for fetching, pushing and creating an archive. Nothing else can be done via the Git protocol (and providers will likely disable the archive so they can leverage their existing caching solutions).
If this is an entirely new feature request - I'd like to get the opinion of someone with knowledge of the libGit2sharp codebase about how difficult such a feature request would be to implement.
This feature would be out of scope and impossible as Git does not provide a way to perform these tasks.
Once you're trying not to do Git, then you're out of the Git world into each provider's API. Trying to replicate Git operations and git commands on top of each provider's API is a whole project unto itself, and one which is likely to get you to hit these provider's API limits, as in-depth analysis of the repositories is not generally why they provide these services.
Not to mention that looking up each necessary object over HTTP would be extremely slow and you'd likely not gain anything over grabbing a Gigabyte or two from the network.
But if all you need is a few questions that can be easily answered from the APIs themselves (say, latest commit and its relationship to different branches), and you do need the logic in GitVersion, then you're probably better off making its history analysis pluggable so you can put in the data from your API lookups.
I'm not familiar with how GitVersion makes its decisions, but if it doesn't just want references and their relationships to each other and the tags, but rather it wants to look at the repositories themselves, and you do need it rather than just replicate some of its logic, I would recommend to download the repositories and perform all the analysis there. It'll be a much more efficient use of time to rent a bit of disk space from some provider than try to fit each individual provider's API into some idealised version of a git command where you then still need to figure out the edge cases of both the command and its API you're using.

Eclipse RCP Target Platform: Updates & Backing Up

I've just created an eclipse target definition/platform for my application, opting to use software sites (rather than local files/installations) as recommended in the tutorial I followed and a later best practices post by the same author.
The software sites are all external sites (eclipse, sourceforge etc.)
Everything seems to be working well, though I have two concerns:
If a component is updated (by the software provider), will it also be updated automatically in the target definition file?
Is it possible to take a backup of the target platform, so that it can be configured (for example) on a computer without an internet connection, or used in the event a remote site becomes unavailable.
You can create a mirror of an Eclipse p2 repository. It's quite common to do this inside an organisation so that there's a copy of the repository that's quick to access, and isn't dependant on some third party continuing to host it. There's a guide on the Eclipse Wiki.
As far as I'm aware, your Target Definition can only reflect what's in the p2 repository it's pointing at. If the developer replaces a package with a newer version, it'll pick that up. If you need greater control over that, then selectively mirroring the content is probably the way to go.
From that wiki page, it looks like by default it won't delete content in your mirror (even if it's deleted in the remote) unless you specify -writeMode clean.

URL shortening (tinyURL, Bit.ly) application for internal deployment (open source or commercial)

I'm looking for the equivalent of a URL shortening service such as http://bit.ly/ for an internal deployment in our organisation. Anyone know of any open source projects (especially Java ones) or commercial products which I can install internally rather than using an external service?
Thanks!
Shorty : http://get-shorty.com/
But there's several other url shortener .... most of them are in PHP/Mysql.
Don't know if a Java one exist.
http://monkeytooth.net/2010/12/htaccess-php-how-to-wordpress-slugs/
tells you the core basics of how to achieve the concept with PHP and Htaccess building up from there I can say would solely be on your own. However not all to hard a concept in general to build off of if you know php/mysql. That said your not likely to find anything directly built in JavaScript however using this with JavaScript again wouldn't be all that hard a concept. I say your not likely to find one JS based as you need some type of server-side script to communicate with a DB somewhere, where you have all your short URL identifiers, and JavaScript to my knowledge doesn't support directly at least database connectivity. You can go through any means of AJAX to communicate with a server-side script to then do what you want with the JavaScript though.