Nexus 3 upgrade changed URls for maven repos - sonatype

I ran the migration utility to upgrade our Nexus 2 (2.14) to Nexus 3.40-02.
The Nexus 3 migration tool brought all the content from our 2.14 instance but subtly changed the URLs to all our Maven Repos.
For instance
https://nexus.foo.net/content/repositories/releases
became
https://nexus.foo.net/repository/releases
If I go in the Nexus 3 UI with the Admin account, browse to repostiories/settings I can see the URL but not change it.
This seems really dangerous since the Nexus URLs are encoded in poms by hundreds or more end-users consuming the jars. Why would the migration tool change the URL like this? Also I can find nothing in the Documentation about why the URL field is visible under "Settings" but cannot be changed to fix it.
Does anyone have any ideas about what went wrong?
Thanks.

The URI pattern will change after the upgrade. However you can activate a switch to enable the old URI pattern.
"By default, Nexus Repository Manager 2 uses a different URL pattern to expose repositories and repository groups than Nexus Repository Manager 3. While automated tools and CI can be reconfigured to utilize the new patterns, it is possible to change a configuration on the Nexus Repository Manager end to allow your upgrade to use the old pattern as well. This can be done in $data-dir/nexus3/etc/nexus.properties by adding:"
org.sonatype.nexus.repository.httpbridge.internal.HttpBridgeModule.legacy=true
https://help.sonatype.com/display/NXRM3/Upgrade+Procedures#UpgradeProcedures-ConfiguringLegacyURLPaths

Related

Can not access a new repository with svn

I have been looking for a solution to my SVN problem but have not yet found one. We have been using svn for a number of years without problems but I have been unsuccessful adding a new project as of late.
SVN is installed on a central computer we use as a server running Windows 7. We have TortoiseSVN installed on our clients and on the server. To create new projects in the past we would log onto the server and execute svnadmin create [drive]:/archive/new project. We would then create the trunk, tags, and branches folders using the repo-browser. Once that was done we could use TortoiseSVN to import the code on our local machines to create the archive.
Now when I create a new project archive the client computers return the error: "Could not open the requested SVN filesystem". The repo-browser says the same thing. I can perform all of the usual SVN activities from the client computers on all of the existing repositories, just not on any new ones. Also, if I use the repo-browser on the server it works.
What I have done so far is uninstall subversion and TortoiseSVN from the server and reinstalled TortoiseSVN 1.9.4 along with the command line tools and recreated the svn service. I also updated TortoiseSVN to 1.9.4 so there shouldn't be any version conflicts but it still does not work. Since everything works as long as I am on the server I suspect the problem lies in the network access configuration but I don't know what would be different from when it was working.
Also note that when I try to browse the archive with Firefox I can navigate down into the project trees of the older projects but not any new ones. Firefox displays:
<D:error>
<C:error/><m:human-readable errcode="160043">Could not open the requested SVN filesystem</m:human-readable>
</D:error>
Any help will be greatly appreciated.
PS: Access to the repositories on the server is through Apache 2.2
1. Using file-type access to repository over LAN is always The Bad Idea (tm)
2. Source of your problem (except the above) is changing format of repository-storage between version and inability of old versions read directly repositories of new versions: your client's SVN is older, than server-side (and worse - use|know only old format of repo)
Check version on client's hosts (I suppose, they are pre-1.6) and
update to version, compatible with server's version (1.7+ for 1.9.*)
OR
add any real network layer (svnserve is easy and lightweight choice) for accessing repositories (don't use file:/// anymore) - in this case old clients can communicate with fresh repositories
OR
run svnadmin create with additional option --compatible-version and correct version number as ARG
This is Permissions&Ownership Problem. User, under which Apache is running, now can't read filesystem-tree, created by user, used for remote login. Ask local admin "WTF?" and fix errors
How to overcome SVN — could not open the requested SVN file system
SVN Error: Could not open the requested SVN filesystem
Could not open the requested SVN filesystem on windows7 (start from answer HERE!!!)

Trying to quickly resurrect an old Maven built project

First day on a project and first day with Maven and I've already wasted a lot of time trying to get it to build.
It appears the issue is that this old project has config, POMs, etc, that have many broken URLs embedded in them. i.e. Maven generated stack traces are presenting lots of URLs that are broken when trying to download project dependencies.
I have been given only the project source which includes Maven config files. I have not been supplied with existing Maven repositories, project dependent libraries or any build environment, etc.
I have been hacking away at these files but I don't get very far with each build attempt.
Am I doing something fundamentally wrong or is this Maven config really stuck in 2008?
Update:
My POM really was stuck in 2008, i.e. by virtue of versioning, it is a snapshot in time while the rest of the Java world moves on.
Some of the dependencies were no longer in any repositories, most of which were defunct projects and so I've ceased to use them. I had to rewrite the entire POM. I had to spend a lot of time tweaking versions to ensure compatibility between dependencies and between plugins. After much battling; some plugins just wouldn't coexist, clobbering each other.
All in all, it was many, many hours effort...too many for this project with only one developer, and I believe I only now know enough to be dangerous.
The good ol' IDE build system would have been a better choice in this instance.
ftr's advice (in the comments section) is right: Maven can't download certain dependencies, but that doesn't necessarily mean that those dependencies don't exist anymore. It could just be that the extra-repos section of the Maven configuration is now missing certain repositories, and/or there's some other connection issue (like bad proxy config - which may lead to you being able to access certain repos but not others).
I've been in a similar situation, and found out that while initially Maven reported errors when trying to download about 80% of the dependencies, after various tweaks on Maven's config I ended up making it download all of the dependencies (well except one which was really just a custom jar somebody did and which was fetched directly from the local file system, but that's besides the point).
Here's what I'd do:
Of all the dependencies that Maven says it can't download, try to spot 2 or 3 which are "well know" (like maybe if it says it can't download Servlet or some Spring library, write down the exact URL's he's trying to contact for those).
Manually check if those URL are indeed accessible (via browser). If so, make sure that the dependencies exist for the version Maven is looking for. Maybe they have been updated since the project was created, and the old version is no longer kept. In this case, 90% of the time the solution is to simply update Maven's pom to point to the new version.
If manually checking the dependency's URL shows you that in fact the dependency exists, for the version Maven is looking for, make sure there's no proxy or some other form of internet connection "extra config" which is done for your browser, but not for Maven. If that's the case, just update Maven's config with all those extra params (proxy, proxy authentication, etc).
If the dependency URL doesn't exist at all, try googling to see if that dependency doesn't now exist on some other repo. For example many of the JBoss dependencies (like Hibernate, etc) have changed repo location somewhere around 2007-2009. If that's the case just add the new repo to Maven's repo list (and remove the old one if it no longer exists).
Finally, the good old shameful way to fix this is to go to a colleague which has (or had) something to do with your project at some point, and copy his local Maven repo to your machine :)

Using GIT or SVN in XCode 3/4 without server

Ok, perhaps I'm trying to accomplish something not doable.
I am a single developer (not part of team).
I'm trying to get some kind of versioning system going. I had used CVS with XCode 3, but XCode 4 no longer has that as an option. I've heard that SVN and Git are better alternatives anyway.
Basically, I've wasted more than half a day trying to get XCode to work with SVN / Git out of the box. I do not have a server running, and would rather not expose my project on a server.
It doesn't make sense for me to have a separate user just to run the Git/SVN Servers, either.
I'm just trying to have version control using either one, in the simplest possible way.
I've tried to add Repo, using local file path (/Volumes/AAA/BBB/Repo) where I manually created the "Repo" directory. I've set the type as Subversion (and also tried Git). XCode says "Host is reachable". But, the Commit functionality is not there (Disabled). I can't import my working directory.
I just don't get it - must I have a server running in order to have SVN/Git, or can XCode just do it through command line? I much more prefer it being done over command line, since the server is complete overkill. Or, am I missing something? Maybe I'm putting in the wrong settings into XCode?
This isn't strictly an XCode 4 issue, I had the same issue with XCode3, but at least it had the CVS option - now it's gone.
With Git you don't need a central server or even a central repository unless you have multiple people on the project. SVN requires you to have a central repo & server running all the time, but with Git you can simply git init a new repo and start using it. If you don't have a central repo you will never use push, pull, or fetch.
Xcode's help mentions the following:
Choose Git or Subversion Xcode supports two SCM systems: Subversion
(often abbreviated svn) and Git. Subversion is always server-based and
the server is normally on a remote machine, though it is possible to
install one locally. Git can be used purely as a local repository, or
you can install a Git server on a remote machine to share files among
team members. The Xcode 4 installer installs the Git and Subversion
tools when you select System Tools. If you are working alone, it’s
generally easiest to use Git, as you don’t need to set up a server. In
fact, Xcode can automatically set up a Git repository for you when you
create a new project (see “Create a Git Repository For Your New
Project”). For a group project, the choice of Subversion or Git is
usually a matter of taste and prior experience. In so far as is
possible, Xcode provides a consistent user interface and workflow for
users of either Subversion or Git.
So the official advise is that in your case, Git is the easiest solution. I'm now in the same position as you described and will be trying Git as advised.
Previously, when working for a small company, we used a dedicated leftover MacMini as an SVN server; this was quite easy to set up, and worked like a charm for many years. Be aware that the SVN integration of Xcode 3 was better than that of Xcode 4 though, so that I ended up using Xcode 4 for development and basic SVN usage, together with Xcode 3 for SVN stuff that Xcode 4 wouldn't do anymore.

Archivable, replicable releases when building with Maven: is there a right way?

We have a largish standalone (i.e. not Java EE) commercial Java project (10,000+ classes, four or five SVN repositories, ten or twenty third-party libraries) that's in the process of switching over to Maven. Unfortunately only one engineer (in a team of a dozen or so distributed across three countries) has any prior Maven experience, so we're kind of figuring it out as we go.
In the old Ant way of doing things, we'd:
check out source code from three or four repositories
compile it all into a single monolithic JAR
release that (as part of a ZIP file with library JARs, an installer, various config files, etc.)
check the JAR into SVN so we had a record of what the customers had actually got.
Now, we've got a Maven repository full of artifacts, and a build process that depends on Maven having access to that repository. So if we need to replicate what we actually shipped to a customer, we need to do a build against a Maven repository that has all the proper versions of everything. This is doable, I guess, if in (some version of) the (SVN-controlled) POM files we set all the dependencies to released versions?
But it gives our release engineer the creepy-crawlies, because there doesn't seem to be any way:
to make sure that somebody doesn't clobber the copy of foo-api-1.2.3.jar on the WebDAV server by mistake (the WebDAV server has access control, but that wouldn't stop a buggy build script)
to detect it if they did
to recover afterwards
His idea is, for release builds, to use a local file system as the repository rather than the WebDAV server, and put that local repository under SVN control.
Our one Maven-experienced engineer doesn't like that -- I guess because he doesn't like putting binaries under version control? -- and suggests that maybe the professional version of the Nexus server can solve the clobbering or clobber-tracking/recovery problem.
Personally, I'm not happy (sorry, Sonatype readers) with shelling out money for a non-free build system when we haven't even seen any benefit from the free version yet, and there's no guarantee it will actually solve the problem.
So our choices seem to be:
WebDAV server
Pros: only one server, also accessible by devs, ...?
Cons: easy clobbering, no clobber-tracking/recovery
Local file system
Pros: can be placed under revision control
Cons: only works with the distribution script
Frankly, both of these seem like hacks to me, and I have to wonder if there isn't a better way to do this.
So: Is there a right thing to do here?
I'm not sure to get everything but I would:
Use the maven-release-plugin (which automates the release process i.e. execute all the steps documented in release:prepare).
Use WebDAV with anonymous read-only and authenticated write policy (so only release engineer can actually deploy released artifacts to the corporate repo).
There is a no need to put generated artifacts under version control (if you have the poms under version control). I don't see the benefits of using the local file system instead of WebDAV (this is not providing more security, you can secure WebDAV as well). I don't see what the commercial version of Nexus would solve here.
Nexus has a setting which prevents you from clobbering an already released artefact in a release repository.
For a team of about a dozen, the free version of Nexus should be enough.

Should we use Nexus or Artifactory for a Maven Repo? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
We are using Maven for a large build process (> 100 modules). We have been storing our external dependencies in source control, and using that to update a local repo.
However, we are ready to graduate to a local repo that can cache central so that we don't have to proactively download all 3rd parties (but we can still have a local repo to pull from). In addition we want to publish our internal build artifacts from a nightly build so that developers don't have to build the world.
We are considering Nexus and Artifactory. What are the reasons for preferring one over the other? Are there others we should be considering?
I'm sure that if you only talk about storing binaries from "mvn deploy" both will do fine.
We use Artifactory very extensively with all upgrades along the way. Lots of projects, numerous snapshots deployed and external repos proxied. Not a single problem. I find it hard to explain how other people experience issues with its DB, indexing or anything else. Nothing like that ever happened to us. Also, Artifactory allows to store data on a disk and only use a DB for storing metadata, it is quite flexible (see more here).
What makes those applications very different is their approach towards integration with other build tools and technologies. Nexus and Sonatype are pretty much locked on Maven and m2eclipse. They ignore anything else and only recently started to work on their own proprietary Hudson integration (see their Maven 3 webinar).
EDIT: This is not true anymore as of 2017 Nexus gives a much larger support for other build tools End of Edit
Artifactory provides an awesome Hudson, TeamCity and Bamboo integration, and Gradle / Ivy support. So while Nexus gives you nothing once you step out of Sonatype "comfort zone" (Maven, m2eclipse), Artifactory embraces and collaborates with all major build tools.
In fact, being able to deploy build artifacts from Hudson, when job has finished, and not by "mvn deploy" is a huge difference: Artifactory Hudson plugin makes an atomic-like deploy of all artifacts at once, only when a build job finished successfully. "mvn deploy" runs after each module and can deploy a partial set of artifacts if a build job fails in the middle. Deploying from Maven on module completion and not from a build server on job completion is really a bad thing to do.
As you see, Artifactory thinks "outside the box" while Nexus thinks "inside the box" and only cares about Maven and Maven artifacts.
Something else that makes Artifactory more accessible is their cloud-based Artifactory Online solution. For about $80 a month you have your own Artifactory instance, no need to dedicate any server for it.
Artifactory has a simple and straightforward REST API, don't know how it works for Nexus.
Edit Nexus has also a REST API that you can use easily as well.
To summarize, for basic storage of Maven artifacts I think both are fine. But while Nexus stops there being strictly a "Maven repository manager", Artifactory goes on and on, being a general "Binaries storage" for binaries of any kind, from any build tool and CI server.
I don't know about Artifactory but here are my reasons for using Nexus:
Dead simple install (and since 1.2, dead simple upgrade, too)
Very good web UI
Easy to maintain, almost no administrative overhead
Provides you with RSS feeds of recently installed, broken artifacts and errors
It can group several repositories so you can mirror several sources but need only one or two entries in your settings.xml
Deploying from Maven works out of the box (no need for WebDAV hacks, etc).
it's free
You can redirect access paths (i.e. some broken pom.xml requires "a.b.c" from "xxx"). Instead of patching the POM, you can fix the bug in Nexus and redirect the request to the place where the artifact really is.
Artifactory supports both file-system and database storage backends. Storage is checksum based and identical binaries are stored only once, no matter how many times they appear in the repo, which makes Artifactory more efficient storage-wise. Move and copy are also very cheap because of this architecture (in Nexus there's no REST for move/copy - you have to move stuff on the file system, then run corrective actions on the repo to let it know content has changed).
Another important differentiator is Artifactory has unique integration with Hudson and TeamCity for capturing information about deployed artifacts, resolved dependencies and environment data associated with build runs, which provides full build traceability.
Artifactory stores the artifacts in a database, which means that if something goes wrong, all your artifacts are gone. Nexus uses a flat file for your precious artifacts so you don't have to worry about them all getting lost.
If you need the "Pro" features of either (e.g. Staging repos, artifact promotion, NuGet), , then you need to consider the different pricing models, which are displayed on their websites.
http://www.jfrog.com/home/v_pricing
http://www.sonatype.com/nexus/purchase
In summary:
Artifactory Pro
you pay per server
you can pay more for increased service hours
Nexus Pro
you pay per seat, i.e. how many developers downloading artifacts
support service is Mon-Fri 0800-2000 ET only, no matter what you pay
No matter how many users you have, Nexus Pro offers a support service that's broadly equivalent to Artifactory's $7,450/year "Silver Value Pack".
$7,450/year will buy you approximately 67 Nexus Pro seats (1-50 # $108, the rest # $120).
On price and support alone then, Nexus Pro makes sense until you get to 67 users, at which point Artifactory becomes the cheaper option.
If you're doing all the support in-house; however, that magic point is about 23 users (Artifactory's most basic support offering is $2,750/year).
I made some research recenly about Artifactory 2 and Nexus 1.3. I'll list here the main differences I found:
Artifactory stores metadata and optionally files in DB, Nexus writes directly to file system. There are pros. and cons. for each approach. DB supports transactions, while in FS stored files can be accessed directly.
Artifactory has higher system requirements especially for disk space.
The most complete comparison: http://binary-repositories-comparison.github.io/
You should use Artifactory
Its latest version was a real jump
You can backup incrementally your repositories , which means you can have all your artifacts saved and maintain
Its has a easy to use web ui
and is really easy to set up
i enjoyed it a lot
check out its new version 2.0
From a learners point of view I note some specific differences between the two.
Sonatype .war deployment is not supported on Jboss application server at the time, although it does run under Tomcat.
Sonatype does not offer me an Amazon Machine Image (AMI), at present, that I could quickly stand up and test.
An Artifactory AMI is provided by Bitnami and takes a only a few minutes to stand up and a few more minutes to configure, maybe several tens of minutes dependant upon what you're trying to achieve.
Artifactory offer a SaaS version of Artifactory in the cloud so you can focus on getting things done rather than infrastructure.
I've no experience with Nexus but I've found Artifactory very intuitive and easy to configure, at least initially.
Added - I do note that the Artifactory User Guide, which may be OK for a seasoned pro, is a bit light on for some in depth explanations. For instance, starting out, one unzips and then addes a Repository, say RedHat's Jboss EAP Enterprise Repo. All goes fine but then when I tried to view the artifacts that were imported Artifactory reports zero artifacts? No errors or warnings so I'm now looking for an explanation. Is this normal or not normal? A simple explanation in the doco can quickly point one in the right direction. Being a good contributor I'm adding these comments to the project for the benefit of other starters.
All politics/religion aside, licensing makes a difference for some organizations.
Nexus is GPL now AGPLv3 and now Eclipse Public License (EPL).
Artifactory is Apache licensed LGPLv3 licensed as of version 2.1 of the product.
You may also want to consider Archiva, just for comparison's sake. It's Apache 2.0 licensed.
I see that Nexus usage is growing, while Artifcatory usage is generaly staying flat.
Picture is taken from here http://blog.sonatype.com/2014/11/42000-nexus-repository-managers-and-growing/
There is also matrix-comparison http://docs.codehaus.org/display/MAVENUSER/Maven+Repository+Manager+Feature+Matrix
Both Artifactory and Nexus have more or less similar feature set but Artifactory's LDAP support makes it more attractive over Nexus. Though Nexus also have LDAP support but in paid version :-(
Hmmm...my experience with artifactory is awful...but I'm a relative newbie so take it with a grain of salt. My overall complaint is that jar files recently uploaded to Artifactory do not seem to get indexed right away - as in for hours - and there does not seem to be a good way to force it. I've tried various things that appeared as if they should have worked, but didn't. I have been working with m2eclipse, adding dependencies to a project that i'm converting from ant. When I try to add a jar that I have just added to artifactory, I expect it to show up as a choice in the selector but it does not.
a coworker told me that they had installed nexus and so far they like it...but I can't vouch for it yet. I'm about to install that on a Linux box as soon as IT can find me one.