How do I look up a MavenProject? - maven-2

How do I programmatically construct a MavenProject instance (not the current project) given its groupId, artifactId, version, etc?
UPDATE: I'm trying to create a patch for http://jira.codehaus.org/browse/MDEP-322. I believe the maven-dependency-plugin depends on Maven 2.x so I can't use any Maven 3.x APIs.

How you would go about doing this depends on whether you want to construct the project from an artifact in your local repository or from a pom file on your hard drive. Either way, you'll need to get a ProjectBuilder, which you can do like so in a Mojo:
/** #component role = "org.apache.maven.project.ProjectBuilder" */
protected ProjectBuilder m_projectBuilder;
If you want to build from an artifact in your local repo, you'll also need:
/** #parameter expression="${localRepository}" */
protected ArtifactRepository m_localRepository;
Once you have that, you can construct a MavenProject from an artifact from your local repository:
//Construct the artifact representation
Artifact artifact =
new DefaultArtifact(groupId,artifactId,version,scope,type,classifier,new DefaultArtifactHandler());
//Resolve it against the local repository
artifact = m_localRepository.find(artifact);
//Create a project building request
ProjectBuildingRequest request = new DefaultProjectBuildingRequest();
//Build the project and get the result
MavenProject project = m_projectBuilder.build(artifact,request).getProject();
Or from a pom file:
File pomFile = new File("path/to/pom.xml");
ProjectBuildingRequest request = new DefaultProjectBuildingRequest();
MavenProject project = m_projectBuilder.build(pomFile,request).getProject();

Related

Modify GroovyDSL classpath to include 3rd party libraries

I'm trying to create a GroovyDSL script which references some external libraries. Here's my script:
import com.github.javaparser.ast.Node
import org.reflections.Reflections
def ctx = context(
ctype: 'groovy.util.ObjectGraphBuilder',
paths: ['com/example/scripts/.*'],
filetypes: ["groovy"]
)
Map<String, Class> candidateClasses = new Reflections(Node.packageName).getSubTypesOf(Node)
.collectEntries { Class type -> [(type.simpleName.uncapitalize()): type] }
contributor(ctx) {
candidateClasses.each { String methodName, Class type ->
method name: methodName, params: [props: "java.util.Map", closure: "groovy.lang.Closure"], type: type.name
}
}
Trying to enable it in Intellij, I'm getting:
startup failed: transformDslSyntaxgdsl: 1: unable to resolve class com.github.javaparser.ast.Node
# line 1, column 1.
import com.github.javaparser.ast.Node
Now, I have the proper external dependencies declared in pom.xml, the rest of the code that depends on them is working just fine. I've also put the script inside a source folder (which some other answers here suggested might be relevant).
I have seen some examples for GDSL reference Intellij types like PsiClass, which tells me the classpath for GDSL files seems to be different from the project classpath. Is there any way to make sure project dependencies are appended to that classpath?
I also tried using #Grape only to get this error. Adding Apache Ivy as a dependency doesn't help, because again, project dependencies don't seem to influence the GDSL classpath.
After a bit more digging, I found that it is pretty easy to modify the IDE's classpath itself.
All you need to do is to drop a dependency into Intellij installation directory's lib subfolder, and reference the jar inside classpath.txt.
Initially, I added the jars my GDSL depends on directly, but then I realized I could simply add a dependency on Apache Ivy to classpath.txt instead and #Grab annotations would start working.

How do I create a Gradle project that depends on the JS from a Kotlin project?

I am using Kotlin to share logic between my back end (Java Spring Web) and front end. Getting the Java back end to call the Kotlin logic is easy: I made both part of the same Gradle Multiproject build and have the server project depend on the Kotlin project.
Now I need to get the generated JavaScript out of Kotlin and into a format where the server can serve it. Looking through the Gradle output jar for the server, it only has the jvm jar and not the js jar.
I had a similar problem in this GitHub project with a Spring Boot backend and a Kotlin/JS frontend + common code.
If everything is in the same repo in multiple Gradle subprojects, you can use the following in the server subproject to bundle the produced JS as resources into your server's jar:
tasks.processResources {
// make sure we build the frontend before creating the jar
dependsOn(":frontend:assemble")
// package the frontend app within the jar as static
val frontendBuildDir = project(":frontend").buildDir
val frontendDist = frontendBuildDir.toPath().resolve("distributions")
from(frontendDist) {
include("**/*")
into("static")
}
}
It's not ideal (I don't like inter-project dependencies like this), but it does the job.
Spring Boot then automatically serves the static files from this place in the jar (/static).

Change directory to SBT project in IntelliJ with SBT console

I've just installed SBT plugin for IntelliJ and successfully imported my project. SBT console in IntelliJ shows up as expected, however I can't use it because of layout of my project. The entire problem is that my SBT Play! project is not a top-level directory. Instead I have maven parent pom with several child modules amongst which my SBT application is placed. This is how it looks like:
MyProject (parent pom)
-> submodule1 (JAR)
-> submodule2 (JAR)
-> webapp (SBT Play! webapp module)
There is no problem to run Play! application from Linux CLI, previously changing directory to MyProject\webapp and executing SBT from there. However, I don't see any option to set root dir for SBT console in IntelliJ. I have entire project imported into workspace, so the default project root directory is MyProject which is obviously not treaded as SBT project.
Is there any way to change "working directory" for IntelliJ SBT plugin?
I had the same issue, here's how I got it going:
Start by adding your root as a new module. In your case it will MyProject. I was careful to add this as a blank module, but if you already have something like a pom file (which begs the question as to why you want to use SBT), then you might be okay letting IntelliJ import it for you.
Next, add a scala file to your root project directory to map the sub-projects. A great explanation on how to set one of these up can be found on the scala wiki
import sbt._
import Keys._
object HelloBuild extends Build {
lazy val root = Project(id = "MyProject",
base = file(".")) aggregate(submodule1, submodule2, webapp)
lazy val submodule1 = Project(id = "submodule1",
base = file("submodule1"))
lazy val submodule2 = Project(id = "submodule2",
base = file("submodule2"))
lazy val webapp = Project(id = "webapp",
base = file("webapp"))
}
Start your SBT and you should now be able to switch between projects. If you have it up and running, make sure you use the reload command to apply the changes.
You can list the projects SBT recognizes as modules with the projects command. Switch projects by using project [projectName]. So to switch to submodule2, just type project submodule2.

Gradle script to move artifacts between Maven repos

I'm working on a Gradle script to copy an artifact from one Maven repo to another. I was trying to hack it by putting the artifact as a dependency and then us setting that as an archive.
I've tried using the configuration.files() method but I haven't been able to build a dependency object that it will accept.
dependencies {
compile group: artGroup, name: artName, version: artVersion
}
artifacts {
archives configurations.default.files(
/* I have not been able to build an argument this method accepts */
)
}
uploadArchives {
repositories {
mavenDeployer {
repository(url: 'file:../../../repo')
}
}
}
We did this already in other environment (copying files from remote to local), and it looks like you got some misconceptions with Gradle DSL.
First the artifacts { archives {}} is used to ADD deployable artifacts to the archives configurations. You cannot use it (in term of doing something with the configurations files) in this block.
Second, you cannot upload what you resolved "as-is". Upload is for artifacts produced or manual added (they have a special type) by the build.
For us the solution was to create a new Gradle task "copyArtifacts" that actually copy all the files of resolved configuration into the local folder.
Hope this helps.

Is there a way to get Maven to install javadocs for all dependencies to my local repo?

By running mvn dependency:sources I can force maven to resolve all dependencies in my project, download the sources, and install them into my local repo.
Is there something that does the same thing with my dependencies' JavaDocs? I.e. grab them from upstream repos and install them into my local repo.
There is a way to do it with the eclipse:eclipse mojo using the downloadJavadocs parameter.
mvn eclipse:eclipse -DdownloadJavadocs
And if you don't use eclipse, just do
mvn eclipse:clean
afterwards.
It's a hack, I know, but it works.
Actually, dependency:sources pretends to be configurable through the classifier and type parameters, so for a moment I thought you could do this:
mvn dependency:sources -Dclassifier=javadoc -Dtype=jar
but I tried it and it didn't work. Then I checked the source code and found this:
private static final String SOURCE_TYPE = "java-source";
private static final String SOURCE_CLASSIFIER = "sources";
// ...
public void execute()
throws MojoExecutionException
{
// parameters are overwritten with constant values
this.classifier = SOURCE_CLASSIFIER;
this.type = SOURCE_TYPE;
I have now submitted a Bug concerning this.