I have a maven2 repository from which I'm trying to fetch an snapshot artifact with an appended timestamp. I'm (unsurprisingly) able to retrieve it fine when building with maven2 but when building with simple-build-tool (sbt), much preferred by me, I can't pull it down.
I can see from this question about snapshots in Ivy that it is possible to configure Ivy to get snapshot artifacts but I don't know how to tell sbt to do it.
The relevant bits of my current configuration:
val snapshotsName = "Snapshots Repository"
val snapshotsUrl = new java.net.URL("http://host:port/path/to/root")
val snapshotsPattern = "[organisation]/[module]/[revision]/[artifact]-[revision](-[classifier]).[ext]"
val snapshots = Resolver.url(snapshotsName, snapshotsUrl)(Patterns(snapshotsPattern))
Credentials(Path.userHome / ".ivy2" / ".credentials", log)
Update: After some more tinkering it looks like I can get it to point at the correct artifact url with the following pattern.
val snapshotsPattern = "[organisation]/[module]/[revision]-SNAPSHOT/[artifact]-[revision](-[timestamp]).[ext]"
With that I still need to specify the timestamp extra in the dependency
val dep = "group" % "artifact" % "0.0.1" extra("timestamp" -> "20101202.195418-3")
but it does pull the artifact. However it does NOT pull the artifact's dependencies. So it seems I've still got something wrong.
Alright, I got this sorted out but it wasn't actually an SBT problem it was 100% user error.
The Nexus I was using required authentication. I didn't have the authentication credentials set up correctly (see my authentication question) and because it wasn't properly authenticating it wasn't finding the metadata pom files and so SBT printed out the error message that it was failing I, incorrectly, assumed it was authenticating but not resolving. So I started messing with the patterns as evidenced in the actual question.
However, now that I have authentication setup correctly I changed back to just a regular repository declaration like so:
val snapshotsRepo = "Snapshots Repository" at "http://host:port/path/to/snapshots/root/"
and everything works. Artifacts are retrieved and dependencies resolved.
Related
I'm probably trying to do something strange, since this doesn't seem like a common question (or maybe I'm asking it all wrong). I was expecting this to be straightforward.
Basically, what I am looking for is a way to do the same as the following, except by using the gradle openapi-generator plugin:
openapi-generator generate -i www.example.com/openapi-doc -g spring
What I have tried is the following (and the associated errors):
inputSpec.set("www.example.com/openapi-doc") --> Cannot convert URL {} to a file
inputSpec.set(URL("www.example.com/openapi-doc").readText()) --> specified for property 'inputSpec' does not exist
The actual code looks something like this:
tasks.register<GenerateTask>("generateClient") {
validateSpec
generatorName.set("spring")
library.set("spring-cloud")
// inputSpec.set("$openapiSpecDir/client/openapi.json") <-- *I am currently using a file, which I don't want to do*
inputSpec.set("https://www.example.com/openapi-doc")
outputDir.set(generatedClientDir)
apiPackage.set("org.example.api")
modelPackage.set("org.example.model")
skipOverwrite.set(false)
templateDir.set("$rootDir/src/main/resources/openapi/templates/client")
configOptions.put("java8", "false")
configOptions.put("serializationLibrary", "jackson")
configOptions.put("dateLibrary", "java8")
}
Assuming you're using the OpenAPI Generator Gradle Plugin, at the time of writing this answer, getting the inputSpec from a URL is not supported. However, for Maven this has been implemented (Issue #2241 closed with PR #3826), so chances are good to have it implemented with a feature request that gets the Gradle plugin on par with its Maven counterpart.
That being said, you may want to look into Gradle Download Task. Gradle Download Task is a plugin that let's you download files from a URL. The downloaded file can be used to feed it into the OpenAPI generator.
I built an Electron app and I am now looking at how to distribute it.
I went with electron-builder to handle packaging etc.
For a bit of context, as a web developer, I am used to continuously deploy web apps on a web server but I have a hard time figuring out how to distribute a packaged one in Electron.
In electron-builder docs there is a brief mention about testing auto-update:
"Note that in order to develop/test UI/UX of updating without packaging the application you need to have a file named dev-app-update.yml in the root of your project, which matches your publish setting from electron-builder config (but in YAML format)"
But, it's rather vague...
So I actually have two questions:
1. How do I actually test the auto-update flow?
Do I need to actually publish a new version to trigger an update locally? Seems pretty unclear, it would be like developing against the production server.
2. Is it possible to have a fallback for unsigned code?
I don't have yet any certificate for code signing. So the OS/app will block the auto-update. But, I'd still want to tell the user that an update is available so they can go and download the app manually. Can I do that? (going back to point 1, I'd like to be able to test this flow)
I've just finished dealing with this. I also wanted to test against a non-production server and avoid having to package my app each time I iterated. To test downloads I had to sign my app, which slowed things down. But it sounds like you just need to check for updates. Which I think you can do as follows...
I created a dummy github repo, then created a a file dev-app-update.yml containing:
owner: <user or organization name>
repo: dev-auto-update-testing
provider: github
The path where this file is expected to be defaults to a place you can't access. Thankfully, you can override it like so:
if (isDev) {
// Useful for some dev/debugging tasks, but download can
// not be validated becuase dev app is not signed
autoUpdater.updateConfigPath = path.join(__dirname, 'dev-app-update.yml');
}
...that should be enough for your case -- since you don't need downloads.
If not, here are some other tips:
you can change the repo setting in your electron-builder config to point at your dummy repo then package your app. This will give you a packed, production build that points at your dummy repo -- this is how I did my download testing (though I have a cert, and signed my app)
you should be calling autoUpdate's checkForUpdates(), but if checkForUpdatesAndNotify() gives you a useful OS Notification then you should be able to set autoUpdater.autoDownload to false and end up with what you need.
Lastly, it sounds you could skip autoUpdater, since you won't be using the download feature anyway. Instead you could use github's releases api, assuming you use github to host your release. If not then your host should have something similar. Use that to check for updates then tell the user from within your App (could present them with a clickable URL too). If you want OS Notifications electron has a module for that.
We're using electron-updater with GitHub as a provider for auto-updates. Unfortunately, it breaks a lot and the electron-builder team doesn't support these issues well (1, 2, 3) (from my own experience, but you can find more examples on GitHub).
One way to test updates in dev mode:
Create a build of your app with an arbitrarily high version number
Create a public repo and publish the above build
Create a dev-app-update.yml next to your main entry point and configure it for the repo above (see)
In your main entry point:
import { autoUpdater } from "electron-updater";
...
if (process.env.NODE_ENV === "development") {
// Customize the test by toggling these lines
// autoUpdater.autoDownload = false
// autoUpdater.autoInstallOnAppQuit = false;
autoUpdater.checkForUpdates();
}
Then when running yarn dev you should see something like:
Checking for update
...
Found version 100.0.0 (url: <>.exe)
Downloading update from <>.exe
updaterCacheDirName is not specified in app-update.yml Was app build using at least electron-builder 20.34.0?
updater cache dir: C:\Users\<>\AppData\Local\Electron
New version 100.0.0 has been downloaded to C:\Users\<>\AppData\Local\Electron\pending\<>.exe
And it should install when you close the dev app.
This should give you some certainty but we still ran into issues in production. If you want to be sure, play through the full update flow with a test repo but packaged production apps just as you would do with the live one.
TL;DR: Why does everything run fine when started via IntelliJ, and why is it broken when call java -jar app.jar. And how do I fix this?
Alright, I have some issues with a backend I am trying to dockerize. I have an application created with Spring Boot (1.4.2.RELEASE) following the Spring Oauth (2.0.12.RELEASE) guide on their page. I follow the Gradle version, since I prefer Gradle over Maven. Also I am using Kotlin instead of Java. Everything is fine, I start via IntelliJ my backend with static front end, I can login via Facebook (and Google and Github), I receive a nice Principal witch holds al the information I need, and I can modify Spring Security to authorize and permit endpoints. So far so good.
Now for the bad part, when I run either ./gradlew clean build app:bootrun or ./gradlew clean build app:jar and run the jar via java -jar (like I will do in my Docker container), my backend comes up. My static front end pops up. Now I want to login via Facebook, I end up on the Facebook login page, I enter my credentials, and... nothing!
I end up back on my homepage, not logged in, no log messages that mean anything to me, just silence. The last thing I see in the log is:Getting user info from: https://graph.facebook.com/me
This Url will give me in my browser:
{
"error": {
"message": "An active access token must be used to query information about the current user.",
"type": "OAuthException",
"code": 2500,
"fbtrace_id": "GV/58H5f4fJ"
}
}
When going to this URL via an IntelliJ start, it will give me credential details. Obviously something is going wrong, but I have no clue what. Especially since a run from IntelliJ works fine. There is some difference between how the jar is started, and how IntelliJ's run config works, but I have no clue where to search for what. I could post trace logging, or all my Gradle files, but perhaps thats too much info to put in 1 question. I will defenitly update this question if someone needs some more details :)
The structure outline of this project is as follows:
root:
- api: is going to be opensourced later, contains rest definitions and DTOs.
- core: contains the meat. Also here is included in the gradle file
spring-boot-starter, -web, -security, spring-security-oauth2, and some jackson stuff.
- rest: contains versioned rest service implementations.
- app: contains angular webjars amongst others, the front end, and
my `#SpringBootApplication`, `#EnableOAuth2Client`
and the impl of `WebSecurityConfigurerAdapter`.
Why does everything run fine when started via IntelliJ, and why is it broken using bootRun or the jar artefact. And how do I fix this?
I found it, the problem was not Multi module Graldle, Spring boot, or Oauth2 related. In fact it was due to a src set config of Gradle, where Java was supposed to be in a Java src set folder, and Kotlin in a Java src set folder:
sourceSets {
main.java.srcDirs += 'src/main/java'
main.kotlin.srcDirs += 'src/main/kotlin'
}
As Will Humphreys stated in his comment above, IntelliJ takes all source sets, and runs the app. However, when building the jar via Gradle, these source sets are stricter. I had a Java file in my Kotlin src set, which is no problem for IntelliJ. But the jar created by Gradle takes into account the source sets as defined in the build.gralde file, which are stricter.
I found my missing bean issue with the code below:
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
return args -> {
System.out.println("Let's inspect the beans provided by Spring Boot:");
String[] beanNames = ctx.getBeanDefinitionNames();
Arrays.sort(beanNames);
for (String beanName : beanNames) {
System.out.println(beanName);
}
};
}
The Bean I missed was called AuthenticationController, which is a #RestController, and kinda crucial for my authentication code.
I'm using scripted test framework to test a sbt plugin. As part of that test it needs to download artifacts from a private artifact store. Scripted seems to ignore the credentials in the ~/.sbt/0.13 directory. How can I get my test to use those credentials?
It would be good if I didn't have to hard code any path.
I assume you have followed the official Publishing document and have a .credentials file containing realm, host, user, password:
realm=Sonatype Nexus Repository Manager
host=nexus.scala-tools.org
user=admin
password=admin123
Then in the build.sbt of your test add
credentials += Credentials(BuildPaths.defaultVersionedGlobalBase(sbtVersion.value) / "credentials")
This is my hack/workaround for the problem.
In scripted.sbt I add the following code.
credentials ++= {
val out = credentials.value.map {
case c: FileCredentials => s"""Credentials(new java.io.File("${c.path.getAbsolutePath}"))"""
case c: DirectCredentials => s"Credentials(${c.realm}, ${c.host}, ${c.userName}, ${c.passwd})"
}.mkString(" credentials ++= Seq(", ",", ")")
sbtTestDirectory.value.listFiles.flatMap(_.listFiles).map(f => IO.writeLines(f / "credentials.sbt", Seq(out)))
List()
}
This takes the existing sbt credentials and writes a credentials.sbt into each of the scriptet tests, e.g sbt-test/test-group/test-name/credentials.sbt. This is then used by the scripted tests.
My sbt knowledge is somewhat limited so this is a bit ugly.
When I try and run a test using the Apache LDAP API, I am getting the following error. I set up a Maven project , and my pom.xml has many dependencies for the Apache Directory server and API artifacts. My code (which I copied and pasted an example, just to get up and running, so that I can explore) all builds fine. However, when I run it (as a Junit Test), I get the following....
Can anyone help me? maybe even just provide an example of where the Apache LDAP API is being used successfully, and maybe give me the pom.xml with the correct dependencies also? (The apche LDAP API documentation seems to be out of date).
I am currently starting the test using the embedded Apache Directory server, using the following...
#RunWith(FrameworkRunner.class)
#CreateLdapServer(transports =
{
#CreateTransport(protocol = "LDAP") ,
#CreateTransport(protocol = "LDAPS") })
// disable changelog, for more info see DIRSERVER-1528
#CreateDS(enableChangeLog = false, name = "PasswordPolicyTest")
public class PasswordPolicyIT extends AbstractLdapTestUnit
{ .......etc }
So, therefore, an alternative approach, is that if I tailor some of the tests to just connect to a local Directory Server instance that I have running on my machine. I assume that this would stop the error messages that I am getting below..Again, if anyone could provide a code snippet there, it would be useful..
Many Thanks
> 2013-06-20 16:05:10 ERROR FrameworkRunner:287 - Problem locating LDIF
> file in schema repository Multiple copies of resource named
> 'schema/ou=schema/cn=apachemeta/ou=matchingrules/m-oid=1.3.6.1.4.1.18060.0.4.0.1.3.ldif'
> located on classpath at urls
> jar:file:/Users/rk/.m2/repository/org/apache/directory/api/api-ldap-client-all/1.0.0-M17/api-ldap-client-all-1.0.0-M17.jar!/schema/ou%3dschema/cn%3dapachemeta/ou%3dmatchingrules/m-oid%3d1.3.6.1.4.1.18060.0.4.0.1.3.ldif
> jar:file:/Users/rk/.m2/repository/org/apache/directory/shared/shared-ldap-schema-data/1.0.0-M7/shared-ldap-schema-data-1.0.0-M7.jar!/schema/ou%3dschema/cn%3dapachemeta/ou%3dmatchingrules/m-oid%3d1.3.6.1.4.1.18060.0.4.0.1.3.ldif
> jar:file:/Users/rk/.m2/repository/org/apache/directory/server/apacheds-all/2.0.0-M12/apacheds-all-2.0.0-M12.jar!/schema/ou%3dschema/cn%3dapachemeta/ou%3dmatchingrules/m-oid%3d1.3.6.1.4.1.18060.0.4.0.1.3.ldif
You need to exclude the shared-ldap-schema-data dependency from apacheds-all. Take a look at this comment