I have a bunch of Swaggerized services. These services all talk to (call) each other. The services, of course, have API versions. Ultimately our goal is to not make incompatible changes to the REST APIs, but given that we're in early-stages (nothing in production,...) we are quite prepared to "correct mistakes".
I have not found any good treatise on how to organize the services with regard to Maven projects and Swagger Codegen.
It seems to me that each service wants the following artifacts:
A WAR file to deploy the service
A Swagger Codegen-generated client API in a Jar file that would be imported by consumers of the API.
We would also want the client API jar in our own Maven repository so it can be brought in as a Maven dependency.
So my questions are:
Is this how it is typically done?
KEY QUESTION: Is there a way for me to generate a client from the Swagger-annotated resource classes. I'm looking avoid having to DEPLOY the service in order to get a Swagger Specification File.
Would we would generate the client JAR as a SECOND artifact of the service WAR Maven project or...
Should we have a separate Maven project for the JAR file that imports stuff from the service (remember that's a war file, not a JAR file). I suppose it might scan the Swaggerized Resource classes to produce a Swagger specification file and then generate the client from that?
Something else entirely?
Additional points:
Right now, we only have Java clients, but we might have other clients in the future
Related
In JBoss 4.2.3 we could configure items in
[jboss_server]/deploy/jboss-web.deployer/conf/web.xml
which would be adopted by all applications deployed. We've used this to configure context params, servlets, and default tag files.
We have dozens of apps deployed in war files, and this a very handy tool.
How is this accomplished in JBoss 7.1.1? I've googled and searched but can't seem to find the solution.
You could try web fragments (part of Servlet API 3.x). You'll be able to apply the same set of filters, mappings, listeners, variables to each web app's context using one META-INF/web-fragment.xml file (inside some WEB-INF/lib/my-common-context.jar, so it'd be easily managed as a simple dependency).
I have an Web application developed using Spring3. Some functions of Web app needs to be exposed as Web services also.
Web app is deployed in the Tomcat Server as a .war file.
I have gone through Axis2 and Spring integration in the site http://axis.apache.org/axis2/java/core/docs/spring.html. What I am unclear is how the final structure looks like. Need clarifiaction on the below points,
1) What should be the directory structure of my final app for "With ServletContext" as well as "Without ServletContext" ?
2)The .aar file also should be placed in WEB-INF/lib directory? If so how does axis2 recognize this as service as it has compulsion on the directory structure like .aar file and inside it META-INF which contains services.xml. and the classes at the same level as META-INF folder.
I am not sure if I am going wrong in getting the whole picture. Any guidelines or a good tutorial would be highly helpful.
I'm trying to convince the higher-ups at my work place to migrate to Apache Ivy. I've managed to get a few sandbox projects working using Ivy to power the build, and now I have a greenlight to put together a migration proposal.
We all agree on one thing: we don't want to trust JARs that are located in public directories! I know, I know, a bit paranoid, yes. But we'd like to have a setup where we pull a JAR from a trusted source (either downloading it from the open source project itself, or most likely, gulp, a public repo), and use it for some time before we "certify" it (give it our blessing as a safe artifact to use).
Then we want to have a common repository for all JARs used by our many projects.
My original thinking was to place this repository up in version control (we have an SVN server). But I wasn't sure what best practices dictate. It might make more sense to put our JARs on a file server and FTP to them in the Ivy script.
Either way, SVN (HTTPS) or FTP, all of our servers are authenticated. So, a small number of questions:
Where should we be publishing all of our "certified" JARs (everything from `log4j` to any homegrown JARs we produce)? What do best practices dictate?
The "ivyrep" resolver-type does not take username or passwd atrributes. If our "JAR server" (FTP, SVN, etc.) is authenticated, how do I configure the Ivy scripts to login?
I must echo Brian's recommendation to use a repository manager like Nexus. It's a lot less work in the long run. You'll also discover that the professional version of Nexus enables you to create approval processes around repositories which you plan to use in your build. See the procurement suite functionality.
If, on the other hand, you are determined to build your own repository, then ivy has the tools for the job. You need to become very familiar with the ivy settings file and how it declares and uses resolvers.
If repository is accessible via HTTPS the the url resolver should be able to access it. The resolver will assume that each version of an artifact is in a different directory and you'll need to specify the URL pattern that ivy will need to use when accessing the repository:
<url name="two-patterns-example">
<ivy pattern="http://ivyrep.mycompany.com/[module]/[revision]/ivy-[revision].xml" />
<artifact pattern="http://ivyrep.mycompany.com/[module]/[revision]/[artifact]-[revision].[ext]" />
</url>
The pattern is fully flexible to how you store the artifacts.
Authentication is also handled in the settings file using the credentials tag.
Finally, the FTP protocol is also supported. It's hard to find in the doco, but it's supported by the vfs resolver.
I think that's enough information on an option I don't recommend :-) Having said that I once created an FTP based repository for managing releases to clients. It's useful to have a tool this powerful :-)
Why not use something like Sonatype's Nexus. I've seen it used for Maven, and I believe it'll work for Ivy.
You can set it up to download from remote repositories into (say) a 'test' repository. You can then evaluate those .jars, and if they're good, upload them into an 'approved' repository for general consumption. There's some authentication surrounding this, but you'd have to evaluate that in greater depth. Certainly you can restrict the uploading into repositories via a username/password pair.
We developed a web service client application using the wsdls from the url location.
I don't want the webservice client to go and validate the actual wsdls everytime , so wanted to download the wsdls in to local project .
is there any way can i download the wsdls using maven similar to copy resources?
Thanks
Vijay
You can use the wagon plugin to download the files to your project. See its usage page.
Two cases. If you own the service code, and it's something like Apache cxf, there's a command line like java2wsdl or java2ws. In the case of CXF, there's even a maven plugin for the job already.
If the service is 'out there' somewhere, you could use the linux wget or curl commands to fetch the wsdl from the ?wsdl URL.
I would like to host a Maven repository for a framework we're working on and its dependencies. Can I just deploy my artifacts to my FTP host using mvn deploy, or should I manually deploy and/or setup some things before being able to deploy artifacts? I only have FTP access to server I want to host the Maven repo on.
The online repository I want to use is not hosted by myself. As I say, I only have FTP access, so if possible, I would like to use that FTP space as a Maven repository. The tools mentioned seem to work when you have full control over the host machine, or at least more than just FTP access since you need to configure the local directories where the repositories will be placed. Is this possible?
You might want to have a look at Nexus, a Maven repository manager. We've replaced our local Maven repository with a Nexus-based one and find it tremendously useful.
I've successfully used Archiva as my repository for several years ... see http://archiva.apache.org/. It's easy to administer and allows you to configure as many repositories as you need (SNAPSHOT, internal, external, etc).
According to the book "Better Builds with Maven", the most common type of repository is HTTP, this paragraph describes what I think you need:
This chapter will assume the repositories are running from http://localhost:8081/ and that artifacts are deployed to the repositories using the file system. However, it is possible to use a repository on another server with any combination of supported protocols including http, ftp, scp, sftp and more. For more information, refer to Chapter 3.
A Maven 2 repository is simply a specific directory structure, so once you get the transport and server specifications right for the repository and deployment portion of your POMs, it should be completely transparent to your users.
You can even use Dropbox. All that you need is a public address to access the files generated with mvn deploy, with any of the protocols in the accepted answer.
I guess there are more services that can work in the same way, but I'm not certain about the URL schemes that alternatives to Dropbox may use.
https://maven.apache.org/wagon/wagon-providers/wagon-ftp/ will tell you that you can use ftp to read from an existing repository, but not to create a new one. I don't think that it is impossible in principle, but no one has cared to write all the fiddly code to do the directory management via ftp.