API versioning in Anypoint API manager - mule

I'm looking for a solution on how to deal with API versions in ANYpoint API manager. At the moment it is possible to create a new version of an API. But it is not possible to distinguish between different OTAP environment. In my situation it could be possible that a test environment has a newer API version than production. Do anyone recognize this issue and how did you solve it?

Currently, there are no environment promotion capabilities per se in Anypoint Platform. Having said that, there are a number of things that you can do that can help in that regard, for example:
- You can export an API from Organization A and import it on Organization B.
- You may define different sub organizations, to reflect your OTAP environment structure.
In general, it is not strictly bad that on QA, Stage or UAT environment, you have a newer API version than in Production, if you plan to implement such version in there.
Just my 2c, Nahuel.

As far as i understand if we want are going to create multiple apis then RAML versioning should be followed. what i do i can share with others.
development raml version= 0.1.0
First major release version=1.0.0
if there is any minor change then we can do 1.0.1
if there is any major contract breaking change then we can use 2.0.0.

Related

Patching a MuleSoft application

We have MuleSoft applications and they are deployed in Mule Runtime. We need recommendations on patching MuleSoft applications. The patch can be either in MuleSoft Runtime itself or can be in application.
MuleSoft's recommendation on patching a MuleRuntime is available at https://support.mulesoft.com/s/article/How-to-apply-patches-to-Mule-4-x.
Here, the recommendation to patch MuleRunTime is to replace the jars/plug-in. But with this, how can we maintain/know the version of patch that is applied.
What is the recommended way to patch a application which is deployed in MuleRunTime.
Any help/recommendation on this is appreciated. Thanks.
You can look at infrastructure automation tools that auto install your runtime with the correct patches etc like puppet, chef and so many other tools. So that your runtime is always using the correct doenedencies and is repeatable. Which tool depends on your organisation.
Or just as with your code you can version control your runtime or install scripts in git etc.
The Mule runtime will log its own version and the versions of each plugin and applied patch, whenever it starts your app. So check the log from the last time you started and you'll see the current version and patch level.
So trust the recommendations in the help document you cited in the OP. As Ryan says, you can use any dev-ops tools favored in your organization. (If you don't have a favorite, Maven is integrated very nicely with Anypoint Studio and can help with this.)

Hortonworks vs Apache projects

I want to know what is the difference between installing HortonWorks HDP vs installing the components directly from Apache projects? One thing I can think of is that Horton works probably has the packages aligned so that the version of each component is compatible with that of the others within the suite, while getting them directly from Apache projects, I may have to handle version compatibility myself. Is that correct? Is there any other difference involved ignoring the support subscription aspect of it.
Thanks.
There are a lot of differences between "roll your own" and using a distribution. Some of the most obvious include:
All of the various components and versions have been tested and built to work together - incompatibility between versions (e.g. Hive, Hadoop, Spark, etc.) can be a painful problem to sort through on your own
Most distribution providers, including Hortonworks, will bring patches in from unstable releases into stable releases, so even for the "same" version (e.g. Hive 1.2.1) you're getting a better release than vanilla - these can include both bug fixes and "safe" feature changes
Most distribution providers, including Hortonworks, provide some flavor of centralized platform management. I'm a big fan of Ambari (the one that comes with HDP), for example - it makes configuration and monitoring significantly easier than coordinating a vanilla install
I would strongly recommend against trying to deploy vanilla, unless it's just for learning and playing. HDP community edition is free (both definitions) and a major improvement over doing it yourself. My last deployment of HDP was entirely based on the community edition.

Why akka.persistence is still having beta release? Is it stable?

Why akka.persistence is still having beta release on nuget packages. Does it imply it is still not stable and not good for used in production applications?
In Akka.NET in order to get out of prerelease, a package must meet multiple criteria, like:
Having full test suite up and running. In case of clustered plugins, this also includes multi-node tests.
Having a fixed API. There are dedicated API Approval tests ensuring, that no public API has been accidentally changed.
Having a battery of performance tests. While many of plugins are ready and usually fast without it, stress tests are needed in order to check if any of the merged pull requests didn't introduce any performance penalties.
Having all documentation writen and published.
While this is a lot, not all of these are necessary to make plugin functional. In case of Akka.Persistence there are minor changes (like deprecation of PersistentView in favor of persistence queries), but the plugin itself is production ready and used as such already. However maturity of persistent backend plugins, that are used underneat, may vary.
Akka.Persistence is stable now. You can download it by running following command in Package Manager Console
Install-Package Akka.Persistence

WindowsAzure.AD.Graph.2013_04_05 - How long is this valid?

I have an MVC App which uses Azure AD, it works very well using the WindowsAzure.AD.Graph.2013_04_05 helper project that Microsoft made available. This project is now outdated, but looking at the new Nuget Package, the two require code changes.
I have two questions, How long can I use the old one before I find myself locked out of my own app. Second, has anyone migrated between them?
What I have is very simple, its just an Auth Filter which checks if a user is in one or more groups.
You should have at least a year from the announcement of a REST version being deprecated; a REST version is not the same as a NuGet package version, so you'll need to understand the underlying version in use.
Please refer to this Microsoft support policy on Azure-related REST APIs and libraries: http://support.microsoft.com/gp/azure-cloud-lifecycle-faq

Using Nuget in development environment - best practices / how to

Trying to figure out the best way to use Nuget in a development environment to manage our own libraries.
We want to standardize on Nuget way of doing things for our 3rd party libs, but would also like to use Nuget to manage our internal utility libraries, for developers consuming the in house libs this is great and everyones happy. However, for devs actively working on the Utility lib it seems to be more problematic, their previous process of build lib , build main app , F5 and go is now slowed down with publishing, and updating and potentially lots of packages, not to mention the moaning about additional process!
We use TDD on the internal libs but everyone needs to be able to debug and modify libs along with main app, have seen Phil Haacks demo on debug packages in 1.3 and read David Ebbos blog, but that fits different scenario.
So what is the best process for dev/debug cycles? if to use Nuget then we need to accept the existing constraints, or is there a hybrid practice people are using and maybe 1.3 gets closer to automating all this, or do we just avoid Nuget for internal packages which would be a real shame.
Loving Nuget, maybe wanting way to much from the little guy, feedback appreciated.
Thanks
I'd suggest you use separate network shares or feeds (similar to what myget.org supports in the cloud) for different scenarios.
You could imagine creating a CI share, a QA share, a Releases share, ...
Make people working on the referenced library do CI builds that drop CI packages on the CI repository for instance, and have them picked up by other projects (who just need to do a simple update, could be automated through PowerShell in pre-build: check for new version, if so, update).
Just make sure that when products release their milestones, they also release with released dependencies (could be as simple as switching feeds, releases will always have a higher version number than CI builds).
Hope that helps!
Cheers,
Xavier
If you're working on the source code for the lib and the main app at the same time, I'd say NuGet is probably not a good solution. I think it'll only work in situations where you work with a "stable" version of the library that don't need to change frequently during the development of your main app.
That said - is it possible the development on your library could be done in isolation? You already mention you're doing TDD on the lib, so why can't that work be done, then built, deployed, then the main app work done?