Does Opendaylight still have a UI - sdn

I try to get the web interface of OpenDaylight controller (Magnesium Version).
But I cannot install the features odl-restconf odl-l2switch-switch odl-mdsal-apidocs odl-dlux-all
A web interface is still available in ODL controller?

This question comes up periodically. the DLUX GUI project as well as
the spanning-tree l2switch project are no longer maintained or part of
an OpenDaylight release. You can try an older version of OpenDaylight. Or,
if you are the developer type, you can try to bring DLUX and/or l2switch
up to speed with the current ODL project and possibly get it to build and
install.

Related

Any side effect for chocolatey if software updates by itself and one tries to update through choco again?

Some programs installed by Chocolatey have the feature to self-update. Examples are VS code and PyCharm. Will there be any side effects if one uses choco to upgrade the software after the software has done the self-update?
I am using Windows 10, python 3.8, anaconda distribution.
If by "side effects" you mean try to download when they shouldn't, or download and revert to outdated files that they shouldn't, the answer is no. They don't just retrieve the packages, they manage them.
handling dependencies during installs and upgrades is the main purpose for package mgrs. to exist. They manage stuff like that so developers do not need to. Otherwise, building any type of project with current methods and standards would be extremely challenging so you don't have to.
Chocolatey, like most package mgrs., runs a 2 step process as a result of the upgrade command. First it checks for newer versions, If there are none, it does nothing. The convention for nearly all automatic upgrade processes is the same. Because of this, either one only retrieves when it needs to.
Companies like IntelliJ and Microsoft usually have relatively robust Deployment schemes, regardless of the method. If a package manager was going to break (or breaking) their automatic update, they would either fix the compatability issue or choose to forgoo the install method. So long as you verify your sources before downloading anything from anywhere, you should be ok.
Side Effects
Side effects should not be present, as it depends on whether the application also successfully overwrites the files, which should be the case.
If PyCharm executes the self-update and the Chocolatey package manager updates the PyCharm application at the same time, errors may occur and the files may not be overwritten successfully.
1. Solution — Deactivate Self-Update
You can manually disable the self-update option integrated in the application.
If the application does not allow you to disable the self-update option, try disabling the update service via the Services application.
2. Solution - Deactivate the Update Service via Services
The Services application allows you to make configurations of the services developed by the respective application.
It is strongly recommended that you create a restore point before making any changes to services.
If you make a mistake that cripples your computer, you can use the restore point to perform a system restore and undo the changes.
If you have disabled the wrong service and lost access to the computer, try booting into safe mode to change the service back.
Configure the update service via the Services application and set the status to Disabled.
Example: Deactivating the Brave Update Service

how to create a project from a template within youtrack with external hub integration?

I am experimenting with latest hub and youtrack on a linux machine,
I installed latest versions (2019.2 and 2019.1 respectively) and enabled the hub integration in youtrack. (not using https for the moment, old plain http is used)
What happened is this:
When I try to create a project i am always switched to Hub (is this correct? i did not find anything on JetBrains docs)
If I create a project from the hub interface and then click on the left panel to add a "Youtrack service" then i am offered the option to create only "default, scrum and kanban" projects that are the standard ones provided by JetBRains, however if i had already created a project and saved as template that project is not offered to me as an option to be the base of the new one.
If i use youtrack with the internal hub, all works as expected and the template projects are available as a starting point for new projects.
This happens as well with older versions (2018.4) of hub and youtrack.
As far as I recall this is bug in Hub yet to be fixed.
It seems correct, as with an external Hub installed projects are always created from there
2-3. There's known issue with that: https://youtrack.jetbrains.com/issue/JPS-9928

Jaeger standalone without docker

Cannot find any information if Jaeger can be executed without docker?
Does a standalone jar exist, or will there be a release in the future for Jaeger like Zipkin has ?
The Downloads page (https://www.jaegertracing.io/download/) lists both the Docker images and the raw binaries built for various platforms (Linux, macOS, windows). You can also build binaries from source.
Just to add to Yuris answer, you can also download the source from github - Github - Jaeger This is useful for diagnosing issues, or just getting a better understanding of how it all works.
I have run both the released apps and custom versions on both windows and linux servers without issues. For windows I would recommend running as a service using Nssm. Nssm details

MonoDroid use Beta, MonoTouch use Release?

We have enterprise licenses for both MonoDroid and MonoTouch from Xamarin.
Our MonoTouch is production so we can't install Beta versions, but we are learning/testing MonoDroid and wish to install the Beta version.
How can we have this configuration when the update channel in MonoDevelop allows for only 1 mode (Release/Beta/Alpha)
You can switch between channels, and you do not have to install everything MonoDevelop offers you. This means that you can:
Switch to stable channel and install MonoTouch updates (and ignore anything else)
Switch to beta channel and install Mono for Android updates (and ignore anything else)
This is of course not optimal, since MonoDevelop will not offer updates for the channel you don't currently have selected, and you'll also be offered updates you may not want. The first problem can be solved by switching channels periodically (for instance once a week).

How to automate development environment setup? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Every time a new developer joins the team or the computer a developer is using changes, the developer needs to do lots of work to setup the local development environment to make the current project work. As a SCRUM team we are trying to automate everything including deployment and tests so what I am asking is: is there a tool or a practice to make local development environment setup automated?
For example to setup my environment, first I had to install eclipse, then SVN, Apache, Tomcat, MySQL, PHP. After that I populated the DB and I had to do minor changes in the various configuration files etc... Is there a way to reduce this labor to one-click?
There are several options, and sometimes a combination of these is useful:
automated installation
disk imaging
virtualization
source code control
Details on the various options:
Automated Installation Tools for automating installation and configuration of a workstation's various services, tools and config files:
Puppet has a learning curve but is powerful. You define classes of machines (development box, web server, etc.) and it then does what is necessary to install, configure, and keep the box in the proper state. You asked for one-click, but Puppet by default is zero-click, as it checks your machine periodically to make sure it is still configured as desired. It will detect when a file or mode has been changed, and fix the problem. I currently use this to maintain a handful of RedHat Linux boxes, though it's capable of handling thousands. (Does not support Windows as of 2009-05-08).
Cfengine is another one. I've seen this used successfully at a shop with 70 engineers using RedHat Linux. Its limitations were part of the reason for Puppet.
SmartFrog is another tool for configuring hosts. It does support Windows.
Shell scripts. RightScale has examples of how to configure an Amazon EC2 image using shell scripts.
Install packages. On a Unix box it's possible to do this entirely with packages, and on Windows msi may be an option. For example, RubyWorks provides you with a full Ruby on Rails stack, all by installing one package that in turn installs other packages via dependencies.
Disk Images Then of course there are also disk imaging tools for storing an image of a configured host such that it can be restored to another host. As with virtualization, this is especially nice for test boxes, since it's easy to restore things to a clean slate. Keeping things continuously up-to-date is still an issue--is it worth making new images just to propagate a configuration file change?
Virtualization is another option, for example making copies of a Xen, VirtualPC, or VMWare image to create new hosts. This is especially useful with test boxes, as no matter what mess a test creates, you can easily restore to a clean, known state. As with disk imaging tools, keeping hosts up-to-date requires more manual steps and vigilance than if an automated install/config tool is used.
Source Code Control Once you've got the necessary tools installed/configured, then doing builds should be a matter of checking out what's needed from a source code repository and building it.
Currently I use a combination of the above to automate the process as follows:
Start with a barebones OS install on a VMWare guest
Run a shell script to install Puppet and retrieve its configs from source code control
Puppet to install tools/components/configs
Check out files from source code control to build and deploy our web application
I stumbled across this question and was very suprised that no one has mentioned Vagrant yet.
As Pete TerMaat and others have mentioned, virtualization is a great way to manage and automate development environments. Vagrant basically takes the pain away from setting up these virtual boxes.
Within minutes you can have a completely fresh copy of your favourite Linux distro up and running, and provisioned exactly the same way your production server is.
No more fighting with OSX or Windows to get PHP, MySQL, etc. installed. All software lives and runs inside the virtual machine. You can even SSH in with vagrant ssh. If you make a mistake or break something, just vagrant destroy it, and vagrant up to start over fresh.
Vagrant automatically creates a synced folder to your local file system, meaning you don't need to develop within the virtual machine (ie. using Vim). Use whatever your editor of choice is.
I now create a new "Vagrant box" for almost every project I do. All my settings are saved into the project repository, so it's easy to bring on another team member. They simply have to pull the repo, and run vagrant up, and they are literally ready to go.
This also makes it much easier to handle projects that have different software requirements. Maybe you have some projects that rely on PHP 5.3, but some newer ones that run PHP 5.4. Just install the version you want for that project.
Check it out!
One important point is to set up your projects in source control such that you can immediately build, deploy and run after checkout.
That means you should also checkin helper infrastructure, such as Makefiles, ant buildfiles etc., and settings for the tools, such as IDE project files.
That should take care of the setup hassle for individual projects.
For the basic machine setup, you could use a standard image. Another option is to use your platform's tools to automate installation. Under Linux, you could create a meta-package that depends on all the packages you need. Under Windows, a similar thing should be possible using MSI or the like.
Edit:
Ideally, instead of checking in helper infrastructure, you check in the information that allows the build to generate the helper infrastructure. This is the approach taken by e.g. the GNU build system (autotools etc.), or by Maven. This is even more elegant, because you can (theoretically) generate infrastructure for any (supported) build environment, thus you are not bound to e.g. one specific IDE, and settings in the helper infrastructure (paths etc.) don't need to duplicate the main project settings.
However, this also a more complex approach, so if you can't get it to work, I believe checking in stuff like IDE files directly is acceptable.
I like to use Virtual PC or VMware to virtualize the development environment. This provides a standard "dev environment" that could be shared among developers. You don't have to worry about software that the user could add to their system that may conflict with your development environment. It also provides me a way to work to two projects where the development environments can't both be on one system (using two different versions of a core technology).
Use puppet to configure both your development and production environment. Using a top-notch automation system is the only way to scale your ops.
There's always the option of using virtual machines (see e.g. VMWare Player). Create one environment and copy it over for each new employee with minimal configuration needed.
At a prior place we had everything (and I mean EVERYTHING) in SCM (clearcase then SVN). When a new developer can in they installed ClearCase|SVN and sucked down the repository. This also handles the case when you need to update a particular lib/tool as you can just have the dev teams update their environment.
We used two repo's for this so code and tools/config lived in separate places.
I highly recommend Blueprint from DevStructure. It's open-source and your use case is actually the exact reason we originally wrote the software. Our goals have somewhat changed, but it still is the perfect tool for what you are describing. In short, you can create reusable server configs - dead simple configuration management. I hope this helps!
https://github.com/devstructure/blueprint (Blueprint # Github)
I've been thinking about this myself. There are some other technologies that you could throw into the mix. Here's what I'm currently setting up:
PXE based pre-seeded installation images (Debian Squeeze). You can start up a bare-metal machine (or new virtual appliance) and select the image from the PXE boot menu. This has the major advantage of being able to install your environment on physical machines (in addition to virtual appliances).
Someone already mentioned Puppet. I use CFEngine but it's a similar deal. Essentially your configuration is documented and centralized in policy files which are continually enforced by an agent on the client.
if you don't want a rigid environment (i.e. developers may choose a combination of tool-sets) you can roll your own deb packages so new devs can type sudo apt-get install acmecorp-eclipse-env or sudo apt-get install acmecorp-intellij-env, for example.
Slightly off-topic, but if you run a Debian based environment (i.e. Ubuntu), consider installing apt-cacher (package proxy). In addition to saving bandwidth, it will make your installations much faster (since packages are cached on your local network).
If you're using OSX and working with Rails. I'd suggest either:
https://github.com/platform45/let-there-be-light
https://github.com/thoughtbot/laptop
If you use machines in a standard configuration, you can image the disk with a fresh perfectly configured install -- that's a very popular approach in many corporations (and not just for developers, either). If you need separately configured OS's, you can tar-bz2 all the added and changed files once a configured OS is turned into your desired setup, and just untar it as root to make your desired environment from scratch.
if you're using a linux flavor, you've probably got a package management system: thinks .rpm for fedora/redhat, or .deb for ubuntu/debian. many of the things you describe already have packages available: svn, eclipse, etc. you could roll your own packages for company specific software, create a repository (perhaps only available on the local network) and then your setup could be reduced to a single bash script which would add the company repo to /etc/apt/sources.list (debian/ubuntu) and then call a command like,
/home/newhire$ apt-get update && apt-get install some complete package list
you could use buildbot to then automate regular builds for company packages that change often.
Try out DevScript at http://nsnihalsahu.github.io/devscript .
Its one command like ,
devscript lamp or devscript laravel or devscript django . In around a few minutes ,depending on the speed of your internet co