i am shifting a rails3 app from heroku to engine yard. I want to know how to start(or restart) sidekiq on engine yard on each deployment? To check that sidekiq is working, i ssh to EY instance and manually start sidekiq. I want this process to be handled from EY deployment script(config/deploy.yml). I am used to mina deployment and EC2.
Engine Yard provides two ways to customize your environment. You can use custom chef recipes, and deploy hooks.
For SideKiq you will want to use both. You will use a custom chef recipe to configure and run SideKiq, and a deploy hook to restart SideKiq each time you deploy new code.
Engine Yard provides a pre-made example custom chef recipe for SideKiq at http://github.com/engineyard/ey-cloud-recipes/tree/master/cookbooks/sidekiq. The documentation on the example recipe shows exactly what to use for your deploy hook as well.
To use the custom recipe, you will first need to install the engine yard gem locally (gem install engineyard). Then you will make a copy of their example repositories using git clone git#github.com:engineyard/ey-cloud-recipes.git.
Once you've cloned the repository, you will need to add require_recipe 'sidekiq' to ./ey-cloud-recipes/cookbooks/main/recipes/default.rb, then modify the sidekiq recipe as described in the documentation.
Once everything is complete, you can run ey recipes upload, and then ey recipes apply to apply the recipes to your environment. You may need to specify some command line options, depending on if the EY gem can guess which application and environment you are attempting to apply the recipes to. The output from the ey command should provide you with the information you need to do so.
After you've applied the recipes, you will want to create they deploy hooks inside your git repository that your application resides in. Create a 'deploy' directory in the root of your repository and add the after_restart.rb deploy hook as described on the sidekiq chef recipe's documentation.
Re-deploy and you should be good to go.
If you run into any problems, please put in a ticket at Engine Yard's support and we will be happy to assist you.
Related
I am deploying a static site with AWS CDK. This works but as the site has grown, the deployments are failing due to
No space left on device
I am looking for solutions to this problem. One suggestion I have seen is to deploy within a docker container.
How can I do this in CDK and are there any other solutions?
I would advise that you use cdk-pipelines to manage your deployment - thats the best way forward.
But if you have to use a docker container then I have done something similar (in Jenkins).
Steps...
Create a 'Dockerfile' in your project, this will be your custom build environment, it should look like this...
FROM node:14.17.3-slim
ENV CDK_DEFAULT_ACCOUNT=1234 \
CDK_DEFAULT_REGION=ap-southeast-2
RUN npm install -g typescript
Make sure your pipeline installs any npm packages you need
'Build' your project, npx cdk synth
'Deploy' your project,npx cdk deploy --require-approval never
Lastly you'll need a way to authenticate with AWS so BB Pipelines and specifically the Docker container can 'talk' to cloudformation.
But like I said, cdk-pipelines is best solution, here is good tutorial
I'd like to use Travis to push a static HTML/JavaScript website to an Amazon S3 bucket on each commit to master. Is there any way to configure my .travis.yml so it doesn't try to run any sort of build process? Just a deploy?
It seems like this is mainly controlled by the language setting which defaults to Ruby, so Ruby is being (unnecessarily) installed on each build.
I don't know how the ruby box works (I use the java box for my work); that being said, I think that the travis CI boxes have their base language already installed so you aren't really unnecessarily installing ruby each time.
If you want, there supposedly is an undocumented option language: generic.
This way you can just run the required bash commands to deploy your code to Amazon S3
I cloned universal-starter (webpack version) and have it up and running on my local machine using npm start and npm run watch per the instructions
Now stuck after npm run build and attempting to deploy to Azure (and Google Cloud) via the github integration - can't figure out how to set up either to work.
Anyone have a recipe on how to get the webpack bundled files to fire up on an external host with express.js? Do I need to run commands via a CI integration? The files in /dist don't seem to stand on their own.
At Netlify you can connect your git repo and tell them what build commands you want them to use. If you specify the "dist" directory, then they will deploy anything that gets in there (after they have compiled your application).
Edit: the lowest tier is free.
Edit2: I am not associated with Netlify. I just used them in my latest deploy, and found the process extremely easy.
Note: This has changed dramatically since Angular 2. While I'm now moved on to SSR, docker, and all kinds of other things, the simplest answer was to
1) Production build
ng build --prod
2) Transfer files to a static web host (i.e., I used awscli to connect to a s3 bucket when it was just a static site...I know use SSR so I need to use a node server like express)
3) Serve files (there are some complexities for redirect requirements for index.html for error and for 404...and of course setting the status for both redirects to 200)
4) Put something on the frontend for performance/ ssl/ etc. nginx or a CDN would make sense.
I want to be able to automate Jenkins server installation using a script.
I want, given Jenkins release version and a list of {(plugin,version)}, to run a script that will deploy me a new jenkins server and start it using Jetty or Tomcat.
It sounds like a common thing to do (in need to replicate Jenkins master enviroment or create a clean one). Do you know what's the best practice in this case?
Searching Google only gives me examples of how to deploy products with Jenkins but I want to actually deploy Jenkins.
Thanks!
this may require some additional setup at the beginning but perhaps could save you time in the long run. You could use a product called puppet (puppetlabs.com) to automatically trigger the script when you want. I'm basically using that to trigger build outs of my development environments. As I find new things that need to be modified, I simply update my puppet modules and don't need to worry about what needs to be done to recreate the environments through testing for the next go round.
It's hard to formulate the question actually so I just explain the situation.
I'm working on a application that consists of multiple sub applications. The main app just provides an navigation bar and some basic functionality like configuration of users and permissions while the sub applications provide the actual functionalities.
Now this is a Rails 2 application and the sub applications get embedded in frames, it's not really nice design and pretty complex to setup.
Fortunately we have Engines now and that would be the saner solution for this application.
Until now everything lives in subversion and can be updated at once, shared code uses externals. We would like to move to git while we're at restructuring and refactoring.
I've been searching the web the past few days about bundler, git submodules and git subtrees but I haven't found a good description how to properly manage a large project which consists of multiple Engines/Gems when you are developing on all of them the same time.
In particular I would like to be able to:
use Bundler to manage dependencies
do not install our own Gems and Engines into the global gem path but relative to the main app, as an git repository
have our own Gems and Engines setup as git repository (maybe with Bundler's local path override)
an easy way to fetch all dependencies (bundle install) which pulls the latest version of our own Gems and Engines, if that's not possible then one command to git pull all own Gems and Engines (maybe an rake task?)
make it easy for new developers to setup the entire development enviroment fast (git clone the app, bundle install dependencies including all own Gems and Engines, locally)
deploy with Capistrano, easily
What I already thought about:
including everything into one repository, seems to defeat the purpose of separate Gems/Engines for me, also I think it wouldn't allow us to manage the dependencies of the main app on our Engines via Bundler
using submodules, I read too many posts about why it's bad, and with our number of developers it's only a matter of time until somebody commits a submodule pointer to a commit that only exists in his local repo
git subtree utility, seems quite complex to me
So has anybody of you a similar setup and how do you manage it to make updating and committing changes as easy as possible? Where do you put your Engine/Gem code on which the application depends?
TL;DR How do manage a large rails project which consists of multiple Engines and Gems?
We have a similar (but probably less complex) case at my company. What we do (as for now) and that could work for you too :
Put your Rails app in its own git repository. The various gems each get their own repository also (while it is possible to do otherwise, the "one gem = one git repository" will make your life easier).
Then in your Rails app Gemfile, you have several options
Default should be to refer each gem to its git repository (so that bundle will load them from there)
When working locally on some of the gems and the Rails app, either change the Gemfile to use the local path (http://gembundler.com/v1.2/gemfile.html) or better, by overriding the path locally (see: http://gembundler.com/v1.2/git.html). Be careful that those two options are different : the first one use the path, the second the local git repository (so a new uncommitted change will be visible by the first, not by the second).
For updating all your gems easily, I would create a small .sh script (just to launch the various clone or update operations, and the bundle install so that everything comes out clean), and commit it with the main app. I would also get a "standard folder organization" among the team (ie, that everyone use a base folder of their choice, with under it folders for the Rails app and each gem), to make the procedure easier.
I hope this can help or get you ideas (your question is quite complex and manifold, so I'm not 100 % sure this is what you are looking for).
How to manage your Gem dependencies?
Bundler via Gemfile.
How to manage your Engines?
Bundler via Gemfile.
Architect your Engines as Gems and provide their git repo location in your Gemfile. If you want examples, check out how to include the https://github.com/radar/forem gem in your Gemfile.
Also, this helped me learn Rails Engines, http://edgeguides.rubyonrails.org/engines.html.
Are you coming from Java Land?
Rails does have a learning curve, but not like the Java mountain cliff drop off.