Can I deploy OpenERP on Heroku? - odoo

Just wondering can I deploy OpenERP(Odoo) on Heroku and use postgres as its dbms? Have any body done this before.
Looking forward to response.

2 years late, but right now it is possible. Shameless plug:
https://github.com/odooku/odooku
Like sepulchered said the file storage is one of the first problems.
This can be solved by using S3 as a fallback in combination with a big /tmp cache in Heroku.
Second problem: db permissions, right now I've patched Odoo to work with a single database. You can also use AWS rdbs with Heroku, which completely solves the single db problem.
Third problem: Long polling running on secondary port. However Odoo can be run in "gevent mode", also currently being patched for best compatibility with Heroku's timeouts.
Fourth problem: Heroku's python buildpack is insufficient for compiling Odoo's dependencies. Was easily fixed with a custom buildpack.
Hope this helps anyone in the future.

Well, actually no, but may be.
Here is why:
openerp requires access to filesystem, and heroku (as far as I know) doesn't provide storage
Postgresql provided as addon to heroku application doesn't provide you with ability to create databases (and openerp creates one database for each company instance)
But I think that you can install it on heroku by collecting requirements via requirements.txt and providing it.
Then you'll have to do something with file storage, I think it's possible to add feature to openerp (as it's open source) for storing files at remote server (cloud storage etc.).
And last you'll have to provide postgresql server with permissions to create databases (I think there are cloud solutions).
PS. openerp is not intended to be installed on cloud platforms, the easiest way of deployment is on some sort of server (e.g. vps) where you can control filesystem and database server.
Hope it helps somehow.

Related

Liferay Cloud IDE, Multiple developpers working on same liferay server

We want to start working with liferay. But the server is too heavy and the developpers computer don't have enought RAM. We want to centralize the server instance.
In other words, we want to build a development server where all developpers can connect and directly develop in their web browser, compile, view the result and push the code to git repository.
I found some good cloud IDE like eclipse CHE and a good maven archetype for liferay projet. So i can build the projet with maven. But now i want to know if it is possible to configure Liferay like every developpers can work without troubling another. And if possible, How ?
The developpers can share the same database and can use different port. Maybe, the server can generate tempory URL like some online cloud editor.
I found this post Liferay With Multiple Server Instances, but i don't think is the best way because he create one server per project. I think is too heavy.
If necessary, We have kubernetes in our IS.
Liferay's tomcat bundle, by default, is configured to take a maximum of 2.5G for the process, but it can run with far less - the default only recently was bumped up, because many people never change the default and then wonder why production systems run out of memory. For 1 concurrent user (the sole developer) on a machine, I guess that the previous default of 1G heap space is enough. Are you saying that that's too much for your developers' machines?
Having many developers on a shared server poses one problem: Yes, you may deploy different code from different machines, but: How about setting a breakpoint? Can you connect with multiple debuggers? If something fails, how do you know whos recent deployment caused the failure?
Sharing a server is an integration technique, not a development technique. If your developers don't have enough memory available for running their own Liferay server next to their IDE, it's a lot cheaper to upgrade their machines than to slow them down when everybody is accessing the same server and they can't properly debug. You pay the memory once, but your waiting developers by the hour.
Is it possible to share one server? Sure it is.
Is it possible to share one server without troubling each other? I doubt.
When you say: You think it's too heavy: What are you basing that assumption on? What does the actual developer machine look like and what keeps you from investing in the extra memory?
It's trivial to share some infrastructure - i.e. have all of them connect to the same database server (and give everyone their own schema). But just the extra effort and setup might require you to pay the developers by the hour as much as you'd otherwise pay for a couple of memory chips.
And yet another option is: Run Liferay on a remote server, but keep 1 instance per developer. This way you don't need the local memory, but can have the memory in the cloud. Calculate if you pay more for remote cloud machines than for local memory - that decision is up to you.

How to rapidly publish web role cloud service, uploading only binaries, avoiding wholly restarting the VM?

Possible ways to accomplish it:
Creating dedicated WCF service for this purpose (currently my favorite option)
Using the REST API?
Azure PowerShell?
Explanation:
Publishing a web-role cloud-service takes about 10 minutes. It's much too long during development - I try to do as much as I can offline, unit-test-ish and modular, but it's just impossible to completely avoid development cycles altogether with the VM.
Apparently, the long time is mostly a result of the machine being wholly restarted, so I'm trying to find an automatic solution, like uploading and installing the binaries.
What is the best way to accomplish it?
What do you think? would it cut at least 50% of the publishing time?
Do you expect any critical problems?
The solutions proposed below are definitely against best practices and should NEVER-EVER be used in production environment.
If your objective is to quickly test your changes in your development environment, there are two ways you can go about it.
Enable RDP and copy your modified binaries or other files directly in the appropriate folders on the VM. You could enable Remote Desktop on your web role and copy the files manually in appropriate folders.
Use Web Deploy: This will only work for web roles in your project but you could enable Web Deploy on your Web Roles and use that to make faster deployment. Please see this link for more details on how to use this feature: https://msdn.microsoft.com/en-us/library/azure/ff683672.aspx.

Is there a general way to get difficulty from any cryptocurrency coin?

Is there a way to get the difficulty from any coin even if there isn't a blockchain site like http://blockchain.info/ (they have an API)? I need to access it programmatically and i want to have it from the source so ripping it from a site that already lists them all isn't an option. Im using a vps Ubuntu server so the ram and mainly the diskspace is limited hence, i cant have alot of blockchains installed on it.
Most if not all have daemons you can use. You can run the daemon on the server and make a request for it. They should all be similar to the php one.

Hosting a Rails Application on Linode

I'm planning to host a Rails application on Linode, but I'm still unsure about the requirements and process of deploying. I'm only getting the 512 plan since I'm expecting relative small traffic for the site.
My question is, do I need to get a repository such as Github to store my code? I'm also a bit concerned about how long it takes to set the server up and the deployment process. I've browsed through the Linode library but I'm not entirely clear on how to deploy Rails apps. I'm planning to use nginx as my server and passenger for deploying. Does anyone know where I can learn to deploy Rails applications on a Linode machine? A step-by-step tutorial with detailed explanation would be great. Thanks!
I've deployed a couple of simple applications on Linode and found their documentation to be excellent. In particular they have step-by-step tutorials tailored to specific environments. For example, in my case (like you) I wanted to use nginx, and I was using Ubuntu 10.04, so I followed this guide:
http://library.linode.com/frameworks/ruby-on-rails-nginx/ubuntu-10.04-lucid
If it's your first time setting up on a VPS there will be some hurdles certainly, but I found the experience to be very rewarding.
Regarding hosting your code, you have a number of options, but keep in mind that this is really a separate issue from deploying your app. You deploy your app on linode, but you don't have to host your code there, although you certainly can.
In general terms, if you're okay with making your code open, then certainly github is a good choice. If you want to keep the code private but still have access online (rather than just on one computer), you can take advantage of your linode machine and host your code there.
If you will have a number of other people contributing to the codebase, you might consider setting up gitosis or gitolite, which make it easy to do this. Alternatively if you will be the main user contributing to the codebase, you can setup a simpler configuration through HTTP, explained here: http://dev.bazingaweb.fr/2011/02/23/how-to-set-up-git-over-http.html
Linode also has documentation on setting up a remote git repository: https://library.linode.com/linux-tools/version-control/git
If you're choosing between gitosis and gitolite, I'd go with gitolite since gitosis appears to have been abandoned and is no longer being actively maintained.
Other references on deploying on linode:
http://infinite-sushi.com/2011/01/deploying-a-rails-app-to-a-linode-box/
http://blog.chris-spencer.co.uk/from-zero-to-git-deployment-on-linode
Ryan Bates has a great videocast on deploying Rails apps to... Linode! Today's your lucky day :) Grab some popcorn and enjoy: http://railscasts.com/episodes/335-deploying-to-a-vps
You don't need a GitHub account to deploy on Linode. The deploy process happens between your local machine and the Linode servers, usually by means of the Capistrano gem.
This tutorial from Smashing Magazine is pretty good. http://coding.smashingmagazine.com/2011/06/28/setup-a-ubuntu-vps-for-hosting-ruby-on-rails-applications-2/
Perfect Script for installation of nginx/ PostgreSQL/ Postfix/ Node.js/ Add deployer user/ rbenv
also refere this link https://medrails.wordpress.com/?blogsub=confirming#subscribe-blog
Thanks

REDIS server requirements for BlueDomino hosting

I have a Python / REDIS service running on my desk that I want to move to my Blue-Domino-hosted site. I've got Python available on the server, but not REDIS. They don't give me root access to my Debian VM so I can't git, extract, and install myself from a Unix prompt.
Their tech support might do the install for me, but they need me to point them to server requirements, which I don't see on the REDIS download page.
I could probably FTP binaries to the site if they were available, but that's dicey.
Has anyone dealt with this?
Installing Redis is actually quite easy, from source. It doesn't have any dependencies, so just download the tarball, unzip it, and follow the install instructions. I'm always afraid of doing that sort of stuff, but with Redis it really was a breeze. If you don't dare to do it their tech support should be able to do it.
If it is Intel/AMD server, you can compile the Redis somewhere (32 bit version for example), and upload it as binary. Then start it with Python. I did this myself couples weeks ago.
For port you will need to use something over 1000. I don't recommend to use default port. Remember to change LogLevel too. Daemonize works well as non-root too.
Some servers blocks all external ports, so you will not be able to connect to Redis from outside, but this will be a problem only if you connect from different machine. For same machine should be OK, since is "internal".
However, I am unsure how hosting administrator will react when he sees the process running :) I personally will kill it immediately.
There is other option as well - check service like Redis4you.com . But their free account is small, you probably will need to spend some money for more RAM.
Is your hosting provider looking for a minimum set of system requirements for running Redis? This is indeed not listed on the Redis website. Probably because there aren't many exotic requirements. Also it depends a lot on your use case. Basically what you need to run Redis is:
Operating system: Unix like, Linux is recommended (one reason to favor Linux I've heard of is the performance of its TCP/IP stack)
Tools: GCC, make, (git).
Memory: lots (no seriously this depends on your use case, but because Redis keeps everything in-memory you need a least more RAM than the size of your dataset).
Disk: disk access for making snapshots.
The problem seems to be dealing with something non-traditional with my BlueDomino hosting. Since this project is a new venture, I think the best course for me is to rent a small Linux VM from rackspace and forget about the BD hosting.