Difference btw deb and src - odoo

I use the deb file for production and the source for development.Is this the correct way to do things?
I think that the deb might have certain optimizations(pyo or pyc) for production environment.
But since I have to move my custom modules, one at a time to the production,I find it increasingly difficult.
The actual addons path is here
(1) /usr/share/pyshared/openerp/addons
But the init.d points to
(2) /usr/lib/pymodules/python2.7/openerp/addons
In some modules the __init__.py is in 1 (eg: web_rpc)
and for some its in 2 (eg: hr)
What the actual difference btw
http://nightly.openerp.com/6.1/nightly/src/
and
http://nightly.openerp.com/6.1/nightly/deb/

I haven't tried the deb files, because we use the Ubuntu all-in-one script from openerpappliance.com. It downloads the source from Launchpad and then runs the deployment scripts for you. It will also do updates after you've installed.
We're very happy with the 5.0 version, but we haven't tried the 6.1 version, yet.

you can do with 6.1 is you can give multiple addons path to the your config file in comma separated , or else you can create link in addons folder for your customized folder while you can keep cumized module where you want, just put link(shortcut) to the addon sfoderl of your. this will give you flexibility.
Thank YOu

Related

No detect the external library (Phpoffice) in yii2

Two computers are working on the SAME repository but first computer detect the library and work well but second computer not detect it and show "Error 'PhpOffice\Phpspreadsheet\Reader\Xlsx' not found".
In vendor, the library also exist.
composer.json and composer.lock also the same on both computer.
One thing is that by git ignore, I use yii's composer mechanism at 1st computer but at sec computer(err computer), I add library manually.
If you want to use a composer package, you absolutely need to install it using composer. This ensures that the autoloader is generated properly and your class can be found through PHP.
Copying library files into vendor directory is not enough to install it. During installation Composer creates autoload script with information how to find all classes installed by Composer. If you just copy library files, Composer will not even know that it exist and will not able load any class from it.
If you cannot use Composer on server/computer A, you should install all dependencies on different computer (B) and copy the entire vendor directory into server/computer A. Autoload definitions are in vendor so it should work if you copy the whole dorectory.

How do I avoid absolute pathnames in my code when using Git?

Up till this point in my programming career, I have mostly worked on small projects, editing the PHP/Perl/Python files directly on the Linux dev host, then copying the files to the same directory on a live server to push them into production.
I'm trying to set it up now by learning how to use Git and also how to properly write code with Git in mind. I want to learn the best practices for this scenario:
Here's a project called "BigFun", which also uses some libraries with tasks that are common to many projects.
/home/projects/BigFun/bin/script1.pl
/home/projects/BigFun/bin/script2.pl
/home/projects/BigFun/bin/script3.pl
/home/projects/BigFun/lib/FunMaker.pm
/home/projects/common/lib/Utilities.pm
/home/projects/common/lib/Format.pm
If this program is not managed by Git, and if I know that the files will never be moved around, I could do something like this:
#!/usr/bin/perl
use warnings;
use strict;
use lib '/home/projects/BigFun/lib';
use FunMaker;
use lib '/home/projects/common/lib';
use Utilities;
But once this is managed by Git, then anyone will be able to check these files out and put them in their own development space. The hardcoded URLs will not work anymore if your development rootdir is "C:\My Projects\BigFun".
So, some questions:
I can probably assume that the BigFun "lib" directory will always be relative to the "bin" directory. So maybe I could change line 3 to use lib '../lib'; . Is that the right solution, though?
It seems likely, though, that this example code I've listed would be split up in to two repositories - one for BigFun, and the other as a "common" repo containing some tools that are used by many projects. When that happens, it seems to me that the BigFun code would have no way of knowing where to find the "common" libraries. /home/projects/common/lib is not at all guaranteed to work, and nor would ../../common/lib. What's the normal Git way to handle something like this?
I'm working my way through the "Pro Git" book, but I haven't (yet) found anything to answer these questions. Thanks to anyone who can point me in the right direction!
Your question is not about Git,
it's about collaboration.
Absolute paths force all users of your piece of software to use the same directory layout, and that's unacceptable. No decent software does that.
Avoiding absolute paths in software is the norm,
regardless of what version control system you use, or not use.
How to make your software work using strictly relative paths and never absolute paths? That depends on the software/framework/language.
For relative paths to make sense,
you need to consider the question: relative from where?
Here are some ideas as the anchor from which relative paths could be considered:
current working directory
user home directory
package installation directory
framework installation directory
Every language typically has some packaging mechanism.
The goal of packaging is that developers in the language can create a standard package, the content of which is organized in such a way that the standard tools of the language can install it,
adding the software to the system-wide libraries of the language,
or to custom user libraries,
or to a specified library location.
Once the software is installed, from a standard package,
it becomes ready to use just like any other installed software.
In your example,
use warnings; and use strict; work without any setup because these libraries are installed in the system.
The system finds their location relative to the installation directory of Perl. Roughly speaking.
So what you need to do is:
Figure out how to package a Perl library
Figure out how to install a Perl package
Once your FunMaker and Utilities are installed as standard Perl packages, you will be able to simplify your script as:
#!/usr/bin/perl
use warnings;
use strict;
use FunMaker;
use Utilities;
You will of course have to document the dependencies of the script (FunMaker, Utilities),
as well as how to install them (especially the location where these packages will be hosted).

Can somebody please explain about Openerp 7's installation in ubuntu inorder to have several different versions on the same machine

I have a few doubts on setting up several copies of openerp 7 on ubuntu 14.04 .
E.g.
If I have extracted all of these versions into my /opt/openerp/,
/opt/openerp/server [old unpatched version]
/opt/openerp/server_231025 [Old unpatched version of openerp 7]
/opt/openerp/server_231303 [latest patched openerp 7 version]
/opt/openerp/odoo_8 [Odoo version]
Now how do I proceed from here.
1. Do I have to copy each of their openerp-server.conf files and put it in /etc/ and rename each one to make it different from one another.
And if I just want to autostart and autostop the first two and not the other two, then how do i do it.
Why do we put the openerp-server.conf in /etc/ eventhough it is already there under /opt/openerp/server/install/openerp-server.conf.
Is it compulsory to put the openerp-server.init from /server/install/ or its modified version into the /etc/init.d/ folder eventhough we do not want the openerp-server service to autostart and autostop. Is that what the init.d folder does, help autostart and autostop application services. Or is this step necessary to do a sudo service openerp-server start, stop, restart.
And what server does openerp use, gunicorn or a custom webdav based server.
Where exactly do we mention in the config file, the created role the openerp server has to use.
A detailed explanation would be really helpful and greatly appreciated.
Thanks a ton in advance.
Please also take a look at my other questions and any answers is even more greatly appreciated with more kudos points.
Regards,
Vyas Senthil
We do it by not using the packages but rather just tar.gz the files and put them in the directory we want. The directory also includes the configuration file. e.g /opt/rel_1, /opt/rel_2.
We then have one start script in /etc/init.d per instance and if required, one virtual environment per instance. You need to set up the start scripts yourself this way but they are pretty simple. As long as you use consistent path names inside the install directories it is pretty much a copy/paste exercise.
Auto-starting or not is up to you via the standard start tools on ubuntu (update-rc.d)
Assuming a recent Openerp (6.1 or greater), OpenERP/Odoo has werkzeug baked in but for prod you really want to use a wsgi server such as Nginx/Gunicorn or Apache/mod_wsgi. I find nginx/gunicorn pretty simple but I don't really have any apache experience so can't comment. In Odoo 8 they seem to have included in a multi-process option and gevent but I have yet to see any documentation on this.
Where you put the config file is up to you, just refer to it in the --config switch when you start openerp and it will work.

Dropbox selective syncing - pattern matching?

I'm using Dropbox on daily basis and put my programming projects in there.
It works great, but once I got many projects my /node_modules dir's are putting
a struggle on Dropbox. It's syncing process starts to be slow and it eats up CPU time.
Is there any way to do a selective sync based on directory name or a mask pattern?
Would be nice to have to a .gitignore equivalent to configure.
Any 3rd party software for that task?
There is a way to selectively sync but I don't believe it has any advanced rules like you're describing:
https://www.dropbox.com/help/175/en
2 way to resolve this problem:
You can put node_modules upper then project directory in files tree. For example:
Project dir: c:/prj/myProjWrapper/myProj
In the c:/prj/myProjWrapper put package.json and make npm install here, NodeJS recursively will find it.
Win and Linux only, not for Mac! In project dir create .ds_store folder (it is not sync by dropbox). Put package.json in to it and do npm install. You must set NODE_PATH=./.ds_store/node_modules;. when starting NodeJS

What is stored in Packages/User directory?

How to save/restore Sublime Text 2 configs/plugins to migrate to another computer? states that, to backup a Sublime Text 2 installation, a user should preserve the ~/Packages/User directory (from the user's local data folder on whatever OS they're using).
However, http://andrew.hedges.name/blog/2012/01/19/sublime-text-2-more-sublime-with-a-drop-of-dropbox and most other walkthroughs for using Dropbox to sync Sublime's settings specify three directories: ~/Packages, ~/Installed Packages and ~/Pristine Packages.
What is the functional difference between backing up just ~/Packages/User, and the other 3 directories?
I think that Packages/User is the one in which you are supposed to put settings (according to Sublime's official and unofficial documentation). However, some people put them in the other folders from time to time.
The Dropbox advice may be a hedge against poor practice.
From here:
Installed Packages is:
You will find this directory in the data directory. It contains a copy
of every sublime-package installed. Used to restore Packages.
These are the packages installed as sublime-packages. I don't think package control uses this, but if you install something as a sublime-package maybe you want to keep it?
Pristine Packages is:
You will find this directoy in the data directory. It contains a copy
of every shipped and core package. Used to restore Packages.
So essentially a list of .sublime-package files used to restore if you break something.
Packages is:
The packages used by Sublime Text, either installed as part of sublime, or the plugins.
User is:
The user directory is your personal directory, containing configurations, additional snippets, etc.
Below are my personal views on what to save, so feel free to ignore it if you would like.
I would have to agree with the post saying just save the User directory, as Package Control will grab all of the plugins in the list if they aren't already installed. I didn't see this mentioned in that post, but you can also add repositories (by specifying a URL) to Package Control, which allows you to install Packages outside of those submitted to Package Control, but still hosted somewhere. One of the arguments I can see to saving the Packages directory completely is if you are using plugins that aren't hosted anywhere (though these could probably be moved to the Packages directory without any problems).
The Installed Packages and Pristine Packages are used to restore packages, so I wouldn't think these would be needed, but I'm sure there is some use case where it is.
Anyways, realize I got off topic a bit at the end there, but hope everything before that helps clarify.