Odoo v10 HR payroll community version - odoo

Hello guys i am working on odooV10 community version. I have installed my own module for HR payroll and leave management with a DB consisting of records. My question is MY created module wants to install on another machine. How can i achieve that.??? is that database entries stored in the same module folder???
I hope someone can clear my doubt..
for more clarity:
I will install payroll in my system
edit all entries as i want
copying this payroll module in another machine
These are the steps that i planning to do.

Are you wanting to have a set of default records that always install with your module?
If so, you want to load these into a data file that will make these a permanent part of your module.
You can read about data files in the documentation and/or review how Odoo does this in the core code.
If you're only needing to get the records into another database, you can Export from the first database and Import into the next.

You have several options, you must choose the one that best fits your situation:
If you want to pass information from one database to another you can use the export option of odoo:
1) Go to the tree view of the model you want to export and select the record (s).
2) Expand the More option and click on Export.
3) Select the fields you want to export and save the csv file that will be used to import.
To import: you must go to the tree view and select the Import button that should appear next to the Create option and follow the steps indicated.
If you want to keep the data in your modules to be created every time you install you must create data.xml files
You can export the database directly. You can visit this page to know the steps to do it from odoo
Create-backup
I hope I've helped

Related

Prestashop 1.7: Can I use another database just for executing a side script?

I have created a Prestashop website with a database N°1. For some reasons, I want to write a script, that I'll execute a few times and that will simply fill a database N°2 with some data. This script has nothing to do with my Prestashop website, but is part of the same project and then, must be hosted in the same server, and moreover must be present within the Prestashop website's files (under a module directory I've made).
My question is: is it possible to use this database N°2 (only for this script)? Can I use Prestashop's Db.php class to do it?
Don't hesitate to ask for more informations if I'm not clear.
So if i am reading your post correctly you want to create two databases.
For example "prestashop_n1" and "prestashop_n2".
prestashop_n1 contains all prestashop required tables.
prestashop_n2 contains data extracted from prestashop_n1
In order to extracted data from prestashop_n1 to prestashop_n2 you could write a module.
In this module you can create a database connection to database prestashop_n2
You can use PHP to copy data from prestashop_n1 to prestashop_n2
Alternative you can add a cron-job to execute your script for copying data x times a day.

Oracle repository

I have repository connected with the integrated source control in oracle. I did full export without any data from tables of my database as separate directories and it looks fine. The problem is that every time I want to change something I have export the stored procedure or table sql code and then to upload it in the repository, which is hard because at the end of the day I'm not sure about how many changes I did and I can forgot some of them. Full export without data could have been the solution but I don't have the time to wait for export 20-25 minutes at the end of every day. Is there any way to just export the changes which are made at the current day or made after the last export. Or maybe directly export the sql code on each compilation inside oracle management studio? The database is not on my computer its located in a server which I'm connected to.
here is how my git folder looks like in separate folders
You need to work the other way around. To change a package for example, open the corresponding source code file from your Git repository in your development tool (SQL Developer, PL/SQL Developer etc), make your changes, test, save the file, check in with ticket number and comment. As a rule you should not edit stored code directly in the database. (PL/SQL Developer has a checkbox "Allow editing of database source", which I generally leave unchecked. Probably other tools have something similar.)

importing CSV from SAP R/3 to SQL database for reporting purpose

I want to import CSV files and invoices from SAP R/3 system into a SQL database. The database will be used for reporting purpose only, please tell me what will be the best possible way, which database to use and anything else that will be relevant to me in this context? and I am novice so please help....Thanks:)
If you are routinely importing CSV files then I recommend getting them comma delimited (or whatever delimiter you choose) and going the route of making an SSIS package with a corresponding SQL Agent Job that runs daily to check for the file and run it if it finds it.
Info on SSIS package creation:
http://smallbusiness.chron.com/import-csv-ssis-46849.html
If this is a one time load then I would recommend just using the import export wizard built into SQL Server.
https://msdn.microsoft.com/en-us/library/ms140052.aspx
Pretty easy to use the import export wizard too. Right click the database > tasks > import data. This will launch the wizard and will walk you through the one time import.
Adding Microsoft's official SSIS guide as well:
https://msdn.microsoft.com/en-us/library/ms169917.aspx

How to import selective tables using SSIS based on custom (e.g.XML) file

I am having about 1200 tables in my oracle database and need to import them to SQL Server database. But I would like to configure the import in such a way that at any given import, I should be able to select the tables that need to be imported.
So, I have an custom XML file listing all the tables and a flag for each table indicating whether that table is to be imported or not. Also I have created the package to import all the tables and would like to modify this to check table if that is to be imported from XML file at runtime.
I was thinking to implement something like given here, but don't want to do this for these many tables and also don't know whether it'll do the job.
How can I get around this? Can I use SSIS configuration file for this (not sure though)? Is there any way that I can read XML at runtime and import tables based on XML file (or any other file with key-value pairs).
Any help in any form would be greatly appreciated.
It might seem a lot of work, but this is how I'd approach:
Create one package for each table that needs to be imported - so 1200 packages.
Store the package names in a metadata table along with a flag column, indicating whether that packages needs to be executed or not.
Create a parent package.
Add an execute sql task in the parent package. SQL command like this: select PackageName from metadataTable where Flag =1 retrieves the list of packages that need to be executed.
Map the result set to an object variable.
Add a for each loop container.
Add Execute package task inside the for each loop container, and parametrize the package name property.
This whole setup reads the packages that needs to be executed, and executes them one after the other.
If you like this approach, check out Andy Leonard's SSIS framework.
Samuel Vanga has a solid approach. The only thing I would look at doing is using something to programmatically generate those 1200 packages.
Depending on your familiarity with the SSIS object model and general .NET development, I'd investigate an EzAPI if you enjoy coding.
Otherwise, look at BIML and the package generation feature of BIDSHelper. You do not need to by a license for Mist to create your BIML script, you can browse the existing scripts on BIMLScript and solve probably most of your needs. Copy, paste, generate.

Magento: Manually Restore Truncated "core_config_data" Table

I've just installed a local copy of Magento. For some reason, importing the sample data overwrites the admin_users table and totally empties the core_config_data table. Restoring my admin user access is easy but restoring the configuration is impossible for me at the moment (yes, it's my fault. I forgot to backup my database before importing)
Say, anybody here who knows the keys to the configuration file or better if you can share a copy of the Magento configuration table?
Thanks in advanced.
Ouch. This is why the test data instructions tell you to import the sample data first and then do the Magento install.
Given the quantity of data in the core_config_data table, it will be easier to reinstall Magento to recover it as its contents are created by all the install scripts in each module and the contents vary from version to version.
There is the possibility that you can create the skeleton contents for the core_config_data table from here (magereverse.com) by using the appropriate sections out of the structure dump.