I'm having database with data in version 14 and also having new database without demo data in version 15. I need to transfer data from v 14 to v 15.
Any help is appreciable!
You have couple of options to do that,
As you can use OCA's library for that https://github.com/OCA/OpenUpgrade/tree/15.0
Make your own script with https://www.odoo.com/documentation/15.0/developer/misc/api/odoo.html#logging-in
Related
I recently upgraded to version 2016.3 from 2016.2. To be specific, I am currently using:
IntelliJ IDEA 2016.3
Build #IU-163.7743.44, built on November 17, 2016
I am using the DB2 (LUW) driver provided by the IDE in the example below but I have tried to use my own drivers and still get the same results.
After I upgraded, if I try and copy a timestamp from the Results pane of the Database Console tool window I do not get the full precision. I was able to copy the full timestamp in the previous version.
For example, my results pane shows something like this:
And this is what it looks like when I paste it here after copying it from the results pane: 2017-04-12 10:42:11
The only work around I have found it to cast the timestamp to a CHAR and then copy it from the results pane. This works but a pain especially since most of my queries end up being SELECT *.
Pasting: 2017-04-12-10.42.11.193944
Anybody have any ideas on how to fix this? Workarounds?
It is a bug. Will be fixed in IntelliJ IDEA 2017.1.2 update. Sorry for the inconvenience.
I have various databases for different versions of Odoo ERP. I've spent a lot of time trying to find out the way to solve this problem but i wasn't successful.
How can i figure out the version of Odoo that was used create the database initially?
In Odoo 10, 11 (probably other versions too):
SELECT latest_version FROM ir_module_module WHERE name = 'base'
Why don't you consider Owner of database. Try this:
Create different user for each odoo version, like user7 for odoo7, user8 for odoo 8 etc.. switch to these user using sudo su - odoo7 -s /bin/bash, then create data base, then the owner will be these user. Then u can identify each database.
I am try to migrate from access 2003 to 2016 When I am importing my objects everything is fine. Only,on the process of importing 3 tables, I am getting this error.
System resource exceeded
They are big tables too.
There is no hotfix for access 2016, Total table quantity around 100 tables
If you help me I really appreciate
Found solution here:
https://answers.microsoft.com/en-us/msoffice/forum/all/ms-access-2016-system-resource-exceeded/df80f64a-f233-467e-89df-f05a8d58bc77
In short:
task manager/processes tab, find msaccess, right click and select set affinity.... option. I had 6 CPUs ticked (0 to 5). I un-ticked them all and just ticked the 0 CPU.
Since currently there is no hot fix for 2016 version you have to merge either to 2010 or 2013. Then you can try merging to 2016.
Please check this link:
https://social.technet.microsoft.com/Forums/en-US/aedecca8-aa7d-417f-9f03-6e63e36f0c5d/access-2016-system-resources-exceeded?forum=Office2016setupdeploy&prof=required
Not sure if this will help you, but I've managed to resolve this error by wrapping fields referenced in the WHERE clause with Nz e.g.
instead of
WHERE ReportDate = Date
use
WHERE Nz(ReportDate,ReportDate) = Date
It's strange but it seems to work for me, I've found the issue is often related to indexed fields, so I always add it to those fields first
I have a data table in R with 1.5M rows. I want to export this to a MS SQL db table.
I know I can do it this way:
dbWriteTable(conn,"benefit_custom.Trial_set",trial_set )
But its very slow.
The other option I've tried is to write to a flat file and then create an SSIS pkg to transfer it to the db. This is not a problem, but the issue is that I have string and numeric data in my data table, and when R writes to the file, everything is varchar and is enclosed within quotes.
FileLocation <-"\\Benefit_Analysis_Input.dat"
FileName<- paste( bcpWorkspace,FileLocation,sep = "")
write.table(trial_set,file =FileName,append = FALSE, sep = "\t",col.names = T, row.names = F)
The 1st method preserves the data types like I want to, but the performance is very bad. Does anyone have anything else I can try?
So I guess the data types cant be preserved if I'm writing to a flat file, so I have to go with choosing the data types when I'm importing the flat file into the db
Answering your question: the fastest seems to be rsqlserver
As of now I know about:
rsqlserver: Using System.Data.SqlClient drivers, only on win OS
RSQLServer: Using java drivers to SQLserver from any OS using RJDBC
RODBC: using ODBC drivers, easy setup only on win OS
Still microsoft sql server looks to be quite poorly supported from the R session perspective.
Here is interesting benchmark by rsqlserver project: https://github.com/agstudy/rsqlserver/wiki/benchmarking
Also important to note related to rsqlserver: A linux version using mono is planned.
Finally, my recent presentation on Data Warehousing with R covers DBI, RJDBC, RODBC examples.
I think #rhealitycheck is on the right track - I would use the SQL Import and Export Data wizard to generate an SSIS package. I would save it and customize it later, for example adding an upstream Execute Process Task to call R and write the text file out.
The performance and flexibility of this solution is hard to beat.
In IntelliJ IDEA 11.0.1 I connected Data Source to Oracle database.
When I open an *.sql file and type
SELECT * FROM
I see in code completion list of available tables. Also table columns of selected table are available in WHERE part of statement
But when I type
INSERT INTO
No table is available in code completion.
The same for SQL console.
Looks like a known bug. It's already fixed in IDEA 12 EAP and will be also fixed in IDEA 11.1.4 (not released at the moment of writing this answer).