Hello all, I am getting SAP DBTech JDBC: [2]: general error: Remote export failed: while exporting a table into csv file - hana

I am getting below error wgile exporting a table into csv in hana studio.
SAP DBTech JDBC: [2]: general error: Remote export failed: export size exceeds 20% of available memory, please use server-local export.
My table has 186 millions records
Please let me know how to resolve this issue and how to run in server - local export

Pallavi there is export limitation applied to the HANA users. You can check those by going into settings of your hana studio. Thats one you need to sort out but if there those many rows as you mentioned above i would suggest to slice the extract into multiple extracts.
Here is the admin settings link from SAP which will guide you to the particular setting i am referring to:
https://help.sap.com/viewer/6b94445c94ae495c83a19646e7c3fd56/2.0.03/en-US/c06b0a63bb5710148bb5e18dfd71c237.html
Look for query limit and exported data to file

Related

Data migration from cocroachdb to postgresql

I am tryig to migrate my cockroachdb into postgresql for some reason :
I have dumbs of cockroachdb data in .sql format like booking.sql etc .
I tried many ways ways to solve this problem
tried direct import of dump file using psql but since the dump file was of cockroachdb it is showing some syntactical error
my second plan was to restore the dump file back into cockroachdb system and try running pgdump from there. But I am not able to restore the database in cockroachdb.
ERROR: failed to open backup storage location: unsupported storage scheme: "" - refer to docs to find supported storage schemes
I tried doing again with import statement from cockroachdb but of no use .
with my little knowledge I also searched google and youtube but of their little documentation I didnt found anything useful
Any help will be appreciated . Thank you
for exporting data from cockroachDB there are some limitations. you can't export your data into SQL directly in new versions.
the first way of exporting is using the cockroach dump command, but it's been deprecated from version 20.2 so if you are using a newer version, this won't work.
cockroach dump <database> <table> <table...> <flags>
sample:
cockroach dump startrek --insecure --user=maxroach > backup.sql
in new versions, you can export your data into CSV files using SQL commands like EXPORT
EXPORT DATABASE bank INTO 's3://{BUCKET NAME}/{PATH}?AWS_ACCESS_KEY_ID={KEYID}&AWS_SECRET_ACCESS_KEY={SECRET ACCESS KEY}' \ AS OF SYSTEM TIME '-10s';
to export into local nodes
EXPORT DATABASE bank INTO ('nodelocal://1/{PATH}');
the other alternative way of exporting is using database clients such as DBeaver.
you can download and install DBeaver from https://dbeaver.io/download/.
after adding the connection you can export the database from this path Right-click on db>tools>Backup
the fastest and easiest way of exporting is using a database tool like DBeaver.
I hope this answer would have been helpful

Error trying export SQL Azure database to BACPAC file

I daily backup around 100 databases to BACPAC file using AzureRM for Windows PowerShell.
For some reason 20 of these databases started to throw an strange error:
Could not export schema and data from database. One or more errors occurred. One or more errors occurred. One or more errors occurred. One or more errors occurred. One or more errors occurred. Failed to convert parameter value from a Int16 to a DateTime. Invalid cast from 'Int16' to 'DateTime'.
This issue started about a week ago, always with the same 20 databases. I tried perform the backup with the Az Module instead AzureRM, and with the Azure Portal, but the same error are shown.
I think it's a bug of the Azure cmdlets because Int16 istn a datatype of SQL Azure,
Help please, i need to backup all databases daily.
Pls check to make sure your source and destination tables have the same data types.
It sounds like you might have a column on the source set to Int16 and dateTime on the server.

Talend Open Studio: Load input files into database

I have an empty SQLlite database. Next to that, I have 6 input files (delimited, excel, json, xml).
Now, all I want to do is load the input files into the empty database.
I tried to connect one input file with the DB and just run it. That didn't work (the DB doens't have anything in it, I suspect that is a problem).
Then, I tried to connect an input file with a tMap, define the table there, define the schema and connect the tMap to the DB (tSQLliteOutput).
When I tried to run it, I receive the following error:
Starting job ProductDemo_Load at 16:46 15/11/2015.
[statistics] connecting to socket on port 3843
[statistics] connected
Exception in component tSQLiteOutput_1
java.sql.SQLException: no such table:
at org.sqlite.DB.throwex(DB.java:288)
at org.sqlite.NativeDB.prepare(Native Method)
at org.sqlite.DB.prepare(DB.java:114)
at org.sqlite.PrepStmt.<init>(PrepStmt.java:37)
at org.sqlite.Conn.prepareStatement(Conn.java:231)
at org.sqlite.Conn.prepareStatement(Conn.java:224)
at org.sqlite.Conn.prepareStatement(Conn.java:213)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.tFileInputExcel_1Process(ProductDemo_Load.java:751)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.runJobInTOS(ProductDemo_Load.java:1672)
at workshop_test.productdemo_load_0_1.ProductDemo_Load.main(ProductDemo_Load.java:1529)
[statistics] disconnected
Job ProductDemo_Load ended at 16:46 15/11/2015. [exit code=1]
I see there's something wrong with the import, but what exactly?
What should I do in order to succesfully load the data from the input files in the database?
I did the exact steps from this little tutorial:
Talend Job: load data into database.
Most talend output components have create table if not exists option.. Did u checked this in your tsqliteoutput..error seems that when talend is inserting data into empty database your table it is not able to find it as it does not exists.. So you to tell talend to create the table first..

SQL Server Data Import / Export Wizard Truncation Error

I'm trying to load a large file into a SQL Server table. I know that two of the columns are > 50 characters wide so on the 'Advanced' tab in the Import/Export Wizard, I specify the width as 115 and 75 respectively. I then run the rest of the job and get the following error:
Is there another place I need to let the Wizard know about the change in length?
Check your database. There is a chance that the table was created from your previous unsuccessful wizard run. It is recommended to delete the table and wizard should re-run successfully without warnings.
Try letting the import wizard create the new table for you. Then you can use regular SQL to move it to its permanent table.

Can You Import an Excel File to a MySQL Database using phpMyAdmin?

Can You Import an Excel File to a MySQL Database using phpMyAdmin? I am looking to buy this database that has the data of all Colleges and Universities in the US. The file is in Excel format. Can this be imported into phpMyAdmin?
Here is the site where I am going to buy the database from if this is possible: http://www.data-lists.com/universities-colleges-database.html?gclid=CPGXkN6H6aECFeQc5wodBFPRIg
You can download a sample of the database that has 10 entries. I have tried importing this into phpMyAdmin but this is the error I am getting:
There is a chance that you may have
found a bug in the SQL parser. Please
examine your query closely, and check
that the quotes are correct and not
mis-matched. Other possible failure
causes may be that you are uploading a
file with binary outside of a quoted
text area. You can also try your query
on the MySQL command line interface.
The MySQL server error output below,
if there is any, may also help you in
diagnosing the problem. If you still
have problems or if the parser fails
where the command line interface
succeeds, please reduce your SQL query
input to the single query that causes
problems, and submit a bug report with
the data chunk in the CUT section
below:
----BEGIN CUT---- eNrtXAtsHMd5nt177R0FipQoiXqvICkiJd7xnnwJBnWiqFdIiubRjyQO5NVxSV51vGPuIYltVNNt
CgS1azhSIre20sBNXLRp2jSOm9iIW8stYLcA3RZtmodjN3Zgp3FdtLWbokVg+/rPzO7evnkXyYna
zhyHd/PvzP/N/PPP7Mw+vtGpqVNTQ+JITByJi2OjE0NiTIyLqYFkNBnKTMOR9lAoczuI95yYGRLL
H8svSqWyXIrkc2cii/OLYiyWiiXEeDQ6EIa/+IAY6xuK9Q+lEh8W8/LCoLgnNL6UuXVsSExFopH+
And then at the bottom of the error it says:
1064 - You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server
version for the right syntax to use
near 'ÐÏࡱá' at line 1
Any help is greatly appreciated.
It looks like you are trying to import a binary file.
Try to export a sheet or a range of the excel file as a CSV file. This must be a reactangle with the same structure on every row.
I've downloaded the sample data and it looks pretty organized. Just save the excel file into csv (File -> save as). Then import it using PHPMyAdmin. I think it will work perfectly fine.
Good luck! :-)