Large File export to postgreSQL - sql

I need to export a 50gb file with inserts to a table in postgreSQL to be able to count the time it takes to perform the inserts, but I can't find any way to load that file, can someone help me?

If the file have you have contains syntactically valid SQL (like INSERT statements), this is very straightforward using the command line psql client that comes with a Postgres installation:
psql DATABASE_NAME < FILE_NAME.sql
You may also want to replace DATABASE_NAME with a connection string like postgres://user:pass#localhost/database_name.
This causes your shell to read the given file and pass it off to psql's stdin, which will cause it to execute commands against the database it's connected to.

Related

Is there an alternative way to import data into Postgres than using psql?

I am under strict corporate environment and don't have access to Postgres' psql. Therefore I can't do what's shown e.g. in the SO Convert SQLITE SQL dump file to POSTGRESQL. However, I can generate the sqlite dump file .sql. The resulting dump.sql file is 1.3gb big.
What would be the best way to import this data into Postgres? I also have DBeaver and can connect to both databases simultaneously but unfortunately can't do INSERT from SELECT.
I think the term for that is 'absurd', not 'strict'.
DBeaver has an 'execute script' feature. But who knows, maybe it will be blocked.
EnterpriseDB offers binary downloads. If you unzip those to a local drive you might be able to execute psql from the bin subdirectory.
If you can install psycopg2 or pg8000 for python, you should be able to connect to the database and then loop over the dump file sending each line to the database with cur.execute(line) . It might take some fiddling if the dump file has any multi-line commands, but the example you linked to doesn't show any of those.

Why can I not import data from CSV file to the table?

I have an interesting challenge. I am learning how to use COPY function in SQL. I need to import a data from .CSV to the table (PostgreSQL server). But every time when I try to do this I get this message:
ERROR: could not open file "/Users/olenaskoryk/Desktop/us_counties_2010.csv" for reading: Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
SQL state: 42501
My query is:
COPY us_counties_2010
FROM '/Users/olenaskoryk/Desktop/us_counties_2010.csv'
WITH (FORMAT CSV, HEADER);
As you see I am working on Mac.
I assume that PostgreSQL is not running locally on your computer. That's why the server can't read your local file.
You may want to do this through a psql session using the \copy command.
$ psql your_db_connection_url
psql (10.5)
Type "help" for help.
db=# \copy us_counties_2010 FROM '/Users/olenaskoryk/Desktop/us_counties_2010.csv' WITH (FORMAT CSV, HEADER);
Psql's \copy command uses COPY FROM STDIN under the hood, passing the contents of the local file through standard input to the server, circumventing the limitation of the server not being able to read the local file.
The HINT is relevant. The SQL server is not running as you, and does not have the right to read your file.
Give it permissions (which may require giving permissions to containing directories), or put in in a universally readable place. (like perhaps a TEMP directory.)

Import/ Copy csv file into PostgreSQL | File not on local Server

I am trying to import/ copy my csv file to PostgreSQL. However, I am encountering these errors. I don't have import/ write permissions to the file. Will stdin help and how?The Postgres docs provides no examples. I was henceforth asked to do bulk insert but since there are too many columns with mixed data types, I am not sure how to proceed with that further.
Command to copy the csv file:
COPY sales.sales_tickets
FROM 'C:/Users/Nandini/Downloads/AIG_Sales_Tickets.csv'
DELIMITER ',' CSV;
ERROR: must be superuser to COPY to or from a file
Hint: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone.
1 statement failed.
Command to do bulk insert is too time taking:
insert into sales.sales_ticket values (1,'2',3,'4','5',6,7,8,'9',10','11');
Please suggest. Thank you.
From PostgreSQL docummentation on COPY:
COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
and
Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. Similarly, the command specified with PROGRAM is executed directly by the server, not by the client application, must be executable by the PostgreSQL user. COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
You're trying to use the COPY command violating two of the requirements:
You're trying to execute the COPY command from a non-super user.
You're trying to read a file on your client machine, and have it copied to the server.
This won't work. If you need to perform such a COPY, you need to:
Copy the CSV file to the server; to a directory that can be read by the (system) user executing the PostgreSQL server process.
Execute the COPY command from a superuser account.
Alternative
If you can't do some of these, you can always use a tool such as pgAdmin 4 and use its Import/Export functionality.
See also How to import CSV file data into a PostgreSQL table?
You are an ideal case to use /COPY not COPY.
/COPY sales.sales_tickets
FROM 'C:/Users/Nandini/Downloads/AIG_Sales_Tickets.csv'
DELIMITER ',' CSV;

different command line used to extract tables from an sql server file into one that is usable by mysql

What is the difference between these two command line used to extract tables from a database into one that can be used by mysql ?
C:> mysql -u user -p PASS database_name < ms.sql
And
mysql> source ms.sql ;
I used to do with the former and the database created contained all information but it didn't work. the second worked fine.
Second in the first case setting default character set is examplified but I found none in the homepage of the mysql an example for the second case. I am thankful for any help available.
Both of the commands can be referred as Batch Commands. I am pointing out the difference between them below.
First Command
mysql -u user -p PASS database_name < ms.sql
The above command is executing two commands at a time. One is to loggin to MySQL and other one is, passing the script file to execute using OS I/O Operator '<'.
After execution of this command it will display the sql result of the script and comes back to command prompt.(comes out of SQL
Prompt)
It is necessary to keep USE DB_name command in the begening of file to execute the script
This way is useful when you want to execute a big script without logging into mysql typically most often used.
Second Command
mysql> source ms.sql;
The above command is generally an SQL Command which will execute the script present in sql file.
It is used if you are already in MYSQL Prompt. After executing the script it will return back to Mysql prompt only
You may also use this command like executing the shell script something like mysql> ./filename
For more information please refer MySql Reference Link: https://dev.mysql.com/doc/refman/5.7/en/mysql-batch-commands.html

Export records from SQL Server 2005 express edition

I have a little problem. My friend has a database with over 10 tables and each table has over 90-100 records.
I can't find a workaround to export the records (to put in a SQL file something like this: INSERT INTO .... VALUES ... for each existing records) from his tables to import in my database.
How to do that ?
I tried: right click on a table -> Script Table as -> INSERT TO -> File ...
but it only generate the INSERT statement.
There are a solution ? or this feature is only for commercial version ?
UPDATE
You can use BCP command with command prompt like this
For export: bcp ADatabase.dbo.OneTable out d:\test\OneTable.bcp -c -Usa -Ppassword
For import: bcp ADatabase.dbo.OneTable in d:\test\OneTable.bcp -c -Usa -Ppassword
these commands will create a BCP file which contains records for specified table. You can import using existing BCP file into another database
If you use remote database then:
bcp ADatabaseRemote.dbo.OneTableRemote out d:\test\OneTableRemote.bcp -Slocalhost/SQLExpress -Usa -Ppassword
Instead of localhost/SQLExpress, you can use localhost or other server name...
Probably the simplest way to do this would be to run a SELECT statement that outputs to a file. Then you can import that data into your database.
For simple moves, I have also done a copy/paste manually. Sometimes it is better to use Excel as a staging platform before pasting it into the new database. You may need to create a temporary table in your new database that matches up exactly with the data you are pasting over. For example, I usually don't put a PK on the temp table at first and make the PK field just an INT. That way the copy will go smoother.
In the corporate world, you would use SSIS to move this data around.
a couple of ways you could do this. One,select everything from each table and save the results as a csv or delimited file (you can do this from sql management studio). You can also script the tables as create and copy the scripts over to the new database, assuming it is a sql server also. Then for import use the load infile statement. You may have to google the syntax for sql server but I know this works in mysql and oracle. haven't tried it in sql server yet.
LOAD DATA INFILE 'myfile'
INTO TABLE stuff
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET id = NULL;
Or if you are going to another sql server use the sql export import wizard.
http://msdn.microsoft.com/en-us/library/ms141209.aspx