How do I transfer an Postgres SQL Function from a 9.3 DB to another computer with same version? - sql

I have a working function that was created and want to just transfer it to another newly installed Postgres DB.
I have checked around and seeing command line options which seems kinda of dry for a common task?
Is there a module import / export option?
I have exported the function into a .sql file from the database, so just looking for an import option.
I know very little about Postgres :)

How did you export it? If it exported as sql into a file named file_name.sql, you would just do:
psql -f file_name.sql
With whatever additional parameters are needed to connect to the correct server and database.

Related

Data migration from cocroachdb to postgresql

I am tryig to migrate my cockroachdb into postgresql for some reason :
I have dumbs of cockroachdb data in .sql format like booking.sql etc .
I tried many ways ways to solve this problem
tried direct import of dump file using psql but since the dump file was of cockroachdb it is showing some syntactical error
my second plan was to restore the dump file back into cockroachdb system and try running pgdump from there. But I am not able to restore the database in cockroachdb.
ERROR: failed to open backup storage location: unsupported storage scheme: "" - refer to docs to find supported storage schemes
I tried doing again with import statement from cockroachdb but of no use .
with my little knowledge I also searched google and youtube but of their little documentation I didnt found anything useful
Any help will be appreciated . Thank you
for exporting data from cockroachDB there are some limitations. you can't export your data into SQL directly in new versions.
the first way of exporting is using the cockroach dump command, but it's been deprecated from version 20.2 so if you are using a newer version, this won't work.
cockroach dump <database> <table> <table...> <flags>
sample:
cockroach dump startrek --insecure --user=maxroach > backup.sql
in new versions, you can export your data into CSV files using SQL commands like EXPORT
EXPORT DATABASE bank INTO 's3://{BUCKET NAME}/{PATH}?AWS_ACCESS_KEY_ID={KEYID}&AWS_SECRET_ACCESS_KEY={SECRET ACCESS KEY}' \ AS OF SYSTEM TIME '-10s';
to export into local nodes
EXPORT DATABASE bank INTO ('nodelocal://1/{PATH}');
the other alternative way of exporting is using database clients such as DBeaver.
you can download and install DBeaver from https://dbeaver.io/download/.
after adding the connection you can export the database from this path Right-click on db>tools>Backup
the fastest and easiest way of exporting is using a database tool like DBeaver.
I hope this answer would have been helpful

Is there an alternative way to import data into Postgres than using psql?

I am under strict corporate environment and don't have access to Postgres' psql. Therefore I can't do what's shown e.g. in the SO Convert SQLITE SQL dump file to POSTGRESQL. However, I can generate the sqlite dump file .sql. The resulting dump.sql file is 1.3gb big.
What would be the best way to import this data into Postgres? I also have DBeaver and can connect to both databases simultaneously but unfortunately can't do INSERT from SELECT.
I think the term for that is 'absurd', not 'strict'.
DBeaver has an 'execute script' feature. But who knows, maybe it will be blocked.
EnterpriseDB offers binary downloads. If you unzip those to a local drive you might be able to execute psql from the bin subdirectory.
If you can install psycopg2 or pg8000 for python, you should be able to connect to the database and then loop over the dump file sending each line to the database with cur.execute(line) . It might take some fiddling if the dump file has any multi-line commands, but the example you linked to doesn't show any of those.

Running psql with pgAdmin4

I try to open a csv file in PGadmin4 with COPY.
However it does not work.
The permission is denied and suggests me to use \copy from pgsql.
I tried to replace COPY by \copy, did not work.
I guess that pgsql must be run another way. I saw there ,example with \copy, that it run with a shell file .sh. However I'm using Windows.
How to run pgsql request ?
Thank you
Try Import/Export option in pgAdmin4 to import your CSV data into Table.
Ref: https://www.pgadmin.org/docs/pgadmin4/dev/import_export_data.html
I answer you on my own question page. Yes, psql is the SQL Shell program and you could start it from the Postgresql folder (from where you start pgAdmin). You do not need of psql if you use pgAdmin.

Best way to export huge data from Oracle DB to Oracle DB

That's my first post in stackoverflow, and i've to say that i really like that website !
For a project i need to export then re-import some huge Oracle tables from one DB to another (around 100 million lines and 30 rows).
My idea is to export the table in a flat file and then reimport into another empty table considering that the schema already exists.
I'm using PL/SQL Developer and/or SQL*Plus to make my operations.
I've tested SQL*Loader which seems to do a good job but it's really slow in my opinion : about 30 seconds to make an import of a CSV file with 1 million lines/30 rows.
Which solution could you bring? Is SQL*Loader is the best tool? Some better tools already exists?
Is CSV the better format talking about size and processing time?
Thanks a lot in advance.
Use Oracle DataPump aka expdp and impdp Overview of Oracle Data Pump See Examples of Using Data Pump Export and Examples of Using Data Pump Import
There really is no need to program this on your own, there is no way that you can outperform expdp/impdp. Don't forget there is also an impdp option to use a network_link. In that case, you just skip the dmp file and import directly into the target database. This can be done using impdp from the commandline but also using dbms_datapump using pl/sql. See PL/SQL Packages and Types Reference for docu.
You can use one of the following option:
SQL Loader (which you already might be trying out).
Traditional data export and Import (exp / imp commands).
Oracle data pump (expdp /impdp).
Also if you need to do this regularly then you can schedule this using oracle scheduler or shell script.

Export DB with PostgreSQL's PgAdmin-III

How to export a Postgresql db into SQL that can be executed into other pgAdmin?
Exporting as backup file, doesn't work when there's a difference in version
Exporting as SQL file, does not execute when tried to run on a different pgAdmin
I tried exporting a DB with pgAdmin III but when I tried to execute the SQL in other pgAdmin it throws error in the SQL, when I tried to "restore" a Backup file, it says there's a difference in version that it can't do the import/restore.
So is there a "safe" way to export a DB into standard SQL that can be executed plainly in pgAdmin SQL editor, regardless of which version it is?
Don't try to use PgAdmin-III for this. Use pg_dump and pg_restore directly if possible.
Use the version of pg_dump from the destination server to dump the origin server. So if you're going from (say) 8.4 to 9.2, you'd use 9.2's pg_dump to create a dump. If you create a -Fc custom format dump (recommended) you can use pg_restore to apply it to the new database server. If you made a regular SQL dump you can apply it with psql.
See the manual on upgrading your PostgreSQL cluster.
Now, if you're trying to downgrade, that's a whole separate mess.
You'll have a hard time creating an SQL dump that'll work in any version of PostgreSQL. Say you created a VIEW that uses a WITH query. That won't work when restored to PostgreSQL 8.3 because it didn't support WITH. There are tons of other examples. If you must support old PostgreSQL versions, do your development on the oldest version you still support and then export dumps of it for newer versions to load. You cannot sanely develop on a new version and export for old versions, it won't work well if at all.
More troubling, developing on an old version won't always give you code that works on the new version either. Occasionally new keywords are added where support for new specification features are introduced. Sometimes issues are fixed in ways that affect user code. For example, if you were to develop on the (ancient and unsupported) 8.2, you'd have lots of problems with implicit casts to text on 8.3 and above.
Your best bet is to test on all supported versions. Consider setting up automated testing using something like Jenkins CI. Yes, that's a pain, but it's the price for software that improves over time. If Pg maintained perfect backward and forward compatibility it'd never improve.
Export/Import with pg_dump and psql
1.Set PGPASSWORD
export PGPASSWORD='123123123';
2.Export DB with pg_dump
pg_dump -h <<host>> -U <<username>> <<dbname>> > /opt/db.out
/opt/db.out is dump path. You can specify of your own.
3.Then set again PGPASSWORD of you another host. If host is same or password is same then this is not required.
4.Import db at your another host
psql -h <<host>> -U <<username>> -d <<dbname>> -f /opt/db.out
If username is different then find and replace with your local username in db.out file. And make sure on username is replaced and not data.
If you still want to use PGAdmin then see procedure below.
Export DB with PGAdmin:
Select DB and click Export.
File Options
Name DB file name for you local directory
Select Format - Plain
Ignore Dump Options #1
Dump Options #2
Check Use Insert Commands
Objects
Uncheck tables if you don't want any
Import DB with PGAdmin:
Create New DB.
By keeping selected DB, Click Menu->Plugins->PSQL Console
Type following command to import DB
\i /path/to/db.sql
If you want to export Schema and Data separately.
Export Schema
File Options
Name schema file at you local directory
Select Format - Plain
Dump Options #1
Check Only Schema
Check Blobs (By default checked)
Export Data
File Options
Name data file at you local directory
Select Format - Plain
Dump Options #1
Check Only Data
Check Blobs (By default checked)
Dump Options #2
Check Use Insert Commands
Check Verbose messages (By default checked)
Note: It takes time to Export/Import based on DB size and with PGAdmin it will add some more time.