Database duplicacy - sql

I have a database in phpmyadmin. I am trying to export my tables in sql format. the export file is generated with nothing in it but when I try to export and save file in csv or json format data gets saved. When I import the csv or json file the import fails without giving any error. Can't understand why.
I want to duplicate my complete database from one system to another. How can I do so?

Related

Django Docker Postgresql database import export

I have a problem with postgresql database. I just want to able to export and import database like .sql file. But when using command line I am able to get .sql database file but when I'm trying to import it to database it's giving syntax errors in sql file. I can try to fix syntax errors in sql file but I don't want to solve this problem it like that.
I need to get exported sql file without any errors and to be able to import it without any issue. What can be a problem?
Postgresql version:postgres:14.6

Restoring DB into pgAdmin4 does not accept my backup file format (.sql)

I have exported a copy of my local Postgres DB from PhpPgMyAdmin and the resulted file is a pretty normal .sql which contains both the structure and the data of the DB.
Then I have tried to import this structure and data to a remote DB connected via PgAdmin4 but I am getting this message at the moment:
pg_restore: error: input file appears to be a text format dump. Please
use psql.
I have tried to look for some file converter online to change the file format from .sql to .psql but I couldn't find any and anyway pgAdmin4 should accept .sql files which confuses me:
Any ideas?
Issue solved.
Under "Tool" select "Query Tool" and import your .sql file there.
For the query to be run I had to change the "OWNER" of the DB to the new "user" of the DB where I wanted to import the data.

Exporting data in CSV file format from SQL?

I am trying to export a data table from SQL Server to a .CSV file format, but could not proceed any further after this warning. And I don't know how to fix this, and I need to go further in my job.
Write a query to retrieve the data you want to export
Right click on the result and save as csv file. see image bellow

Uploading a CSV in PhpMyAdmin

I am trying to upload a .csv file in phpMyAdmin without success.
The first way: Selecting the current database and directly import the zipped file then selecting CSV under Format of imported file and , as delimiter.
I get no error but when the upload is complete nothing happens and the page remains white empty.
Second way: I created a table with the same number of columns (43) included in the csv file. Then import the CSV file using the same configuration as before.
I get an error basically saying that number of fields in the CSV imported is not valid even though it is.
What am I doing wrong?

BigQuery export table to csv file

I'm trying to export a BigQuery form UI to Google Storage table but facing this error:
Errors:
Table gs://mybucket/delta.csv.gz too large to be exported to a single file. Specify a uri including a to shard export. (error code: invalid)
When trying to export after query I got:
Download Unavailable This result set contains too many rows to download. Please use "Save as Table" and then export the resulting table.
Finally found how to do. we must use "*" in the blob name.
And will create as many file as needed.
It's weird that i can import large file (~GB) but not possible to export large file :(
BigQuery can export up to 1 GB of data per file
For larger than 1GB - BigQuery supports exporting to multiple files
See Single wildcard URI and Multiple wildcard URIs in Exporting data into one or more files