How can I run .SQL files from other .SQL files using MonetDB?
I went through the whole documentation but had no success finding it.
For instance, in MySql I could do something like
parent.sql
use mydatabase;
source child1.sql
source child2.sql
child1.sql
SELECT * from Products;
child2.sql
SELECT * from Orders;
How can I do that (or something similar) using MonetDB?
To dump the SQL database, start the MonetDB SQL Client program and type the command
\>...\databasedump.sql
\D
\>
The path after > should be an absolute path name (i.e. start with a drive letter) and be in a save location. By default the database is located in %APPDATA%\MonetDB5. After having made a database dump it can be removed. This folder is located inside the dbfarm\demofolder.
Restoring the SQL database can be done using the MonetDB SQL Client program with the following command
\<...\databasedump.sql
Source:
https://www.monetdb.org/Documentation/UserGuide/DumpRestore
Related
I work in a company, and i need to export one request SQL to CSV every month. I need to register this file in folder on the server of the company. I work with Oracle (sql developer).
Is this posible ?
Do you have any ideas or a way to resolve my problem ?
"Every month" leads to a scheduled job - use DBMS_SCHEDULER package to create it (or, if you're on older database versions, see DBMS_JOB).
"CSV file" leads to usage of a stored procedure and UTL_FILE package.
At the end, you'd have a job which periodically calls the stored procedure which - using UTL_FILE - creates a CSV file in a directory on the database server.
You can configure your Oracle Server(NOT client) to SPOOL with tool UTL_FILE.
If you want to do it more faster, you can do it with command SPOOL into .SQL file.
There are some specifications like: you must to redirect the output to > /dev/null for example, but it is more eazy.
I am trying to import/ copy my csv file to PostgreSQL. However, I am encountering these errors. I don't have import/ write permissions to the file. Will stdin help and how?The Postgres docs provides no examples. I was henceforth asked to do bulk insert but since there are too many columns with mixed data types, I am not sure how to proceed with that further.
Command to copy the csv file:
COPY sales.sales_tickets
FROM 'C:/Users/Nandini/Downloads/AIG_Sales_Tickets.csv'
DELIMITER ',' CSV;
ERROR: must be superuser to COPY to or from a file
Hint: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone.
1 statement failed.
Command to do bulk insert is too time taking:
insert into sales.sales_ticket values (1,'2',3,'4','5',6,7,8,'9',10','11');
Please suggest. Thank you.
From PostgreSQL docummentation on COPY:
COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
and
Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. Similarly, the command specified with PROGRAM is executed directly by the server, not by the client application, must be executable by the PostgreSQL user. COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
You're trying to use the COPY command violating two of the requirements:
You're trying to execute the COPY command from a non-super user.
You're trying to read a file on your client machine, and have it copied to the server.
This won't work. If you need to perform such a COPY, you need to:
Copy the CSV file to the server; to a directory that can be read by the (system) user executing the PostgreSQL server process.
Execute the COPY command from a superuser account.
Alternative
If you can't do some of these, you can always use a tool such as pgAdmin 4 and use its Import/Export functionality.
See also How to import CSV file data into a PostgreSQL table?
You are an ideal case to use /COPY not COPY.
/COPY sales.sales_tickets
FROM 'C:/Users/Nandini/Downloads/AIG_Sales_Tickets.csv'
DELIMITER ',' CSV;
i have created a table in Hive "sample" and loaded a csv file "sample.txt" into it.
now i need that data from "sample" into my local /opt/zxy/sample.txt.
How can i do that?
Hortonworks' Sandbox lets you do it through its HCatalog menu. Otherwise, the syntax is
INSERT OVERWRITE LOCAL DIRECTORY '/tmp/c' SELECT a.* FROM b
as per Hive language manual
Since your intention is just to copy the entire file from HDFS to your local FS, I would not suggest you to do it through a Hive query, because of the following reasons :
It'll start a Mapreduce job which will take more time than a normal copy.
It'll create file(s) with different names(000000_0, 000001_0 and so on), which will require you to rename the file manually afterwards.
You might face problem in opening these files as they are without any extension. Your OS would be unable to choose an application to open these files on its own. In such a case you either have to rename the file or manually select an application to open it.
To avoid these problems you could use HDFS get command :
bin/hadoop fs -get /user/hive/warehouse/sample/sample.txt /opt/zxy/sample.txt
Simple n easy. But if you need to copy some selected data, then you have to use a Hive query.
HTH
I usually run my query directly through Hive on the command line for this kind of thing, and pipe it into the local file like so:
hive -e 'select * from sample' > /opt/zxy/sample.txt
Hope that helps.
Readers who are accessing Hive from Windows OS can check out this script on Github.
It's a Python+paramiko script that extracts Hive data to local Windows OS file-system.
I have a .sql file and I am trying to import it into SQL Server 2008. What is the proper way to do this?
If your file is a large file, 50MB+, then I recommend you use sqlcmd, the command line utility that comes bundled with SQL Server. It is easy to use and it handles large files well. I tried it yesterday with a 22GB file using the following command:
sqlcmd -S SERVERNAME\INSTANCE_NAME -i C:\path\mysqlfile.sql -o C:\path\output_file.txt
The command above assumes that your server name is SERVERNAME, that you SQL Server installation uses the instance name INSTANCE_NAME, and that windows auth is the default auth method. After execution output.txt will contain something like the following:
...
(1 rows affected)
Processed 100 total records
(1 rows affected)
Processed 200 total records
(1 rows affected)
Processed 300 total records
...
use readfileonline.com if you need to see the contents of huge files.
UPDATE
This link provides more command line options and details such as username and password:
https://dba.stackexchange.com/questions/44101/importing-sql-server-database-from-a-sql-file
If you are talking about an actual database (an mdf file) you would Attach it
.sql files are typically run using SQL Server Management Studio. They are basically saved SQL statements, so could be anything. You don't "import" them. More precisely, you "execute" them. Even though the script may indeed insert data.
Also, to expand on Jamie F's answer, don't run a SQL file against your database unless you know what it is doing. SQL scripts can be as dangerous as unchecked exe's
Start SQL Server Management Studio
Connect to your database
File > Open > File and pick your file
Execute it
Try this process -
Open the Query Analyzer
Start --> Programs --> MS SQL Server --> Query Analyzer
Once opened, connect to the database that you are wish running the script on.
Next, open the SQL file using File --> Open option. Select .sql file.
Once it is open, you can execute the file by pressing F5.
In order to import your .sql try the following steps
Start SQL Server Management Studio
Connect to your Database
Open the Query Editor
Drag and Drop your .sql File into the editor
Execute the import
A .sql file is a set of commands that can be executed against the SQL server.
Sometimes the .sql file will specify the database, other times you may need to specify this.
You should talk to your DBA or whoever is responsible for maintaining your databases. They will probably want to give the file a quick look. .sql files can do a lot of harm, even inadvertantly.
See the other answers if you want to plunge ahead.
Get the names of the server and database in SSMS:
Run the following command in PowerShell or CMD:
sqlcmd -S "[SERVER NAME]" -d [DATABASE NAME] -i .\[SCRIPT].sql
Here is a screenshot of what it might look like:
There is no such thing as importing in MS SQL. I understand what you mean. It is so simple. Whenever you get/have a something.SQL file, you should just double click and it will directly open in your MS SQL Studio.
I have two backup files
1) is named 'backup.sql' with a bunch of SQL defining TABLES
2) is named 'backup' with a bunch of encoded data, which I believe are the ROWS
I need to restore these TABLES + ROWS, but all I am able to figure out is how to restore the tables.
Any tips on dealing with these files? It's the first time I ever deal with SQL Server.
The backup process would not create a file with actual SQL statements, it would create a binary file. So #1 is not a backup file (it's probably a script someone saved to re-create the schema).
I would try to use SQL Server Management Studio to restore the second file and see what happens. I don't think it will allow you to restore an invalid file, but I would take some basic precautions like backing up the system first.
What is the extension for the 'backup' file? Is the filename backup.bak? If you have a backup file created by sql server then it 'should' contain the logic to create both the tables and restore the data, but it could depend on how the backup was created.
---Edit
It is possible for a .SQL file to contain data values as well as the logic to create the tables/columns for a database. I used to run backups of a MySql database in this way a long time ago...it just is not seen very often with SQL server since it has built in backup/restore funcationality.
Seems unlikely they would export all the rows from all tables into CSV file, and given you said it looks encrypted, it's making me think that's your actual backup file.
try this, save a copy of the "backup" file, rename it to backup.bak and run this from SQL Server Management Studio
restore filelistonly from disk='C:\backup.bak'
(assuming your file is saved on the root of the C: drive)
Any results/errors?