Export automatically - sql

I work in a company, and i need to export one request SQL to CSV every month. I need to register this file in folder on the server of the company. I work with Oracle (sql developer).
Is this posible ?
Do you have any ideas or a way to resolve my problem ?

"Every month" leads to a scheduled job - use DBMS_SCHEDULER package to create it (or, if you're on older database versions, see DBMS_JOB).
"CSV file" leads to usage of a stored procedure and UTL_FILE package.
At the end, you'd have a job which periodically calls the stored procedure which - using UTL_FILE - creates a CSV file in a directory on the database server.

You can configure your Oracle Server(NOT client) to SPOOL with tool UTL_FILE.
If you want to do it more faster, you can do it with command SPOOL into .SQL file.
There are some specifications like: you must to redirect the output to > /dev/null for example, but it is more eazy.

Related

Import text file into a table in db2

Text file contains data like
1,'name',34
2,'name1',23
If you have Access Client Solutions, you can use the File Transfer function to upload the file.
Also can upload directly from Excel if the file is open:
https://www.ibm.com/support/pages/transferring-data-excel-using-access-client-solutions
When the Db2-server runs on Linux/Unix/Windows, you can CALL a stored procedure to do import or load.
BUT, the file to be imported or loaded must already be on the Db2-server, or on a file-system that the Db2-server process can read. So any filenames are relative to the Db2-server (not to your workstation, unless of course the Db2-server runs on your workstation).
If the target table already exists, the connected-userid needs suitable permissions on that table. If the target table does not exist, you need to create it first.
Also the userid needs execute permission on the stored procedure that does the work.
So there are three steps:
copy the file to be imported (or loaded) to a location that the Db2-server can read.
call the ADMIN_CMD stored procedure with parameters telling it what to do, in this case to import a file.
Examine the result-set of the stored procedure to see what happened. If the import or load failed, you need to run the SQL listed in the MSG_RETRIEVAL column of the result-set to see why it failed (assuming you used MESSAGES ON SERVER option to import).
See the Db2 documentation online here for import or load
There are also many examples here on stackoverflow.
So do your research and learn.
On Db2 11.5 you can use a REMOTE TABLE to import a text file into Db2
Use the REMOTESOURCE YES option if the file is on your client and not on a directory visible to the database server
https://www.ibm.com/support/knowledgecenter/en/SSEPGG_11.5.0/com.ibm.db2.luw.sql.ref.doc/doc/r_create_ext_table.html?pos=2

How to execute SQL queries from text files

I am using Aqua Data Studio 7.0.39 for my Database Stuff.
I have a 20 SQL files(all contains sql statements, obviously).
I want to execute all rather than copy-paste contains for each.
Is there any way in Aqua to do such things.
Note: I am using Sybase
Thank you !!
I'm also not sure of how to do this in Aqua, but it's very simple to create a batch/powershell script to execute .sql files
You can use the SAP/Sybase isql utility to execute files, and just create a loop to cover all the files you wish to execute.
Check my answer here for more information:
Running bulk of SQL Scripts in Sybase through batch
In the latest versions of ADS there is an integrated shell named FluidShell where you can achieve what you are looking for. See an overview here: https://www.aquaclusters.com/app/home/project/public/aquadatastudio/wikibook/Documentation15/page/246/FluidShell
The command you are looking for is source
source
NAME
source - execute commands or SQL statements from a file
SYNOPSIS
source [OPTION...] FILE [ARGUMENT...]
source [OPTION...]
DESCRIPTION
Read and execute commands or SQL statements from FILE in the current shell environment.
I have not used Aquafold before so I can't tell you exactly. However I have tackled a similar problem once before.
I once created a Powershell script. It opened a ODBC connection to my database and then executed stored procedures in a loop until end of file.
I suggest having a text document with each line being the name of an Stored Proc to run. Then in your powershell script read in a line from the file concatenate it into the call to execute a stored procedure. After each execution is completed you can delete the line from the text file and then read the next line until the EOF (end of file) is reached.
Hope this helps. If I have some time this morning I will try and do a working example for you and post it.

Schedule Oracle Reports and export to XML

I have a report that I need to run everyday # 00:00 and export all the information from the table to a specific location with a specific name.
Example:
select * from my_table
where date between SYSTIMESTAMP -2 and SYSTIMESTAMP -1
and to export this to file date.xml.
Is this possible from Oracle SQL Developer or do I need other tools?
No Oracle version so I assume 10 or 11.
To schedule your process you just have to create a job and schedule it. The job has to run your script (which can be a function or a stored procedure).
Here is the documentation:
http://docs.oracle.com/cd/B28359_01/server.111/b28310/scheduse.htm#i1033533
To write to a file you can use the spool command in SQL. Here you can find the documentation: http://docs.oracle.com/cd/B19306_01/server.102/b14357/ch12043.htm
It's really simple to use.
spool #path/nomefile
la tua query
spool off
Obviously, the machine from which you run the script must have write permissions on the machine where you're going to write the file (I say this because I often forget to check out).
To create an XML is slightly more complex and a little long to explain here but there is a nice post on the community of Oracle that explains it and makes a simple and practical example: https://community.oracle.com/thread/714758?start=0&tstart=0
If you do not want to use a job in Oracle you can write a .Sql file with the connection commands, the spool command and your query and schedule it on the server machine on which you intend to set as a simple command sqlplus.

How to Export data to Excel in SQL Server using SQL Jobs

I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.

How do I import a sql data file into SQL Server?

I have a .sql file and I am trying to import it into SQL Server 2008. What is the proper way to do this?
If your file is a large file, 50MB+, then I recommend you use sqlcmd, the command line utility that comes bundled with SQL Server. It is easy to use and it handles large files well. I tried it yesterday with a 22GB file using the following command:
sqlcmd -S SERVERNAME\INSTANCE_NAME -i C:\path\mysqlfile.sql -o C:\path\output_file.txt
The command above assumes that your server name is SERVERNAME, that you SQL Server installation uses the instance name INSTANCE_NAME, and that windows auth is the default auth method. After execution output.txt will contain something like the following:
...
(1 rows affected)
Processed 100 total records
(1 rows affected)
Processed 200 total records
(1 rows affected)
Processed 300 total records
...
use readfileonline.com if you need to see the contents of huge files.
UPDATE
This link provides more command line options and details such as username and password:
https://dba.stackexchange.com/questions/44101/importing-sql-server-database-from-a-sql-file
If you are talking about an actual database (an mdf file) you would Attach it
.sql files are typically run using SQL Server Management Studio. They are basically saved SQL statements, so could be anything. You don't "import" them. More precisely, you "execute" them. Even though the script may indeed insert data.
Also, to expand on Jamie F's answer, don't run a SQL file against your database unless you know what it is doing. SQL scripts can be as dangerous as unchecked exe's
Start SQL Server Management Studio
Connect to your database
File > Open > File and pick your file
Execute it
Try this process -
Open the Query Analyzer
Start --> Programs --> MS SQL Server --> Query Analyzer
Once opened, connect to the database that you are wish running the script on.
Next, open the SQL file using File --> Open option. Select .sql file.
Once it is open, you can execute the file by pressing F5.
In order to import your .sql try the following steps
Start SQL Server Management Studio
Connect to your Database
Open the Query Editor
Drag and Drop your .sql File into the editor
Execute the import
A .sql file is a set of commands that can be executed against the SQL server.
Sometimes the .sql file will specify the database, other times you may need to specify this.
You should talk to your DBA or whoever is responsible for maintaining your databases. They will probably want to give the file a quick look. .sql files can do a lot of harm, even inadvertantly.
See the other answers if you want to plunge ahead.
Get the names of the server and database in SSMS:
Run the following command in PowerShell or CMD:
sqlcmd -S "[SERVER NAME]" -d [DATABASE NAME] -i .\[SCRIPT].sql
Here is a screenshot of what it might look like:
There is no such thing as importing in MS SQL. I understand what you mean. It is so simple. Whenever you get/have a something.SQL file, you should just double click and it will directly open in your MS SQL Studio.