How can I most simply automate an Oracle query in windows? - sql

I need to run the same damn Oracle query daily (and export the output, which I can do from the sql file itself). I'd like to automate using Windows Task Scheduler but it only opens the script, and it doesn't run it.
Is this feasible or is there an easier way?

Your description isn't very descriptive (we don't know how exactly you did that), but - here's a suggestion:
create a .bat script (let's call it a.bat) which will call your .sql (let's call it a.sql) script. Only one line in it:
sqlplus scott/tiger#orcl #a.sql
a.sql is ... well, what it is, for example
prompt Number of rows in EMP table
select count(*) from emp;
Create a job in Task Scheduler which will start a program (a.bat).
I've just tried it (using that simple example I posted above) and got the result (saying that there are 14 rows in the table), so I hope you'll get it as well.

Related

Sequential Teradata Queries

I have a collection of SQL queries that need to run in a specific order using Teradata. How can this be done?
I've considered writing an application in some other language (like Python or C++) to sequentially call each query, but am unsure how to get live data there from Teradata. I also want to keep the queries as separate SQL files (like it is currently).
Goal is to minimize the need for human interaction ie. I want to hit "Run" and let it take care of the rest.
BTEQ scripts are your Go-To solution.
Have each query, or at least, logical blocks of several statements, in single bteq script.
Then create a script that will call the the BTEQ with needed settings, i.e. TD logon command and have this script be called in a batch with parameters like this:
start /wait C:\Teradata\BTEQ.bat Script_1.txt
start /wait C:\Teradata\BTEQ.bat Script_2.txt
start /wait C:\Teradata\BTEQ.bat Script_3.txt
pause
Then you can create several batch files, split in logical blocks and have them executed at will, or scheduled.

How to automatically update a SQL database

So I'm working with a professor who wants me to create a SQL Database containing information from csv files (from the New York City Department of Transportation). I've written program that takes the csv file and converts it into the appropriate sql commands. So my question is, how do i automate the database so that every 5 minutes or so a program downloads the new csv file, runs it through my csv-to-SQL command program, and then enters the output from my csv-to-SQL command program into terminal (what I use to interface with my SQL database)? Is there a specific language I should look into, I've seen people talk about cron?
cron is a reference to making a scheduled task under Unix; the Windows equivalent is to set up a task to run using a Task Scheduler.
I don't know that there's a pure SQL answer to your problem--probably the way I'd approach it is by writing a simple Import program in the language of your choice, compiling it down to an .EXE, then setting up a Task Scheduler command to run the program every 5 minutes. (Alternately, you could leave the app up all the time and simply let a timer execute every 5 minutes to trigger the import).
The simplest thing you could do is loop your script.
If you're running PHP you could do something like:
$running = true;
while ($running)
{
// Your code that gets and converts CSV
// and then saves it in SQL.
$running = $some_error ? false : true;
sleep(5000);
}
I don't know what you're using but mind the logic, not the language.
What you can do is use Sql Server Integration Services (SSIS). It's basically a workflow package built into Sql that will let you handle tasks like this. You should be able to import the CSV into a temp table and then run your appropriate queries against it.
This is Azure specific but working against hosted SQL should be similar.
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/import-data-from-excel-to-sql

SQL Server job to execute query from the output CSV file of first step

This is my first job creation task as a SQL DBA. First step of the job runs a query and sends the output to a .CSV. As a last step, I need the job to execute the query from the .CSV file (output of first step).
I have Googled all possible combinations but no luck.
your question got lost somehow ...
You last two comments make ist a little clearer.
If I understand it correctly you create a SQL script which restores all the logins, roles and users, their rights etc. into a newly created db.
If this created script is executable within a query window you can easily execute it with EXECUTE (https://msdn.microsoft.com/de-de/library/ms188332(v=sql.120).aspx)
Another approach could be SQLCMD (http://blog.sqlauthority.com/2013/04/10/sql-server-enable-sqlcmd-mode-in-ssms-sql-in-sixty-seconds-048/)
If you need further help, please come back with more details: What does your "CSV" look like? What have you tried so far?

PostgreSQL and queue commands

I would like to know if there is a way to quere my queries. I am doing some basic text matching in psql and each query (which is saved in a different script) takes about 6 hours to run. I was wondering if there is a way to queue my scripts?
For example;
my database is called; "data"
my scipts are called; cancer, heart, death
and I am doing the following;
data; \i cancer;
data; \i heart;
data; \i death;
But I have to come back every so often and check whether it is running or not etc which doesn't seem very efficient.
I am new to postgresql so appreciate any help.
this is the most easiest/fastest solution I can think of, but should work for your case ;)
When using psql from command line, you can start it with
-f filename
where filename is a SQL script. It will run the query and send the output to stdout. Also you can forward this to a file. Just put your queries into that SQL-file and you got your own queuing.
Assuming you might run Linux, you could use screen to have a simple way to leave your session open when logging of for the night.
The easiest solution was to create a separate sql file which runs through the commands sequentially.

Schedule Oracle Reports and export to XML

I have a report that I need to run everyday # 00:00 and export all the information from the table to a specific location with a specific name.
Example:
select * from my_table
where date between SYSTIMESTAMP -2 and SYSTIMESTAMP -1
and to export this to file date.xml.
Is this possible from Oracle SQL Developer or do I need other tools?
No Oracle version so I assume 10 or 11.
To schedule your process you just have to create a job and schedule it. The job has to run your script (which can be a function or a stored procedure).
Here is the documentation:
http://docs.oracle.com/cd/B28359_01/server.111/b28310/scheduse.htm#i1033533
To write to a file you can use the spool command in SQL. Here you can find the documentation: http://docs.oracle.com/cd/B19306_01/server.102/b14357/ch12043.htm
It's really simple to use.
spool #path/nomefile
la tua query
spool off
Obviously, the machine from which you run the script must have write permissions on the machine where you're going to write the file (I say this because I often forget to check out).
To create an XML is slightly more complex and a little long to explain here but there is a nice post on the community of Oracle that explains it and makes a simple and practical example: https://community.oracle.com/thread/714758?start=0&tstart=0
If you do not want to use a job in Oracle you can write a .Sql file with the connection commands, the spool command and your query and schedule it on the server machine on which you intend to set as a simple command sqlplus.