So I'm working with a professor who wants me to create a SQL Database containing information from csv files (from the New York City Department of Transportation). I've written program that takes the csv file and converts it into the appropriate sql commands. So my question is, how do i automate the database so that every 5 minutes or so a program downloads the new csv file, runs it through my csv-to-SQL command program, and then enters the output from my csv-to-SQL command program into terminal (what I use to interface with my SQL database)? Is there a specific language I should look into, I've seen people talk about cron?
cron is a reference to making a scheduled task under Unix; the Windows equivalent is to set up a task to run using a Task Scheduler.
I don't know that there's a pure SQL answer to your problem--probably the way I'd approach it is by writing a simple Import program in the language of your choice, compiling it down to an .EXE, then setting up a Task Scheduler command to run the program every 5 minutes. (Alternately, you could leave the app up all the time and simply let a timer execute every 5 minutes to trigger the import).
The simplest thing you could do is loop your script.
If you're running PHP you could do something like:
$running = true;
while ($running)
{
// Your code that gets and converts CSV
// and then saves it in SQL.
$running = $some_error ? false : true;
sleep(5000);
}
I don't know what you're using but mind the logic, not the language.
What you can do is use Sql Server Integration Services (SSIS). It's basically a workflow package built into Sql that will let you handle tasks like this. You should be able to import the CSV into a temp table and then run your appropriate queries against it.
This is Azure specific but working against hosted SQL should be similar.
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/import-data-from-excel-to-sql
Related
I need to run the same damn Oracle query daily (and export the output, which I can do from the sql file itself). I'd like to automate using Windows Task Scheduler but it only opens the script, and it doesn't run it.
Is this feasible or is there an easier way?
Your description isn't very descriptive (we don't know how exactly you did that), but - here's a suggestion:
create a .bat script (let's call it a.bat) which will call your .sql (let's call it a.sql) script. Only one line in it:
sqlplus scott/tiger#orcl #a.sql
a.sql is ... well, what it is, for example
prompt Number of rows in EMP table
select count(*) from emp;
Create a job in Task Scheduler which will start a program (a.bat).
I've just tried it (using that simple example I posted above) and got the result (saying that there are 14 rows in the table), so I hope you'll get it as well.
I have a collection of SQL queries that need to run in a specific order using Teradata. How can this be done?
I've considered writing an application in some other language (like Python or C++) to sequentially call each query, but am unsure how to get live data there from Teradata. I also want to keep the queries as separate SQL files (like it is currently).
Goal is to minimize the need for human interaction ie. I want to hit "Run" and let it take care of the rest.
BTEQ scripts are your Go-To solution.
Have each query, or at least, logical blocks of several statements, in single bteq script.
Then create a script that will call the the BTEQ with needed settings, i.e. TD logon command and have this script be called in a batch with parameters like this:
start /wait C:\Teradata\BTEQ.bat Script_1.txt
start /wait C:\Teradata\BTEQ.bat Script_2.txt
start /wait C:\Teradata\BTEQ.bat Script_3.txt
pause
Then you can create several batch files, split in logical blocks and have them executed at will, or scheduled.
I would like to execute sql files generated by the service builder, but the problem is that the sql files contains types like: LONG,VARCHAR... etc
Some of these types don't exist on Postgresql (for example LONG is Bigint).
I don't know if there is a simple way to convert sql file's structures to be able to run them on Postgresql?
execute ant build-db on the plugin and you will find sql folder with vary vendor specific scripts.
Daniele is right, using build-db task is obviously correct and is the right way to do it.
But... I remember a similar situation some time ago, I had only liferay-pseudo-sql file and need to create proper DDL. I managed to do this in the following way:
You need to have Liferay running on your desktop (or in the machine where is the source sql file), as this operation requires portal spring context fully wired.
Go to Configuration -> Server Administration -> Script
Change language to groovy
Run the following script:
import com.liferay.portal.kernel.dao.db.DB
import com.liferay.portal.kernel.dao.db.DBFactoryUtil
DB db = DBFactoryUtil.getDB(DB.TYPE_POSTGRESQL)
db.buildSQLFile("/path/to/folder/with/your/sql", "filename")
Where first parameter is obviously the path and the second is filename without .sql extension. The file on disk should have proper extension: must be called filename.sql.
This will produce tables folder next to your filename.sql which will contain single tables-postgresql.sql with your Postgres DDL.
As far as I remember, Service Builder uses the same method to generate database-specific code.
I have a report that I need to run everyday # 00:00 and export all the information from the table to a specific location with a specific name.
Example:
select * from my_table
where date between SYSTIMESTAMP -2 and SYSTIMESTAMP -1
and to export this to file date.xml.
Is this possible from Oracle SQL Developer or do I need other tools?
No Oracle version so I assume 10 or 11.
To schedule your process you just have to create a job and schedule it. The job has to run your script (which can be a function or a stored procedure).
Here is the documentation:
http://docs.oracle.com/cd/B28359_01/server.111/b28310/scheduse.htm#i1033533
To write to a file you can use the spool command in SQL. Here you can find the documentation: http://docs.oracle.com/cd/B19306_01/server.102/b14357/ch12043.htm
It's really simple to use.
spool #path/nomefile
la tua query
spool off
Obviously, the machine from which you run the script must have write permissions on the machine where you're going to write the file (I say this because I often forget to check out).
To create an XML is slightly more complex and a little long to explain here but there is a nice post on the community of Oracle that explains it and makes a simple and practical example: https://community.oracle.com/thread/714758?start=0&tstart=0
If you do not want to use a job in Oracle you can write a .Sql file with the connection commands, the spool command and your query and schedule it on the server machine on which you intend to set as a simple command sqlplus.
I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.