I'm new to SQL with a background in OOL type programming.
I've created the following query which successfully imports .CSV files into a table on SQL Management Studio. How would I go about importing multiple files? This would be quite straightforward using an OOL language, although I heard you have to read directories using cmd?
The working code is as following:
--Cihans Import for Holdings--
INSERT INTO Holdings1
SELECT * FROM OPENROWSET
('MSDASQL', 'Driver={Microsoft Access Text Driver (*.txt, *.csv)}
;DBQ=C:\Share\DataUploads\FundHoldings;','SELECT * FROM holdings.csv')
How would I go about looping/reading all files in a directory and importing data? We are given over 120 sheets a month, and I would like to import using the above. Or if anyone can recommend an alternative to this?
Do you have access to SSIS? (SQL Server Integration Services)?
If so you can set up an ETL task using this to import all files in a folder using a FOREACH LOOP CONTAINER which contains your data flow task & file system task?
Edit - Response to comment - The Solution in this thread should enable you to do what you need using just T-SQL: loop through files in folder
Basically - load all the files into a temptable, create a while loop, passing in a new filename from the folder each execution, then perform your data manipulation, then pass in the next filename until all are complete.
Related
I have an SSIS package created using SSDT and running as a job on SQL Server 2014.
This SSIS package retrieves an Excel file (.xlsx) from a specific folder and exports its content into a specific table on my SQL Server database. The package runs fine.
My Data Flow is in the following sequence:
Import Excel file from folder
Apply a Conditional Split to split data with today's date
Export the data into the SQL Server table in the database
Here is my problem:
I will now have 4 additional Excel files into that folder and they will need to be exported into that same SQL Server table.
So what is the best way forward to achieve this (assuming all of them are possible solutions):
Rewrite 4 additional SSIS packages from scratch?
Use “Save As” existing package with a new name (4 times) and modify the file name to be retrieved?
Modify my existing SSIS package to accommodate for the additional 4 Excel files?
Any help would appreciated.
Assuming the 4 excel files are the same structure and going to the same table, you'll want to use the ForEach loop for each file in the folder.
SentryOne has a good example of looping through each file in a folder and archiving. I imagine it can be adjusted for your use case.
I want to transfer SQL query results to a new csv file. This is because I have placed my SQL query inside a loop which will generate export query results to csv file each time. I'm using MS SQL Server 2012. I don't want to take GUI option.
Sql Server is not really designed to import and export files. You can use bulk copy program but I dont think it works in tsql code (looping). You can use openrowset but you need to set a special flag that opens up your surface area of attack which some do not want to do.
The answer is SSIS (or a tool like Talend). It comes with Sql and is designed by MS as the go to tool for import and export from Sql. If you were to right click on the data base, choose tasks and then export the wizard eventually creates and executes an SSIS package.
I recommend you reconsider a GUI option.
ps - Another answer was to use save results as. I have heard of problems using this method including problems with delimiters or text qualified fields.
There are multiple ways to attain this. Either you can export the resultset using BCP or using IMPORT/ EXPORT or using CTRL+SHIFT+S (this will change the resultset to SAVE AS. Hope this may help.
I'm trying to execute multiple SQL files and export the results to Excel files.
Until now, i used the «For each lood container» with the «Execute SQL task» and it's running well...
I think i should use «For each lood container» and «Data flow task», but i can't use multiple exports in it.
Thank you in advance for your help,
For multiple exports,
you can use a variable with expression for your excel file name. Increment it for each loop and use in your FileSystem Task to create the file.
I am using SQL developer. I need to export the SQL result set to Excel sheet. I know how to export it by manually.
Currently I am using a batch file. Which in turn run's multiple SQL script files. At the end there is a 1 SQL script which contains multiple SQL select statements. Now I need to export these result to excel sheet while running batch file itself.
Batch file Name: Mytest.SQL it contain multiple script files as below.
##test1.sql;
##test2.sql;
##test3.sql;
##test4.sql;
The last script test4.sql contains multiple select statements, which needs to be exported into multiple excels. Please suggest any solution.
Months ago I have found this solution. A ready to use package to unload query result into xlsx file, also with formating. Here is the link. There is also a description how to use the package.
pl/sql package to unload as xlsx
Hope this helps.
I used to do this VB-script, ie I run queries from Excel:
Set OraSession = CreateObject("OracleInProcServer.XOraSession")
Set ThisWorkbook.OraDatabase = OraSession.OpenDatabase(sDB$, sUSERID$, 0&)
Set OraDynaSet = ThisWorkbook.OraDatabase.CreateDynaset(QueryText, 0&)
You can in this macro calls a script and write cycle data in Excel
Do you need a specific xls file or a csv file?
If you want a csv you can spool a file with sql.
For xls files, you can't do it easily, you'll probably have to go through another programming langage like java or c# or whatever with a specific library to build your report (e.g. for Java Apache POI).
I rewrite a program based on the old Foxbase database consisting of files .dbf. I need a tool that would read these files, and helped in the transfer of data to PostgreSQL. You know maybe some of this type of tool?
pgdbf.sourceforge.net - has worked for all the DBF I've fed it. Quoting the site description:
PgDBF is a program for converting XBase databases - particularly
FoxPro tables with memo files - into a format that PostgreSQL can
directly import. It's a compact C project with no dependencies other
than standard Unix libraries.
If you are looking for something to run on Windows, and this doesn't compile directly, you could use cygwin (www.cygwin.com) to build and run pgdbf.
As part of the migration path you could use Python and my dbf module. A very simple script to convert the dbf files to csv would be:
import sys
import dbf
dbf.export(sys.argv[1])
which will create a .csv file of the same name as the dbf file. If you put that code into a script named dbf2csv.py you could then call it as
python dbf2csv.py dbfname
Hopefully there are some handy tools to get the csv file into PostgreSQL.