Import updating CSV into SQL Server - sql

I'm looking for a simple solution (beginner to SQL) to allow the import of data from my .csv file to my SQL DB.
I have a third party program that is updating my .csv file every 30 seconds and I want to put that updating information into my SQL DB. I tried the importing & exporting wizard but it didn't work due to the .csv file being utilized by the other third party program.
Getting the information into the SQL DB doesn't need to be in real time it could just retrieve all the information when opening a saved sql query file.
Thank you!

OPENROWSET is the simplest one if you get that working in your env for CSV file. i have seen lot of issues with what OS, and what version of MS office installed with 32bit or 64 bit.
but bit more work and you will be all set with creating a small SSIS package to import that CSV in to table. execute that SSIS using SQL JOB at desired interval. later if you needed more complex insert/update you can always modify the package.

This is the case of producer consumer problem where one process is writing data and another one is reading it.
Whatever you do you need to setup some kind of lock on this file that process can check if file is available for reading/writing. If import/export wizard had issues with concurrency then probably other processes will also.
Another option is to always create new file to write into and have reader process to always read from the newest one and delete it after processing.
One more thing you’ll have to take care of is reading from same file multiple times. You need some way to mark the records that have already been read so these are not inserted twice.
All of the above is needed if this needs to be a fully automated and unattended process.
If not you can just manually create a copy of CSV file and then use import/export wizard to import the data.
Here is another resource you can check out for importing CSV into SQL Server
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/

Related

importing CSV from SAP R/3 to SQL database for reporting purpose

I want to import CSV files and invoices from SAP R/3 system into a SQL database. The database will be used for reporting purpose only, please tell me what will be the best possible way, which database to use and anything else that will be relevant to me in this context? and I am novice so please help....Thanks:)
If you are routinely importing CSV files then I recommend getting them comma delimited (or whatever delimiter you choose) and going the route of making an SSIS package with a corresponding SQL Agent Job that runs daily to check for the file and run it if it finds it.
Info on SSIS package creation:
http://smallbusiness.chron.com/import-csv-ssis-46849.html
If this is a one time load then I would recommend just using the import export wizard built into SQL Server.
https://msdn.microsoft.com/en-us/library/ms140052.aspx
Pretty easy to use the import export wizard too. Right click the database > tasks > import data. This will launch the wizard and will walk you through the one time import.
Adding Microsoft's official SSIS guide as well:
https://msdn.microsoft.com/en-us/library/ms169917.aspx

.SQL export from PHPMyAdmin to Excel or CSV

I inherited some old records for a company I volunteer for. One of the old files is an SQL Dump from their old webpage, and I would like to get the data from one of the tables for their use into Excel.
-- MySQL dump 10.11
The dump drops the table if it exists, creates the table new, and then inserts all of the data.
Is there some easy way I can get this data into Excel on my PC? I don't have SQL Server or anything like that loaded... I assumed there was some easy way to get a CSV or Excel file out of it but I have failed to find this yet without first uploading the dump to some SQL Server.
Unfortunately I don't think that there is any way to export a dump file into an excel or .CSV file. The reason for this is that the dump file is actually a collection of Select statements instead of the actual data itself. SQL servers do this to prevent a whole list of problems that can occur when you try to manipulate raw data manually.
Lucky for you, MySQL offers a free version of their server. You can find it here: http://dev.mysql.com/downloads/
I think you are best off downloading this and restoring your file as a new database. This has the added benefit of allowing you complete control over the data from that point on. Exporting to excel would be easy at that point however, you may find it a lot more fulfilling to continue using MySQL server.
Hope this helped.

Programmatic Export with Indeterminate Table Structure from SQL Server 2008 using T-SQL

I have searched for an answer to this, and one seems not to exist.
Problem:
A website is querying a database and unable to return results (as an export to Excel) in a timely fashion. This is primarily due to result set size. I'd like to set up a background process to 'ping' for waiting queries and execute them one-by-one, dumping data into a location to be downloaded from. The 'pinging' task can be handled a whole host of ways. My original ideal solution was a trigger (alternatively, a SQL Server Agent task) that exported the data to the filesystem. But I have run into an issue where I don't know how to set up an amorphous output to the filesystem with a simple T-SQL statement.
SSIS is apparently the standard solution to this. I don't know enough about SSIS to know whether it will handle what I want it to do, but I have been told the queries are too great in number / various in output for that to be a feasible solution.
xp_cmdshell can be run to do a BCP export. This works fine, but apparently opens a security hole.
Previous solutions:
A solution I used years ago, DTS passing data straight to the operating system, seems to have been disabled in SQL Server 2008/ 2012. I also used to be able to use sp_makewebtask to export data directly to the filesystem but no longer can do that either.
Current solution
I am writing a PowerShell script tied to some SQL tables and stored procedures to manage execution. This seems like a non-ideal solution; I'm curious as to whether I have missed something. Is there an easy way to set up SSIS to export data without a structure? A way to create an Excel file on the fly and fill it with data?
The answer seems to be No.
You can export to CSV instead of Excel (because Excel opens CSV files easily), but they don't have any formatting. You can set up SSIS (or BCP in a scheduled task) to export in the CSV file a single column which already contains the commas and the text delimiters, so the data would be presented by Excel in separate columns.

Is there a MacOS ODBC driver that reads SQL-command text files?

I've been searching without luck for a MacOS iODBC driver that can read saved .SQL files exported in Microsoft SQL Server format. Does one exist?
We've got a large pile of research data stored in one app that can export as Excel spreadsheets or SQL files (eg, a text file full of SQL CREATE TABLE and INSERT statements). We need to import this data into another app (Stata 9) that runs under MacOS and can import Excel files, its own format, or from an ODBC source. So, I need an ODBC driver that can read plain SQL files as its source. We don't need a driver that actually talks to an MSSQL database, because there is no actual database here; just a plain .SQL file with MSSQL-style commands in it.
Unfortunately, the default MacOS install seems to come with no ODBC drivers whatsoever, not even one for reading flat files or SQLite databases.
The current workflow for moving this data — exporting it from DatStat as an Excel spreadsheet, opening that spreadsheet and fixing it by hand to conform to Stata's need, then saving and reimporting into Stata — is ridiculously labor-intensive and also loses a lot of important metadata like variable descriptions and annotations.
I think that best thing to do here is load the data from DatStat to a database and then load it back into Stata. First, export your data from DatStat to a .sql file. I'm not familiar with DatStat, but if you can do this in bulk or via the command line it would be best. You can access your OS's terminal in Stata by using the -shell- command. After you have a .sql file, say foo.sql, you can use the following Stata code to send it to a database and then import into Stata.
odbc sqlfile("foo.sql"), dsn("DataSourceName")
odbc load, exec("SELECT * FROM CustomerTable") dsn("DataSourceName")
You could even issue a final command to cleanup the tables in the database if you don't think you'll use this database again and you don't want it taking up space. Use something like:
odbc exec("DROP TABLE CustomerTable")
Yes, this will probably be slow if your dataset is large, but it could be nice once your data is in the database because you can query parts of it at a time instead of importing the whole thing.
Lastly, you mentioned that no ODBC driver for Mac exists for MS SQL Server. If that is the case, you may want to install one of the open-source database systems like MySQL or PostgreSQL. I'm not a Mac user but drivers for these must exist for mac.
Good luck!

can a sql 2005 ssis package be scheduled?

I have a data dump that I manually initiate and I want to automate things now that they are working well. I have a system that exports data into Excel that I ultimately want to import into a SQL table.
I have a ssis package that I used for the import and saved it for re-use later. I just manually ran it and it works well. Now I would like to have it run either when invoked by a file watcher or schedule or some thing so that all I need to do is over-write the excel file and have it trigger the ssis to run its import.
Any ideas on how to make this happen?
SQL Server does its scheduling with SQL Agent, so try creating a schedule in that to do what you want.