I have a table in SQL Server where I need to insert data on regular base. Each day I perform same task importing data manually, it makes me feel tedious so I need your help. Is it possible to send data from CSV file to SQL Server's existing table without doing manual procedure.
Or using python to create a scrip that send data from CSV file to SQL Server at fixed time automatically.
First you have to create a python script that inserts data into SQL server after reading CSV file. Then you should create a CRON job on your server that runs this script regularly. This might be a possible solution for your problem.
Related
I have an Execl file. I want to convert it into a text file. Is it possible in SQL Server?
As you said, you get the excels on daily basis; why don't you create an SSIS task which will take all the files from a particular location where all these excels are dumped and convert it through a DFT. You can then schedule this package on SQL server agent to run on regular basis.
P.S. I am assuming you're familiar with SSIS.
My problem statement is that I have a csv blob and I need to import that blob into a sql table. Is there an utility to do that?
I was thinking of one approach, that first to copy blob to on-premise sql server using AzCopy utility and then import that file in sql table using bcp utility. Is this the right approach? and I am looking for 1-step solution to copy blob to sql table.
Regarding your question about the availability of a utility which will import data from blob storage to a SQL Server, AFAIK there's none. You would need to write one.
Your approach seems OK to me. Though you may want to write a batch file or something like that to automate the whole process. In this batch file, you would first download the file on your computer and the run the BCP utility to import the CSV in SQL Server. Other alternatives to writing batch file are:
Do this thing completely in PowerShell.
Write some C# code which makes use of storage client library to download the blob and once the blob is downloaded, start the BCP process in your code.
To pull a blob file into an Azure SQL Server, you can use this example syntax (this actually works, I use it):
BULK INSERT MyTable
FROM 'container/folder/folder/file'
WITH ( DATA_SOURCE = 'ds_blob',BATCHSIZE=10000,FIRSTROW=2);
MyTable has to have identical columns (or it can be a view against a table that yields identical columns)
In this example, ds_blob is an external data source which needs to be created beforehand (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql)
The external data source needs to use a database contained credential, which uses an SAS key which you need to generate beforehand from blob storage https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql)
The only downside to this mehod is that you have to know the filename beforehand - there's no way to enumerate them from inside SQL Server.
I get around this by running powershell inside Azure Automation that enumerates blobds and writes them into a queue table beforehand
I have a BCP process to move data from one server to another server, but it takes two trips: one to a .dat file, and one to the destination server. Is there any way to send all of the data directly to the destination server?
I'm trying to improve the speed of this process.
Assuming that you're using SQL Server 2005+, then SSIS; BCP writes to a file, but SSIS can go from one connection to another. Here's a few articles on how to bulk load data in SSIS:
Optimizing Bulk Import Performance
http://msdn.microsoft.com/en-us/library/ms190421(v=sql.105).aspx
The Data Loading Performance Guide
http://technet.microsoft.com/en-us/library/dd425070(SQL.100).aspx
We Loaded 1TB in 30 Minutes with SSIS, and So Can You
http://msdn.microsoft.com/en-us/library/dd537533(v=sql.100).aspx
I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.
Say the source data comes in excel format, below is how I import the data.
Converting to csv format via MS Excel
Roughly find bad rows/columns by inspecting
backup the table that needs to be updated in SQL Query Analyzer
truncate the table (may need to drop foreign key constraint as well)
import data from the revised csv file in SQL Server Enterprise Manager
If there's an error like duplicate columns, I need to check the original csv and remove them
I was wondering how to make this procedure more effecient in every step? I have some idea but not complete.
For step 2&6, using scripts that can check automatically and print out all error row/column data. So it's easier to remove all errors once.
For step 3&5, is there any way to automatically update the table without manually go through the importing steps?
Could the community advise, please? Thanks.
I believe in SQL 2000 you still have DTS (Data Transformation Services) part of Enterprise Manager. Using that you should be able to create a workflow that does all of these steps in sequence. I believe it can actually natively import Excel as well. You can run everything from SQL queries to VBScript so there's pretty much nothing you can't do.
I used to use it for these kind of bucket brigade jobs all the time.