Transfer data from local server to online server - sql

I am using SQL Server 2008 R2 on my local PC. In my database there is one table with approx 98000 rows. Now I want to transfer that data directly to online server database. I have tried by making script of that table but when I run that script, it gives me error of insufficient memory. plz help me.. how can I do this. Thanks

There are a variety of strategies you can employ in this instance. Here's a few off the top of my head...
Got some .NET programming up your sleeve? Try the SqlBulkCopy class
Export the data to a transferable format, e.g. CSV file and then use BULK INSERT to insert the data.
Try using OPENROWSET to copy from the local to remote. Stackoverflow example
If you've got the full leverage of SSIS, example of SSIS is just here
A bit Heath Robinson but why not grab the data out into CSV and using some Excel skills, build the individual statements yourself. Example here using INSERT INTO and UNION
HTH

Related

Send data from CSV file to SQL Server automatically?

I have a table in SQL Server where I need to insert data on regular base. Each day I perform same task importing data manually, it makes me feel tedious so I need your help. Is it possible to send data from CSV file to SQL Server's existing table without doing manual procedure.
Or using python to create a scrip that send data from CSV file to SQL Server at fixed time automatically.
First you have to create a python script that inserts data into SQL server after reading CSV file. Then you should create a CRON job on your server that runs this script regularly. This might be a possible solution for your problem.

copy blob data into on-premise sql table

My problem statement is that I have a csv blob and I need to import that blob into a sql table. Is there an utility to do that?
I was thinking of one approach, that first to copy blob to on-premise sql server using AzCopy utility and then import that file in sql table using bcp utility. Is this the right approach? and I am looking for 1-step solution to copy blob to sql table.
Regarding your question about the availability of a utility which will import data from blob storage to a SQL Server, AFAIK there's none. You would need to write one.
Your approach seems OK to me. Though you may want to write a batch file or something like that to automate the whole process. In this batch file, you would first download the file on your computer and the run the BCP utility to import the CSV in SQL Server. Other alternatives to writing batch file are:
Do this thing completely in PowerShell.
Write some C# code which makes use of storage client library to download the blob and once the blob is downloaded, start the BCP process in your code.
To pull a blob file into an Azure SQL Server, you can use this example syntax (this actually works, I use it):
BULK INSERT MyTable
FROM 'container/folder/folder/file'
WITH ( DATA_SOURCE = 'ds_blob',BATCHSIZE=10000,FIRSTROW=2);
MyTable has to have identical columns (or it can be a view against a table that yields identical columns)
In this example, ds_blob is an external data source which needs to be created beforehand (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql)
The external data source needs to use a database contained credential, which uses an SAS key which you need to generate beforehand from blob storage https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql)
The only downside to this mehod is that you have to know the filename beforehand - there's no way to enumerate them from inside SQL Server.
I get around this by running powershell inside Azure Automation that enumerates blobds and writes them into a queue table beforehand

R + Sql Server Script output to csv using R

I am trying to run a process that rolls over several time periods using data that sits in a Sql Server 2008 R2 Database. If it were smaller I'd pull it all in to R and just subset based on date. However the data is ~15GB and I need to be able to generate csv files directly from Sql Server (I've found the R-Sql connectors to be too slow to move large amounts of data) via the R-SQL connector (RODBC, etc.) and then read them into R.
It seems that sqlcmd or bcp are the only options but I wanted to check before going in that direction.
Any suggestions would be appreciated.
I use the package RODBC a lot it works like a charm, large query using sqlQuery can take some times though. Once the query is done you will get a dataframe that you can easily export in csv via write.csv. It will work without any issues.

How to Export data to Excel in SQL Server using SQL Jobs

I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.

SQL Server 2000, how to automate import data from excel

Say the source data comes in excel format, below is how I import the data.
Converting to csv format via MS Excel
Roughly find bad rows/columns by inspecting
backup the table that needs to be updated in SQL Query Analyzer
truncate the table (may need to drop foreign key constraint as well)
import data from the revised csv file in SQL Server Enterprise Manager
If there's an error like duplicate columns, I need to check the original csv and remove them
I was wondering how to make this procedure more effecient in every step? I have some idea but not complete.
For step 2&6, using scripts that can check automatically and print out all error row/column data. So it's easier to remove all errors once.
For step 3&5, is there any way to automatically update the table without manually go through the importing steps?
Could the community advise, please? Thanks.
I believe in SQL 2000 you still have DTS (Data Transformation Services) part of Enterprise Manager. Using that you should be able to create a workflow that does all of these steps in sequence. I believe it can actually natively import Excel as well. You can run everything from SQL queries to VBScript so there's pretty much nothing you can't do.
I used to use it for these kind of bucket brigade jobs all the time.