Maximum size of .SQL file or variable for SQL server? - sql

I have a SQL file/SQL string which is about 20MB. SQL server simply cannot accept this file. Is there a limit on the maximum size of the .SQL file or variable which is used to query or insert data into SQL server ?
When I say variable, it means passing a variable to SQL server through some programming language or even ETL tool.

You can use SQLCMD, but I just ran into a 2GB file size limit using that command-line tool. This was even though I had a GO after every statement. I get an Incorrect syntax error once the 2GB boundary is crossed.
After some searching, I found this link:
https://connect.microsoft.com/SQLServer/feedback/details/272757/sqlcmd-exe-should-not-have-a-limit-to-the-size-of-file-it-will-process
The linked page above says that every character after 2GB is ignored. That could explain my Incorrect syntax error.

Yep, I've seen this before. There is no size limit to .sql files. It's more about what kind of logic is being executed from within that .sql file. If you have a ton of quick inserts into a small table ex: INSERT INTO myTable (column1) VALUES(1) then you can run thousands of these within one .sql file whereas if you're applying heavy logic in addition to your insert/deletes then you'll have these problems. The size of the file isn't as important as what's in the file.
When we came across these in the past, we ran the .sql files from SQLCMD . Very easy to do. You could also create a streamreader in C# or vb to read the .sql file and build a query to execute.
SQLCMD: SQLCMD -S [Servername] -E -i [SQL Script]
Was that clear enough? If you post an example of what you're trying to do then I could write some sample code for you.
When I first experienced this problem, the only solution I found was to split the .sql file into smaller ones. That didn't work for our solution but SQLCMD did. We later implemented a utility that read these large files and executed them with some quick c# programming and a streamreader.

Size of the SQL file should be limited by memory available on your PC/workstation. However, if you don't want to use osql and/or third party tool(s), there is a solution for this in the very SSMS. It's called SQLCMD Mode and it enables you to run a SQL file by referencing it, and not really opening it in editor.
Basically, all you have to do is:
In your Query menu select SQLCMD Mode
Look up the path to your called script (large SQL file)
Open up a New Query (or use existing one) and write this code in a new line
:r D:\PathToMyLargeFile\MyLargeFile.sql
Run that (calling) script
If you need to use a variable in your called script, you have to declare it in a calling script. Then your calling script should look like this:
:setvar myVariable "My variable content"
:r D:\PathToMyLargeFile\MyLargeFile.sql
Let's say your called script uses the variable for content that should be inserted into rows. Then it should look something like this...
INSERT INTO MyTable (MyColumn)
SELECT '$(myVariable)'

Pranav was kind of on the right track in referencing the Maximum Capacity Specifications for SQL Server article; however, the applicable limit to executing queries is:
Length of a string containing SQL statements (batch size)1 65,536 *
Network packet size
1 Network Packet Size is the size of the tabular data stream (TDS)
packets used to communicate between applications and the relational
Database Engine. The default packet size is 4 KB, and is controlled by
the network packet size configuration option.
Additionally, I have seen problems with large numbers of SQL statements executing in SQL Server Management Studio. (See SQL Server does not finish execution of a large batch of SQL statements for a related problem.) Try adding SET NOCOUNT ON to your SQL to prevent sending unnecessary data. Also, when doing large numbers of INSERT statements, try breaking them into batches using the GO batch separator.

I think your concern comes from trying to open your file in SSMS. Within SSMS, opening a 20mb file would likely be problematic -- no different than trying to open the same file in Notepad or most text editors.
For the record - for other posters - I don't think the questions has anything to do at all with SQL column, table, object, or database sizes! It's simply a problem with using the IDE.
If the file is pure data to be imported, with NO sql commands, try bulk import.
If the file is SQL commands, you're going to need an editor that can handle large files, like Vedit. http://www.vedit.com/ It won't be able to execute the sql. You must do that from the command line using sqlcmd as noted above.

Here are few links,
2 I hope they might be helpful for you

I came through this article on MSDN which specifies "Maximum Capacity Specifications for SQL Server", going through this, I was able to find :
For Sql Server 2012, 2008 R2, 2005 :
Maximum File size (data): 16 terabytes
Maximum Bytes per varchar(max), varbinary(max), xml, text, or image column: 2^31-1 Bytes (~2048 GB)
For more details on Maximum Capacity Specifications for SQL Server, refer:
For SQL Server 2012:
http://msdn.microsoft.com/en-us/library/ms143432(v=sql.110).aspx
For SQL Server 2008 R2:
http://msdn.microsoft.com/en-us/library/ms143432(v=sql.105).aspx
For SQL Server 2005:
http://msdn.microsoft.com/en-us/library/ms143432(v=sql.90).aspx
For Sql server 2000, I am not sure since MSDN seems to have removed related documentation .

It is not clear from your question what the SQL file contains. The solution I suggest below is only applicable if the SQL file you refer to has only insert statements.
The fastest way to insert large amounts of data into SQL server is to use bulk copy functionality (BCP BCP Utility)
If you have SQL Server management studio then you should also have the bcp utlity look in C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Binn (or equivalent).
If you want to use BCP utility then you would need to create a file that contains the data, this can be comma delimited. Refer to bcp documentation on what the file should look like.

For Maximum Capacity Specifications for SQL Server , you can check in here.
http://msdn.microsoft.com/en-us/library/ms143432(v=sql.120).aspx.
if you ask "Is there a limit on the maximum size of the .SQL file or variable which is used to query or insert data into SQL server ?" I will say yes there is a limit for each variabel.and if you want upload file with big size, i recommended you convert your file to varbinary or you can increasing the Maximum Upload Size in your sistem web.
here i give some example http://msdn.microsoft.com/en-us/library/aa479405.aspx

Related

Generated script of schema and table data not running in MS SERVER sqlcmd

I have a very large table script that is about ^ GB in size and cannot open in in Query Editor (obviously) due to memory/size.
I am trying to run it on the db server with the command propmt and using sqlcmd:
I am 100% sure the path and script name are correct (marked out for privacy reasons). I then used the following two scripts to get the DBServer\SQLInstance:
SELECT ##servername
SELECT ##servicename
What am I missing as it appears it has not done anything with the 21? _ prompt just sitting there. Do I need to do anything else?
I'm pretty sure the Windows command line pipeline is just choking on your previous command.
I think the best chance you have is doing this using PyPy:https://pypi.org/project/pymssql/, given the SQL instance has the memory to handle the data stream.

Most efficient way to output SQL table to XML file

I have server that needs to process and dump from an SQL database queries and tables into xml format on disk. This needs to be a scheduled task.
Currently using BCP via a scheduled batch file > sql script > xp_cmdshell >bcp, but this error
SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0][SQL Server]Warning: Server data (172885 bytes) exceeds host-file field length (65535 bytes) for field (1). Use prefix length, termination string, or a larger host-file field size. Truncation cannot occur
for BCP output files.
is troubling me in the log files. I have found no solution online yet. I do not quite understand what the 'host-file field' is referring to. The original table has no column with a value as large as 172885 bytes. The output files are very large, and so far it seems as thought the data is all being written, but there seems to be some garbage at the end of all the xml files.
Performance is important but reliability is the most important for me in this situation.
I have tried recreating the error locally but have been unsuccessful in doing so. The server runs Windows Server 2008 r2.
Any help or explanation/analysis of the error and it's meaning, as well as a recommendation of a simple scheduled solution to dump the sql tables/queries to xml files, would be appreciated.
You should check out the FOR XML PATH syntax introduced in SQL Server 2005:
SQL Server: simple example of creating XML file with T-SQL
What's new in FOR XML in SQL Server 2005
With this, you can easily create fairly nifty XML outputs, including hierarchies, attributes and more

How to do a search and replace of a part of a string in all columns in all tables in an sql database

Is it possible to search and replace all occurrences of a string in all columns in all tables of a database? I use Microsoft SQL Server.
Not easily, though I can thing of two ways to do it:
Write a series of stored procedures that identify all varchar and text columns of all tables, and generate individual update statements for each column of each table of the form "UPDATE foo SET BAR = REPLACE(BAR,'foobar','quux')". This will probably involve a lot of queries against the system tables, with a lot of experimentation -- Microsoft doesn't go out of its way to document this stuff.
Export the entire database to a single text file, do a search/replace on that, and then re-import the entire database. Given that you're using MS SQL Server, this is actually the easier approach. Microsoft created the Microsoft SQL Server Database Publishing Wizard for other reasons, but it makes a fine tool for exporting all of the tables of a SQL Server database as a text file containing pure SQL DDL and DML. Run the tool to export all of the tables for a database, edit the resulting file as you need, and then feed the file back to sqlcmd to recreate the database.
Given a choice, I'd use the second method, as long as the DPW works with your version of SQL Server. The last time I used the tool, it met my needs (MS SQL Server 2000 / 2005) but it had some quirks when working with database Roles.
In MySQL, you can do it very easily like this:
update [table_name] set [field_name] = replace([field_name],'[string_to_find]','[string_to_replace]');
I have personally tested this successfully on a production server.
Example:
update users set vct_filesneeded = replace(vct_filesneeded,'.avi','.ai');
Ref: http://www.mediacollege.com/computer/database/mysql/find-replace.html
A good starting point for writing such a query is the "Search all columns in all the tables in a database for a specific value" stored procedure. The full code is at the link (not trivial, but copy/paste it and use it, it just works).
From there on it's relatively trivial to amend the code to do a replace of the found values.

SSIS and MySQL - Table Name Delimiter Issue

I am trying to insert rows into a MySQL database from an Access database using SQL Server 2008 SSIS.
TITLE: Microsoft SQL Server Management Studio
------------------------------
ERROR [42000] [MySQL][ODBC 5.1 Driver][mysqld-5.0.51a-community-nt]You have
an error in your SQL syntax; check the manual that corresponds to your MySQL
server version for the right syntax to use near '"orders"' at line 1
The problem is with the delimiters. I am using the 5.1 ODBC driver, and I can connect to MySql and select a table from the ADO.Net destination data source.
The MySql tables all show up delimited with double-quotes in the SSIS package editor:
"shipto addresses"
Removing the double quotes from the "Use a table or view" text box on the ADO.NET Destination Editor or replacing them with something else does not work if there is a space in the table name.
When SSIS puts the Insert query together, it retains the double quotes and adds single quotes.
The error above is shown when I click on "Preview" in the editor, and a similar error is thrown when I run the package (albeit then from the actual insert statement).
I don't seem to have control over this behavior. Any suggestions? Other package types where I can hand-code the SQL don't have this problem.
Sorry InnerJoin, I had to take the accepted answer away from you. I found a workaround here:
The solution is to reuse the connection for all tasks, and to turn ANSI quotes on for the connection before you do any inserts, with an Execute Sql task that runs the following:
set sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,
NO_ENGINE_SUBSTITUTION,ANSI_QUOTES'
Try using square brackets around the table names. That may help.
EDIT: If you can, I would create views (with no spaces) based on the Access tables, and use those to export. Even if it means building another Access database with linked tables, I think this is your best bet.
I've always struggled with using SSIS with MYSQL directly. Even after installing the ODBC drivers, they just don't play well in data flows. I've always ended up creating linked ODBC connections between SQL Server and MYSQL. I then rely on linked server queries to bring over data. Instead of using a SSIS data flow task, I use an Execute SQL command, usually in the form of a stored procedure that executes an OPENQUERY.
One solution you could do is load the data into a SQL Server database and use it as a staging environment before you load it into the MYSQL database. I regularly move data between SQL Server 2008 and MYSQL and in the past I use to regularly move data between Access and SQL Server.
Another possible solution is to transform the incoming Access data before it loads into the MYSQL database. That may give you a chance to clean up the column names and the actual data that's going through to MYSQL.
Let me know if either of these work for you.
You can locate the configuration setting file my.ini at <<Drive>>:\ProgramData\MySQL\MySQL Server 5.6\my.ini and add "ANSI_QUOTES" to sql-mode.
e.g: sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ANSI_QUOTES". This should solve the issue while previewing in the SSIS editor.

How do I make a script in SQL Management Studio 2005?

I have a table in an MS SQL Server db. I want to create a script that will put the table and all records into another db. So I right-click the table in Management Studio and select Create-To new query editor... but all I get is the table structure.
How exactly do I get the values too?
One of the things I really like about the tools for MySQL that SQL Server is missing out of the box to be certain.
You can use a script to do it however.
You might also want to consider using something like Red-Gate SQL Compare and Red-Gate SQL Data Compare. They aren't cheap tools, priced at $395 each (for the standard editions), but there are 14 day free trials available for download, and they make copying schema and data from one SQL Server to another very easy.
If both are on the same machine (or on different machines but the servers are linked)
you can create the table with the script you can generate automatically and do this to copy the data:
INSERT INTO [destinationdb].[dbo].[destinationtable] SELECT *
FROM [originaldb].[dbo].[originaltable]
(Prepend [servername] to the database name if you'll be using linked servers)
Another option is to enable xp_cmdshell (do with care, it's relaxing security constraints) and use the bcp command line utility from the management studio to create copies you can then import into the other database/server. You can do that directly from the shell as well and do not need to enable xp_cmdshell in that case, of course.
it doesn't really create a "SQL script" but it does the job :
select the database in the object explorer
right click
select import/export data
follow the wizard
at the end of the process you can save the "integration service package" to reuse it
later you can modify the details by opening the .dtsx
(it will take care of security, and won't cost one more penny, it's seems we have to compete with other answers :) )
hope it helps.