I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.
Related
First, a grumble: MS builds SQL Server Studio AND Excel, but can't make one save in the standard format of the other?
OK, I'm a data analyst, but not allowed to change/mod either the data or structures directly. So full READ, but no WRITE.
I'm trying to do a dump so I can do some of this analysis offline, as I have no remote access either.
So one VARCHAR2 column in this table is for comments on the purchase of the asset being described/tracked. Of course, there are commas. The only export types built into SQL Server Studio are .csv and .txt, and .csv just turns into a mess when 'comma' is included as a delimiter.
So after an hour or so of screwing around with this, (including reading a thread on methods for excluding the one column from a SELECT while still exporting the other 221 columns in the table, without having to write them all out manually (fun reading, impressive, but means I'd have to figure out which of them actually works, and then still export the one column separately and insert it in the Excel separately)) I am throwing this problem on the pile at StackOverflow.
Someone else must have worked around this frustration of the .csv format as export VS the commas embedded in 'comment' text.
Any help would be appreciated.
Why don't you simply select all data in ssms result window, then copy and then paste in a blank excel file?
It should copy paste all data in correct format including comma valued fields in single column.
Try that.
So If you replace the ' to some special character you can export it.
Select
Replace(columnName,'''','`')
from Table
Other solution if you use the manager studio
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/start-the-sql-server-import-and-export-wizard
I want to move the data from a spreadsheet into a database. The program I am using is called SQLWorkbenchJ. I am kinda of lost and don't really know where to start. Is there any tips or ways that might point me in the right direction.
Sql Workbench/J provides the WbImportcommand in order to load a text file into a DB table. So if you save your spreadsheet file in the CSV (comma separed value) format you can then load it in a table using this command.
Here is an example to load the text file CLASSIFICATION_CODE.csvhaving ,as field delimiter and ^ as quoting character in the CLASSIFICATION_CODEDB table.
WbImport -type=text
-file='C:\dev\CLASSIFICATION_CODE.csv'
-delimiter=,
-table=CLASSIFICATION_CODE
-quoteChar=^
-badfile='C:\dev\rejected'
-continueOnError=true
-multiLine=true
-emptyStringIsNull=false;
You might not need all the parameters of the example. Refer to the documentation to find the ones you need.
If the data you have in your spreadsheet are heterogeneous (e.g. your spreadsheet has two books) then split them in two files in order to store them in separate DB tables.
I have generated and excel from SSIS package successfully.
But every column is having extra ' (quote) mark why is it so?
My source sql table is like below
Name price address
ashu 123 pune
jkl 34 UK
In my sql table i took all column as varchar(50) datatype.
In Excel Manager when it is going to create table
Excel Destination took all column as same varchar(50) datatype.
And in Data Flow I have used Data Conversion transformation to prevent unicode conversion error.
Please advice where i need to change to get the clear columns in excel file.
You could create a template Excel file in which you have specified all the column types (change to Text from General) and headers you will need. Store it in a /Template directory and have copy it over to where you will need it from within the SSIS package.
In your SSIS package:
Use Script Component to copy Excel Template file into directory of choice.
Programatically change its name and store the whole filepath in a variable that will be used in your corresponding Data Flow Task.
Use Expression Builder for your Excel Connection Manager. Set the ExcelFilePath to be retrieved from your variable.
the single quote or apostrophe is a way of entering any data (in Excel) and ensure it is treated as text so numbers with leading zeros or fractions are not interpreted by Excel as numeric or dates.
a NJ zip code for instance 07456 would be interpreted as 7456 but by entering it as '07456 it keeps its leading zero (please note that numbers in your example are left aligned, like text is)
I guess SSIS is adding the quotes because your data is of VARCHAR type
First, define the field types for your excel destination in SSIS, any non-text fields will format properly without the '. Then, add a derived column transformation between your source and destination, and use a replace statement for any text columns.
Should be:
(REPLACE(Column1, "'","")
This caused me major problems! So I completed the following:
You can change the excel version to 'Microsoft Excel 4.0' within the excel connection manager in your SSIS package.
Then within excel follow Options > Trust Center > Trust Center Settings > File Block Settings > Untick the 'Open' checkbox for 'Excel 4 workbooks' and 'sheets'.
It is a particular problem when using the Excel destination, at least with older versions of SSIS anyway. To answer the why question, there is this in the Microsoft documentation:
The following behaviors of the Jet provider that is included with the Excel driver can lead to unexpected results when saving data to an Excel destination.
Saving text data. When the Excel driver saves text data values to an Excel destination, the driver precedes the text in each cell with the single quote character (') to ensure that the saved values will be interpreted as text values. If you have or develop other applications that read or process the saved data, you may need to include special handling for the single quote character that precedes each text value.
Taken from https://learn.microsoft.com/en-us/previous-versions/sql/sql-server-2008-r2/ms137643(v=sql.105)
What's the easiest way to export data to excel from SQL Server 2000.
I want to do this from commands I can type into query analyzer.
I want the column names to appear in row 1.
In Query Analyzer, go to the Tools -> Options menu. On the Results tab, choose to send your output to a CSV file and select the "Print column headers" option. The CSV will open in Excel and you can then save it as a .XLS/.XLSX
Manual copy and paste is the only way to do exactly what you're asking. Query Analyzer can include the column names when you copy the results, but I think you may have to enable that somewhere in the options first (it's been a while since I used it).
Other alternatives are:
Write your own script or program to convert a result set into a .CSV or .XLS file
Use a DTS package to export to Excel
Use bcp.exe (but it doesn't include column names, so you have to kludge it)
Use a linked server to a blank Excel sheet and INSERT the data
Generally speaking, you cannot export data from MSSQL to a flat file using pure TSQL, because TSQL cannot manipulate anything outside the database (using a linked server is sort of cheating). So you usually need to use some sort of client application anyway, whether it's bcp.exe, dtswiz.exe or your own program.
And as a final comment, MSSQL 2000 is no longer supported (unless your company has an extended maintenance agreement) so you may want to look into upgrading at some point.
I have such csv file, fields delimiter is ,. My csv files are very big, and I need to import it to a SQL Server table. The process must be automated, and it is not one time job.
So I use Bulk Insert to insert such csv files. But today I received a csvfile that has such row
1,12312312,HOME ,"House, Gregory",P,NULL,NULL,NULL,NULL
The problem is that Bulk Insert creates this row, specially this field "House, Gregory"
as two fields one '"House' and second ' Gregory"'.
Is there some way to make Bulk Insert understand that double quotes override behaviour of comma?
When I open this csv with Excel it sees this field normally as 'House, Gregory'
You need preprocess your file, look to this answer:
SQL Server Bulk insert of CSV file with inconsistent quotes
If every row in the table has double quotes you can specify ," and ", as column separators for that column using format files
If not, get it changed or you'll have to write some clever pre-processing routines somewhere.
The file format need to be consistent for any of the SQL Server tools to work
Since you are referring to Sql Server, I assume you have Access available as well (Microsoft-friendly environment). If you do have Access, I recommend you use its Import Wizard. It is much smarter than the import wizard of Sql Server (even version 2014), and smarter than the Bulk Insert sql command as well.
It has a widget where you can define the Text seperator to be ", it also makes no problems with string length because it uses the Access data type Text.
If you are satisfied with the results in Access you can import them later to Sql Server seamlessly.
The best way to move the data from Access to Sql is using Sql Server Migration Assistant, available here