Script to import ever changing Excel CSV file into SQL Server Express 2008 - sql

I have looked at this site and several other, and the closest I could come to was Dealing with a changing Excel table structure to import to a database table
But I'm actually looking for the script to identify the Column Name and Type from the CSV file, and to create the Table from that.
The actual Importing script is working well, but it only caters for fixed Headers, while I have headers varying between 160 and 170 during any given time.

Related

MSSQL DB Multi-Table Export to Excel Issue

I need to export a subset of db data tables into a single file. I tried using SSMS to export to an excel workbook. Each table would have its own worksheet. Sounds ideal !! However, many of our table names are > 31 characters in length, thus table names are being truncated by excel (which apparently has a nonconfigurable worksheet naming convention) as the worksheets are created. This won't work for us, and its far too late to change the table names. Therefore, seems excel workbook is out.
I am thinking there must be an existing tool that has a wizard that would allow us to select a handful of tables and store it in hierarchical way, in a single file, such that the table names are properly preserved in the file. Perhaps an export to json ??
Any suggestions ??
Could not find a product, or open source tool, that would properly save Azure SQL Tables whose table name length could be > 31 characters. Wrote a python script and saved tables in json format (snippet).
{
"artifacts": [{"tablename": "[dbo][Config_Entity_Metadata_Sample_Table]",
"contents": "{\"schema\"{\"fields[{\"name\":\"index\",\"type\":\"integer\"},}]
}

Exporting SQL Server table containing a large text column

I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.

How to export VARCHAR2 data that includes commas(!) to Excel from MSSQLMgmtStudio(2012)

First, a grumble: MS builds SQL Server Studio AND Excel, but can't make one save in the standard format of the other?
OK, I'm a data analyst, but not allowed to change/mod either the data or structures directly. So full READ, but no WRITE.
I'm trying to do a dump so I can do some of this analysis offline, as I have no remote access either.
So one VARCHAR2 column in this table is for comments on the purchase of the asset being described/tracked. Of course, there are commas. The only export types built into SQL Server Studio are .csv and .txt, and .csv just turns into a mess when 'comma' is included as a delimiter.
So after an hour or so of screwing around with this, (including reading a thread on methods for excluding the one column from a SELECT while still exporting the other 221 columns in the table, without having to write them all out manually (fun reading, impressive, but means I'd have to figure out which of them actually works, and then still export the one column separately and insert it in the Excel separately)) I am throwing this problem on the pile at StackOverflow.
Someone else must have worked around this frustration of the .csv format as export VS the commas embedded in 'comment' text.
Any help would be appreciated.
Why don't you simply select all data in ssms result window, then copy and then paste in a blank excel file?
It should copy paste all data in correct format including comma valued fields in single column.
Try that.
So If you replace the ' to some special character you can export it.
Select
Replace(columnName,'''','`')
from Table
Other solution if you use the manager studio
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/start-the-sql-server-import-and-export-wizard

Is there an easier way to import this data into a sql database?

I'm trying to import a bunch of ach files and make a big sql table. An ach file is a text file with transaction information arranged in columns. The problem is that I need to add a date column. Currently the date is only contained in the file name and header. There are about 3000 files and each file is a different date.
I have basic knowledge of sql commands and how to query a database, but I just started learning about importing data for this project. The only tool I found is the program called "import and export data" as a part of sql server 2012. It allows me to import the text file and make it into a table.
The problem is that I have to import the text file and create the table. Then I add a column for the date, do
update table
set date='date'
then I can combine tables with an insert command. The do it again 3000 times.
Is there a better way?
Write a program that can open all files in directory.
Extend that program to parse ach files.
Extend that program to get date from file name.
Extend that program to write to database.
I'd say it's 3 hours of work.

BCP utility to create a format file, to import Excel data to SQL Server 2008 for BULK insertion

Am trying to import Excel 2003 data into SQL table for SQL Server 2008.
Tried to add a linked server but have met with little success.
Now am trying to check if there's a way to use the BCP utility to do a BULK insert or BULK operation with OPENROWSET, using a format file to get the Excel mapping.
First of all, how can I create a format file for a table, that has differently named columns than the Excel spreadsheet colums?
Next, how to use this format file to import data from say a file at: C:\Folder1\Excel1.xsl
into table Table1 ?
Thank you.
There's some examples here that demonstrate what the data file should look like (csv) and what the format file should look like. Unless you need to do this lots I'd just hand-craft the format file, save the excel data to csv, then try using bcp or OPENROWSET.
The format file specifies the column names for the destination. The data file doesn't have column headings so you don't need to worry about the excel (source) cols being different.
If you need to do more mapping etc, then create an SSIS package. You can use the data import wizard to get you started, then save as SSIS package, then edit to your heart's content.
If it's a one-off I'd use the SQL data import size, from right-click on database in mgmt studio. If you just have a few rows to import from excel I typically open a query to Edit Top 200 rows, edit the query to match the columns I have in excel, then copy and paste the rows from excel into SQL mgmt studio. Doesn't handle errors very well, but quick.