Padding ssis input source columns to avoid truncation errors? - sql

First post. In SSIS I am using an ODBC Source, and the database (or ODBC driver) doesn't appear to report column metadata correctly for any of the tables in the database for varchar type columns. Therefore, each time I import a table, I get truncation errors on all the varchar fields. Is there any way to set the size of these fields besides doing it ONE AT A TIME in the advanced editor? When importing a flat file source it lets you select a padding % for string fields. Does something like this exist for OLE or ODBC sources? If not, is there any way I can override the column length to, say, force them all to be VARCHAR(1000)?

I have never experience SQL Server providing the wrong meta data for an ODBC connection and it is unlikely you have a ghost in the machine (Deus Ex Machina). The meta data of the column can be set in the ODBC source via the advanced editor. I am willing to bet that is where the difference is. To confirm this:
Right click the ODBC connection and select the Advanced Editor
Click on the Input/Ouput Properties tab
Expand OLE DB Source Output
Expand both External Columns and Output Columns
Inspect each column pair and verify that the meta data matches
Correct any outages in the meta data
Let me know if that works. If it does not work, please provide data and SQL query you are using.

The VARCHAR field width must be set to the maximum incoming field width. I know the default field width is 50. Regardless, each field must be set. I previously worked on a project with large numbers of columns on the input files. My solution was to store the meta-data for the columns in a database table and then I built a C# application to read in the meta-data and then modify the *.dtsx file and set the meta data on all columns. This is the best solution that I am aware of to automate the task.
Unfortunately, I don't have much experience with pulling data through ODBC. Are you pulling from an Access database? Or, what are you pulling from?

Related

Is it possible to override the Excel Data Type through SSIS?

I've tried finding a solution for my issue but alas the problem continues. I've got an Excel Data Destination which I am trying to map in to SSIS [Please note I am saying the issue with the way SSIS identifies the Data Type of the Excel input. The scenario is OLE DB Source > Data Conversion > Excel Destination, please don't tell me to do a Data Conversion or use the Input and Output Properties method because it doesn't work, it just converts back to what SSIS "thinks" it's meant to be the instant I click out of the operations window]. I'm trying to create a new Excel Document through SSIS by mapping out the template to my data source from OLE DB Source.
Now when I do it with example data in the Excel Destination, it works fine because SSIS registers that the value in the workbook is a NTEXT [which is what I want]. However, the instant I apply the expression to use a blank template [with just headers no example data] it converts the Data Type in my Template to NVARCHAR(255) which is wrong and my package fails when I execute it, due to incompatible Data Type.
I've tried converting the Data Type within the Excel workbook to a TEXT format but it doesn't matter because when you pull it in to the Data Flow SSIS overwrites it and identifies that Column as a NVARCHAR(255). Even when I give up and comply and change the Input Data to NVARCHAR(255) because I'm just so annoyed, it still doesn't work because it fails my package and gives me an error message that it truncates my column field [-_-"]. I can't win.
I'll probably try and use a SQL Command to force it to identify the column as a NTEXT in the Excel Destination Editor or just rewrite some form of Forced SSIS to identify the Column as NTEXT but is there another way I am not aware of? I feel this is quite a known issue and there should be a plausible solution. Any assistance will be appreciated. Thank you.

Exporting SQL Server table containing a large text column

I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.

Program to update the database table from the parameter with the excel sheet from select option in ABAP

Will come directly to the question.
Have 2 parameter like filename and table name. The requirement is to upload the data from the excel sheet to the database table enter in the other parameter. This should be in run time. No hardcoding of field names and that program should be flexible enough to suite any table. Please help.
I can think of two possible approaches:
Dynamic code generation -- write a program which writes a program
Use dynamic type tools
For 1. try googling
For 2. see https://wiki.scn.sap.com/wiki/display/Snippets/Example+-+create+a+dynamic+internal+table - this wiki shows a way (not sure if it is overkill as it creates the type from scratch whereas any table in your SAP system is already a defined type in the Data Dictionary).
You can do easily reference a parameterised table in Open SQL e.g. MODIFY (p_tab) ...
Perhaps you could do a generic SPLIT of a line read in from file by the delimiter into a table of fields - you can then use ASSIGN COMPONENT to match the fields you have read in to the fields in your internal type.
If you are doing this I think a white list of allowed tables would be wise - and auth checks. Otherwise someone could upload SAP standard tables with no authorisation.

Insert file into access table

I have a table named Reports which has 3 fields ID (auto number), filename (string field), theFile (attachment field).
What I want to is to run a SQL query and insert a PDF file into the attachments field (theFile).
Lets say the PDF file is located in the C: drive (C:\report1.pdf), I have tried the SQL query below but it is not working. I know its not good practice to store files in a database but I just want to try it out:
CurrentDb.Execute "INSERT INTO Reports (filename,theFile) VALUES ('report1'," & C:\report1.pdf & ")"
It's standard practice to store files in a database. Access certainly supports it, but not through SQL. You'll have to use DAO, as detailed at http://msdn.microsoft.com/en-us/library/office/bb258184%28v=office.12%29.aspx
"File" is not appropriate SQL data type supported in Access, available data types.
That is correct Derek, if you try to run a SQL statement like that you will get an error message of one type or another every time. I spent a fair amount of time researching this subject for my own DB, and from what I understand there are a number of options/alternatives; however, having an attachment column type and using SQL to insert a file is not an option with Access' current capabilities.
It is not bad practice to store files in a database, it is actually standard practice; however, it IS best practice to not store files in an ACCESS db. There are a few reasons for this which you can research on your own, but perhaps most notably, Access has a file size limit of 2GB, so if you store files in it you can run out of space quickly and then things get even more complicated.
Here are your options:
Change your column data type to OLE object and use some kind of stream reader to convert the files to binary data, then use a SQL statement to load them into your DB
Use the built in Access user interface for working directly with tables/attachments
Establish a DAO db connection and use Access' recordset.LoadFromFile function
Just store links to the files in the Access DB
The 4th option is the preferred method. It's very simple and you won't have to worry about complex code or the 2GB storage limit.

Display blob content using Eclipse database explorer

I'm connected, using Eclipse database development view (standard of Eclipse Indigo), to an Oracle DB in which, for a particular record (that I already know), I want to view one column content in "text" form (although the column contains BLOB data).
When I simply do a
select MYBLOBCOLUMN from MYTABLE where ID='myid'
SQL results view only show an execution log, but no data. So, how can I see that BLOB content ?
The BLOB datatype was invented in order to be able to transfer "custom" objects from one database to another. The Database itself has no idea how to interpret and display the stored data in the blob field.
It can be an image, application, video, audio or anything else. If you have stored normal text in a blob field your database program has no idea that it is regular text.
If you store text in a database, you better use (n)varchar or memo data type.