SQL Server 2008 SSIS: Importing date field incorrectly (from Oracle) - sql

The date fields in the file (CSV) I must import are in the format DD-MMM-YY, SQL Server is importing them as YY-MM-DD. So 01-FEB-13 is imported as 2001-02-13.
How can I change the format SQL uses to import the dates?
I don't have access to the original Oracle database, only the flat file exported from it. So everything I do pretty much has to be done in SQL.

Changing the date format that SQL Server uses by default would require mucking around with the Windows server culture settings. Not a good idea, especially if this is the only file where you're having this issue.
I would use a Script Transformation and the .NET Framework's DateTime.ParseExact method, which lets you completely control the expected format. Start by configuring the Flat File Connection Manager that you're using to read the CSV files so that the columns with the DD-MMM-YY dates have their DataType set to string or Unicode string rather than date, database date or any other date-specific type:
In your Data Flow, place a Script Transformation between your source and destination components, like thus:
Select each of the DD-MMM-YY date columns as inputs to the Script Transformation:
And create an output column with a DataType of date corresponding to each input column:
The Script Transformation code will look like this (using C#)
using System;
using System.Data;
using System.Globalization; // needed for CultureInfo.InvariantCulture
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void IncomingRows_ProcessInputRow(IncomingRowsBuffer Row)
{
string incomingFormat = "dd-MMM-yy";
Row.ConvertedCreateDate = DateTime.ParseExact(
Row.CreateDate,
incomingFormat,
CultureInfo.InvariantCulture);
Row.ConvertedLastUpdateDate = DateTime.ParseExact(
Row.LastUpdateDate,
incomingFormat,
CultureInfo.InvariantCulture);
}
}

You can use the to_char() function in oracle to format your dates so that they are in the format you want. Hopefully Gordon's comment about storing as text referred to a staging table. For a variety of reasons, your first import should be into a staging table anyway.

Related

How can I export an entire SQL Server table to CSV, when the table contains an Image column?

I have a SQL Server table with the following design:
I need to export this entire table to a CSV file, with the data looking like this in plain-text format:
When I use the Import/Export wizard in SSMS, I get the following results in the CSV:
As you can see, this doesn't contain the actual binary data from the Image columns.
Instead, it just says System.Byte[] in plain-text, in each row.
In the past I've copy/pasted the text directly from the SSMS results window, into notepad, to manually produce the CSV.
However this table is so long, and the images are so large, that my computer runs out of memory if I try this now.
There must be a method to export the binary data into a CSV, in 0x1234ABCD format (hex).
How could I accomplish this?
if you are a developed you can create small C# app Useing Base64 encoding for your binary data. It converts any binary data into readable string. The code is as following
Encode
public static string Base64Encode(string plainText)
{
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
return System.Convert.ToBase64String(plainTextBytes);
}
Decode
public static string Base64Decode(string base64EncodedData)
{
var base64EncodedBytes = System.Convert.FromBase64String(base64EncodedData);
return System.Text.Encoding.UTF8.GetString(base64EncodedBytes);
}
Following the advice in Dan Guzman's comment, I created a new view with the following query:
CREATE VIEW vwPhotosMen AS
SELECT SS,
CONVERT(varchar(max), CONVERT(varbinary(MAX), PhotoMen, 1), 1) AS PhotoMen,
CONVERT(varchar(max), CONVERT(varbinary(MAX), Thumbnail, 1), 1) AS Thumbnail
FROM dbo.PhotosMen
The SSMS Export Wizard was then able to export the data correctly from this view, to a plain-text CSV file.

I want to create U.S. Date Format to Indian Date Format Using Asp MVC Core 2.0

I am trying to Create Date Format the US to Indian Date Format like(dd/mm/yyyy hh:mm tt).
When I run the code on my local machine it works.
When we publish and fetch values from the server at that time it shows "US" Date Format(mm/dd/yyyy)
How τo do the internal conversion, in Appsettings.json what strings i need to mention.
public static DateTime ConvertIndianDateFormat(DateTime usTime)
{
DateTime dateTime = DateTime.Now;
TimeZoneInfo usEasternZone = TimeZoneInfo.FindSystemTimeZoneById("US Eastern Standard Time");
TimeZoneInfo indianZone = TimeZoneInfo.FindSystemTimeZoneById("India Standard Time");
DateTime usEasternTime = TimeZoneInfo.ConvertTimeFromUtc(usTime, usEasternZone);
DateTime indianTime = TimeZoneInfo.ConvertTimeFromUtc(usTime, indianZone);
return indianTime;
}
This is because you are probably using something like DateTime.Now for C# and if you have an SQL Server you are using GETDATE(). It's not like an issue with application.json or something. The above functions return the machine datetime, thus why locally on your pc the time is correct and incorrect if you upload it to a server.
So make sure that the time is correct. If you are uploading to servers in another country then you will probably have a different time and/or format.
How you proceed depends on your needs:
Is the time correct?
Then simply reformated it or store it specifically using
DateTime.Now.ToString("dd/mm/yyyy hh:mm tt") // this is not the correct format.
Do you want to serve multiple clients in multiple regions/countries?
Then you should store the time as UTC and cast it based on clients date format. For example the server could be in USA and someone from UK would view a different time than his own which would be weird.
DateTime.UtcNow
Generally your problem could be large or small depending on your needs

Pentaho kettle convert date to unix

I'm trying to pacha a string format dated "2019-05-14 13:30:00" to a UNIX format.
In javascript I got it but in the javascript kettle module I am not able to return the numeric value 1557833442
the line of code is this:
const tests = (new Date ("2019-05-14 13:30:00"). getTime () / 1000);
It looks like the Date() constructor doesn't like the format you are using.
If you want the current date, use a Get System Info, it has a number of useful date options.
If you are converting an incoming field, use the Select Values step to change the metadata, using the format string that matches your string field's format.

Error Converting DateTime using format

I get the following error when trying to import a CSV file.
Error Converting '2007/01/02' to type: 'DateTime'. Using the format: 'yyyy/MM/dd'
I have set the class like this:
[FieldConverter(ConverterKind.Date, "yyyy/MM/dd")]
public DateTime PriceDate;
Any idea why that could be, since the format matches - it is the second of Jan 2007?
When I change the date format to 2007.01.02 then Filehelpers parses perfectly.
I use V 3.1.5.0
Thanks
Try changing the mask to:
[FieldConverter(ConverterKind.Date, "yyyy/M/d")]
public DateTime PriceDate;
Click on the 'Converters' tab in this link to learn more about a handful of datatypes available, such as numeric and dates.

SQL Bulk import from CSV

I need to import a large CSV file into an SQL server. I'm using this :
BULK
INSERT CSVTest
FROM 'c:\csvfile.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
problem is all my fields are surrounded by quotes (" ") so a row actually looks like :
"1","","2","","sometimes with comma , inside", ""
Can I somehow bulk import them and tell SQL to use the quotes as field delimiters?
Edit: The problem with using '","' as delimiter, as in the examples suggested is that :
What most examples do, is they import the data including the first " in the first column and the last " in the last, then they go ahead and strip that out. Alas my first (and last) column are datetime and will not allow a "20080902 to be imported as datetime.
From what I've been reading arround I think FORMATFILE is the way to go, but documentation (including MSDN) is terribly unhelpfull.
Try FIELDTERMINATOR='","'
Here is a great link to help with the first and last quote...look how he used the substring the SP
http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file
Another hack which I sometimes use, is to open the CSV in Excel, then write your sql statement into a cell at the end of each row.
For example:
=concatenate("insert into myTable (columnA,columnB) values ('",a1,"','",b1,"'")")
A fill-down can populate this into every row for you. Then just copy and paste the output into a new query window.
It's old-school, but if you only need to do imports once in a while it saves you messing around with reading all the obscure documentation on the 'proper' way to do it.
Try OpenRowSet. This can be used to import Excel stuff. Excel can open CSV files, so you only need to figure out the correct [ConnectionString][2].
[2]: Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=c:\txtFilesFolder\;Extensions=asc,csv,tab,txt;
I know this isn't a real solution but I use a dummy table for the import with nvarchar set for everything. Then I do an insert which strips out the " characters and does the conversions. It isn't pretty but it does the job.
Id say use FileHelpers its an open source library
Do you need to do this programmatically, or is it a one-time shot?
Using the Enterprise Manager, right-click Import Data lets you select your delimiter.
You have to watch out with BCP/BULK INSERT because neither BSP or Bulk Insert handle this well if the quoting is not consistent, even with format files (even XML format files don't offer the option) and dummy ["] characters at the beginning and end and using [","] as the separator. Technically CSV files do not need to have ["] characters if there are no embedded [,] characters
It is for this reason that comma-delimited files are sometimes referred to as comedy-limited files.
OpenRowSet will require Excel on the server and could be problematic in 64-bit environments - I know it's problematic using Excel in Jet in 64-bit.
SSIS is really your best bet if the file is likely to vary from your expectations in the future.
u can try this code which is very sweet if you want ,
this will remove unwanted semicolons from your code.
if for example your data is like this :"Kelly","Reynold","kelly#reynold.com"
Bulk insert test1
from 'c:\1.txt' with (
fieldterminator ='","'
,rowterminator='\n')
update test1<br>
set name =Substring (name , 2,len(name))
where name like **' "% '**
update test1
set email=substring(email, 1,len(email)-1)
where email like **' %" '**
Firs you need to import CSV file into Data Table
Then you can insert bulk rows using SQLBulkCopy
using System;
using System.Data;
using System.Data.SqlClient;
namespace SqlBulkInsertExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable("ProductSalesData");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType("System.DateTime");
dateColumn.ColumnName = "SaleDate";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = "ProductName";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType("System.Int32");
totalSalesColumn.ColumnName = "TotalSales";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let's populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow["SaleDate"] = DateTime.Now.Date;
dailyProductSalesRow["ProductName"] = "Nike";
dailyProductSalesRow["TotalSales"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}
This is an old question, so I write this to help anyone who stumble upon it.
SQL Server 2017 introduces the FIELDQUOTE parameter which is intended for this exact use case.
Yup, K Richard is right: FIELDTERMINATOR = '","'
See http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file for more info.
You could also use DTS or SSIS.
Do you have control over the input format? | (pipes), and \t usually make for better field terminators.
If you figure out how to get the file parsed into a DataTable, I'd suggest the SqlBulkInsert class for inserting it into SQL Server.