SQL Server Import and Export Wizard Data conversion failed - sql

I am trying to import data from a flat file, my table has 3 columns,
id - int - auto_increment - primary key
profileID - int
taken - bit
This is what my flat file looks like:
profileID
79518
27835
26853
15052
11165
67092
57399
There is 10,000 rows
and I keep getting this error when I try in import
Data Flow Task 1: Data conversion failed. The data conversion for column "profileID" returned status value 6 and status text "Conversion failed because the data value overflowed the specified type.".
I set the Data Type as single-byte signed integer [DT_I1] for profileID for the flat file.
Please Help!

Related

trying to import csv file to table in sql

I have 4 csv files each having 500,000 rows. I am trying to import the csv data into my Exasol databse, but there is an error with the date column and I have a problem with the first unwanted column in the files.
Here is an example CSV file:
unnamed:0 , time, lat, lon, nobs_cloud_day
0, 2006-03-30, 24.125, -119.375, 22.0
1, 2006-03-30, 24.125, -119.125, 25.0
The table I created to import csv to is
CREATE TABLE cloud_coverage_CONUS (
index_cloud DECIMAL(10,0)
,"time" DATE -- PRIMARY KEY
,lat DECIMAL(10,6)
,lon DECIMAL(10,6)
,nobs_cloud_day DECIMAL (3,1)
)
The command to import is
IMPORT INTO cloud_coverage_CONUS FROM LOCAL CSV FILE 'D:\uni\BI\project 1\AOL_DB_ANALYSIS_TASK1\datasets\cloud\cfc_us_part0.csv';
But I get this error:
SQL Error [42636]: java.sql.SQLException: ETL-3050: [Column=0 Row=0] [Transformation of value='Unnamed: 0' failed - invalid character value for cast; Value: 'Unnamed: 0'] (Session: 1750854753345597339) while executing '/* add path to the 4 csv files, that are in the cloud database folder*/ IMPORT INTO cloud_coverage_CONUS FROM CSV AT 'https://27.1.0.10:59205' FILE 'e12a96a6-a98f-4c0a-963a-e5dad7319fd5' ;'; 04509 java.sql.SQLException: java.net.SocketException: Connection reset by peer: socket write error
Alternatively I use this table (without the first column):
CREATE TABLE cloud_coverage_CONUS (
"time" DATE -- PRIMARY KEY
,lat DECIMAL(10,6)
,lon DECIMAL(10,6)
,nobs_cloud_day DECIMAL (3,1)
)
And use this import code:
IMPORT INTO cloud_coverage_CONUS FROM LOCAL CSV FILE 'D:\uni\BI\project 1\AOL_DB_ANALYSIS_TASK1\datasets\cloud\cfc_us_part0.csv'(2 FORMAT='YYYY-MM-DD', 3 .. 5);
But I still get this error:
SQL Error [42636]: java.sql.SQLException: ETL-3052: [Column=0 Row=0] [Transformation of value='time' failed - invalid value for YYYY format token; Value: 'time' Format: 'YYYY-MM-DD'] (Session: 1750854753345597339) while executing '/* add path to the 4 csv files, that are in the cloud database folder*/ IMPORT INTO cloud_coverage_CONUS FROM CSV AT 'https://27.1.0.10:60350' FILE '22c64219-cd10-4c35-9e81-018d20146222' (2 FORMAT='YYYY-MM-DD', 3 .. 5);'; 04509 java.sql.SQLException: java.net.SocketException: Connection reset by peer: socket write error
(I actually do want to ignore the first column in the files.)
How can I solve this issue?
Solution:
IMPORT INTO cloud_coverage_CONUS FROM LOCAL CSV FILE 'D:\uni\BI\project 1\AOL_DB_ANALYSIS_TASK1\datasets\cloud\cfc_us_part0.csv' (2 .. 5) ROW SEPARATOR = 'CRLF' COLUMN SEPARATOR = ',' SKIP = 1;
I did not realise that mysql is different from exasol
Looking at the first error message, a few things stand out. First we see this:
[Column=0 Row=0]
This tells us the problem is with the very first value in the file. This brings us to the next thing, where the message even tells us what value was read:
Transformation of value='Unnamed: 0' failed
So it's failing to convert Unnamed: 0. You also provided the table definition, where we see the first column in the table is a decimal type.
This makes sense. Unnamed: 0 is not a decimal. For this to work, the CSV data MUST align with the data types for the columns in the table.
But we also see this looks like a header row. Assuming everything else matches we can fix it by telling the database to skip this first row. I'm not familiar with Exasol, but according to the documentation I believe the correct code will look like this:
IMPORT INTO cloud_coverage_CONUS
FROM LOCAL CSV FILE 'D:\uni\BI\project 1\AOL_DB_ANALYSIS_TASK1\datasets\cloud\cfc_us_part0.csv'
(2 FORMAT='YYYY-MM-DD', 3 .. 5)
ROW SEPARATOR = 'CRLF'
COLUMN SEPARATOR = ','
SKIP = 1;

Table will not load into BigQuery

I've tried loading a table into BigQuery with no success. The error message I continue to get is attached below and I've tried manually entering my data along with letting Google determine my data as well and neither work.
Here is my error messages:
Error while reading data, error message: CSV table references column position 11, but line starting at position:606 contains only 1 columns.
Error while reading data, error message: CSV processing encountered too many errors, giving up. Rows: 0; errors: 1; max bad: 0; error percent: 0
And here is my schema:
Product_Type - String
Product_Name - String
Size - String
Manufacturer - String
SKU - String
NDC - String
Price - Float
UOM - String
Alt_UOM_Price - Float
Alt_UOM - String
Net_Price - Float
NEt_UOM - String
Try enabling Jagged rows when importing:

Failed to transfer data from GCS to Bigquery table

Need Help in DTS.
After creating a table "allorders" with autodetect schema, I created a data transfer service. But when I ran the DTS I'm getting an error. see Job below. quantity field type is for sure set to integer and all the data in the said field are whole numbers.
Job bqts_602c3b1a-0000-24db-ba34-30fd38139ad0 (table allorders) failed
with error INVALID_ARGUMENT: Error while reading data, error message:
Could not parse 'quantity' as INT64 for field quantity (position 14)
starting at location 0 with message 'Unable to parse'; JobID:
956421367065:bqts_602c3b1a-0000-24db-ba34-30fd38139ad0
When I recreated a table and set all fields to type string. It worked fine. see Job below
Job bqts_607cef13-0000-2791-8888-001a114b79a8 (table allorders)
completed successfully. Number of records: 56017, with errors: 0.
Try to find unparseable values in the table with all string fileds:
SELECT *
FROM dataset.table
WHERE SAFE_CAST(value AS INT64) IS NULL;

Insert new timestamp value to acc table in kamailio

I want to add a new column to acc table. I created a new column in the acc table of type timestamp and named it ring_time. In every call I put the ring time to a $dlg_var like this:
$dlg_var(ringtime) = $Ts;
Then I add a extra column in config like this:
modparam("acc", "log_extra", "src_user=$fU;src_domain=$fd;src_ip=$si;" "dst_ouser=$tU;dst_user=$rU;dst_domain=$rd;ring_time=$dlg_var(ringtime)")
but when I try to test it, I always get:
db_mysql [km_dbase.c:122]: db_mysql_submit_query(): driver error on query: Incorrect datetime value: '1591361996' for column kamailio.acc.ring_time at row 1 (1292)
Jun 5 17:29:59 kamailio /usr/sbin/kamailio[22901]: ERROR: {2 102 INVITE 105a0f4a3d99a0a5558355e54b43f4e1#192.168.1.121:5060} <core> [db_query.c:244]: db_do_insert_cmd(): error while submitting query
Jun 5 17:29:59 kamailio /usr/sbin/kamailio[22901]: ERROR: {2 102 INVITE 105a0f4a3d99a0a5558355e54b43f4e1#192.168.1.121:5060} acc [acc.c:477]: acc_db_request(): failed to insert into database
Sounds like an error with the SQL INSERT query, if I had to guess I'd say you're being caught out by the date format in the SQL table not matching the date format you're pushing to it.
I don't know the structure of your database, but there's a simple trick I use for debugging SQL queries when I can't see the query being run;
Start up Wireshark/TCPdump on the machine and packet capture for all SQL traffic (MySQL is port 3306) and replicate the error.
From the packet capture and you'll be able to see the Query Kamailio's database engine ran.
If the error "db_mysql [km_dbase.c:122]: db_mysql_submit_query(): driver error on query: Incorrect datetime value: '1591361996' for column kamailio.acc.ring_time at row 1 (1292)", the '1591361996' looks like it is an epoch for the $dlg_var(ringtime). The "Incorrect datetime value" part of the error looks like the database is trying to store the value in datetime data type so a data type mismatch. Double-check and you may need either change the ringtime to convert to datetime or change the database column to a type that will take epoch.

BULK insert error

I am trying to BULK insert from .csv file and i get the following error:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 23 (AR).
Msg 4864, Level 16, State 1, Line 4
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 3, column 23 (AR).
When i open the CSV file in Microsoft excel on row 2 column23 its just the number '0'.
So if i go manually in my database table and i insert the number 0 in the column AR it accepts it without any problems. I do not understand why this happens. Any help?
I assume your code looks something like this
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
{
// Create a reader somehow
IDataReader reader = new ... // <- Your problem will be here
bulkCopy.WriteToServer(reader);
}
In your reader you need to read the file according to it's type and encoding.
According to your file type you need to select the correct encoding from
System.Text.Encodig