Query contains parameters but import file contains different values [importing csv to Teradata SQL] - sql

I am using Teradata SQL to import a CSV file. I clicked import to activate the import operation, then typed the following
insert into databasename.tablename values(?,?,?,...)
I made sure to specify the database name as well as what I want the table to be named, and I put 13 commas--the number of columns in my CSV file.
It gives me the following error:
Query contains 13 parameters but Import file contains 1 data values
I have no idea what the issue is.

The default delimiter used by your SQL Assistant doesn't match the one used in the CSV, so it doesn't recognise all the columns.
On SQL Assistant, go to : Tools >> Options >> Export/Import and choose the proper delimiter so it matches the one in your CSV.

Related

BigQuery CSV Import: Allow Jagged Rows

I'm trying to import a CSV to a BigQuery table from the user interface. My import job fails with the message:
Too many errors encountered. (error code: invalid)
gs://foo/bar.csv: CSV table references column position 15, but line starting at position:29350998 contains only 15 columns. (error code: invalid)
I'm assuming this means the importer doesn't like null fields in source data without an actual null string. Is there a way to make the UI allow jagged rows on import? Or, if not, what CLI command should I use to import my CSV file to a table this way?
The UI has an Allow jagged rows checkbox that you can select. Did you try that? It's part of the Options for the Create Table wizard.

importing excel table into database

I have a following table in xlsx format which I would like to import into the my sql database:
The table is pretty complicated and I only want the records after '1)HEADING'
I have been looking at php libraries to import into sql but they only seem to be for simple excel files.
You have two ways to realize that :
First method :
1) Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
2) Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
3) Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
4) Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
Second method :
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.

Skip Columns During Teradata Table Import From CSV Using SQL Assistant

I have a CSV file with data I need to import to a Teradata table, but it has a useless column that I would like to exclude from the import. The useless column is the first column, so the CSV rows are set up like:
'UselessData','Data','Data','Data'
Typically, I would import using SQL Assistant by choosing File -> Import Data from the menu and using the basic query:
INSERT INTO TableName VALUES (?,?,?,?)
But this will collect the extraneous useless data in Column 1. Is there a way to specify that an import take only certain columns or send the useless column to NULL?
AFAIK you can't do that with SQL Assistant.
Possible workarounds:
Switch to Teradata Studio or TPT for loading (will also load faster)
Load all columns into a Volatile Table first (and don't forget to increase the Maximum Batch size for simple Imports in Tools -> Options -> Import) and then Insert/Select into the target.

Only import specific data columns - Comma Delimited List

I used the following command to import data from a text file, however, I need to find out a way of selecting specific columns within the text file. The following links have been suggested to me however I'm struggling to understand whether I need to replace my current SQL with the examples on MSDN:
BULK INSERT T2 FROM 'c:\Temp\Data.txt' WITH (FIELDTERMINATOR = ',')
http://msdn.microsoft.com/en-us/library/ms179250.aspx
http://msdn.microsoft.com/en-us/library/ms187908.aspx
I have the following fields held within a text file which is separated by comma. The data is also separated by comma enabling me to use the above code to import it all.
Date,Time,Order,Item,Delivery Slot,Delivery Time
Is there a way to only import Date, Time, Item and Delivery Time into an SQL database table?
Use a Format File for your BULK INSERT. You can specify which fields are imported through this file definition.
EDIT: example from MSDN.
BULK INSERT bulktest..t_float
FROM 'C:\t_float-c.dat' WITH (FORMATFILE='C:\t_floatformat-c-xml.xml');
GO

Oracle SQL Dump file extracting parts to sql/another dump file

I have a Oracle DB dump file and now I only need parts of the tables that are included there. Does anyone know how I can extract this parts into a separate dump file (or SQL)?
I thought about using the import statement. Import from dump file (full export) to dumpfile (needed parts) something like this, but don't know if its possible this way
import user/pw directory=fullexport_dump dumpfile=part.dmp logfile=import.log status=30
No it's not possible. You can only limit rows while exporting using query parameter.
exp ..... query="where id=10"
You may search further in the Oracle Documentation.
So, import the whole table, and create a new table with only required parts:
create table NEEDEDPARTS as select * from FULLEXPORT where id=10
Or, import the whole table and re-export with query parameter.