is Multi Statement Request more peroformant than multiple separated request in teradata ?
I have a mainframe job that lunch a bteq script that is actually Multi Statement Request as described in the example below :
insert into table (col1, col2, col3) values (val1,val2,val3)
; insert into table (col1, col2, col3) values (val4,val5,val6)
; insert into table (col1, col2, col3) values (val7,val8,val9);
my question is should I keep this one job for the Multi Statement Request or separe it into multipe jobs for each insert ? which way is more performant ?
Thanks in advance.
If you are using BTEQ you can do a batch/bulk insert operation using the .REPEAT/PACK command. An example:
.set sessions 5
.logon ...
.import vartext ',' file = \\your\file\path\somefile.csv;
.repeat * pack 100
using (val1 integer, val2 varchar(20),val3 varchar(10))
insert into table (col1, col2, col3)
values(val1, val2, val3);
Even better is using a proper utility like fastload or TPT, but short of that any way you can cram your inserts into a single request the better off you are.
Related
I'm facing below error in CrateDB after upgrading Crate version 4.1.8 to 4.2.7
error during main processing: SQLActionException[UnsupportedFeatureException: Unknown function: to_object_array(db_name.tbl_name."object_array_type_col_name")]
error : {
"message": "SQLActionException[UnsupportedFeatureException: Unknown function: to_object_array(db_name.tbl_name."object_array_type_col_name")]",
"code": 4004
}
I'm trying to move data from one table to another using INSERT INTO with subsql query statement in CrateDB from existing table having column with data types OBJECT(DYNAMIC) and ARRAY(OBJECT(DYNAMIC)) and creating temp table with original schema of existing table.
As there is a column_policy = 'dynamic' at table level in original table, there are couple of columns added dynamically with same data types OBJECT(DYNAMIC) and ARRAY(OBJECT(DYNAMIC)).
Below is the full SQL query which I'm using to move the data which is working fine on Crate version 4.1.8 and raise above exception on version 4.2.7.
INSERT INTO temp_tbl (col1, col2_object, col3_object_array, col4, col5, dynamic_col6_object, dynamic_col6_object_array) (SELECT col1, to_object(col2_object), to_object_array(col3_object_array), col4, col5, to_object(dynamic_col6_object), to_object_array(dynamic_col6_object_array) FROM original_tbl);
UPDATE 1:
As mentioned/pointed by #proddata, I did try CAST but facing the below error
error: { "message": "SQLActionException[SQLParseException: The type 'object' of the insert source 'object_col_name' is not convertible to the type 'object' of target column 'object_col_name']", "code": 4000 }
to_object_array() is an internal / undocumented CrateDB function, which is hidden from 4.2 and upwards
Could you try to use <column> :: <type> or cast(<column> AS <type>) instead.
e.g.
SELECT
[] :: ARRAY(OBJECT(DYNAMIC)),
cast([] AS ARRAY(OBJECT(DYNAMIC)))
also see https://crate.io/docs/crate/reference/en/4.6/general/ddl/data-types.html#cast
INSERT INTO temp_tbl (col1, col2_object, col3_object_array, col4, col5, dynamic_col6_object, dynamic_col6_object_array)
(SELECT
col1,
col2_object :: OBJECT,
col3_object_array :: ARRAY(OBJECT),
col4,
col5,
dynamic_col6_object :: OBJECT(DYNAMIC),
dynamic_col6_object_array :: ARRAY(OBJECT(DYNAMIC))
FROM original_tbl);
Edit:
With some CrateDB version (probably ranging between 4.2.x - 4.5.1) there was a bug that prevented the INSERT of objects from another table, if the object column in the target column has different object properties, that aren't a superset of the source object column. e.g.:
More complete example ...
cr> CREATE TABLE dynamic_objects (
col1 TEXT
) WITH (column_policy = 'dynamic');
-- CREATE OK, 1 row affected (1.393 sec)
cr> INSERT INTO dynamic_objects (col1, obj_dyn, obj_arr_dyn) VALUES
('Hello', {a = 1}, [{x = 1},{y = 1}]);
-- INSERT OK, 1 row affected (0.216 sec)
cr> CREATE TABLE dynamic_objects_copy (
col1 TEXT
) WITH (column_policy = 'dynamic');
-- CREATE OK, 1 row affected (1.342 sec)
cr> INSERT INTO dynamic_objects_copy (col1, obj_dyn, obj_arr_dyn) VALUES
('Hello', {b = 1}, [{u = 1},{v = 1}]);
-- INSERT OK, 1 row affected (0.140 sec)
With version 4.2.7 the following query fails:
INSERT INTO dynamic_objects_copy (col1, obj_dyn, obj_arr_dyn)
SELECT col1, obj_dyn, obj_arr_dyn FROM dynamic_objects;
Tested with 4.2.7 (workaround for bug crate#11386
INSERT INTO dynamic_objects_copy (col1, obj_dyn, obj_arr_dyn)
SELECT col1, obj_dyn::TEXT::OBJECT, obj_arr_dyn::ARRAY(TEXT)::ARRAY(OBJECT) FROM dynamic_objects;
if columns already exist:
INSERT INTO dynamic_objects_copy (col1, obj_dyn, obj_arr_dyn)
SELECT col1, obj_dyn::TEXT, obj_arr_dyn::ARRAY(TEXT) FROM dynamic_objects;
Tested with 4.6.3 (works)
INSERT INTO dynamic_objects_copy (col1, obj_dyn, obj_arr_dyn)
SELECT col1, obj_dyn, obj_arr_dyn FROM dynamic_objects;
SELECT column_name, data_type FROM information_schema.columns
WHERE table_name = 'dynamic_objects_copy' AND column_name NOT LIKE '%[%';
+-------------+--------------+
| column_name | data_type |
+-------------+--------------+
| obj_arr_dyn | object_array |
| col1 | text |
| obj_dyn | object |
+-------------+--------------+
I am using SQL Server 2008. And I am trying to insert a string into a table but only part of the string is inserted. I have checked whether it is a SQL injection risk. How to solve or avoid this problem?
insert into tble (col1, col2, col3)
values (23, 34, "out of 8 works, 5 works are completed");
Only Out of 8 Works is inserted, , 5 works are completed was skipped
The double quotes will only work if QUOTED_IDENTIFIERS is turned off, if you are worried about SQL injection then don't pass a string on the INSERT - parameterise it from the application. The string could be truncated because the col3 is not defined long enough - check that also.
Try this..
insert into table(col1,col2,col3) values (23,34,'out of 8 works, 5 works are completed');
Change the size of col3 in your table (based on your string size).
You can change it by using the following query:
alter table tble
alter column col3 nvarchar(100) [null | not null]
I have a table of 3.3 million records and don't want to copy the entire thing from dev to prod (on a client controlled machine and can't get the linked server working correctly).
I only want to copy 300 or so of these records. How do I generate the 300 insert statements?
My select SQL that I want the inserts for is:
select * from data where ID > 9000;
I want a query that will print out all the INSERTS so that I can copy and run it on the production box.
I see you tagged your post SQL-Server-2005, that's too bad because version 2008 has a wizard tool for that.
You could build the insert statements out of concatenated strings.
If field1 is a string, field2 a numeric:
select 'insert into data (field1, field2) values('' || field1 || '', ' || char(field2) ||');' from data where ID < 9000;
Obviously that can be time-consuming if you have lots columns, considering that the strings needs quotes. You may have to convert the numeric columns using char() too.
That should give you a list of insert statements, like this:
insert into data (field1, field2) values('A', 10);
insert into data (field1, field2) values('B', 20);
insert into data (field1, field2) values('C', 30);
Maybe that's not the most elegant way to do this, but it works.
I have situations that I need to write multiple rows of the same value to setup some tables. Say I have to add 120 rows with two columns populated. I am looking for a shortcut, instead of having the Insert line repeated n times. How to do this?
In SQL Server Management Studio, you can use the "GO" keyword with a parameter:
INSERT INTO YourTable(col1, col2, ...., colN)
VALUES(1, 'test', ....., 25)
GO 120
But that works only in Mgmt Studio (it's not a proper T-SQL command - it's a Mgmt Studio command word).
Marc
How about
Insert Table( colsnames )
Select Top 120 #value1, #Value2, etc.
From AnyTableWithMoreThan120Rows
Just make sure the types of the values in the #Value list matches the colNames List
what about
insert into tbl1
(col1,col2)
(select top 120 #value1,#value2 from tbl2)
if in sql server 2008 . new in sql server 2008 to insert into a table multiple rows in a single query .
insert into tbl1
(col1,col2)
values
(#value1,#value2),(#value1,#value2),.....(#value1,#value2)
Put the values in an unused table for safe keeping. From there you can insert from this table to the tables you need to setup.
Create an Excel Spreadsheet with your data.
Import the speadsheet into Sql Server.
You can even try with something like this(just an example)
declare #tbl table(col1 varchar(20),col2 varchar(20))
; with generateRows_cte as
(
select
1 as MyRows
union all
select
MyRows+1
from generateRows_cte
where MyRows < 120
)
insert into #tbl(col1,col2)
select
'col1' + CAST(MyRows as varchar),'col2' + CAST(MyRows as varchar)
from generateRows_cte OPTION (MAXRECURSION 0)
select * from #tbl
Note:- Why not you are trying with Bulk insert into SqlServer from a dataset ? I didnot notice first that u have a front end too(VB)!
I'm using OPENROWSET(BULK ...) to insert the contents of a file into my table. The problem is that I also need to specify the value of another column in the same INSERT statement.
I have something like this:
INSERT INTO MyTable
SELECT *
FROM OPENROWSET(BULK 'c:\foo.bin', SINGLE_BLOB)
I'm sure there's a way to also specify the value of a different column here, but I don't know how.
Found it, it was in the link posted by astandar, but under example D:
INSERT INTO MyTable (col1, col2)
SELECT 'foo' AS col1, *
FROM OPENROWSET(BULK N'c:\foo.bin', SINGLE_BLOB) AS col2