I want insert random data value from my table in my bulk insert like that.
INSERT INTO insertedtable
SELECT s.a,
s.b,
s.c,
s.d,
s.e,
(SELECT *
FROM (SELECT pruchasecode
FROM pruchtable
ORDER BY dbms_random.value)
WHERE rownum < 2)
FROM seltable s
But it insert one value.
how can i insert random value each row in bulk insert statement.
I thought loop but execution cost is low.
Thanks.
Related
Setup:
CREATE TABLE tbl (group_id integer, id integer)
INSERT INTO tbl VALUES (1, 1)
INSERT INTO tbl VALUES (1, 2)
INSERT INTO tbl VALUES (1, 3)
INSERT INTO tbl VALUES (2, 1)
INSERT INTO tbl VALUES (2, 2)
INSERT INTO tbl VALUES (3, 1)
...
INSERT INTO tbl VALUES (999, 999999)
Query #1
SELECT * FROM tbl WHERE group_id > 1 AND group_id < 10 AND id > 100
Query #2
SELECT * FROM tbl WHERE id > 100 AND group_id > 1 AND group_id < 10
Is there any performance difference between query #1 vs query #2?
More specifically, is there any advantage or disadvantage querying a "lower resolution" column (eg group_id) before a "higher resolution" column (eg id)?
(Using Postgresql)
If you look at the explain query you will find that the order of your where clauses has no impact on the query plan, execution, or performance. From a PSQL standpoint your queries are functionally equivalent.
I need to insert a row into one table and use this row's id to insert two more rows into a different table within one transaction. I've tried this
begin;
insert into table default values returning table.id as C;
insert into table1(table1_id, column1) values (C, 1);
insert into table1(table1_id, column1) values (C, 2);
commit;
But it doesn't work. How can I fix it?
updated
You need a CTE, and you don't need a begin/commit to do it in one transaction:
WITH inserted AS (
INSERT INTO ... RETURNING id
)
INSERT INTO other_table (id)
SELECT id
FROM inserted;
Edit:
To insert two rows into a single table using that id, you could do that two ways:
two separate INSERT statements, one in the CTE and one in the "main" part
a single INSERT which joins on a list of values; a row will be inserted for each of those values.
With these tables as the setup:
CREATE TEMP TABLE t1 (id INTEGER);
CREATE TEMP TABLE t2 (id INTEGER, t TEXT);
Method 1:
WITH inserted1 AS (
INSERT INTO t1
SELECT 9
RETURNING id
), inserted2 AS (
INSERT INTO t2
SELECT id, 'some val'
FROM inserted1
RETURNING id
)
INSERT INTO t2
SELECT id, 'other val'
FROM inserted1
Method 2:
WITH inserted AS (
INSERT INTO t1
SELECT 4
RETURNING id
)
INSERT INTO t2
SELECT id, v
FROM inserted
CROSS JOIN (
VALUES
('val1'),
('val2')
) vals(v)
If you run either, then check t2, you'll see it will contain the expected values.
Please find the below query:
insert into table1(columnName)values('stack2');
insert into table_2 values(SCOPE_IDENTITY(),'val1','val2');
I have a table I want to insert into based on two other tables.
In the table I'm inserting into, I need to find the Max value and then do +1 every time to basically create new IDs for each of the 2000 values I'm inserting.
I tried something like
MAX(column_name) + 1
But it didn't work. I CANNOT make the column an IDENTITY and ideally the increment by one should happen in the INSERT INTO ... SELECT ... statement.
Many Thanks!
You can declare a variable with the last value from the table and put it on the insert statement, like this:
DECLARE #Id INT
SET #Id = (SELECT TOP 1 Id FROM YoutTable ORDER BY DESC)
INSERT INTO YourTable VALUES (#Id, Value, Value)
If its mysql, you could do something like this..
insert into yourtable
select
#rownum:=#rownum+1 'colname', t.* from yourtable t, (SELECT #rownum:=2000) r
The example to generate rownumber taken from here
If its postgresql, you could use
insert into yourtable
select t.*,((row_number() over () ) + 2000) from yourtable t
Please note the order for the select is different on both the queries, you may need to adjust your insert statement accordingly.
Use a sequence, that's what they are for.
create sequence table_id_sequence;
Then adjust the sequence to the current max value:
select setval('table_id_sequence', (select max(id_column) from the_table));
The above only needs to be done once.
After the sequence is set up, always use that for any subsequent inserts:
insert into (id_column, column_2, column_3)
select nextval('table_id_sequence'), column_2, column_3
from some_other_table;
If you will never have any any concurrent inserts into that table (but only then) you can get away with using max() + 1
insert into (id_column, column_2, column_3)
select row_number() over () + mx.id, column_2, column_3
from some_other_table
cross join (
select max(id_column) from the_table
) as mx(id);
But again: the above is NOT safe for concurrent inserts.
The sequence solution is also going to perform better (especially if the target table grows in size)
How do I select data from a table and populate it on a grid view then, insert the data to another table at the same time?
I am using an event, OnTextchanged.
OnTextChanged:
1.
select snum, itemname, desc
from item_tbl
2.
insert into detail
values snum, item, desc
left join item_tbl
where snum(of text box) = item_tbl.snum
Use Output clause to select the inserted rows, which helps you insert and select the rows at the same time.
INSERT INTO Insert_table
(col1,col2..)
Output inserted.col1,
inserted.col2
....
SELECT col1,
col2,
...
FROM select_table
I got this already working;
INSERT INTO TermsFinal
(old_classification, count, new_classification, old_term,new_term)
SELECT old_classification , Count(seed) AS count , new_classification, old_term, new_term FROM TermsTemp
GROUP BY old_classification
ORDER BY count DESC
There is one more field in the TermsFinal called SOURCE_TABLE which TermsTemp does not have.
I would like to populate that field too. I already got the $source_table value. I tried this but dis not work.
INSERT INTO TermsFinal
(SOURCE_TABLE,old_classification, count, new_classification, old_term,new_term)
'{$SOURCE_TABLE}', SELECT old_classification , Count(seed) AS count , new_classification, old_term, new_term FROM TermsTemp_TEMP
GROUP BY old_classification
ORDER BY count DESC
How do you add that value into the SOURCE_TABLE field of the TermsFinal while executing the insert into statement in one go?
and the other puzzling thing to me here, how come my first SQL insertinto works without the SQL keyword VALUES. This page http://www.w3schools.com/sql/sql_insert.asp teaches that VALUES part is needed!
You can put string (or any other type ) constant into select, for example
select 'string' as const_str , field1 from table1 will return 2 columns , the first will have "string" text for all rows. In your case you can do
INSERT INTO TermsFinal
(SOURCE_TABLE,old_classification, count, new_classification, old_term,new_term)
SELECT '{$SOURCE_TABLE}', old_classification , Count(seed) AS count , new_classification, old_term, new_term FROM TermsTemp_TEMP
GROUP BY old_classification
ORDER BY count DESC
There are several ways to insert data into a table
One row at a time. This is where you need the values keyword.
Insert into TableA (Col1, Col2,...) values (#Val1, #Val2,...)
You can get the id using select ##identity (at least in ms sql), if you have auto identity
on. This is useful when you need the ID for you next insert.
From select (what you're doing)
Insert into tableA (Col1, Col2)
Select Val1, Val2 from TableB
or insert a hard coded value and values from two separate tables
Insert into tableA (Col1, Col2, Col3, Col4)
Select 'hard coded value', b.Val1, b.Val2, c.Val1
from TableB b join Table c on b.TableCID=c.ID
Now you can insert more than one row at once.
Select into
This method ends up creating a new table from a query, and is great for quick backups of a table, or part of a table.
select * into TermsFinal_backup from TermsFinal where ...