I want to create a table which should contain max of 5 rows, when ever there is a 6th insert operation the last row must be deleted.
I want max of 5 rows and I want to do this in SQLIte database in android.
please suggest me a query which is simple.
You could do a row count in your application. If it returns more than four, or eaqual to five, rows you delete the last row before inserting a new one.
UPDATE:
I did a bit of testing on a table which i named test_table, and added this trigger:
DELIMITER $$
CREATE TRIGGER `schema_name`.`trigger_name` BEFORE INSERT ON `database_name`.`test_table`
FOR EACH ROW
if (select count(*) from test_table)> 4 /* count records to see if it exeeds your limit */
then
delete from test_table where id=(SELECT MAX(id) FROM test_table LIMIT 1); /* delete last row */
END if;
END
When I try to insert a sixth row I'm getting this error:
ERROR 1442: 1442: Can't update table 'test_table' in stored function/trigger because it is already used by statement which invoked this stored function/trigger.
I looked it up and found this answer: https://stackoverflow.com/a/21117071/1355562 - which lead me to http://dev.mysql.com/doc/refman/5.6/en/stored-program-restrictions.html.
The section "Restrictions for Stored Functions", point 5, says "... cannot modify a table that is already being used..."
Looks like you have to do this in you application bro ;)
ops.. just noticed you're working an MySQLlite.. Maybe it's different. This was for MySQL... Sorry about that...
Related
I have an ORDS enabled schema which accepts bulk JSON records and splits them one by one, and inserts them one by one into UEM table.
I've tried to create a trigger which fetches the last inserted row's id and use that value to insert into another table. The problem is, the trigger below only fetches and inserts the last inserted row's id, and it only does 1 insert.
To be more specific:
ORDS gets a bulk JSON payload which consists of 4 records.
POST handler starts a Procedure which splits these 4 records by line break, and immediately inserts these to CLOB columns of UEM table as 4 separate rows by using "connect by level". There is also the ID column which is automatically created and incremented.
In the parallel I also would like to get the ID of these rows and use it in another table insert. I've created the compound trigger below, but this trigger only retrieves the ID of the last record, and inserts only one row.
Why do you think it behaves like this? In the end, the procedure "inserted" 4 records.
CREATE OR REPLACE TRIGGER TEST_TRIGGER5
FOR INSERT ON UEM
COMPOUND TRIGGER
lastid NUMBER;
AFTER STATEMENT IS
BEGIN
SELECT MAX(ID) INTO lastid FROM UEM;
INSERT INTO SPRINT1 (tenantid, usersessionid, newuser, totalerrorcount, userid) VALUES ('deneme', 'testsessionid', 'yes', lastid, 'asdasfqwqwe');
END AFTER STATEMENT;
END TEST_TRIGGER5;
inserts these to CLOB columns of UEM table as 4 separate rows by using "connect by level".
You have 1 INSERT statement that is inserting 4 rows.
In the parallel I also would like to get the ID of these rows and use it in another table insert. I've created the compound trigger below, but this trigger only retrieves the ID of the last record, and inserts only one row.
Why do you think it behaves like this? In the end, the procedure "inserted" 4 records.
It may have inserted 4 ROWS but there was only 1 STATEMENT and you are using an AFTER STATEMENT trigger. If you want it to run for each row then you need to use a row-level trigger.
CREATE OR REPLACE TRIGGER TEST_TRIGGER5
AFTER INSERT ON UEM
FOR EACH ROW
BEGIN
INSERT INTO SPRINT1 (tenantid, usersessionid, newuser, totalerrorcount, userid)
VALUES ('deneme', 'testsessionid', 'yes', :NEW.id, 'asdasfqwqwe');
END TEST_TRIGGER5;
/
db<>fiddle here
Why? Because it is a statement level trigger. If you wanted it to fire for each row, you'd - obviously - use a row level trigger which has the
for each row
clause.
I have the following table
Data --Table name
ID -- Identity column
PCode -- Postal Code
I created the following trigger:
CREATE TRIGGER Trig
ON Data
FOR INSERT
AS
BEGIN
Select * from inserted
END
And inserted the following values
INSERT INTO Data VALUES (125)
INSERT INTO Data VALUES (126)
INSERT INTO Data VALUES (127)
It shows this:
But I was expecting something like this:
After the 1st insertion, the trigger is executed -> one row is shown in the inserted table.
After the 2nd insertion, the trigger is executed -> two rows are shown in the inserted table.
After the 3rd insertion, the trigger is executed -> three rows are shown in the inserted table.
According to msdn.microsoft all the rows inserted are in this table.
How can I access the inserted table so that I can see all the expected rows and not separately?
You can not. From the Use the inserted and deleted Tables article on microsoft.com, you can read:
The inserted table stores copies of the affected rows during INSERT and UPDATE statements.
That means that the inserted table will only contain rows for the current INSERT or UPDATE statement.
If you do want to see all rows for several such INSERT or UPDATE statements, you will have to store these rows in a table you created yourself.
There are 2 table available in a trigger, the inserted and the deleted. Each update on table XXX is actually a delete row X from XXX then an insert of row X in table XXX. So the inserted inside the trigger is a copy of what got inserted. You can do a lot with a trigger, but triggers are dangerous.
For example, on a performance gig, I found a huge SP being run by a trigger, we dropped it and the database came back online. Or another example, if you do a trigger wrong to audit logins, you can down the server.
As TT mentioned, if you want to see all the inserted records then you need to change your Trigger to something like this:
CREATE TRIGGER Trig
ON Data
FOR INSERT
AS
BEGIN
Select * into "tablename"
from
(Select * from inserted) Ins
END
I'm did some research about SQL batch inserts - let's say I have 100k items to be inserted, and I set the batch size to 100.
If the ID column is not marked as Identity then that bulk insert will work.
But I found quite interesting (theoretical so far) problem, and I need some opinions:
The problem can be, if e.g. 5 users are making the bulk inserts in the same time, how then safely provide the the ID column value ? I can't just get the table rows count + 1, because in that way all of that 5 users will have the ID duplicates and the bulk insert operation will fail.
You can use SEQUENCE as an UNIQUE ID generator or try TRIGGER ON INSERT to get a unique ID.
EDIT
With mysql you can build trigger for every row
DELIMITER $$
CREATE TRIGGER adresse_trigger_insert_check
BEFORE INSERT ON adresse
FOR EACH ROW BEGIN
IF NEW.land IS NULL THEN
SET NEW.land := 'XY';
END IF;
END$$
DELIMITER ;
Should I Use IDENTITY or Not?
Sometimes Another Approach Works Better
I have a requirement as a trigger should get fired when any row is inserted or deleted from table FAB which contains num as unique value. and depending upon that num value, another table should be update.
e.g.
FAB table
num code trs
10 A2393 80
20 B3445 780
Reel table
reelnr num use flag
340345 10 500 1
when num 10 from FAB table gets deleted(or any new num gets inserted), the trigger should get fired and should check the reel table which contains that num value and give the reelnr.
How to proceed with this?
you can Use Inserted & Deleted Table in SQL
These two tables are special kind of table that are only available inside the scope of the triggers.
If you tries to use these tables outside the scope of Triggers then you will get Error.
Inserted : These table is use to get track of any new records that are insert in the table.
Suppose there are Six rows are inserted in your table then these table will consist of all the six rows that are inserted.
Deleted : These table is used to track all the Deleted record from your tables.
Last delete rows will be tracked by these table.
For Insert :
CREATE TRIGGER TR_AUDIT_Insert
ON Reel_table
FOR INSERT
AS
BEGIN
INSERT INTO Reel_table (reelnr, num, use, flag)
SELECT
reelnr,
num,
use,
flag
FROM
INSERTED
END
For Delete :
CREATE TRIGGER TR_AUDIT_Delete
ON Product
FOR DELETED
AS
BEGIN
INSERT INTO Reel_table (reelnr, num, use, flag)
SELECT
reelnr,
num,
use,
flag
FROM
DELETED
END
Note :
I don't know from where these three reelnr, use flag values you are getting
So, Please modify this as per your need.
This is the format of the Triggers that normally we use.
You also can do this by using single trigger also
I dont know what is your exact requirement
If you want to achieve by only single Trigger then you can refer this link :
Refer
I have written a Trigger which is transferring a record from a table members_new to members_old. The Function of trigger is to insert a record into members_old on after insert in members_new. So suppose a record is getting inserted into a members_new like
nMmbID nMmbName nMmbAdd
1 Abhi Bangalore
This record will get inserted into members_old with the same data structure of the table
My trigger is like :
create trigger add_new_record
after
insert on members_new
for each row
INSERT INTO `test`.`members_old`
(
`nMmbID`,
`nMmbName`,
`nMmbAdd`
)
(
SELECT
`members_new`.`nMmbID`,
`members_new`.`nMmbName`,
`members_new`.`nMmbAdd`
FROM `test`.`members_new`
where nMmbID = (select max(nMmbID) from `test`.`members_new` // written to read the last record from the members_new and stop duplication on the members_old , also this will reduce the chances of any error . )
)
This trigger is working for now , but my confusion is that what will happen if a multiple insertion is happening at one instance of time.
Will it reduce the performance?
Will I face deadlock condition ever in any case as my members_old have FKs?
If any better solution for this situation is there, please give limelight on that
From the manual:
You can refer to columns in the subject table (the table associated with the trigger) by using the aliases OLD and NEW. OLD.col_name refers to a column of an existing row before it is updated or deleted. NEW.col_name refers to the column of a new row to be inserted or an existing row after it is updated.
create trigger add_new_record
after
insert on members_new
for each row
INSERT INTO `test`.`members_old`
SET
`nMmbID` = NEW.nMmbID,
`nMmbName` = NEW.nMmbName,
`nMmbAdd` = NEW.nMmbAdd;
And you will have no problem with deadlocks or whatever. Also it should be much faster, because you don't have to read the max value before (which is also unsecure and might lead to compromised data). Read about isolation levels and transactions if you're interested why...