I'm trying to copy the last inserted row from a table into a csv file using a trigger.
CREATE TRIGGER new_tbc_order
AFTER INSERT
ON trig_test
FOR EACH ROW
EXECUTE PROCEDURE write_last_tbc_order();
CREATE OR REPLACE FUNCTION write_last_tbc_order()
RETURNS TRIGGER
LANGUAGE plpgsql
as $$
BEGIN
EXECUTE 'COPY (
select i.id, i.paid, i.no_items FROM (SELECT NEW.*) AS i
) TO ''/Users/fred/Desktop/last_tbc_order.csv'' csv;' USING NEW;
RETURN NEW;
END; $$
I've tried this in various incarnations, with or without EXECUTE but I'm still getting the error.
ERROR: missing FROM-clause entry for table "new"
LINE 1: ...opy (select i.id, i.paid, i.no_items FROM (SELECT NEW.*) AS ...
^
Just cannot get it to access the NEW data.
Where am I going wrong ?
The only way I could get this to work is something like this:
CREATE OR REPLACE FUNCTION write_last_tbc_order()
RETURNS TRIGGER
LANGUAGE plpgsql
as $$
BEGIN
EXECUTE 'COPY (
select id, paid, no_items FROM trig_test WHERE id = ' || NEW.id ||
') TO ''/Users/fred/Desktop/last_tbc_order.csv'' csv;';
RETURN NEW;
END; $$
COPY does not seem to 'see' the NEW record. Be aware that the above will fail if the Postgres server does not have permissions on /Users/fred/Desktop/ as COPY runs as the server user. Personally I think a better solution is to write to a audit table and periodically harvest the records from there.
Adrian's answer inspired me to experiment more with the NEW record where it was actually available.
The actual answer, as usual, turned out to be simple-ish:
CREATE OR REPLACE FUNCTION write_last_tbc_order()
RETURNS TRIGGER
AS
$$
BEGIN
EXECUTE 'copy (select '''||NEW.id||''','''||NEW.paid||''','''||NEW.no_items||''')
to ''/Users/shaun/Desktop/last_tbc_order.csv'' csv;' USING NEW;
RETURN NEW;
END;
$$
LANGUAGE plpgsql;
Thus keeping the NEW record outside of the copy statement and building it dynamically before the execute.
This still required the USING NEW part at the end to make it work correctly.
Also required all the quoting.
UPDATE:
The above code does require almost everything to be running as a Postgres Superuser, (as mentioned by several commenters), which is not ideal.
To get around this, you can create a PlPython function as follows:
CREATE OR REPLACE FUNCTION write_last_tbc_order()
RETURNS TRIGGER AS
'
import os
row = str(TD["new"]["id"]) + "," + TD["new"]["paid"] + "," + str(TD["new"]["noitems"])
path = "/tmp/db-data/last_order.csv";
with open(path,"w") as o:
os.chmod(path, 0o644)
o.write(row)
return None
'
LANGUAGE 'plpythonu';
This function must be created as a Superuser, but it can be used in triggers created by standard users, and those triggers also work when inserts are fired by standard users.
The place to where the file is written must have write permission for the postgres user in every part of the path, so I created a subdirectory of /tmp with 777 permissions.
Also needed to add in the os chmod 644 statement to make the file readable by other users on the system. Without this, the file is created with permissions of 600.
FURTHER NOTES:
In the end Apache doesn't like you to setup a virtual directory inside the /tmp directory so in the end had to create another called /tmp-wh specifically for this purpose.
Related
I'm trying to create users with the prefix sit from a list of the greek alphabet. Like sit-alpha, sit-beta, so on.
This is what I have so far:
BEGIN
LIST:= 'alpha, beta, gamma';
FOR u IN LIST
LOOP
EXECUTE IMMEDIATE 'CREATE USER SIT-' || TO_CHAR (U)||' IDENTIFIED BY
CLERK'||TO_CHAR (U) ;
EXECUTE IMMEDIATE 'GRANT CONNECT, RESOURCE TO SIT-'||TO_CHAR(U);
END LOOP;
END;
But it says "PLS-00201: identifier 'LIST' must be declared". How do I do this properly?
Your code is mostly correct. However, you need to split the comma-separated list of values into an array using the REGEXP_SUBSTR function to loop through each element in the list. About the error "PLS-00201: identifier 'LIST' must be declared" means that the LIST variable you used in your code is not declared. This error usually occurs when a variable has not been declared or initialized correctly. In your code, you must declare the LIST variable before using it. You can do this by using the DECLARE command at the beginning of your code, like this:
DECLARE
LIST VARCHAR2(200) := 'alpha, beta, gamma';
BEGIN
FOR u IN (SELECT trim(regexp_substr(LIST, '[^,]+', 1, LEVEL)) AS val FROM dual
CONNECT BY LEVEL <= regexp_count(LIST, ',') + 1)
LOOP
EXECUTE IMMEDIATE 'CREATE USER SIT-' || u.val || ' IDENTIFIED BY CLERK' || u.val;
EXECUTE IMMEDIATE 'GRANT CONNECT, RESOURCE TO SIT-' || u.val;
END LOOP;
END;
Here's a great explanation of my code :
The first line of the code is a DECLARE command that indicates the beginning of a variable declaration section. This section declares all the variables and constants that will be used in the rest of the code.
The second line declares a LIST variable of type VARCHAR2 with a maximum length of 200 characters. This variable contains a comma-separated list of the user names you wish to create.
The third line is the BEGIN command that indicates the start of the executable code block.
The fourth line begins a FOR loop that traverses each item in the list of usernames using a SELECT query.
The SELECT query uses the regexp_substr function to retrieve each comma-separated item in the list and returns it as a val. The trim function is used to remove any unnecessary spaces that may be present around each element in the list. The CONNECT BY clause is used to generate the rows necessary for the FOR loop to traverse each element in the list.
The FOR loop uses the variable u to store each element in the list at each iteration.
The two EXECUTE IMMEDIATE commands inside the FOR loop create a new user using the username SIT- followed by the value of val, and then add CONNECT and RESOURCE privileges to that user.
Finally, the last line of code is the END command which indicates the end of the executable code block.
In summary, this code extracts each item from the list of user names, creates a new user with a name that includes that item, and then adds CONNECT and RESOURCE privileges to that user. This code uses Oracle functions to extract each item from the list and dynamically create the users.
Total newbie in AL here; I'm experimenting with automated testing in Dynamics BC for a new deployment. As a very basic test, I'd like to simply create a new Item record in the Cronus test database and validate each field. I'm running into trouble when I try to select the value for an enum field. Here's the code I'm using:
Procedure AddGoodItem()
// [Given] Good item data
var
recItem: Record Item Temporary;
Begin
recItem."Description" := 'zzzz';
recItem.validate("Description");
recItem.Type := recItem.Type::Inventory;
recItem.Validate(Type);
recItem."Base Unit of Measure" := 'EA';
recItem.Validate("Base Unit of Measure");
recItem."Item Category Code" := 'FURNITURE';
recItem.validate("Item Category Code");
End;
When I run this in Cronus, I get the error:
You cannot change the Type field on Item because at least one Item Journal Line exists for this item.
If I comment the Type lines out, the process runs successfully.
Given that this is a temporary record, it shouldn't have any Item Journal Lines, should it? What am I missing?
The code in the OnValidate trigger still runs, even if you have marked the Record variable as temporary. In addition temporary is not inherited to the underlying variables meaning any Record variables used in the OnValidate trigger are not temporary (unless marked as such).
There are two options:
Leave out the line recItem.Validate(Type);, if the code run by the OnValidate trigger is not relevant in this case.
Replace the line recItem.Validate(Type); with your own clone of the code from the OnValidate trigger and then remove the unneeded parts.
Every time I try to run my database script I reach the last few lines that have my 2 triggers and my script stops after compiling the first trigger.
These are the 2 triggers I have and it compiles "Player Round Trigger" and then the script stops and doesn't reach my second trigger "Handicap Trigger"
--
-- Player Round Trigger
--
CREATE TRIGGER playerRoundUpdateAudit BEFORE UPDATE ON PlayerRound
FOR EACH row BEGIN
INSERT INTO PlayerRoundAudit(old_data_PlayerID, old_data_RoundID, old_data_Holenumber, old_data_holeScore,
new_data_PlayerID, new_data_RoundID, new_data_Holenumber, new_data_holeScore, tbl_name)
VALUES (OLD.playerID, OLD.roundID, OLD.holeNumber, OLD.holeScore, NEW.playerID, NEW.roundID, NEW.holeNumber, NEW.holeScore, "PlayerRound");
END;
/
--
-- Handicap Trigger
--
CREATE TRIGGER handicapUpdateAudit BEFORE UPDATE ON Handicap
FOR EACH row BEGIN
INSERT INTO HandicapAudit(old_data_handicapID, old_data_playerID, old_data_handicapDate, old_data_handicapScore,
new_data_handicapID, new_data_playerID, new_data_handicapDate, new_data_handicapScore, tbl_name)
VALUES (OLD.handicapID, OLD.playerID, OLD.handicapDate, OLD.handicapScore, NEW.handicapID, NEW.playerID, NEW.handicapDate, NEW.handicapScore, "Handicap");
END;
/
I'm running the script in Oracle SQL Developer version 4.1.2.20 (the newest one atm)
Acturlly the first triggers compiles with errors, and breaks the script.
You can do an experiment - change a header of first trigger into CREATE OR REPLACE TRIGGER ....,
then in SQL Developer click on the first trigger to move cursor into it's code, then press Ctrl-Enter - this executes one statement where the cursor is placed (actually - 'CREATE` statement of the first trigger).
Then examine "Compiler log" window - you will see a message like this:
There are two problems with this trigger:
you are using OLD.column_name and NEW.column_name, what is wrong. You need use :OLD.column_name and :NEW.column_name, using a colon as a prefix
you are using double quotes instead of quotes here: "PlayerRound", and Oracle doesn't interpret this as a string, but as an identifier (of variable, column name etc.). Use 'PlayerRound' within quotes instead.
Change the first trigger like below an it should compile:
set define off
CREATE or replace TRIGGER playerRoundUpdateAudit BEFORE UPDATE ON PlayerRound
FOR EACH row BEGIN
INSERT INTO PlayerRoundAudit(old_data_PlayerID, old_data_RoundID,
old_data_Holenumber, old_data_holeScore,
new_data_PlayerID, new_data_RoundID, new_data_Holenumber,
new_data_holeScore, tbl_name)
VALUES (:OLD.playerID, :OLD.roundID, :OLD.holeNumber, :OLD.holeScore,
:NEW.playerID, :NEW.roundID, :NEW.holeNumber, :NEW.holeScore, 'PlayerRound');
end;
/
You need also correct the second trigger, because it contains similar errors.
Remark : put SET DEFINE OFF into the script to turn off a variable substitution, otherwise SQL-Developer will prompt to enter a value for each :NEW and :OLD
I'm not so familiar with Oracle SQL Developer but is there an option to run the script as "Execute as script"? This feature is available on TOAD...
I found this code to create a calc field to a TADOTable in Delphi somewhere ...
.....
procedure TfrmMain.ABSTable1CalcFields(DataSet: TDataSet);
begin
with ABSTable1 do
FieldByName('cost').AsFloat := FieldByName('price').AsFloat *
FieldByName('quantity').AsInteger;
// add new field cost as Price * quantity !!!!
end;
end.
Inside my app i create a TADOQuery at rum time like
try
Fquery.sql.clear;
Fquery.sql.AddStrings(Amemo.lines);
Fquery.Open;
.....
finally
end;
How to add more calc fields to my Query derived from the first code Fragment ?
I think the only way you can easily do this is by creating a set of persistent TFields in the IDE (or do the equivalent creation of them in code before you open the dataset). Otherwise, when you call Open on the dataset, IIRC that will call BindFields and that - unless the dataset already has a set of TFields - will create a set of dynamic TFields which will last as long as the dataset is open, but will not include any calculated fields.
By the time BindFields has been called, it's too late to add any more, so you have to set them up beforehand or not at all.
I have a scenario in which I have to export data of around 500,000 records from sql table to be used in Delphi application. The data is to be loaded into a packed record. Is there a method in which i can use the BCP to write data file similar to that of writing the records to file.
As of now I am loading the data using this psudo code.
// Assign the data file generated from BCP to the TextFile object.
AssignFile(losDataFile, loslFileName);
Reset(losDataFile);
while not EOD(losDataFile) do
begin
// Read from the data file until we encounter the End of File
ReadLn(losDataFile, loslDataString);
// Use the string list comma text to strip the fields
loclTempSlist.CommaText := loslDataString;
// Load the record from the items of the string list.
DummyRec.Name := loclTempSList[0];
DummyRec.Mapped = loclTempSList[1] = 'Y';
end;
For convenience i have listed the type of Dummy rec below
TDummyRec = packed record
Name : string[255];
Mapped : Boolean;
end;
So, my question is, instead of exporting the data to a text file, will it be possible to export the data to binary so that i can read from the file directly using the record type?
like
loclFileStream := TFileStream.Create('xxxxxx.dat', fmOpenRead or fmShareDenyNone);
while loclFileStream.Position < loclFileStream.Size do
begin
// Read from the binary file
loclFileStream.Read(losDummyData, SizeOf(TDummyRec));
//- -------- Do wat ever i want.
end;
I don't have much experience on using the BCP. Please help me with this.
Thanks
Terminator...
In your record, a string[255] will create a fixed-size Ansi string (i.e. a so-called shortstring). This type is clearly deprecated, and should not be used in your code.
It will be an awful waste of space to save it directly, using a TFileStream (even if it will work). Each record will store 256 bytes for each Name.
And using a string[255] (i.e. a so-called shortstring) will make an hidden conversion to a string for most access to it. So it is not the best option, IMHO.
My advice is to use a dynamic array then serialize / unserialize it with our Open Source classes. For your storage, you can use a dynamic array. Works from Delphi 5 up to XE2. And you'll be able to use a string in the record:
TDummyRec = packed record
Name : string; // native Delphi string (no shortstring)
Mapped : Boolean;
end;
Edit after OP's comment:
BCP is just a command-line tool meant to export a lot of rows into a SQL table. So IMHO BCP is not the good candidate for your purpose.
You seems to need to import a lot of rows from a SQL table.
In this case:
Using shortstring will be in all case a waste of memory, so you'll get faster out of memory than with using a good string;
You can try our Open Source classes to retrieve all data rows one by one, then populate your records using this data: see SynDB classes - it is lighter than ADO; Then you'll be able to retrieve the record data one by one, then use our record serialization functions to create some binary content - or try a dedicated faster engine like our SynBigTable;
There are some articles about using directly the OleDB feature used by BCP from Delphi code in here - it is in french, but you can use google to translate it and here for fast bulk copy; full source code included.
You want to read a SQL-table into a record, I have no idea why you are working with the archaic AssignFile.
You should really use a TADOQuery (or suitable variant) for you database.
Put a sensible SQL-query in it; something like:
SELECT field1, field2, field3 FROM tablename WHERE .....
When in doubt you can use:
SELECT * FROM tablename
Which will select all fields from the table.
The following code will walk through all the records and all the fields and save them in a variants and save that in a FileStream.
function NewFile(Filename: string): TFileStream;
begin
Result:= TFileStream.Create(Filename, fmOpenWrite);
end;
function SaveQueryToFileStream(AFile: TFileStream; AQuery: TADOQuery): boolean;
const
Success = true;
Failure = false;
UniqueFilePrefix = 'MyCustomFileTypeId';
BufSize = 4096;
var
Value: variant;
Writer: TWriter;
FieldCount: integer;
c: integer;
RowCount: integer;
begin
Result:= Success;
try
if not(AQuery.Active) then AQuery.Open
FieldCount:= AQuery.Fields.Count;
Writer:= TWriter.Create(AFile, BufSize);
try
Writer.WriteString(UniqueFilePrefix)
//Write the record info first
Writer.WriteInteger(FieldCount);
//Write the number of rows
RowCount:= AQuery.RecordCount;
WriteInteger(RowCount);
AQuery.First;
while not(AQuery.eof) do begin
for c:= 0 to FieldCount -1 do begin
Value:= AQuery.Fields[c].Value;
Writer.WriteVariant(Value);
end; {for c}
AQuery.Next;
end; {while}
except
Result:= failure;
end;
finally
Writer.Free;
end;
end;