I'm trying to get glucose measurements from glucose device (Contour One Plus) with Bluetooth LE, using the TBluetoothLE component in Delphi. I can:
- connect to device
- discover GATT services
- discover GATT characteristics
and now Im trying to set notification for Glucose Measurement Characteristic and indication to Record Access Control Point Characteristic.
I wrote two procedures for enabling:
notification for Glucose Measurement:
procedure TForm6.enableGlucoseMeasurementNotification(
const ACharacteristic: TBluetoothGattCharacteristic);
var
ADescriptor: TBluetoothGattDescriptor;
AValues : TBytes;
begin
BluetoothLE1.DiscoveredDevices[0].SetCharacteristicNotification(FGlucoseMeasurementGattCharact, true);
ADescriptor := FGlucoseMeasurementGattCharact.Descriptors[0]; // czy aby na pewno [0]... ?
SetLength(AValues, 2);
AValues[0] := $01;
AValues[1] := $00;
ADescriptor.SetValue(AValues);
BluetoothLE1.DiscoveredDevices[0].WriteDescriptor(ADescriptor);
end;
indication for Record Access Control Point:
procedure TForm6.enableRecordAccessControlPointIndication(
const ACharacteristic: TBluetoothGattCharacteristic);
var
ADescriptor: TBluetoothGattDescriptor;
AValues: TBytes;
begin
BluetoothLE1.DiscoveredDevices[0].SetCharacteristicNotification(FRecordAccessControlPoint, true);
ADescriptor := FRecordAccessControlPoint.Descriptors[0];
SetLength(AValues, 2);
AValues[0] := $02;
AValues[1] := $00;
ADescriptor.SetValue(AValues);
BluetoothLE1.DiscoveredDevices[0].WriteDescriptor(ADescriptor);
end;
I am based on nRF Connect but I dont understand few things.
My question is, where I should run these two procedures? In OnServicesDiscovered?
Are my procedures correct?
And what is the next step to get values from RACP? I should just write byte value (for example 0x01) to RACP?
Related
My program keeps making new requests to a database while it's running. Now, I am wondering if I should outsource my database connection, since it always creates a new one for a new query and destroys it afterwards.
function TMainMenu.btnInsertPersonClick(Sender: TObject)
var
vDatabase : TADOConnection;
vQuery : TADOQuery;
begin
vDatabase := TADOConnection.Create(self);
vQuery := TADOQuery.Create(self);
try
vDatabase.ConnectionString := 'SECRET_STRING';
vQuery.Connection := vDatabase;
// DO QUERY STUFF HERE
finally
vDatabase.Close;
vQuery.Destroy;
vDatabase.Destroy;
end;
end;
Maybe it's not a good idea to outsource the connection. That's why I am happy about any help or suggestions. Sheers!
Env: Oracle 12c
I'm looking at using Oracle DBMS_PIPE within a table trigger that will be used by many users. The trigger will fire only on a STATUS update as per below:
CREATE OR REPLACE TRIGGER MY_TRG
AFTER UPDATE OF STATUS ON "MY_TABLE"
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
declare
v_status INTEGER;
begin
if :OLD.status = 'ERROR' and (:NEW.status = 'OK' or :NEW.status = 'ERROR') then
DBMS_PIPE.PACK_MESSAGE(:OLD.id_key);
DBMS_PIPE.PACK_MESSAGE(:NEW.status);
v_status := DBMS_PIPE.SEND_MESSAGE('MY_PIPE');
if v_status != 0 THEN
raise_application_error(num => -20002,msg => 'error message on trigger!');
end if;
end if;
end;
The following call will be initiated from an Oracle APEX page process where this can be submitted again by multiple users.
DBMS_PIPE.receive_message(pipename => 'MY_PIPE', timeout => 10);
My question is, for each user here, do I need to ensure that the PIPE NAME is specific to each user, so that they only see their messages within their PIPE or can just the one 'MY_PIPE' pipe name handle all transactions for multiple users?
If in the case that each user needs their own designated PIPE NAME, how would I do this, if the SEND_MESSAGE('USER_1_PIPE') is triggered from a table trigger, which my receive_message_proc will be unaware of this 'USER_1_PIPE' name.
My create pipe is like this:
v_res := DBMS_PIPE.create_pipe(pipename => 'MY_PIPE', private => TRUE);
I assume I need to tag each user with their own private pipe name - is this correct?
Private pipes are private the username that created them. If you have multiple people logging on with the same user account, then they are all going to be able to see that pipe.
But perhaps a larger issue is that pipe are not transactional. So the moment that trigger fires, the message get put in the pipe...even if that transaction later rolls back, or fails or anything else that does not finally update the status. Moreover, the pipe message will be sent BEFORE the transaction commits. Another session (receiving that pipe message) will not be able to see changes done until the commit occurs, which can lead to timing inconsistencies.
Perhaps AQ (Advanced Queuing) is an alternative you might want to consider. Messages on a queue by default are transactional, so the message on the queue is then nicely bound to whether your changes to STATUS actually succeed.
The calling application just listens on a queue rather than on a pipe message.
How can I use Firedac LocalSQL with FDMemtable? Is there any working example available?
According to the Embarcadero DocWiki I set up a local connection (using SQLite driver), a LocalSQL component and connected some Firedac memory tables to it. Then I connected a FDQuery and try to query the memory tables. But the query always returns "table xyz not known" even if I set an explicit dataset name for the memory table in the localSQL dataset collection.
I suspect that I miss something fundamental that is not contained in the Embarcadero docs. If anyone has ever got this up and running I would be grateful for some tips.
Here is some code I wrote for an answer here a while ago, which is a self-contained example of using LocalSQL, tested in D10.2 (Seattle). It should suffice to get you going. Istr that the key to getting it working was a comment somewhere in the EMBA docs that FD's LocalSQL is based on Sqlite, as you've noted.
procedure TForm3.CopyData2;
begin
DataSource2.DataSet := FDQuery1;
FDConnection1.DriverName := 'SQLite';
FDConnection1.Connected := True;
FDLocalSQL1.Connection := FDConnection1;
FDLocalSQL1.DataSets.Add(FDMemTable1);
FDLocalSQL1.Active := True;
FDQuery1.SQL.Text := 'select * from FDMemTable1 order by ID limit 5';
FDQuery1.Active := True;
FDMemTable1.Close;
FDMemTable1.Data := FDQuery1.Data;
end;
procedure TForm3.FormCreate(Sender: TObject);
var
i : integer;
MS : TMemoryStream;
begin
FDMemTable1.CreateDataSet;
for i := 1 to 10 do
FDMemTable1.InsertRecord([i, 'Row:' + IntToStr(i), 10000 - i]);
FDMemTable1.First;
// Following is to try to reproduce problem loading from stream
// noted by the OP, but works fine
MS := TMemoryStream.Create;
try
FDMemTable1.SaveToStream(MS, sfBinary);
MS.Position := 0;
FDMemTable1.LoadFromStream(MS, sfBinary);
finally
MS.Free;
end;
end;
As you can see, you can refer in the SQL to an existing FireDAC dataset simply by using its component name.
I'm developing an application (for Win32 and WinCE) in Lazarus using sqldb components for data access.
Remote database is PostgreSQL (but i have the same behaviour with local SQLite).
Connection to PostgreSQL work perfectrly but when I open any Query (also a very simple select) database go in transaction: "idle in transaction".
var
PGConnection:TPQConnection;
PGTransaction:TSQLTransaction;
myQuery:TSQLQuery;
begin
PGConnection := TPQConnection.Create(self);
PGTransaction := TSQLTransaction.Create(self);
myQuery := TSQLQuery.Create(self);
try
PGConnection.HostName := '192.168.1.2';
PGConnection.DatabaseName:='testdb';
PGConnection.UserName:='test';
PGConnection.Password:='test';
PGConnection.Transaction := PGTransaction;
PGConnection.Open;
myQuery.DataBase := PGConnection;
myQuery.SQL.Add('SELECT 1 AS value');
myQuery.Open; // <- this start transaction
ShowMessage(myQuery.FieldByName('value').AsString); // <- db: "idle in transaction"
myQuery.Close; // <- db: "idle in transaction"
PGConnction.Close;
finally
myQuery.Free;
PGConnection.Free;
PGTransaction.Free;
end;
end;
Ok, maybe sqldb work in this way: every query on database start a transaction, so, developer must Commit or Rollback after interrogation. But there is another question: when I commit the Transaction, sqldb close the query and i can't access value retrieved:
var
PGConnection:TPQConnection;
PGTransaction:TSQLTransaction;
myQuery:TSQLQuery;
begin
PGConnection := TPQConnection.Create(self);
PGTransaction := TSQLTransaction.Create(self);
myQuery := TSQLQuery.Create(self);
try
PGConnection.HostName := '192.168.1.2';
PGConnection.DatabaseName:='testdb';
PGConnection.UserName:='test';
PGConnection.Password:='test';
PGConnection.Transaction := PGTransaction;
PGConnection.Open;
myQuery.DataBase := PGConnection;
myQuery.SQL.Add('SELECT 1 AS value');
myQuery.Open; // <- this start transaction
PGConnection.Transaction.Active := False; // <- Close also myQuery
ShowMessage(myQuery.FieldByName('value').AsString); // <- Error: Field "value" not found
myQuery.Close;
PGConnction.Close;
finally
myQuery.Free;
PGConnection.Free;
PGTransaction.Free;
end;
end;
This behavior is a bit boring: I can't use TSQLQuery dataset with dbgrid (since I do not want my database in transaction for too long) so I need to move selected data in Memory Tables.
Is this a bug, I made some mistakes or is a normal operation? There is a way to open a SELECT Query and use it without start a transaction?
This is currently normal behaviour.
I have planned an 'offline' mode where the transaction is closed but the data is kept open.
What you can currently do is save the data to file (using the savetofile method), disconnect, and load the data again from file (using the loadfromfile method)
Consider a trigger that sets an id-column to an Oracle-sequence (next-)value on-insert. Now I want to extend it with some extra debugging actions. But they should be only active for certain connection/sessions (i.e. debugging ones).
Pseudo-Code on client side:
Open connection to oracle database
set mydebugflag=yes for that connection only
insert some stuff
close connection
Pseudo-Code on server-side (inside the trigger):
set id = someseq.next_val
if mydebugflag=yes then { do_some_extra_sanity_checks(); diagnostics();}
else: finished
How to implement such logic for an Oracle database?
Which Oracle features should I use for this?
There are a couple of ways to do this.
One is to create a package, create a package global variable (which will have session scope) in the package, and then have your client set the package variable and your trigger read it. Something like
CREATE OR REPLACE PACKAGE pkg_debug_mode
AS
PROCEDURE set_debug_mode( p_debug_mode IN NUMBER );
FUNCTION get_debug_mode
RETURN NUMBER;
DEBUG_MODE_ON constant number := 1;
DEBUG_MODE_OFF constant number := 2;
END;
CREATE OR REPLACE PACKAGE BODY pkg_debug_mode
AS
g_debug_mode NUMBER := DEBUG_MODE_ON;
PROCEDURE set_debug_mode( p_debug_mode IN NUMBER )
AS
BEGIN
g_debug_mode := p_debug_mode;
END;
FUNCTION get_debug_mode
RETURN NUMBER
IS
BEGIN
RETURN g_debug_mode;
END;
END;
The client calls pkg_debug_mode.set_debug_mode to set the debug mode and the trigger calls pkg_debug_mode.get_debug_mode to determine the current debug mode for the session.
create or replace context my_ctx using pkg_debug_mode;
CREATE OR REPLACE PACKAGE BODY pkg_debug_mode
AS
PROCEDURE set_debug_mode( p_debug_mode IN NUMBER )
AS
BEGIN
dbms_session.set_context( 'MY_CTX', 'DEBUG_MODE', p_debug_mode );
END;
FUNCTION get_debug_mode
RETURN NUMBER
IS
BEGIN
RETURN SYS_CONTEXT( 'MY_CTX', 'DEBUG_MODE' );
END;
END;
Your trigger can either call the get_debug_mode function or it can directly reference the context by putting the SYS_CONTEXT call in the trigger.