Insert new timestamp value to acc table in kamailio - sql

I want to add a new column to acc table. I created a new column in the acc table of type timestamp and named it ring_time. In every call I put the ring time to a $dlg_var like this:
$dlg_var(ringtime) = $Ts;
Then I add a extra column in config like this:
modparam("acc", "log_extra", "src_user=$fU;src_domain=$fd;src_ip=$si;" "dst_ouser=$tU;dst_user=$rU;dst_domain=$rd;ring_time=$dlg_var(ringtime)")
but when I try to test it, I always get:
db_mysql [km_dbase.c:122]: db_mysql_submit_query(): driver error on query: Incorrect datetime value: '1591361996' for column kamailio.acc.ring_time at row 1 (1292)
Jun 5 17:29:59 kamailio /usr/sbin/kamailio[22901]: ERROR: {2 102 INVITE 105a0f4a3d99a0a5558355e54b43f4e1#192.168.1.121:5060} <core> [db_query.c:244]: db_do_insert_cmd(): error while submitting query
Jun 5 17:29:59 kamailio /usr/sbin/kamailio[22901]: ERROR: {2 102 INVITE 105a0f4a3d99a0a5558355e54b43f4e1#192.168.1.121:5060} acc [acc.c:477]: acc_db_request(): failed to insert into database

Sounds like an error with the SQL INSERT query, if I had to guess I'd say you're being caught out by the date format in the SQL table not matching the date format you're pushing to it.
I don't know the structure of your database, but there's a simple trick I use for debugging SQL queries when I can't see the query being run;
Start up Wireshark/TCPdump on the machine and packet capture for all SQL traffic (MySQL is port 3306) and replicate the error.
From the packet capture and you'll be able to see the Query Kamailio's database engine ran.

If the error "db_mysql [km_dbase.c:122]: db_mysql_submit_query(): driver error on query: Incorrect datetime value: '1591361996' for column kamailio.acc.ring_time at row 1 (1292)", the '1591361996' looks like it is an epoch for the $dlg_var(ringtime). The "Incorrect datetime value" part of the error looks like the database is trying to store the value in datetime data type so a data type mismatch. Double-check and you may need either change the ringtime to convert to datetime or change the database column to a type that will take epoch.

Related

timestamp VS TIMESTAMP_NTZ in snowflake sql

I am using an sql script to parse a json into a snowflake table using dbt.
One of the cols contain this datetime value: '2022-02-09T20:28:59+0000'.
What's the correct way to define ISO datetime's data type in Snowflake?
I tried date, timestamp and TIMESTAMP_NTZ like this in my dbt sql script:
JSON_DATA:",my_date"::TIMESTAMP_NTZ AS MY_DATE
but clearly, these aren't the correct one because later on when I test it in snowflake with select * , I get this error:
SQL Error [100040] [22007]: Date '2022-02-09T20:28:59+0000' is not recognized
or
SQL Error [100035] [22007]: Timestamp '2022-02-13T03:32:55+0100' is not recognized
so I need to know which Snowflake time/date data type suits the best for this one
EDIT:
This is what I am trying now.
SELECT
JSON_DATA:"date_transmission" AS DATE_TRANSMISSION
, TO_TIMESTAMP(DATE_TRANSMISSION:text, 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
I get this error:
000939 (22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
However, if I comment out the the first 2 lines(related to timpstamp types), the other two work perfectly fine. What's the correct syntax of parsing json with TO_TIMESTAMP?
Not that JSON_DATA:"apiPath"::text API_PATH gives the correct value for it in my snowflake tables.
Did some testing and it seems you have 2 options.
You can either get rid of the +0000 at the end: left(column_date, len(column_date)-5)
or try_to_timestamp with format
try_to_timestamp('2022-02-09T20:28:59+0000','YYYY-MM-DD"T"HH24:MI:SS+TZHTZM')
TZH and TZM are TimeZone Offset Hours and Minutes
So there are 2 main points here.
when getting data from JSON to pass to any of the timestamp functions that want a ::TEXT object, but the values to get from JSON are still ::VARIANT so they need to be cast. This is the cause of the error you quote
(22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
also your SQL is wrong there it should have been
TO_TIMESTAMP(DATE_TRANSMISSION::text,
How you handle the timezone format.As other have noted you (as I did in your last question) do you want to ignore the timezone values or read them. I forgot about the TZHTZM formatting. Given you have timezone data, you should use the TO_TIMESTAMP_TZ`TRY_TO_TIMESTAMP_TZto make sure the time zone data is keep, given you second example shows+0100`
putting those together (assuming you didn't want an extra date_transmission as a variant in you data) :
SELECT
TO_TIMESTAMP_TZ(JSON_DATA:"date_transmission"::text, 'YYYY-MM-DDTHH24:MI:SS+TZHTZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text AS API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
You should use timestamp (not date which does not store the time information), but probably the format you are using is not autodetected. You can specify the input format as YYYY-MM-DD"T"HH24:MI:SSTZHTZM as shown here. The autodetected one has a : between the TZHTZM.

Failed to transfer data from GCS to Bigquery table

Need Help in DTS.
After creating a table "allorders" with autodetect schema, I created a data transfer service. But when I ran the DTS I'm getting an error. see Job below. quantity field type is for sure set to integer and all the data in the said field are whole numbers.
Job bqts_602c3b1a-0000-24db-ba34-30fd38139ad0 (table allorders) failed
with error INVALID_ARGUMENT: Error while reading data, error message:
Could not parse 'quantity' as INT64 for field quantity (position 14)
starting at location 0 with message 'Unable to parse'; JobID:
956421367065:bqts_602c3b1a-0000-24db-ba34-30fd38139ad0
When I recreated a table and set all fields to type string. It worked fine. see Job below
Job bqts_607cef13-0000-2791-8888-001a114b79a8 (table allorders)
completed successfully. Number of records: 56017, with errors: 0.
Try to find unparseable values in the table with all string fileds:
SELECT *
FROM dataset.table
WHERE SAFE_CAST(value AS INT64) IS NULL;

Sybase Database Error: Invalid data conversion

I need help with this Query, I'm making some calculations based on some results, but I'm receiving the following error:
Sybase Database Error: Invalid data conversion.
The query is as follows:
SELECT
DC.DIM_DATE.DATE_ID,
DC.DIM_E_RAN_UCELL.RBS_ID,
(SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestDchPsIntRabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumFachPsIntRabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestPsHsAdchRabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestPsEulRabEstablish))/720 AS '3G_DATA_ERLANG',
(SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestCs12Establish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmr12200RabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmr7950RabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmr5900RabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmr4750RabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmrWbRabEstablish)+SUM(DC.DC_E_RAN_UCELL_RAW.pmSumBestAmrNbMmRabEstablish))/720 AS '3G_SPEECH_ERLANG',
(100*(SUM(DC.DC_E_RAN_UCELL_RAW.pmTotNoRrcConnectReqCsSucc)/(1+SUM(DC.DC_E_RAN_UCELL_RAW.pmTotNoRrcConnectReqCs)-SUM(DC.DC_E_RAN_UCELL_RAW.pmNoLoadSharingRrcConnCs))*(SUM(DC.DC_E_RAN_UCELL_RAW.pmNoRabEstablishSuccessSpeech)+SUM(DC.DC_E_RAN_UCELL_RAW.pmNoRabEstablishSuccessCs64)+SUM(DC.DC_E_RAN_UCELL_RAW.pmRabEstablishEcSuccess))/(SUM(DC.DC_E_RAN_UCELL_RAW.pmRabEstablishEcAttempt)+SUM(DC.DC_E_RAN_UCELL_RAW.pmNoRabEstablishAttemptSpeech)+SUM(DC.DC_E_RAN_UCELL_RAW.pmNoRabEstablishAttemptCs64)-SUM(DC.DC_E_RAN_UCELL_RAW.pmNoDirRetryAtt)))) AS '3G_CSSR_CS'
FROM
DC.DIM_DATE,
DC.DIM_TIME,
DC.DIM_E_RAN_UCELL,
DC.DC_E_RAN_UCELL_RAW
WHERE
(DC.DC_E_RAN_UCELL_RAW.HOUR_ID=DC.DIM_TIME.HOUR_ID and DC.DC_E_RAN_UCELL_RAW.MIN_ID=DC.DIM_TIME.MIN_ID)
AND (DC.DC_E_RAN_UCELL_RAW.DATE_ID=DC.DIM_DATE.DATE_ID)
AND (DC.DC_E_RAN_UCELL_RAW.OSS_ID=DC.DIM_E_RAN_UCELL.OSS_ID)
AND (DC.DC_E_RAN_UCELL_RAW.RNC=DC.DIM_E_RAN_UCELL.RNC_ID)
AND (DC.DC_E_RAN_UCELL_RAW.UtranCell=DC.DIM_E_RAN_UCELL.UCELL_ID)
AND
(
DC.DIM_DATE.DATE_ID IN ('2017-08-14')
AND
DC.DIM_E_RAN_UCELL.RBS_ID IN ('DN1U0441')
)
GROUP BY
DC.DIM_DATE.DATE_ID,
DC.DIM_E_RAN_UCELL.RBS_ID
The problem is with the last row of the SELECT statement, without it the result is this:
DATE_ID RBS_ID 3G_DATA_ERLANG 3G_SPEECH_ERLANG
8/14/2017 DN1U0441 421.8541 33.5055
When is included I got this error:
Lookup Error - Sybase Database Error: Invalid data conversion
Any Help? Is Sybase ASE DataBase

How to fix error "Conversion failed when converting datetime from character string" in SQL Query?

The error that I found at the log is the one below.
'Illuminate\Database\QueryException' with message 'SQLSTATE[22007]:
[Microsoft][ODBC Driver 11 for SQL Server][SQL Server]Conversion
failed when converting date and/or time from character string. (SQL:
SELECT COUNT(*) AS aggregate
FROM [mytable]
WHERE [mytable].[deleted_at] IS NULL
AND [created_at] BETWEEN '2015-09-30T00:00:00' AND '2015-09-30T23:59:59'
AND ((SELECT COUNT(*) FROM [mytable_translation]
WHERE [mytable_translation].[item_id] = [mytable].[id]) >= 1)
)'
in
wwwroot\myproject\vendor\laravel\framework\src\Illuminate\Database\Connection.php:625
On the database, the DataType is datetime and is not null
Based on marc_s's answer I tried to change the format that I'm sending to the database. So I tried without the T on and [created_at] between '2015-09-30 00:00:00' and '2015-09-30 23:59:59'.
In my local, I'm using mysql, and the code works just fine. If I test the query above on the SQL Server client, both (with and without the T) works too.
How can I fix this problem without create any changes on the database itself?
The PHP/Laravel code:
$items = $items->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
With #lad2025 help, I got it to work.
Based on the point of his comments on my question, I changed in the code part (Laravel/PHP in this case) the format that I was passing. (In reality, I "removed" the format it self, and just added fields to a variable before passing to the query. This way, I let the database decide the format that he wants)
Instead of
$itens = $itens->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
I changed the code to this:
$sta = $value_aux."T00:00:00";
$end = $value_aux."T23:59:59";
$itens = $itens->whereBetween($key, [$sta, $end]);

Update date within a table, Postgresql

So I'm having trouble understanding on how to change the date on an update in postgres. What I have currently, that is giving a syntax error is
UPDATE works_locations SET (wrl_startdate = '2014-09-07', wrl_enddate = '2015-02-06')
with a few statements determining which field I should specifically change. However, postgres is giving me an error. How do I successfully change the date in postgres, even if the start date is around two years prior to this entry?
I don't have Postgres installed so I can't test this but try removing the parenthesis on your SET clause so that it looks like this:
UPDATE works_locations SET wrl_startdate = '2014-09-07', wrl_enddate = '2015-02-06'