How to find time zone set in Panasonic EXIF - camera

The Panasonic Lumix GX8 doesn't have a GPS, so generally doesn't know where it is. However the time can be set manually and a time zone set. But I can't figure out if that information is available in the EXIF data.
TimeStamp shows UTM time and all the other time related EXIF data is in local time (whatever time is set for the camera. So the camera is storing the offset, but I wanted get it directly.
In Ruby
require …
photo = MiniExiftool.new(fn) # fn is a filename
timeZone = (photo.CreateDate - photo.TimeStamp)/3600
puts "timeZone: #{timeZone}"
works as expected. But can the offset be found directly via EXIF?
exiftool.org has
Panasonic TimeInfo Tags
Index1 Tag Name Writable Values / Notes
0 PanasonicDateTime undef[8]
but PanasonicDateTime is null for the GX8.
exiftool -G1 -a -s -time:all "file.rw2"
[System] FileModifyDate : 2021:12:05 18:40:45-08:00
[System] FileAccessDate : 2021:12:05 19:17:28-08:00
[System] FileInodeChangeDate : 2021:12:05 18:40:45-08:00
[IFD0] ModifyDate : 2021:11:25 15:31:34
[ExifIFD] DateTimeOriginal : 2021:11:25 15:31:34
[ExifIFD] CreateDate : 2021:11:25 15:31:34
[Panasonic] TimeStamp : 2021:11:25 22:31:34
[ExifIFD] SubSecTime : 277
[ExifIFD] SubSecTimeOriginal : 277
[ExifIFD] SubSecTimeDigitized : 277
[GPS] GPSTimeStamp : 22:31:34
[GPS] GPSDateStamp : 2021:11:25
[ExifIFD] DateTimeOriginal : 2021:11:25 15:31:34
[ExifIFD] CreateDate : 2021:11:25 15:31:34
[ExifIFD] SubSecTimeOriginal : 277
[ExifIFD] SubSecTimeDigitized : 277
[GPS] GPSTimeStamp : 22:31:34
[GPS] GPSDateStamp : 2021:11:25
[Composite] SubSecCreateDate : 2021:11:25 15:31:34.277
[Composite] SubSecDateTimeOriginal : 2021:11:25 15:31:34.277
[Composite] SubSecModifyDate : 2021:11:25 15:31:34.277
[Composite] GPSDateTime : 2021:11:25 22:31:34Z
[Composite] GPSDateTime : 2021:11:25 22:31:34Z

EXIF supports UTC offsets:
9010.H "OffsetTime" (to augment 132.H "DateTime")
9011.H "OffsetTimeOriginal" (to augment 9003.H "DateTimeOriginal")
9011.H "OffsetTimeDigitized" (to augment 9004.H "DateTimeDigitized")
Source: CIPA's Exif standard 2.31 from 2016, semantically page 49, digitally page 54.
You already see other time augmentation fields: SubSecTime, SubSecTimeOriginal and SubSecTimeDigitized are also optional for providing milliseconds. Now it's up to the EXIF writer to turn your timezone into a UTC offset and write the additional fields. If any Panasonic software is not writing it into official EXIF fields then seek their support.
Linking Why don't Exif tags contain time zone information?

Related

extract a date from a STRING ROBOT FRAMEWORK

${my-string} "Fête : Anniversaire
Emplacement : Paris
Date : 08/12/2021
Prix : texte"
I need to extract the date from this text but i can't end up with a solution using robot framework , i tried :
${REGULAR EXPRESSION} \d{2}\/\d{2}\/\d{4}
Get Regexp Matches ${my-string} ${REGULAR EXPRESSION}
But it gave me null as a result , any solution for that please ?
I think because in Python/Robot Framework the backslash is treated as am escape character, you need an additional backslash on each occassion you use it in your reg ex
e.g. \\d{2}\\/\\d{2}\\/\\d{4}
So something like this should work:
*** Settings ***
Library String
*** Test Cases ***
Check Date Regex
${date} Set Variable Fête : Anniversaire Emplacement : Paris Date : 08/12/2021 Prix : texte
${regex} Set Variable \\d{2}\\/\\d{2}\\/\\d{4}
${matches} Get Regexp Matches ${date} ${regex}
${match} Set Variable ${NONE}
IF ${matches}
${match} Set Variable ${matches}[0]
END
log to console ${match}
With output:
08/12/2021

How to convert String to Timestamp in kafka connect using transforms and insert into postgres using jdbc sink connector from confluent?

Below is my kafka-connect-sink.properties file
I am using confluent-6.0.1.
name=enba-sink-postgres
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
connection.url=jdbc:postgresql://IP:PORT/DB
connection.user=USERNAME
connection.password=PASSWORD
tasks.max=1
topics=postgresInsert
insert.mode=INSERT
table.name.format=schema."tableName"
auto.create=false
key.converter.schema.registry.url=http://localhost:8081
key.converter.schemas.enable=false
value.converter.schemas.enable=false
config.action.reload=restart
value.converter.schema.registry.url=http://localhost:8081
errors.tolerance=all
errors.log.enable=true
errors.log.include.messages=true
print.key=true
# Transforms
transforms=TimestampConverter
transforms.TimestampConverter.type=org.apache.kafka.connect.transforms.TimestampConverter$Value
transforms.TimestampConverter.format=yyyy-MM-dd HH:mm:ss
transforms.TimestampConverter.target.type=Timestamp
transforms.TimestampConverter.target.field=DATE_TIME
I am using avro data and schema is :
{\"type\":\"record\",\"name\":\"log\",\"namespace\":\"transform.name.space\",\"fields\":[{\"name\":\"TRANSACTION_ID\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"MSISDN\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"TRIGGER_NAME\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"W_ID\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"STEP\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"REWARD_ID\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"CAM_ID\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"STATUS\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"COMMENTS\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"CCR_JSON\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"},{\"name\":\"DATE_TIME\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"},\"avro.java.string\":\"String\"}]}
Basically DATE_TIME column in Postgres is of type Timestamp and from avro I tried sending String date and also of type long .
DATE_TIME = 2022-12-15 14:38:02
Issue is If I dont use transform then I am getting error :
ERROR: column "DATE_TIME" is of type timestamp with time zone but expression is of type character varying
And If I use transforms as mentioned above then error is :
[2021-02-06 21:47:41,897] ERROR Error encountered in task enba-sink-postgres-0. Executing stage 'TRANSFORMATION' with class 'org.apache.kafka.connect.transforms.TimestampConverter$Value', where consumed record is {topic='enba', partition=0, offset=69, timestamp=1612628261605, timestampType=CreateTime}. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
org.apache.kafka.connect.errors.ConnectException: Schema Schema{com.package.kafkaconnect.Enbalog:STRUCT} does not correspond to a known timestamp type format
I got it working using :
# Transforms
transforms= timestamp
transforms.timestamp.type= org.apache.kafka.connect.transforms.TimestampConverter$Value
transforms.timestamp.target.type= Timestamp
transforms.timestamp.field= DATE_TIME
transforms.timestamp.format= yyyy-MM-dd HH:mm:ss
For some reason transforms=TimestampConverter was not working.

Numbers in XText not accepted

I have the following rule:
terminal MIDI_VALUE:
( '0'..'9') |
( '1'..'9' '0'..'9') |
('1' '0'..'1' '0'..'9') |
('1' '2' '0'..'7');
This rule is meant to read values from [0..127].
However, it does not accept values from [1..16], while 0 and 17 to 127 is accepted.
When I hover over the error I get:
mismatched input: '16' expecting RULE_MIDI_VALUE.
How can I fix this?
2nd Example
This example is maybe even more trivial:
DmxDelayTimeSubCommand:
'DelayTime' time=Time;
Time:
time=INT type=('ms' | 's' );
While the input
AllFrontBlue AllGroupsAll Mode loop DelayTime 255 ms;
Shows an error over 255 showing when hovering over it:
Mismatched input '255' expecting RULE_INT
While RULE_INT is a predefined terminal:
terminal INT returns ecore::EInt: ('0'..'9')+;
I get this error for all values below 256 (all values from [0..255]).
the rules MIDI_VALUE and INT
overlap with each other.
possible solutions
use INT + validator (for all of them)
use a datatyperule like MIDI_CHANNEL: INT (no terminal keyword) + a valueconverter
use terminal rules that dont overlap and datatype rules MIDI_CHANNEL: TERMINAL1|TERMINAL2| ....

Query documents using Pymongo Datetime

when I run the below query in MongoDB Compass, I am able to get the documents what I want:
{"save_date" : { "$gte" : new Date("2019-02-25T08:01:59"),"$lte":new Date("2019-02-25T08:02:59")}}
But when I use pymongo, I am getting 0 documents, I have been using these in below 2 versions and none of them are giving me the data that I want, Any idea what is missing?.
connection = MongoDbClient("With all the connection parameters")
start = datetime.datetime.strptime(start_date, '%m-%d-%Y %H:%M:%S')
end = datetime.datetime.strptime(end_date, '%m-%d-%Y %H:%M:%S')
connection.database.collection.find({ "save_date": {"$gte" : start, "$lte": end}}).count()
returned 0 documents
The below version returned 0 document as well
connection.database.collection.find({ "save_date": {"$gte" : datetime.datetime(2019,2,25,8,1,59), "$lte": datetime.datetime(2019,2,25,8,2,59)}}).count()
I guess found the issue. Looks like I was passing all the dates as local data and time where there was no document for that date range, Hence 0 documents returned. Now I have converted my local date to UTC date and before passing to pymongo worked great and got my documents.
local_start="2019-02-25 08:01:59"
start = time.strftime("%Y-%m-%d %H:%M:%S",
time.gmtime(time.mktime(time.strptime(local_start,
"%Y-%m-%d %H:%M:%S"))))

What does " Cannot read tablet : Incompatible types" mean?

We ran this query on Bigquery:
SELECT DateTime, Source, MachineName, LogLevel, Identifier, Message, Exception
FROM TABLE_DATE_RANGE(XXXX.EventLog_, TIMESTAMP(Current_Date()), TIMESTAMP(Current_Date()))
Where source like 'Sync' and (MachineName like 'WEBNEW' or Identifier like 'WEBNEW')
Order by DateTime desc
LIMIT 100;
It gave us :
Error: Cannot read tablet : Incompatible types. 'DateTime' : TYPE_MESSAGE 'DateTime' : TYPE_INT64
Job ID: red-road-574:job_t5gM9MysBFi20PFZ88kgTO8ygvQ
When we only got rid of " Order by DateTime desc", the query ran well.
We wonder why, and how to fix it.
Transient issue in BigQuery - everything should be working normal now.