I am trying to extract GPS metadata from hex following this tutorial, but cannot understand why at the end the latitude and longitude have length 24 and values 42 and 73:
http://itbrigadeinc.com/post/2012/03/06/Anatomy-of-a-JPG-image.aspx
http://www.itbrigadeinc.com/post/2012/03/16/Seeing-the-EXIF-data-for-a-JPG-image.aspx
I found the tags of latitude and longitude (00 02 00 05 00 00 00 03 00 00 02 42) and (00 04 00 05 00 00 00 03 00 00 02 5A). As I understood, if count = 3, then the values of both of them should follow in the last 4 bytes of tags. but 02 42 and 02 5A are not "42" and "73"...
Who could explain me what is wrong?
Please, don't recommend any tools - I need to do it manually.
You need to also consider the size of each value. The count is three, but the size of each is larger than one byte. Therefore it won't fit in the four bytes, and those four bytes represent an offset to the value.
GPS data is usually stored as three rational numbers, where each rational number is two 32-bit integers (numerator, denominator). Therefore you have three values for latitude, but each is 8 bytes. The 24 bytes won't fit within the TIFF tag, so it is stored somewhere else in the file, and the four bytes you're seeing are a pointer to it. You need to look into the spec to find out where that pointer is relative to, as it's probably not the start of the file.
Check out my metadata extractor libraries (in Java and C#) for reference.
Apparently the 24 bit data type is a PropertyTagTypeRational
https://msdn.microsoft.com/en-us/library/ms534414(v=vs.85).aspx
Specifies that the value data member is an array of pairs of unsigned long integers. Each pair represents a fraction; the first integer is the numerator and the second integer is the denominator.
Mostly gotten from: Getting GPS data from an image's EXIF in C#
This bit of python code might have a good hint too at how you can decode the data http://eran.sandler.co.il/2011/05/20/extract-gps-latitude-and-longitude-data-from-exif-using-python-imaging-library-pil/
Related
Running a search in PHPMyAdmin for an ip address to unblock from a WordPress plug in, I get this on one of the tables:
Warning: #1300 Invalid utf8 character string: '\x8B\x08\x00\x00\x00\x00\x00\x00\x03\x14\xD6y8\x15\xEF\x17\x0...'
Warning: #1300 Invalid utf8 character string: '\x8B\x08\x00\x00\x00\x00\x00\x00\x03\x00\x1E\x80\xE1\x7Fa:2:{...'
I tried to search for part of the strings, but cannot find where they are in the db.
These look suspicious to me, I've had some SQL injection compromises in the past and I'm fearing that's what it may indicate.
How do I track down where these strings actually are in the db if I cannot find by the PHPMyAdmin search?
Thank you.
Those look like gzip headers which are missing their leading \x1f. I expect it's there but not part of the warning because \x1f is a valid UTF-8 character but \x8b is not.
1F 2-byte magic number of a gzip file
8B |
08 compression method (08 = deflate)
00 1 byte header flags (00 = it's probably compressed text)
00 4 byte timestamp
00 |
00 |
00 |
00 Extra flags
03 Operating System (03 = Unix)
After that, data begins.
Something is trying to read gzipped text as UTF-8.
I am trying to interface with my internet connected gas fire. The manufacturer has told me that I can communicate with it on UDP port 3300.
He says I can send the packet with the information "SEARCH_FOR_FIRES" to the local subnet address to receive a response.
The packets should be composed in 15 bytes, as follows:
Byte 1: StartByte(0x47 'G')
Byte 2: Command ID
Byte 3: DataSize
Byte 4-13: Data
Byte 15: CRC
Byte 15: End Byte (0x46 'F')
They give, 0x473100000000000000000000003146 as am example. 31 is the command ID for the "SEARCH_FOR_FIRES" command.
The only problem is I have no idea how to create these packets... I'm using the Windows verson of Packet Sender and it gives me the option of inputting ASCII or HEX values. So far I have:
HEX: 47 31 00 03 01 46
ASCII: G1\00\03\01F
But none of them seem to work, but I don't know how to find the HEX equivalent of 0x473100000000000000000000003146.
Can someone help?
Well, that sounds weird, but hex equivalent of 0x473100000000000000000000003146 is... 0x473100000000000000000000003146 itself :) "0x" stands for hex representation, it's followed by hex numbers, so you need to pass "47 31 00 00 00 00 00 00 00 00 00 00 00 31 46" to packet sender.
By the way, do you know what to expect from device on successful packet sending? Should device perform some noticeable indication of processing "SEARCH_FOR_FIRES" command? It's possible, that device will silently send some report in UDP packet back to you, so you may need to setup network capturing (e.g. wireshark) to see and analyze response.
I'm learning about endianness, and i read that Intel processors usually are little-endian. Im on an Intel mac and thought i'd try it for myself to see it in action. I define a uint32_t and then try to print out the 4 bytes as they are ordered in memory.
uint32_t integer = 1027;
uint8_t * bytes = (uint8_t*)&integer;
printf("%04x\n", integer);
printf("%x%x%x%x\n", bytes[0], bytes[1], bytes[2], bytes[3]);
Output:
0403
3400
I expected to see the bytes either in reverse order (3040) or unchanged, but what's output is neither. What am i missing?
Im actually compiling it as Objective-C using Xcode if that makes any difference.
Because saving data occurs in unit of bytes (8 bits) in today's typical computers.
In machines in which little endian is used, the first byte is 03, the second byte is 04, and the third and forth bytes are 00.
The first digit in the second line 3 represents the first byte and the second digit 4 represents the second byte. To show bytes with 2 digits for each bytes, specify width to print in the format like
printf("%02x%02x%02x%02x\n", bytes[0], bytes[1], bytes[2], bytes[3]);
That is endianness.
There are two different approaches for storing data in memory. Little endian and big endian.
In big endian the most significant byte is stored first.
In little endian the least significant byte is stored first.
You system is little endian as the data is stored as
03 04 00 00
On a big endian system, it would have been
00 00 04 03
For printing use %02x to get the leading zero printed.
I'm writing a cookie authentication library that replicates that of an existing system. I'm able to create authentication tokens that work. However testing with a token with known value, created by the existing system, I encountered the following puzzle.
The original encoded string purports to be base64url encoded. And, in fact, using any of several base64url code modules and online tools, the decoded value is the expected result.
However base64url encoding the decoded value (again using any of several tools) doesn't reproduce the original string. Both encoded strings decode to the expected results, so apparently both representations are valid.
How? What's the difference?
How can I replicate the original encoded results?
original encoded string: YWRtaW46NTVGRDZDRUE6vtRbQoEXD9O6R4MYd8ro2o6Rzrc
my base64url decode: admin:55FD6CEA:[encrypted hash]
Encoding doesn't match original but the decoded strings match.
my base64url encode: YWRtaW46NTVGRDZDRUE677-977-9W0Lvv70XD9O6R--_vRh377-977-92o7vv73Otw
my base64url decode: admin:55FD6CEA:[encrypted hash]
(Sorry, SSE won't let me show the unicode representation of the hash. I assure you, they do match.)
This string:
YWRtaW46NTVGRDZDRUE6vtRbQoEXD9O6R4MYd8ro2o6Rzrc
is not exactly valid Base64. Valid Base64 consists in a sequence of characters among uppercase letters, lowercase letters, digits, '/' and '+'; it must also have a length which is a multiple of 4; 1 or 2 final '=' signs may appear as padding so that the length is indeed a multiple of 4. This string contains only Base64-valid characters, but only 47 of them, and 47 is not a multiple of 4. With an extra '=' sign at the end, this becomes valid Base64.
That string:
YWRtaW46NTVGRDZDRUE677-977-9W0Lvv70XD9O6R--_vRh377-977-92o7vv73Otw
is not valid Base64. It contains several '-' and one '_' sign, neither of which should appear in a Base64 string. If some tool is decoding that string into the "same" result as the previous string, then the tool is not implementing Base64 at all, but something else (and weird).
I suppose that your strings got garbled at some point through some copy&paste mishap, maybe related to a bad interpretation of bytes as characters. This is the important point: bytes are NOT characters.
It so happens that, traditionally, in older times, computers got on the habit of using so-called "code pages" which were direct mappings of characters onto bytes, with each character being encoded as exactly one byte. Thus came into existence some tools (such as Windows' notepad.exe) that purport to do the inverse, i.e. show the contents of a file (nominally, some bytes) as they character counterparts. This, however, fails when the bytes are not "printable characters" (while a code page such as "Windows-1252" maps each character to a byte value, there can be byte values that are not the mapping of a printable character). This also began to fail even more when people finally realized that there were only 256 possible byte values, and a lot more possible characters, especially when considering Chinese.
Unicode is an evolving standard that maps characters to code units (i.e. numbers), with a bit more than 100000 currently defined. Then some encoding rules (there are several of them, the most frequent being UTF-8) encode the characters into bytes. Crucially, one character can be encoded over several bytes.
In any case, a hash value (or whatever you call an "encrypted hash", which is probably a confusion, because hashing and encrypting are two distinct things) is a sequence of bytes, not characters, and thus is never guaranteed to be the encoding of a sequence of characters in any code page.
Armed with this knowledge, you may try to put some order into your strings and your question.
Edit: thanks to #marfarma for pointing out the URL-safe Base64 encoding where the '+' and '/' characters are replaced by '-' and '_'. This makes the situation clearer. When adding the needed '=' signs, the first string then decodes to:
00000000 61 64 6d 69 6e 3a 35 35 46 44 36 43 45 41 3a be |admin:55FD6CEA:.|
00000010 d4 5b 42 81 17 0f d3 ba 47 83 18 77 ca e8 da 8e |.[B.....G..w....|
00000020 91 ce b7 |...|
while the second becomes:
00000000 61 64 6d 69 6e 3a 35 35 46 44 36 43 45 41 3a ef |admin:55FD6CEA:.|
00000010 bf bd ef bf bd 5b 42 ef bf bd 17 0f d3 ba 47 ef |.....[B.......G.|
00000020 bf bd 18 77 ef bf bd ef bf bd da 8e ef bf bd ce |...w............|
00000030 b7 |.|
We now see what happened: the first string was decoded to bytes but someone fed these bytes to some display system or editors that really expected UTF-8. Some of these bytes were not valid UTF-8 encoding of anything, so they were replaced with the Unicode code point U+FEFF ZERO WIDTH NO-BREAK SPACE, i.e. a space character with no width (thus, nothingness on the screen). The characters where then reencoded as UTF-8, each U+FEFF yielding the EF BF BD sequence of three bytes.
Therefore, the hash value was badly mangled, but the bytes that were altered show up as nothing when interpreted (wrongly) as characters, and what was put in their place also shows up as nothing. Hence no visible difference on the screen.
I am trying to perform external authentication on smart card, I got the 8 byte challenge from the card and then I need to generate the card cryptogram on that 8 bytes.
But I don't know how to perform that cryptogram operation (smartcard tool kit converting 8 bytes to 72 bytes).
The following commands are generated by the tool kit
00 A4 04 00 0C A0 00 00 02 43 00 13 00 00 00 01 04
00 22 41 A4 06 83 01 01 95 01 80
command: 80 84 00 00 08 Response: (8 bytes challenge)
command: 80 82 00 00 48 (72 bytes data)
Can any body say what are the steps to follow to convert 8 byte challenge to 72 bytes ?
Conversion is not exactly the right term. You need to apply the cryptographic algorithm with the correct key to the received challenge. I assume, that an External Authenticate command is performed, but the strange data field length allows no assumption on the algorithm used. Possibly an external challenge is also provided in the command and session keys are established. Since the assumed Get Challenge command and the External Authenticate command have a class byte indicating a proprietary command, ISO 7816-4 won't help here and you need to refer to the card specification. To get knowledge of the key you probably have to sign a non-disclosure agreement with the card issuer.