How to get signed number representations of binary string - kotlin

Is there a method to convert a binary string into a signed integer, for example:convert the string "11110000" into -16

I got NumberFormatException when trying to deal with the negative numbers. I used the following for the negative and positive numbers. You can try this one.
System.out.println(Integer.parseUnsignedInt("11111111111111111111111111110111", 2));
Output : -9

Related

Getting public key from DNSKEY RR public key field using Python

I am trying to parse and validate DNSSEC responses without using any DNS specific libraries. I am able to get the hexstring representation of a RSA key from the public key field value present in the DNSKEY RRs. According to RFC 8017, the RSA public key is represented with the ASN.1 type RSAPublicKey format which has a modulus and exponent. However, it doesn't specify anything more.
The hexstring(same as in Wireshark) is
"03010001ac1f62b7f85d62c550211fd70ddbbca7326cde13dca235f26f76a5dd5872db601d775ecd189955ed96e749fd8e8e6af3e133e8a8eb8b8afc25730c6318f949de9436fde6ea280b5ccbc09a43ee357617905690fdc09cda06bc5ad3bcd1bc4e43de9a4769ff83453e96a74642b23daabae00398539cfca56b04c200776c4841724cb09674b519eb7e3506a3e08e4f96b5a733425a1c55eecb1613552c022b246b27141652d907cdbc6e30b5f3341a1ba5dfbb503edddbd01e85f1c4206642cfb312e14f2772fe8b66143ba847382e95fb86ba215342ae9cca803655bccadef1123e06f3cf1626840e11200b1acda118c50805c6eacfd271d930b93f2e332d9521"
I saw other similar posts and tried to follow the solutions. Most of the solutions try to get it from a pem file or binary data or base64 encoded form. When I try to convert the hex to those forms and use the solution, I get errors like 'RSA key format not supported' etc..
Is there anyway I can get the public key from the hex? I would really appreciate any inputs! Thanks!
I finally managed to find the solution.
According to RFC 3110 section 2, we can split the given value into exponent length, exponent and modulus. I split them as specified and converted the hexadecimal to integer. The text from RFC is below
The structure of the algorithm specific portion
of the RDATA part of such RRs is as shown below.
Field Size
----- ----
exponent length 1 or 3 octets (see text)
exponent as specified by length field
modulus remaining space
For interoperability, the exponent and modulus are each limited to
4096 bits in length. The public key exponent is a variable length
unsigned integer. Its length in octets is represented as one octet
if it is in the range of 1 to 255 and by a zero octet followed by a
two octet unsigned length if it is longer than 255 bytes. The public
key modulus field is a multiprecision unsigned integer. The length
of the modulus can be determined from the RDLENGTH and the preceding
RDATA fields including the exponent.

Labview converting hexformatted string to ascii

In labview I am trying to convert a hex string to ascii format. For example if I have a hexstring like: 09124E4F21CD0024FFFFFFFFFFFFFFFF the ascii version of this is : NO!Í or basically a bunch of illegible symbols. I tried using the labview functions of converting hexstring to number but they didn't work. How would I convert the ascii part to hexformatted ascii?
Hexadecimal String to Number works fine, but only for a hex string that represents a number that can be stored as a numeric data type:
If the input string represents a number outside the range of the
representation of number, number is set to the maximum value for that
data type.
Your example input is 128 bits long whereas the longest integer data type in current LabVIEW is 64 bits.
You can use this function, but you need to convert the input one byte at a time:
Create a While Loop and add a shift register. Initialise the shift register with your input string.
Inside the loop, wire the string from the shift register to the string input of a Search/Split String function
Wire a numeric constant of 2 to the offset input - i.e. split the string into the first two characters, and the rest
Wire the match + rest of string output to the right-hand shift register terminal
Wire the substring before match output to a Hexadecimal String to Number function
Wire the default input of this function to a numeric constant with value 0 and type U8
Wire the output of this function to the right-hand side of the While loop and make the terminal indexing (via right-click)
Use an Empty String/Path? function to exit the While loop when the string being passed back into the shift register is empty.
The output from the indexing terminal you created will now be a U8 (byte) array containing the data converted from the input string. If you want it in string form you can convert it using Byte Array to String.
This assumes that your input string is always a multiple of 2 characters in length. If you need it to handle an input such as "3F2" you'll need to check for this and do something to the input (I'll let you figure out what) before passing it into your loop.

Bro convert hex string to int

I am using bro to read bytes directly of the payload of a packet.
I have a string value "\x10" and I want to get the decimal value of off that.
I know that bro support directly printing hex to decimal:
print 0x10;
Question is, how do I convert that string similarly to its integer version?
The best you can do is strip off the "\x" portion, and run it through 2 BIFs:
bytestring_to_count(hexstr_to_bytestring("10"));

how to get sha256 hashed string in erlang?

I'm trying to encrypt string via sha256 in erlang, but I could not manage to get string back. crypto:hash(sha256, somestring) gives some binary, how can I get string?
The binary has to be decoded to an integer, and then printed in hexadecimal form:
1> io_lib:format("~64.16.0b", [binary:decode_unsigned(crypto:hash(sha256,
"somenewstring"))]).
"abf8a5e4f99c89cabb25b4bfde8a1db5478da09bcbf4f1d9cdf90b7b5321e43c"
binary:decode_unsigned/1 decodes the whole binary as one large big-endian unsigned integer. An alternative would be to pattern match the binary into an integer:
2> <<Integer:256>> = crypto:hash(sha256, "somenewstring").
<<171,248,165,228,249,156,137,202,187,37,180,191,222,138,
29,181,71,141,160,155,203,244,241,217,205,249,11,123,83,
...>>
3> Integer.
77784820141105809005227607825327585337759244421257967593474620236757179950140
4> io_lib:format("~64.16.0b", [Integer]).
"abf8a5e4f99c89cabb25b4bfde8a1db5478da09bcbf4f1d9cdf90b7b5321e43c"
(Note that <<Integer:256>> is equivalent to <<Integer:256/big-unsigned-integer>> since those are the default flags).
Alternative way how to get sha256 hashed string in Erlang:
[ element(C+1, {$0,$1,$2,$3,$4,$5,$6,$7,$8,$9,$A,$B,$C,$D,$E,$F}) || <<C:4>> <= crypto:hash(sha256,"somenewstring")]

Hexadecimal numbers vs. hexadecimal enocding (with base64 as well)

Encoding with hexadecimal numbers seems to be different from using hexadecimals to represent numbers. For example, then hex number 0x40 to me should be equal to 64, or BA_{64}, but when I put it through this hex to base64 converter, I get the output: QA== which to me is equal to some number times 64. Why is this?
Also when I check the integer value of the hex string deadbeef I get 3735928559, but when I check it other places I get: 222 173 190 239. Why is this?
Addendum: So I guess it is because it is easier to break the number into bit chunks than treat it as a whole number when encoding? That is pretty confusing to me but I guess I get it.
You may wish to read this:
http://en.wikipedia.org/wiki/Base64
In summary, base64 specifies a specific encoding, which involves using different values for letters than their ASCII encoding.
For the second part, one source is treating the entire string as a 32 bit integer, and the other is dividing it into bytes and giving the value of each byte.