I am unable to understand the following code can anyone please clear up this to me? - solidity

Particularly the usage of keccak256 and the return statement used here.
function isTokenTransferOK(address currentOwner, address newOwner)
public
pure
returns (bool ok)
{
// Check an arbitrary condition to see if transfer should proceed
return keccak256(abi.encodePacked(currentOwner, newOwner))[0] == 0x7f;
}

abi.encodePacked() essentially concatenates data. So the call here forms a sequence of 40 bytes, where the first 20 bytes are the address currentOwner and the second 20 bytes are the address newOwner.
keccak256() is a hashing function very similar to SHA3. It's used to calculate a hash of the output of abi.encodePacked().
[0] retrieves the first element of an array. In this case, it's the first byte of the hash calculated above.
== 0x7f is true if and only if that byte is the hexadecimal value 0x7f.
So the function hashes the current and new owner addresses and returns true if the first byte of that hash is 0x7f. Otherwise it returns false.

Related

Solid user input check - Newbie Learning C

So I'm learning C and got this exercise to do with functions. Not gonna ask how to do it.
So I created this function:
int menu(void) {
char user;
do {
printf("Choise: ");
user = getchar();
if (user == '1') {
/* code and no return here */
}
else if (user == '2') {
/* code */
return 2;
}
else if (user == '3') {
/* code */
return 3;
}
} while (user != '3');
Got others controls flows like isdigit(user) and !isdigit(user).
The problem here is that if the user input "fw" (for example), the program prints 2 times "Choise: " or more if there's more input of course.
I tried several others controls flows and changing the variable user to an array and force user[0] == ... && user[1] == '\n' but to no avail.
What I'm trying to do is, if the user don't enter one of the 3 options the program just stop reading after the 1st input and waits for another input.
Already checked several questions at StackOverflow but it doesn't answer to my question :/
I'm all ears to advises! Thank in advance!
The underlying cause here is that getchar() in C gets one single character; it's not like, for example, input() in Python, which gets an entire string of as many characters as you like. The usual technique in C to get a string consisting of more than one character is with pointers: You declare a variable, for instance, as char* a; To obtain user input, you use scanf() and store the input under that pointer address: scanf(&a); Technically, a is the pointer pointing at the memory address of the first character in your string, but the compiler stores the individual characters of the string in a contiguous block of memory until a null character is reached to mark the end of the string.
To avoid the risk of seg faults, you might want to reserve as much memory as you need to hold the longest string you think you'll need: a = malloc((sizeof(char)*n);, with n being the number of characters you want to set memory aside for.
Apologies for not posting in a while, I've been travelling. I thought about the problem again, and I think the root of the problem is that getchar() will return the character cast into an integer, namely, the ASCII value of the character. So if the user keys in "1", you can't run an if statement that tests if (user == '1') (C doesn't natively support direct string comparison). Instead, you should test if (user == 49) - 49 is the ASCII value of the digit "1" ("2" is 50, "3" is 51). If you write your if statements and the loop condition accordingly, that should work.

Why is my custom block going twice into general_work() function in GNU Radio?

I am creating a custom block "Combine" that gets 20 bytes of data from the first input. The value of first input specifies the number of bytes to be read from the second input, which are read and wrote to the output file.
Whenever I execute the flowgraph, the printing shows that the code goes twice into the general work function. It reads the correct data in the first time and the second time, it just reads bogus values and writes this incorrect data to the output sink.
I am using the following signatures for the input:
Combine_impl::Combine_impl()
: gr::block("Combine",
gr::io_signature::make(2, 2, sizeof(unsigned char)),
gr::io_signature::make(1, 1, sizeof(unsigned char)))
{}
I think my problem is with the forecast function and the usage of consume each function. I have tried doing this in forecast but it still goes twice into the general_work function and writes incorrect data to the output file.
ninput_items_required[0] = 20;
ninput_items_required[1] = 7; //because the first input has a value of 7 and will read 7 bytes of data from the second input
Can someone please help me this to determine what exactly is going wrong over here? Also, how is the consume_each() function supposed to be utilized over here?
I modified the forecast function to take the exact number of items that were utilized from each input:
void Combine_impl::forecast (int noutput_items, gr_vector_int &ninput_items_required)
{
ninput_items_required[0] = 20;
ninput_items_required[1] = 7;
}
Instead of using consume_each() which specifies the number of bytes consumed in all of the inputs, I used the consume() function which specifies this number separately for each input:
consume(0, 20); //20 bytes of input at index 0 were consumed
consume(1, 7); //7 bytes of input at index 1 were consumed
And instead of returning noutput_items from the general_work function, I return the following. It exactly the specifies the number of bytes that are to returned, whose value is different than noutput_items.
return (20 + 7);

Why DataInputStream can't read char correctly?

I tried to write String in DataInputStream, when read from DataInputStream single char, but I have an error.
I expected, that readChar() return 'q', but method:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_8)).readChar())
Throws exception:
java.io.EOFException
at java.io.DataInputStream.readChar(DataInputStream.java:365)
Please have a look at DataInput.readChar() which states:
Reads two input bytes and returns a char value. Let a be the first byte read and b be the second byte. The value returned is:
(char)((a << 8) | (b & 0xff))
This method is suitable for reading bytes written by the writeChar method of interface DataOutput.
The last sentence is basically also the solution. If you write the data using writeChar, reading works as expected, i.e. the following will give you a succeeding test case:
assertEquals('q', DataInputStream(ByteArrayOutputStream().apply {
DataOutputStream(this).use {
it.writeChars("q")
}
}.toByteArray().inputStream())
.readChar())
The following, even though not mentioned in the interface, may also work out:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_16BE)).readChar())

OTP Google Acc. Compatible

I'm working on the implementation of a OTP Google Acc. compatible.
So far, I've been using
-RFC2104(http://www.ietf.org/rfc/rfc2104.txt),
-RFC4226(http://www.ietf.org/rfc/rfc4226.txt),
-RFC6238(https://www.rfc-editor.org/rfc/rfc6238), and following this schema :
[Pseudo code Time OTP] (http://en.wikipedia.org/wiki/Google_Authenticator#Pseudocode_for_Time_OTP)
function GoogleAuthenticatorCode(string secret)
key := base32decode(secret)
message := floor(current Unix time / 30)
hash := HMAC-SHA1(key, message)
offset := value of last nibble of hash
truncatedHash := hash[offset..offset+3] //4 bytes starting at the offset
Set the first bit of truncatedHash to zero //remove the most significant bit
code := truncatedHash mod 1000000
pad code with 0 until length of code is 6
return code
Until " hash := HMAC-SHA1(key, message) " everything is ok. I checked multiple time the result through other HMAC-SHA1 converters. (Well, I think so).
But then, I think something must go wrong ... because obviously I'm not getting the same code as my google-authenticator app (android). (At least it's still a 6-digits value).
The part I'm not quiet sure to understand well is :
offset := value of last nibble of hash
truncatedHash := hash[offset..offset+3] //4 bytes starting at the offset
Set the first bit of truncatedHash to zero //remove the most significant bit
Could someone give me a more detailed explanation on this ?
Thanks,
My guess would be that you may take the value of offset incorrectly.
The statement
value of last nibble of hash
is pretty vague if you don't have a proper definition of bit and byte ordering.
Quoted wikipedia page has links to a number of implementations, I think this Java implementation is something to check your code against:
byte[] hash = ...
// Dynamically truncate the hash
// OffsetBits are the low order bits of the last byte of the hash
int offset = hash[hash.length - 1] & 0xF;

How do I limit BitConverter.GetBytes() to return only a certain amount of bytes using VB.NET?

I do:
Dim BytArr() as Byte = BitConverter.GetBytes(1234)
Since, by default, they are 32 bits, it returns 4 byte elements.
I want to be able to control it to return only like two bytes. Maybe only three bytes. Are there any built-in functions to control it?
I don't want to rely on using shifting >> 8 >> 16 >> 24 >> 32, etc..
I also don't want to rely on type casting the data in GetBytes() to a specific datatype.
It is not that GetBytes defaults to 32 bits, it is that GetBytes returns an array of the size required to hold the data type. If you pass a Long then you will get a 8 elements in your array.
The best way to control this is indeed casting the data you pass in. Otherwise you could truncate some of the number.
That being said, you could do something like this:
Dim BytArr() as Byte = Array.Resize(BitConverter.GetBytes(1234), 2)
But if the value you passed in exceeded what could be stored in 2 bytes (in this case) then you will have some very broken code.