I tried to write String in DataInputStream, when read from DataInputStream single char, but I have an error.
I expected, that readChar() return 'q', but method:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_8)).readChar())
Throws exception:
java.io.EOFException
at java.io.DataInputStream.readChar(DataInputStream.java:365)
Please have a look at DataInput.readChar() which states:
Reads two input bytes and returns a char value. Let a be the first byte read and b be the second byte. The value returned is:
(char)((a << 8) | (b & 0xff))
This method is suitable for reading bytes written by the writeChar method of interface DataOutput.
The last sentence is basically also the solution. If you write the data using writeChar, reading works as expected, i.e. the following will give you a succeeding test case:
assertEquals('q', DataInputStream(ByteArrayOutputStream().apply {
DataOutputStream(this).use {
it.writeChars("q")
}
}.toByteArray().inputStream())
.readChar())
The following, even though not mentioned in the interface, may also work out:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_16BE)).readChar())
Related
If we have a reducedTxBytes as String form. How can we sign it via appkit? (I found that we can use
ctx.parseReducedTransaction
, however i'm getting this
requirement failed: Not enough bytes in the buffer
I'm doing:
{
(Address.create(tx._1),
client.getContext.parseReducedTransaction(tx._2.getBytes))
}
Where tx = (String, String) representing (WalletAddress, ReducedTxBytes)
For a sample tx like this:
3QIB_1JG5qTPntBexsiCikbQ_VE1uhtoBG5TyWttASKOkosAAAADg2FcoBCWIDoj4kbTFy2D_mMHmGvdRZRldiDJ0oWk0Dd85dhG5-wa8gfLDadilJjVz2_6zXtNKNCNzZXYicp3OvZjSvkKhjvNecY0PlaLGyzdPOvyMDYaEu87_luDcOdgA4CJegAIzQPULJZ0Jd77AX3zyVjLnTHR3SrYxe50EA_yE0yvp85wP62ZNAABBRTAhD0QBQQABAAONhACBKALCM0Ceb5mfvncu6xVoGKVzocLBwKb_NstzijZWfKBWxb4F5jqAtGSo5qMx6cBcwBzARABAgQC0ZaDAwGTo4zHsqVzAAABk8KypXMBAHRzAnMDgwEIze6sk7GlcwStmTQAAMC71dUEAAjNA9QslnQl3vsBffPJWMudMdHdKtjF7nQQD_ITTK-nznA_rZk0AwABAb_ckOgBAgEAzQPULJZ0Jd77AX3zyVjLnTHR3SrYxe50EA_yE0yvp85wP51P8Gw=
While trying to debug I found that the reason why it failed is due to decoding. As its a string that was encoded from an array bytes, it has to be decoded the same way.
Here is the way it was encoded:
reducedTx = Base64.getUrlEncoder.encodeToString(reducedTx.toBytes)
Where reducedTx is a ReducedTransaction
Answer below
The problem happens due to a failure of decoding it to the right bytes. Because it is being encoded via Base64.GetUrlEncoder, it has to be decoded the same way. Therefore this line down here works:
val result = (Address.create(tx._1),
client.getContext.parseReducedTransaction(
Base64.getUrlDecoder.decode(tx._2)))
Note that we're using Base64.getUrlDecoded rather than getBytes.
A 2012 answer at StackOverflow (“How do I read a binary file in a Windows Store app”) suggests this method of reading byte data from a StorageFile in a Windows Store app:
IBuffer buffer = await FileIO.ReadBufferAsync(theStorageFile);
byte[] bytes = buffer.ToArray();
That looks simple enough. As I am working in cppwinrt I have translated that to the following, within the same IAsyncAction that produced a vector of StorageFiles. First I obtain a StorageFile from the VectorView using theFilesVector.GetAt(index);
//Then this line compiles without error:
IBuffer buffer = co_await FileIO::ReadBufferAsync(theStorageFile);
//But I can’t find a way to make the buffer call work.
byte[] bytes = buffer.ToArray();
“byte[]” can’t work, to begin with, so I change that to byte*, but then
the error is “class ‘winrt::Windows::Storage::Streams::IBuffer’ has no member ‘ToArray’”
And indeed Intellisense lists no such member for IBuffer. Yet IBuffer was specified as the return type for ReadBufferAsync. It appears the above sample code cannot function as it stands.
In the documentation for FileIO I find it recommended to use DataReader to read from the buffer, which in cppwinrt should look like
DataReader dataReader = DataReader::FromBuffer(buffer);
That compiles. It should then be possible to read bytes with the following DataReader method, which is fortunately supplied in the UWP docs in cppwinrt form:
void ReadBytes(Byte[] value) const;
However, that does not compile because the type Byte is not recognized in cppwinrt. If I create a byte array instead:
byte* fileBytes = new byte(buffer.Length());
that is not accepted. The error is
‘No suitable constructor exists to convert from “byte*” to “winrt::arrayView::<uint8_t>”’
uint8_t is of course a byte, so let’s try
uint8_t fileBytes = new uint8_t(buffer.Length());
That is wrong - clearly we really need to create a winrt::array_view. Yet a 2015 Reddit post says that array_view “died” and I’m not sure how to declare one, or if it will help. That original one-line method for reading bytes from a buffer is looking so beautiful in retrospect. This is a long post, but can anyone suggest the best current method for simply reading raw bytes from a StorageFile reference in cppwinrt? It would be so fine if there were simply GetFileBytes() and GetFileBytesAsync() methods on StorageFile.
---Update: here's a step forward. I found a comment from Kenny Kerr last year explaining that array_view should not be declared directly, but that std::vector or std::array can be used instead. And that is accepted as an argument for the ReadBytes method of DataReader:
std::vector<unsigned char>fileBytes;
dataReader.ReadBytes(fileBytes);
Only trouble now is that the std::vector is receiving no bytes, even though the size of the referenced file is correctly returned in buffer.Length() as 167,513 bytes. That seems to suggest the buffer is good, so I'm not sure why the ReadBytes method applied to that buffer would produce no data.
Update #2: Kenny suggests reserving space in the vector, which is something I had tried, this way:
m_file_bytes.reserve(buffer.Length());
But it didn't make a difference. Here is a sample of the code as it now stands, using DataReader.
buffer = co_await FileIO::ReadBufferAsync(nextFile);
dataReader = DataReader::FromBuffer(buffer);
//The following line may be needed, but crashes
//co_await dataReader.LoadAsync(buffer.Length());
if (buffer.Length())
{
m_file_bytes.reserve(buffer.Length());
dataReader.ReadBytes(m_file_bytes);
}
The crash, btw, is
throw hresult_error(result, hresult_error::from_abi);
Is it confirmed, then, that the original 2012 solution quoted above cannot work in today's world? But of course there must be some way to read bytes from a file, so I'm just missing something that may be obvious to another.
Final (I think) update: Kenny's suggestion that the vector needs a size has hit the mark. If the vector is first prepared with m_file_bytes.assign(buffer.Length(),0) then it does get filled with file data. Now my only worry is that I don't really understand the way IAsyncAction is working and maybe could have trouble looping this asynchronously, but we'll see.
The array_view bridges the gap between Windows APIs and C++ array types. In this example, the ReadBytes method expects the caller to provide some array that it can copy bytes into. The array_view forwards a pointer to the caller's array as well as its size. In this case, you're passing an empty vector. Try resizing the vector before calling ReadBytes.
When you know how many bytes to expect (in this case 2 bytes), this worked for me:
std::vector<unsigned char>fileBytes;
fileBytes.resize(2);
DataReader reader = DataReader::FromBuffer(buffer);
reader.ReadBytes(fileBytes);
cout<< fileBytes[0] << endl;
cout<< fileBytes[1] << endl;
I was told that I should be using while(fin) instead of while(!fin.eof()) when reading a file.
What exactly is the difference?
Edit: I do know that while(fin) actually checks the stream object and that when it becomes NULL, the loop breaks and it covers eof and fail flags.
But my course teacher says that fin.eof() is better so I need to understand the fundamental operation that's going on here.
Which one is the right practice?
Note: This is not a duplicate, I need assistance in Turbo C++ and with binary files.
I'm basically trying to read a file using a class object.
First of all I am assuming fin is your fstream object. In which case your teacher would not have told you to use while(fin.eof()) for reading from file. She would have told to use while(!fin.eof()).
Let me explain. eof() is a member of the fstream class which returns a true or false value depending on whether the End Of File (eof) of the file you are reading has been reached. Thus while eof() function returns 0 it means the end of file has not been reached and loop continues to execute, but when eof() returns 1 the end of the file has been reached and the loop exits.
while(fin) loop is entered because fin actually returns the value of an error flag variable inside the class object fin whose value is set to 0 when any function like read or write or open fails. Thus the loop works as long as the read function inside the loop works.
Personally I would not suggest either of them.
I would suggest
//assume a class abc.
abc ob;
While(fin.read((char*)&ob, sizeof(ob)))
{}
Or
While(fin.getline(parameters))
{}
This loop reads the file record inside the loop condition and if nothing was read due to the end of file being reached, the loop is exited.
The problem with while(!fin.eof()) is that it returns 1 if the end of file has been reached. End of file is actually a character that is put at the end of the file. So when the read function inside the loop reads this character and sets a variable eof to 1. All the function actually does is return this value.
Thus works fine when you are reading lines in words but when you are reading successive records of a class from a file, this method will fail.
Consider
clas abc
{}a;
Fstream fin("file");
While(!fin.eof())
{
fin.read((char*)&a,sizeof(a));
a.display(); // display is a member function which displays the info }
Thus displays the last record twice. This is because the end of file character is the character after the last byte of the last record. When the last is read the file pointer is at the eof byte but hasn't read it yet. So it will enter the loop again but this time the eof char is read but the read function fails. The values already in the variable a, that is the previous records will be displayed again.
One good method is to do something like this:
while ( instream.read(...) && !instream.eof() ) { //Reading a binary file
Statement1;
Statement2;
}
or in case of a text file:
while ( (ch = instream.get()) && !instream.eof() ) { //To read a single character
Statement1;
Statement2;
}
Here, the object is being read within the while loop's condition statement and then the value of eof flag is being tested.
This wouldn't result in undesired outputs.
Here we are checking the status of the actual I/O operation and the eof together. You may also check for the fail flag.
I would like to point out that according to #RetiredNinja, we may only check for the I/O operation.
That is:
while ( instream.read(...) ) { //Reading a binary file
Statement1;
Statement2;
}
A quick and easy workaround that worked for me to avoid any problems when using eof is to check for it after the first reading and not as a condition of the while loop itself. Something like this:
while (true) // no conditions
{
filein >> string; // an example reading, could be any kind of file reading instruction
if (filein.eof()) break; // break the while loop if eof was reached
// the rest of the code
}
I am using MsgSendv and server sends MSgReply like this:
char desc_buf_out[MAX_CHARS_IN_A_LINE];
MsgReply(rcvid, EOK, desc_buf_out, sizeof(desc_buf_out));
My client is looking like this:
iov_t *iovrcv=calloc(1,sizeof(iov_t));
char rcv[1024]={0}
if (MsgSendv(server_coid, iovin_render, 3 , iovrcv, 1 ) == -1)
{
printf("error sending message to server\n");
fprintf( stderr,
"%s: %s\n",
__func__,
strerror( errno ) );
return EXIT_FAILURE;
}
SETIOV (iovrcv + 0, rcv, sizeof(rcv));
printf("iovrcv=%s\n", rcv);
But I get nothing in my rcv buffer?
Can you tell me why and what is the correct way of doing it so I receive my data correctly? I expect to receive string.
You are using the iovrcv uninitialized (well, ok, it's initialized with zeros via calloc, but it's not initialized to point to anything).
An iov_t is a pair of values, a pointer and a length.
It's given to the MsgSendv() function to tell it where the data should go. By leaving it uninitialized, you're telling MsgSendv() that the pointer is zero and the length is zero -- not a whole lot of data! :-)
Move your SETIOV to above the MsgSendv() function.
Also, be sure to initialize the iovin_render (which you show as having three parts, that is, three pairs of ptr/length values).
Hello I have a bizarre problem with sprintf. Here's my code:
void draw_number(int number,int height,int xpos,int ypos){
char string_buffer[5]; //5000 is the maximum score, hence 4 characters plus null character equals 5
printf("Number - %i\n",number);
sprintf(string_buffer,"%i",number); //Get string
printf("String - %s\n",string_buffer);
int y_down = ypos + height;
for (int x = 0; x < 5; x++) {
char character = string_buffer[x];
if(character == NULL){ //Blank characters occur at the end of the number from spintf. Testing with NULL works
break;
}
int x_left = xpos+height*x;
int x_right = x_left+height;
GLfloat vertices[] = {x_left,ypos,x_right,ypos,x_left,y_down,x_right,y_down};
rectangle2d(vertices, number_textures[atoi(strcat(&character,"\0"))], full_texture_texcoords);
}
}
With the printf calls there, the numbers are printed successfully and the numbers are drawn as expected. When I take them away, I can't view the output and compare it, of-course, but the numbers aren't rendering correctly. I assume sprintf breaks somehow.
This also happens with NSLog. Adding NSLog's anywhere in the program can either break or fix the function.
What on earth is going on?
This is using Objective-C with the iOS 4 SDK.
Thank you for any answer.
Well this bit of code is definately odd
char character = string_buffer[x];
...
... strcat(&character,"\0") ...
Originally I was thinking that depending on when there happens to be a NUL terminator on the stack this will clober some peice of memory, and could be causing your problems. However, since you're appending the empty string I don't think it will have any effect.
Perhaps the contents of the stack actually contain numbers that atoi is interpretting?Either way I suggest you fix that and see if it solves your issue.
As to how to fix it Georg Fritzsche beat me to it.
With strcat(&character,"\0") you are trying to use a single character as a character array. This will probably result in atoi() returning completely different values from what you're expecting (as you have no null-termination) or simply crash.
To fix the original approach, you could use proper a zero-terminated string:
char number[] = { string_buffer[x], '\0' };
// ...
... number_textures[atoi(number)] ...
But even easier would be to simply use the following:
... number_textures[character - '0'] ...
Don't use NULL to compare against a character, use '\0' since it's a character you're looking for. Also, your code comment sounds surprised, of course a '\0' will occur at the end of the string, that is how C terminates strings.
If your number is ever larger than 9999, you will have a buffer overflow which can cause unpredicable effects.
When you have that kind of problem, instantly think stack or heap corruption. You should dynamically allocate your buffer with enough size- having it as a fixed size is BEGGING for this kind of trouble. Because you don't check that the number is within the max- if you ever had another bug that caused it to be above the max, you'd get this problem here.