Ergo - Unable to Parse ReducedTransaction, getting Not enough bytes in buffer - ergoscript

If we have a reducedTxBytes as String form. How can we sign it via appkit? (I found that we can use
ctx.parseReducedTransaction
, however i'm getting this
requirement failed: Not enough bytes in the buffer
I'm doing:
{
(Address.create(tx._1),
client.getContext.parseReducedTransaction(tx._2.getBytes))
}
Where tx = (String, String) representing (WalletAddress, ReducedTxBytes)
For a sample tx like this:
3QIB_1JG5qTPntBexsiCikbQ_VE1uhtoBG5TyWttASKOkosAAAADg2FcoBCWIDoj4kbTFy2D_mMHmGvdRZRldiDJ0oWk0Dd85dhG5-wa8gfLDadilJjVz2_6zXtNKNCNzZXYicp3OvZjSvkKhjvNecY0PlaLGyzdPOvyMDYaEu87_luDcOdgA4CJegAIzQPULJZ0Jd77AX3zyVjLnTHR3SrYxe50EA_yE0yvp85wP62ZNAABBRTAhD0QBQQABAAONhACBKALCM0Ceb5mfvncu6xVoGKVzocLBwKb_NstzijZWfKBWxb4F5jqAtGSo5qMx6cBcwBzARABAgQC0ZaDAwGTo4zHsqVzAAABk8KypXMBAHRzAnMDgwEIze6sk7GlcwStmTQAAMC71dUEAAjNA9QslnQl3vsBffPJWMudMdHdKtjF7nQQD_ITTK-nznA_rZk0AwABAb_ckOgBAgEAzQPULJZ0Jd77AX3zyVjLnTHR3SrYxe50EA_yE0yvp85wP51P8Gw=
While trying to debug I found that the reason why it failed is due to decoding. As its a string that was encoded from an array bytes, it has to be decoded the same way.
Here is the way it was encoded:
reducedTx = Base64.getUrlEncoder.encodeToString(reducedTx.toBytes)
Where reducedTx is a ReducedTransaction
Answer below

The problem happens due to a failure of decoding it to the right bytes. Because it is being encoded via Base64.GetUrlEncoder, it has to be decoded the same way. Therefore this line down here works:
val result = (Address.create(tx._1),
client.getContext.parseReducedTransaction(
Base64.getUrlDecoder.decode(tx._2)))
Note that we're using Base64.getUrlDecoded rather than getBytes.

Related

Why DataInputStream can't read char correctly?

I tried to write String in DataInputStream, when read from DataInputStream single char, but I have an error.
I expected, that readChar() return 'q', but method:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_8)).readChar())
Throws exception:
java.io.EOFException
at java.io.DataInputStream.readChar(DataInputStream.java:365)
Please have a look at DataInput.readChar() which states:
Reads two input bytes and returns a char value. Let a be the first byte read and b be the second byte. The value returned is:
(char)((a << 8) | (b & 0xff))
This method is suitable for reading bytes written by the writeChar method of interface DataOutput.
The last sentence is basically also the solution. If you write the data using writeChar, reading works as expected, i.e. the following will give you a succeeding test case:
assertEquals('q', DataInputStream(ByteArrayOutputStream().apply {
DataOutputStream(this).use {
it.writeChars("q")
}
}.toByteArray().inputStream())
.readChar())
The following, even though not mentioned in the interface, may also work out:
assertEquals('q', DataInputStream("q".byteInputStream(Charsets.UTF_16BE)).readChar())

How to read byte data from a StorageFile in cppwinrt?

A 2012 answer at StackOverflow (“How do I read a binary file in a Windows Store app”) suggests this method of reading byte data from a StorageFile in a Windows Store app:
IBuffer buffer = await FileIO.ReadBufferAsync(theStorageFile);
byte[] bytes = buffer.ToArray();
That looks simple enough. As I am working in cppwinrt I have translated that to the following, within the same IAsyncAction that produced a vector of StorageFiles. First I obtain a StorageFile from the VectorView using theFilesVector.GetAt(index);
//Then this line compiles without error:
IBuffer buffer = co_await FileIO::ReadBufferAsync(theStorageFile);
//But I can’t find a way to make the buffer call work.
byte[] bytes = buffer.ToArray();
“byte[]” can’t work, to begin with, so I change that to byte*, but then
the error is “class ‘winrt::Windows::Storage::Streams::IBuffer’ has no member ‘ToArray’”
And indeed Intellisense lists no such member for IBuffer. Yet IBuffer was specified as the return type for ReadBufferAsync. It appears the above sample code cannot function as it stands.
In the documentation for FileIO I find it recommended to use DataReader to read from the buffer, which in cppwinrt should look like
DataReader dataReader = DataReader::FromBuffer(buffer);
That compiles. It should then be possible to read bytes with the following DataReader method, which is fortunately supplied in the UWP docs in cppwinrt form:
void ReadBytes(Byte[] value) const;
However, that does not compile because the type Byte is not recognized in cppwinrt. If I create a byte array instead:
byte* fileBytes = new byte(buffer.Length());
that is not accepted. The error is
‘No suitable constructor exists to convert from “byte*” to “winrt::arrayView::<uint8_t>”’
uint8_t is of course a byte, so let’s try
uint8_t fileBytes = new uint8_t(buffer.Length());
That is wrong - clearly we really need to create a winrt::array_view. Yet a 2015 Reddit post says that array_view “died” and I’m not sure how to declare one, or if it will help. That original one-line method for reading bytes from a buffer is looking so beautiful in retrospect. This is a long post, but can anyone suggest the best current method for simply reading raw bytes from a StorageFile reference in cppwinrt? It would be so fine if there were simply GetFileBytes() and GetFileBytesAsync() methods on StorageFile.
---Update: here's a step forward. I found a comment from Kenny Kerr last year explaining that array_view should not be declared directly, but that std::vector or std::array can be used instead. And that is accepted as an argument for the ReadBytes method of DataReader:
std::vector<unsigned char>fileBytes;
dataReader.ReadBytes(fileBytes);
Only trouble now is that the std::vector is receiving no bytes, even though the size of the referenced file is correctly returned in buffer.Length() as 167,513 bytes. That seems to suggest the buffer is good, so I'm not sure why the ReadBytes method applied to that buffer would produce no data.
Update #2: Kenny suggests reserving space in the vector, which is something I had tried, this way:
m_file_bytes.reserve(buffer.Length());
But it didn't make a difference. Here is a sample of the code as it now stands, using DataReader.
buffer = co_await FileIO::ReadBufferAsync(nextFile);
dataReader = DataReader::FromBuffer(buffer);
//The following line may be needed, but crashes
//co_await dataReader.LoadAsync(buffer.Length());
if (buffer.Length())
{
m_file_bytes.reserve(buffer.Length());
dataReader.ReadBytes(m_file_bytes);
}
The crash, btw, is
throw hresult_error(result, hresult_error::from_abi);
Is it confirmed, then, that the original 2012 solution quoted above cannot work in today's world? But of course there must be some way to read bytes from a file, so I'm just missing something that may be obvious to another.
Final (I think) update: Kenny's suggestion that the vector needs a size has hit the mark. If the vector is first prepared with m_file_bytes.assign(buffer.Length(),0) then it does get filled with file data. Now my only worry is that I don't really understand the way IAsyncAction is working and maybe could have trouble looping this asynchronously, but we'll see.
The array_view bridges the gap between Windows APIs and C++ array types. In this example, the ReadBytes method expects the caller to provide some array that it can copy bytes into. The array_view forwards a pointer to the caller's array as well as its size. In this case, you're passing an empty vector. Try resizing the vector before calling ReadBytes.
When you know how many bytes to expect (in this case 2 bytes), this worked for me:
std::vector<unsigned char>fileBytes;
fileBytes.resize(2);
DataReader reader = DataReader::FromBuffer(buffer);
reader.ReadBytes(fileBytes);
cout<< fileBytes[0] << endl;
cout<< fileBytes[1] << endl;

Convert audio doubles to bytes

I am dealing with raw PCM audio data (the audio data of a PCM file without the header).
This data is provided to me in the form of a vector of double.
I would like to pass this data to another function, and this function expects the audio data in the form of a byte vector.
I tried
Dim nBytes() As Byte = nDoubles.SelectMany(Function(d) BitConverter.GetBytes(d)).ToArray()
but that wouldn't give the expected results.
I guess I have to deal with the conversion manually, but I am unsure how this should be done.
Can anybody help?
Thank you.
Since the required format for the other function is 16-bit, 48 kHz, which is the same as your source data, it's a simple case of converting the source to an array of Short, then serializing this as a Byte array.
The problem with the code you suggest for this is that the first step is missed, so it basically serializes the Double array. However, you can re-use this for the second step. So, you can do something like:
Dim nShorts() As Short = New Short(nDoubles.Length - 1) {}
For i = 0 To nDoubles.Length - 1
nShorts(i) = Convert.ToInt16(nDoubles(i))
Next
Dim nBytes() As Byte = nShorts.SelectMany(Function(s) BitConverter.GetBytes(s)).ToArray()

X509_NAME - how to get the actual buffer?

I am adding openssl to my application and I can send and receive data however so far it is not encrypted or checking the certificates.
I get the servers certificate with:
X509 *gCert = NULL;
X509_NAME *gCertName = NULL;
gCert = SSL_get_peer_certificate(sslConnection.ssl);
gCertName = X509_NAME_new();
gCertName = X509_get_subject_name(gCert);
certNameLen = strlen(gCertName);
memcpy(write_buffer, gCertName, certNameLen);
Then I write write_buffer to the SSL socket but what I receive in the other end is just gibberish.
How do I use X509_NAME? strlen does not work on it it seems? I get the length 4 which is the pointer size I think, not the buffer size...
And what encoding is X509_NAME? It does not seem like utf-8...
I know sending downstream works, if I just put 0xAA or 0XBB in the buffer it is received correctly on the other end.
You are fairly close. However X509_get_name returns a X509_NAME structure which you need to cast into a human readable string (or use 'as-is' for detailed and safer comparisons; as parsing/comparing plain ascii is a bit risky - a clever adversary could put things like 'C=' in the actual text).
The easiest of the lot is
X509 * gCertName =SSL_get_peer_certificate(con);
// if null - error out
const char * buf = X509_NAME_oneline(gCertName, 0, 0);
// if null - release memory and error out
printf("Client\t: %s\n", buf);
OpenSSL_free(buf);
X509_free(gCertName);
which nets you a one line approximation as a C \0 terminated string. Note that the _oneline functions are depricated - see the man pages of openssl for more elaborate ways to get a text rendition which is UTF8 and multi-line safe. But above is fine for simple things like pure ASCII westerncharset domain names without things like alternatives/variants.

How to read variable length data from an asynchronous tcp socket?

I'm using CocoaAsyncSocket for an iOS project. I'm trying to read VarInts through an asynchronous interface. The problem is unlike something else like a String, where I can prefix a length, I don't know the length of a varint beforehand. It needs to be processed one byte at a time, but since each read operation is asynchronous other read calls may have been queued in between.
I considered reading into a buffer then processing it, say reading 5 bytes (the max length for a varint-32), and pushing extra bytes back, but that may hang unnecessarily if the varint is only 4 bytes and I'm waiting for a 5th byte to be available.
How can I do this? Also, I cannot change the protocol on the other end, to use fixed size ints.
Here's a snippet of code as Josh requested
- (void)readByte:(void (^)(int8_t))onComplete {
NSUInteger size = 1;
int32_t tag = OSAtomicAdd32(1, &_nextTag);
dispatch_async(self.dispatchQueue, ^{
[self.onCompleteHandlers setObject:(^void (NSData* data) {
int8_t x = 0;
[data getBytes:&x length:size];
onComplete(x);
}) forKey:[NSNumber numberWithInteger:((NSInteger) tag)]];
[self.socket readDataToLength:size withTimeout:-1 tag:tag];
});
}
A callback is saved in a dictionary, which is used in the delegate method socket: didReadData: withTag.
Suppose I'm reading a VarInt byte by byte:
execute read first byte for varint
don't know if we need to read another byte for a varint or not; that depends on the result of the first read
(possible) read another byte for something else
read second byte for varint, but now it's actually the 3rd byte being read
I can imagine using a flag to indicate whether or not I'm in a multipart-read, and a queue to hold reads that should be executed after the multipart-read, and I've started writing it but it's quite messy. Just wondering if there is a standard/recommended/better way to approach this problem.
in short there are 4 ways to know how much to read from a socket...
read some format that you can infer the length from like the Content-Length header... only works if the whole request can be put together before the body is sent.
read until some pattern: like \r\n\r\n at the end of the headers
read until some timeout... after you get no bytes after n seconds you flush the buffers and close the connection.
read until the server closes the connection... actually used to be pretty common.
these each have problems and I would probably lean in your case from using some existing protocol.
of course there is overhead to doing it that way, and you may find that you don't want to use any of that application level stuff and your requests may be like:
client>"doMath(2+5)\0"
server>"(7)\0"
but it is hard to answer your general question specifically.
edit:
So I looked into the varint base-128 issue a little more and I think really only a timeout or the server closing the connection will work, if you are writing these right at the TCP level which is horrible...