Is there a way to create an sjcl.js bignum (bn) from an integer string? - bignum

sjcl.js provides codecs for reading in hex strings and utf8Strings, but it doesn't provide a codec for reading in base-10 integer strings. I'm trying to read in private keys for ECC point generation that are from another program (going from codec to bitArray then from bitArray to bn). Seems like an odd oversight which leads me to believe it is in there somewhere, just missing it.
Thanks,
PeterT

Related

Incorrect result of sha256 of a string in a Everscale Solidity smart contract

I would like to compute the sha256 of a string in a Free TON-Solidity contract, I do it by storing the string in a TvmBuilder and then passing it as a TvmSlice to sha256(), but the result is not correct (it doesn't match the one computed by sha256sum in my shell). Any idea why ?
Is the TvmBuilder adding some bits that are passed in the slice ?
Yes, tvm builder is kind of TL-B scheme serializer as far as I understand
sha256() function in Free TON Solidity API only takes a TvmBuilder as input, You can compute the hash for a raw string.
Hashing arbitrary string is hashing its BOC , because BOC is the only structure tvm can understand
i think you may want to build BOC out of this string. builder builds cells and cell layout is made of slices + refs. it results in a tree structure of slices mixed with refs, which resolve in blockchain state.
your approach should work for small strings as well as it works with the whole blockchain state. that’s the only way tvm understands data
so hash of string is hash of a cell which has proofs for underlying cells
that’s the way i now understand it, hope that helps.
and if you have string less than 127 bytes, you can pass bytes instead and hash bytes packed in a single cell
tg #freeton_smartcontracts here where clever SmC guys can clarify , because i’m self learner, not really hands on SmC pro
https://github.com/move-ton/ton-client-py/blob/b06b333e6f5582aa1888121cca80424b614e092c/tonclient/abi.py#L49
maybe this or rust core sdk help you

WebRtcNs_Process input buffer changed from int16* to float*

I have been using an earlier version of webrtc code. Today, I fetched the latest code and it broke my build:-(. It appears that WebRtcNs_Process now takes a "float" type buffer instead of "int16" type buffer. There may be a good reason to do so. However, this also seems to have broken the chain of operations.
Typically, you first call WebRtcNs_Process and feed the output of this method to WebRtcAecm_Process. In the latest version, the output of WebRtcNs_Process is a float type buffer but the input to WebRtcAecm_Process is a int16 buffer. Seems like now I have to write extra code to convert float buffer to int16 buffer.
Also, on most platforms, the output from the mic is int16 type buffer. There is additional code I have to write to convert this int16 value to float so that I can pass it to WebRtcNs_Process.
I am wondering if I missed something. Regards.

how to implement CRC alogirthm to check the data intergrity in .net wcf application

I am just trying to implement 32-bit CRC Algorithm with initial seed of 0x0 to check a xml string integrity in .net wcf application.
i am returning a Xml string.and the fixed-length
hexadecimal string generated from the value of the CRC . how can i implement CRC alogirthm for a xml string and convert CRC to hexadecimal string.
zlib provides a crc32() function for this purpose. A little bit of googling will turn up many other implementations in source code.

Converting in Smalltalk VisualWorks 7.9.1

I need to convert a ByteString to a Float32 (Exactly a 32-bit big-endian IEEE 754 floating point number). The ByteString is a part of an open sound control stream, received via UDP client.
I've spent a lot of time researching, so I'd like of someone handy with Smalltalk could give me a solution.
Thanks, in advance.
Since you seem to be receiving binary data, and not a decimal number in formatted ASCII, I would not recommend to call it ByteString, but rather ByteArray, Strings are an abstraction for containing characters, not bits.
In the case of VisualWorks, there is a class called UninterpretedBytes specialized in storing raw data (bits or rather bytes) for later interpretation.
This class has all the message you need to interpret the bytes, like for example #floatAt:bigEndian:
| yourBinaryStream buffer |
yourBinaryStream := ... insert some code to create your stream here...
buffer:= UninterpretedBytes from: (yourBinaryStream next: 4).
nextFloat := buffer floatAt: 1 bigEndian: true
In Pharo Smalltalk you can do:
(Float readFrom: '4.2') asIEEE32BitWord
readFrom: just reads a float from a string, and then you convert it to IEEE 754...
In VisualWorks you need to use the superclass method readFrom: as implemented in class Number.
First create a readstream on the string, for example:
Number readFrom: '192843.887' readStream

Converting large byte array to numeric string using Objective C

I am facing a issue in which I need to convert a byte array of roughly the size in range of 20 - 28 bytes into a printable number string. In Java I am able to do it easily using BigInteger Class. In Objective C, the biggest type is Long Long Int which is only 64 bits. Please guide me the approach to solve the problem. I am not good in C language but some examples will be highly helpful. I must admit, I would be in the same problem in Java if the BigInteger Class was not there in Java as well.
Check this source:
http://www.santsys.com/code/display?folder=Objective-C&file=BigInt.h
EDIT:
Check the BitInt class, a big integer implementation in Objective-C, so you do not need to use C to solve your problem, and it's methods are similar to that of Java.