My application is in Windows CE 6.0 using Compact Framework and is being used to issue remote commands to a device through RS-232. These commands are send using bytes with specific hex values, e.g. sending 0x22 0x28 0x00 0x01 as a command sequence. I'm sending the bytes one at a time. The hex values are stored internally in a string for each command sequence, e.g. "22,28,00,01". I'm sending the bytes using the following code.
Dim i As Integer
Dim SendString() As String
Dim SendByte, a As String
DutCommand = "22,0A,00,02,E7,83" 'Sample command string
SendString = Split(DutCommand, ",") 'Split the string
For i = 0 To UBound(SendString) 'Send each byte after encoding
SendByte = Chr(CInt("&H" & SendString(i)))
CommPort.Write(SendByte)
Next
SendByte is being properly encoded even for values greater than 0x7F but the last two bytes being sent (0xE7 and 0x83) are being sent as 0x3F, the ASCII code for "?" since it's greater than 0x7F.
Am I missing a setting for the Comm port to handle encoding? Is there a simple method for sending the data with values greater than 0x7F?
You simply forgot to convert the hex values to bytes. It needs to look like this:
For i = 0 To UBound(SendString) 'Send each byte after encoding
Dim b = Byte.Parse(SendString(i), Globalization.NumberStyles.HexNumber)
CommPort.BaseStream.WriteByte(b)
Next
The non-stringy way is:
Dim DutCommand As Byte() = {&H22, &H0A, &H00, &H02, &HE7, &H83}
CommPort.Write(DutCommand, 0, DutCommand.Length)
I am assuming that you are using SerialPort.Write.
If so, notice what the documentation says:
By default, SerialPort uses ASCIIEncoding to encode the characters. ASCIIEncoding encodes all characters greater than 127 as (char)63 or '?'. To support additional characters in that range, set Encoding to UTF8Encoding, UTF32Encoding, or UnicodeEncoding.
Seems like the solution is pretty clear. You'll need to set the CommPort.Encoding property to the desired value.
See SerialPort.Encoding for more info.
As per the documentation for SerialPort.Write:
By default, SerialPort uses ASCIIEncoding to encode the characters.
ASCIIEncoding encodes all characters greater than 127 as (char)63 or
'?'. To support additional characters in that range, set Encoding to
UTF8Encoding, UTF32Encoding, or UnicodeEncoding.
You could also consider using the Write overload that actually just writes the raw bytes.
Related
I am trying to convert string to hex then to Base64, it is working but but base64value getting is not matching - Vs 2008(.net 3.5) & vs 2019 (.net 4.6)
this my code
--HEX string
Dim QrCodeHex as string ="010c426f6273205265636f726473020f3331303132323339333530303030330314323032322d30342d32355431353a33303a30305a0407313030302e303005063135302e3030"
dim QrCodeBase64En as string = System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(QrCodeHex))
I am getting this result- is wrong
MDEwYzQyNmY2MjczMjA1MjY1NjM2ZjcyNjQ3MzAyMGYzMzMxMzAzMTMyMzIzMzM5MzMzNTMwMzAzMDMwMzMwMzE0MzIzMDMyMzIyZDMwMzQyZDMyMzU1NDMxMzUzYTMzMzAzYTMwMzA1YTA0MDczMTMwMzAzMDJlMzAzMDA1MDYzMTM1MzAyZTMwMzA=
Correct Result is
AQxCb2JzIFJlY29yZHMCDzMxMDEyMjM5MzUwMDAwMwMUMjAyMi0wNC0yNVQxNTozMDowMFoEBzEwMDAuMDAFBjE1MC4wMA==
how do i get it.
System.Text.Encoding.UTF8.GetBytes converts a (regular) string to a byte array. However, in your case, you don't have a regular string ("Bobs Records...") but a hexadecimal representation of the byte array ("010c426f62..."). So you need to convert that hex representation to a byte array first:
Dim QrCodeHex As String = "010c426f6273205265636f726473020f3331303132323339333530303030330314323032322d30342d32355431353a33303a30305a0407313030302e303005063135302e3030"
' Hex to bytes
Dim bytes As Byte() = BigInteger.
Parse(QrCodeHex, NumberStyles.AllowHexSpecifier).
ToByteArray().Reverse().ToArray()
Dim QrCodeBase64En As String = Convert.ToBase64String(bytes)
Console.WriteLine(QrCodeBase64En)
(fiddle)
Note: I just used BigInteger for conversion, since it was the most compact way to do so without relying on .NET 5+ features. (The Reverse is required because, by default, it outputs the number as little-endian.) See this question for alternatives: How can I convert a hex string to a byte array?
I have a webservice api in vb.net that accepts string. but i cannot control the data coming to this API. I sometimes receive chars in between words in this format (–, Á, •ï€,ââ€ï€, etc. ) Is there a way for me to handle these or convert these characters to their correct symbols before saving to the database?
i know that the best solution would be to go after the source where the characters get malformed.. but i'll make that as plan B
my code is already using utf-8 as encoding pattern, but what if the client that uses my API messed up and inadvertently sent the malformed char thru the API. can i clean that string and convert the malformed char to the correct symbol?
If you only want to accept ASCII characters, you could remove non-ASCII characters by encoding and decoding the string - the default ASCII encoding uses "?" as a substitute for unrecognized characters, so you probably want to override that:
' Using System.Text
Dim input As String = "âh€eÁlâl€o¢wïo€râlâd€ï€"
Dim ascii As Encoding = Encoding.GetEncoding(
"us-ascii",
New EncoderReplacementFallback(" "),
New DecoderReplacementFallback(" ")
)
Dim bytes() As Byte = ascii.GetBytes(input)
Dim output As String = ascii.GetString(bytes)
Output:
h e l l o w o r l d
The replacement given to the En/DecoderReplacementFallback can be empty if you just want to drop the non-ASCII characters.
You could use a different encoding than ASCII if you want to accept more characters - but I would imagine that most of the characters you listed are valid in most European character sets.
While you are kind of vague I could guide you in something I think you could potentially do:
Sub Main()
Dim allowedValues = "abcdefghijklmnopqrstuvwxyz".ToCharArray()
Dim someGoodSomeBad = "###$##okay##$##"
Dim onlyGood = New String(someGoodSomeBad.ToCharArray().Where(Function(x) allowedValues.Contains(x)).ToArray)
Console.WriteLine(onlyGood)
End Sub
The first line would be valid characters, in my example I chose to use alpha characters, you could add more characters and numbers too. Basically you are creating a whitelist of acceptable characters, that you the developer would make.
The next line would be an output from your API that has some good and some bad lines.
The next part is really more simple than it looks. I am extending the string to be an array of characters, then I am finding ONLY the characters that match my whitelist in a lambda statement. Then I extend this to an array again because if I do a new String in .NET from a char array.
I then get a good string, but I could make 'good' to be subjective based on a whitelist.
The bigger question though is WHY is your Web API sending garbled data over? It should be sending well formed JSON or XML that is then able to well parsed and strongly type to models. Doing what I have shown above is more of a bandaide than a real fix to the underlying problem and it will have MANY holes.
I have a function that generates a random Base64 String
Public Shared Function GenerateSalt() As String
Dim rng As RNGCryptoServiceProvider = New RNGCryptoServiceProvider
Dim buff(94) As Byte
rng.GetBytes(buff)
Return Convert.ToBase64String(buff)
End Function
This will always return a 128 Character String. I then take that string and divide it into 4 substrings. I then merge that all back into one big string called MasterSalt like so
MasterSalt = (Salt.Substring(1,32)) + "©" + (Salt.Substring(32,32)) + "©" + etc...
I am doing this because I then put all of this into an array and say Split(MasterSalt, "©")
My concern is I am not overly confident in the stability of using "©" as the delimiter to define where the string should be split. However I have to use something that is not going to be included in the randomly generated base64string. I would like it to be something that can be found on a standard keyboard if possible. So to be clear my question is: is there a glyph or character on a standard keyboard that would never be included in a randomly generated base64string??
Base64 uses 64 characters to encode 6 bits of the content at a time as values 0-63;
A-Z (0-25)
a-z (26-51)
0-9 (52-61)
+ (62)
/ (63)
...and it uses = as filler at the end if required.
Any other character will be available for you to use as a delimiter, for example space, period and minus.
Im basically trying to achieve this : how to get the peers from an torrent tracker
Im stuck here :
Not only that, you have to send the actual value of the hash as a GET parameter. "76a36f1d11c72eb5663eeb4cf31e351321efa3a3" is a hexadecimal representation of the hash, but the tracker protocol specifies that you need to send the value of the hash (=bytestring). So you have to first decode the hexadecimal representation and then URL encode it: urllib.urlencode( [('info_hash', '76a36f1d11c72eb5663eeb4cf31e351321efa3a3'.decode('hex'))] ) == 'info_hash=v%A3o%1D%11%C7.%B5f%3E%EBL%F3%1E5%13%21%EF%A3%A3' # in Python.
I have researched quite alot and due to my newbish coding skills I can't manage to do the following in vb.net. Could anyone please enlighten me ?
I need to do the same thing :
Conversion from hexadecimal representation to the bytestring value of the hash.
Thanks in advance
I was surprised there was not an easier way to turn a hex string into a byte array, but I didn't locate one quickly so here is the hard way:
Dim hex As String = "76a36f1d11c72eb5663eeb4cf31e351321efa3a3"
'prepend leading zero if needed
If hex.Length Mod 2 <> 0 Then
hex = " " & hex
End If
Dim bytes As Byte() = New Byte((hex.Length \ 2) - 1) {}
For byteNum As Int32 = 0 To bytes.Length - 1
bytes(byteNum) = Byte.Parse(hex.Substring(byteNum * 2, 2),
Globalization.NumberStyles.AllowHexSpecifier)
Next
'convert to an ansi string and escape
Dim final As String =Uri.EscapeDataString(
System.Text.Encoding.Default.GetString(bytes))
I've been trying to trace down a bug for hours now and it has come down to this:
Dim length as Integer = 300
Dim buffer() As Byte = binaryReader.ReadBytes(length)
Dim text As String = System.Text.Encoding.UTF8.GetString(buffer, 0, buffer.Length)
The problem is the buffer contains 300 bytes but the length of the string 'text' is now 285. When I convert it back to bytes, the length is 521 bytes... WTF?
The same code is a normal WinForms app works perfectly. The data being read by the binary reader is a UTF8 encoded string. Any ideas why Silverlight is playing funny buggers?
I bet your stream contains some characters that require more than one byte. UTF8 uses a single byte when possible, but uses more bytes when the character is outside the ASCII range.
This explains why your buffer is longer than the string (300 vs 285).
Example:
string: "t e s t ä " (length = 5 -last char takes 2 bytes)
bytes: 0x74 | 0x65 | 0x73 | 0x74 | 0xc3 0xa4 (length = 6)
As to why it becomes even longer when you convert the text back to bytes, my best guess (also looking at the 521 size you get) is that you are using Encoding.Unicode instead of Encoding.UTF8 to perform the conversion. Unicode always uses two bytes for each character.
(btw. obviously this has nothing to do with Silverlight. You are probably testing the code with two different strings in Winforms vs. Silverlight. No worry, we've all done stupid mistakes like that :-) )