How to convert a handle string to a std::string - c++-cli

I am trying to convert a handle string to a normal string. I though the method I was using was working, but when I look in the debugger it appears that half of my string has been chopped off on the line that creates the chars variable. Any idea why and what the proper way to convert a handle string to a normal string woudl be?
std::string convert(String^ s) {
const char* chars = (const char*)(System::Runtime::InteropServices::Marshal::
StringToHGlobalAnsi(s)).ToPointer();
string myNewString = std::string(chars);
return myNewString;
}

It's probably the debugger that's cutting off the display of the string. You didn't mention how long a string you're using, but the debugger can't display infinite length, so it has to cut it off at some point.
To verify this, try printing myNewString to the console, or to the debugger via Debug::WriteLine or OutputDebugString.
However, there is a significant issue in your code: After allocating memory with StringToHGlobalAnsi, you must free it using FreeHGlobal.
If you want to continue using StringToHGlobalAnsi, I'd fix it up like this:
std::string convert(String^ s) {
IntPtr ptr = Marshal::StringToHGlobalAnsi(s);
string myNewString = std::string((const char*)ptr.ToPointer());
Marshal::FreeHGlobal(ptr);
return myNewString;
}
However, it's probably easier to use the marshal_as methods. This will take care of everything for you.
std::string output = marshal_as<std::string>(managedString);

Related

Correct way to convert array<unsigned char>^ to std::string

I wanted to know what is the correct way to convert a managed array<unsigned char>^ to an unmanaged std::string. What I do right now is this:
array<unsigned char>^ const content = GetArray();
auto enc = System::Text::Encoding::ASCII;
auto const source = enc->GetString(content);
std::string s = msclr::interop::marshal_as<std::string>(source);
Is there a way to marshal the content in one step to a std::string without converting to a String^?
I tried:
array<unsigned char>^ const content = GetArray();
std::string s = msclr::interop::marshal_as<std::string>(content);
but this gave me following errors:
Error C4996 'msclr::interop::error_reporting_helper<_To_Type,cli::array<unsigned char,1> ^,false>::marshal_as':
This conversion is not supported by the library or the header file needed for this conversion is not included.
Please refer to the documentation on 'How to: Extend the Marshaling Library' for adding your own marshaling method.
Error C2065 '_This_conversion_is_not_supported': undeclared identifier
If the array is of plain bytes, then it's already ASCII encoded (or whatever narrow characters you're using). Converting to a managed UTF-16 String^ is an unnecessary detour.
Just construct a narrow-characters string from the byte array. Pass a pointer to the first byte and the length.
array<unsigned char>^ const content = GetArray();
pin_ptr<unsigned char> contentPtr = &content[0];
std::string s(contentPtr, content->Length);
I'm not at a compiler right now, there may be trivial syntax errors.

How can i import this Dll

I have a "CLock.dll" have some functions
For example: This is document for a function
__int16 __stdcall dv_get_auth_code(unsigned char* auth);
Function
To gain authorization code of setup card.
Parameters
auth:[out] Return authorization code, 6 characters.
Return
Succeed then return 0.
I need to call this dll in my winform application. I try
[DllImport("CLock.dll",CharSet = CharSet.Ansi, CallingConvention = CallingConvention.StdCall)]
public static extern int dv_get_auth_code([Out]StringBuilder auth);`
and in Main()
My code:
StringBuilder sb = new StringBuilder();
int result = dv_get_auth_code(sb);
But it's working. What should i do? Thank you and have a nice day !
There are two mistakes in the code as presented. The return type is wrong, and no buffer is allocated.
The return type is a 16 bit type, in C# that is short:
[DllImport("Clock.dll")]
public static extern short dv_get_auth_code(StringBuilder auth);
Then to call the function you need to allocate a buffer. I don't know how large that buffer should be, presumably you know that.
StringBuilder sb = new StringBuilder(bufferLengtg);
short result = dv_get_auth_code(sb);
It is always wise for such an API to pass the length of the buffer to the function. Then it can make sure it does not overrun the buffer.

C/C++ DLL: Converting a const uint8_t to a String

I haven't seen C++ code in more than 10 years and now I'm in the need of developing a very small DLL to use the Ping class (System::Net::NetworkInformation) to make a ping to some remoteAddress.
The argument where I'm receiving the remoteAddress is a FREObject which then needs to be transformed into a const uint8_t *. The previous is mandatory and I can't change anything from it. The remoteAddress has to be received as a FREObject and later be transformed in a const uint8_t *.
The problem I'm having is that I have to pass a String^ to the Ping class and not a const uint8_t * and I have no clue of how to convert my const uint8_t * to a String^. Do you have any ideas?
Next is part of my code:
// argv[ARG_IP_ADDRESS_ARGUMENT holds the remoteAddress value.
uint32_t nativeCharArrayLength = 0;
const uint8_t * nativeCharArray = NULL;
FREResult status = FREGetObjectAsUTF8(argv[ARG_IP_ADDRESS_ARGUMENT], &nativeCharArrayLength, &nativeCharArray);
Basically the FREGetObjectAsUTF8 function fills the nativeCharArray array with the value of argv[ARG_IP_ADDRESS_ARGUMENT] and returns the array's length in nativeCharArrayLength. Also, the string uses UTF-8 encoding terminates with the null character.
My next problem would be to convert a String^ back to a const uint8_t *. If you can help with this as well I would really appreciate it.
As I said before, non of this is changeable and I have no idea of how to change nativeCharArray to a String^. Any advice will help.
PS: Also, the purpose of this DLL is to use it as an ANE (Air Native Extension) for my Adobe Air app.
You'll need UTF8Encoding to convert the bytes to characters. It has methods that take pointers, you'll want to take advantage of that. You first need to count the number of characters in the converted string, then allocate an array to store the converted characters, then you can turn it into System::String. Like this:
auto converter = gcnew System::Text::UTF8Encoding;
auto chars = converter->GetCharCount((Byte*)nativeCharArray, nativeCharArrayLength-1);
auto buffer = gcnew array<Char>(chars);
pin_ptr<Char> pbuffer = &buffer[0];
converter->GetChars((Byte*)nativeCharArray, nativeCharArrayLength-1, pbuffer, chars);
String^ result = gcnew String(buffer);
Note that the -1 on nativeCharArrayLength compensates for the zero terminator being included in the value.

Rust - How to migrate '\uXXXX' to new bytes string

I'm wondering if there is a possibility to translate following old Rust code:
bytes!("a\u2028t")
Into current language. It seems bytes! was deprecated by b"" but I don't see a way to translate \u2028 into a byte string literal.
If you want a true byte string equivalent, you'll need to find the UTF8 encoding of U+2028, e.g. via
fn main() {
for b in "\u2028".as_bytes().iter() { print!("\\x{:x}", *b) }
}
which prints \xe2\x80\xa8 (i.e. in pre-encoded form), so b"a\xe2\x80\xa8t" should work. Also, the above hints at another method: you can often use "a\u2028t".as_bytes(), although this will not work in static contexts.

A program that will let the user input a string and then output each letter on a new line

import java.util.Scanner;
public class Program3_5
{
public static void main (String[]args)
{
Scanner scan = new Scanner(System.in);
String input = new String();
System.out.println("Please enter a string: ");
input=scan.next();
int length;
length = input.length;
input.substring();
System.out.println(charAt(0));
while (length)
{
System.out.println(charAt(0 + 1));
}
}
}
I am getting an error stating that it "cannot find symbol - variable length"
I have tried numerous things yet I am having trouble getting it to work. New to Java! Thanks in advance.
For example if the user were to input: Hello There
The Output would print the letters on separate lines.
String#length() is a method, not a field, of String. You need to call the method. In Java, methods are called (or "invoked") using parentheses. So, change
length = input.length;
// to
length = input.length();
Anticipating the next compile error you see:
while (length)
won't compile in Java because length is an int, but the condition part of a while must be a boolean. I'm guessing you want to continue as long as the string is not empty, so change the while condition to be
while (length > 0)
Other problems you'll need to solve to get your code to compile:
String#substring() requires integer arguments
Also, the code will compile with the String input = new String(); but the assignment is completely unnecessary. In Java, you almost never need to new a string. Instead, use string literals.