I have a c# code which encodes a string. I am trying to write a corresponding routine in objective c.
The code is as follows:
// c# code
public static string Encode(Guid guid)
{
string encode = convert.ToBase64String(guid.ToByteArray());
encode = encoded.Replace("/","_").Replace("+","-");
return encoded.substring(0,22);
}
I have written this code in objective c.
- (NSString *)encode:(NSString *)inId
{
NSString *uniqueId = inId;
// convert user id in to data
NSData *userIdData = [uniqueId dataUsingEncoding:NSUTF16StringEncoding];
// convert encoded userId's data into base64EncodedString
NSString *base64String = [Base64 encode:userIdData];
//NSString *base64String = [userIdData encodeBase64ForData];
NSString *encodedId = [[NSString alloc] initWithString:base64String];
// replace "/" character in base64String into "_" character
encodedId = [encodedId stringByReplacingOccurrencesOfString:#"/" withString:#"_"];
// replace "+" character in base64String into "-" character
encodedId = [encodedId stringByReplacingOccurrencesOfString:#"+" withString:#"-"];
// get substring of range 22
encodedId = [encodedId substringToIndex:22];
NSLog(#"Base 64 encoded = %#",encodedId);
return encodedId;
}
I am calling this function from viewDidLoad
NSString *encodedStr = [self encode:#"a8f9f344-d14e-4541-a8e7-0f5936e42254"];// string to encode
NSLog(#"Encoded String %#",encodedStr);
this code is not giving me the correct result i want
for eg:for the string a8f9f344-d14e-4541-a8e7-0f5936e42254
it should give result as RPP5qE7RQUWo5w9ZNuQiVA.
Thanks.
Your problem is that guid.ToByteArray() and [uniqueId dataUsingEncoding:NSUTF16StringEncoding]; do not do the same thing. As far as I can tell from the documentation, the former removes the hyphens and treats the rest as the hex ASCII representation of 16 bytes. The latter just turns each character into UTF16 (actually, it is UTF-16 already) and puts it into an NSData.
You need to write some code in Objective-C to take an ASCII Hex string and convert it into bytes.
Related
I'd like to know if calling stringEncodingForData:encodingOptions:convertedString:usedLossyConversion: can return NSUTF16StringEncoding, NSUTF32StringEncoding or any of their variants?
The reason I'm asking is because of this documentation note on cStringUsingEncoding::
Special Considerations
UTF-16 and UTF-32 are not considered to be C string encodings, and should not be used with this method—the results of passing NSUTF16StringEncoding, NSUTF32StringEncoding, or any of their variants are undefined.
So I understand that creating a C string with UTF-16 or UTF-32 is unsupported, but I'm not sure if attempting String Encoding Detection with stringEncodingForData:encodingOptions:convertedString:usedLossyConversion: may return UTF-16 and UTF-32 or not.
An example scenario, (adapted from SSZipArchive.m), may be:
// name is a null-terminated C string built with `fread` from stdio.h:
char *name = (char *)malloc(size_name + 1);
size_t read = fread(name, 1, size_name + 1, file);
name[size_name] = '\0';
// dataName is the data object of name
NSData *dataName = [NSData dataWithBytes:(const void *)name length:sizeof(unsigned char) * size_name];
// stringName is the string object of dataName
NSString *stringName = nil;
NSStringEncoding encoding = [NSString stringEncodingForData:dataName encodingOptions:nil convertedString:&stringName usedLossyConversion:nil];
In the above code, can encoding be NSUTF16StringEncoding, NSUTF32StringEncoding or any of their variants?
Platforms: macOS 10.10+, iOS 8.0+, watchOS 2.0+, tvOS 9.0+.
Yes, if the string is encoded using one of those encodings. The notes about C strings are specific to C strings. An NSString is not a C string, and the method you're describing doesn't work on C strings; it works on arbitrary data that may be encoded in a wide variety of ways.
As an example:
#import <Foundation/Foundation.h>
int main(int argc, const char * argv[]) {
#autoreleasepool {
NSData *data = [#"test" dataUsingEncoding:NSUTF16StringEncoding];
NSStringEncoding encoding = [NSString stringEncodingForData:data
encodingOptions:nil
convertedString:nil
usedLossyConversion:nil];
NSLog(#"%ld == %ld", (unsigned long)encoding,
(unsigned long)NSUTF16StringEncoding);
}
return 0;
}
// Output: 10 == 10
This said, in your specific example, if name is really what it says it is, "a null-terminated C string," then it could never be UTF-16, because C strings cannot be encoded in UTF-16. C strings are \0 terminated, and \0 is a very common character in UTF-16. Without seeing more code, however, I would not gamble on whether that comment is accurate.
If your real question here is "given an arbitrary c-string-safe encoding, is it possible that stringEncodingForData: will return a not-c-string-safe encoding," then the answer is "yes, it could, and it's definitely not promised that it won't even if it doesn't today." If you need to prevent that, I recommend using NSStringEncodingDetectionSuggestedEncodingsKey and ...UseOnlySuggestedEncodingsKey to force it to be an encoding you can handle. (You could also use ...DisallowedEncodingsKey to prevent specific multi-byte encodings, but that wouldn't be as robust.)
I have an NSData object. I need to convert its bytes to a string and send as JSON. description returns hex and is unreliable (according to various SO posters). So I'm looking at code like this:
NSUInteger len = [imageData length];
Byte *byteData = (Byte*)malloc(len);
[imageData getBytes:&byteData length:len];
How do I then send byteData as JSON? I want to send the raw bytes.
CODE:
NSString *jsonBase64 = [imageData base64EncodedString];
NSLog(#"BASE 64 FINGERPRINT: %#", jsonBase64);
NSData *b64 = [NSData dataFromBase64String:jsonBase64];
NSLog(#"Equal: %d", [imageData isEqualToData:b64]);
NSLog(#"b64: %#", b64);
NSLog(#"original: %#", imageData);
NSString *decoded = [[NSString alloc] initWithData:b64 encoding:NSUTF8StringEncoding];
NSLog(#"decoded: %#", decoded);
I get values for everything except for the last line - decoded.
Which would indicate to me that the raw bytes are not formatted in NSUTF8encoding?
The reason the String is being considered 'unreliable' in previous Stack posts is because they too were attempting to use NSData objects where the ending bytes aren't properly terminated with NULL :
NSString *jsonString = [NSString stringWithUTF8String:[nsDataObj bytes]];
// This is unreliable because it may result in NULL string values
Whereas the example below should give you your desired results because the NSData byte string will terminate correctly:
NSString *jsonString = [[NSString alloc] initWithBytes:[nsDataObj bytes] length:[nsDataObj length] encoding: NSUTF8StringEncoding];
You were on the right track and hopefully this is able to help you solve your current problem. Best of luck!
~ EDIT ~
Make sure you are declaring your NSData Object from an image like so:
NSData *imageData = [[NSData alloc] init];
imageData = UIImagePNGRepresentation(yourImage);
Have you tried using something like this:
#implementation NSData (Base64)
- (NSString *)base64EncodedString
{
return [self base64EncodedStringWithWrapWidth:0];
}
This will turn your NSData in a base64 string, and on the other side you just need to decode it.
EDIT: #Lucas said you can do something like this:
NSString *myString = [[NSString alloc] initWithData:myData encoding:NSUTF8StringEncoding];
but i had some problem with this method because of some special characters, and because of that i started using base64 strings for communication.
EDIT3: Trys this method base64EncodedString
#implementation NSData (Base64)
- (NSString *)base64EncodedString
{
return [self base64EncodedStringWithWrapWidth:0];
}
//Helper Method
- (NSString *)base64EncodedStringWithWrapWidth:(NSUInteger)wrapWidth
{
//ensure wrapWidth is a multiple of 4
wrapWidth = (wrapWidth / 4) * 4;
const char lookup[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
long long inputLength = [self length];
const unsigned char *inputBytes = [self bytes];
long long maxOutputLength = (inputLength / 3 + 1) * 4;
maxOutputLength += wrapWidth? (maxOutputLength / wrapWidth) * 2: 0;
unsigned char *outputBytes = (unsigned char *)malloc((NSUInteger)maxOutputLength);
long long i;
long long outputLength = 0;
for (i = 0; i < inputLength - 2; i += 3)
{
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[((inputBytes[i] & 0x03) << 4) | ((inputBytes[i + 1] & 0xF0) >> 4)];
outputBytes[outputLength++] = lookup[((inputBytes[i + 1] & 0x0F) << 2) | ((inputBytes[i + 2] & 0xC0) >> 6)];
outputBytes[outputLength++] = lookup[inputBytes[i + 2] & 0x3F];
//add line break
if (wrapWidth && (outputLength + 2) % (wrapWidth + 2) == 0)
{
outputBytes[outputLength++] = '\r';
outputBytes[outputLength++] = '\n';
}
}
//handle left-over data
if (i == inputLength - 2)
{
// = terminator
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[((inputBytes[i] & 0x03) << 4) | ((inputBytes[i + 1] & 0xF0) >> 4)];
outputBytes[outputLength++] = lookup[(inputBytes[i + 1] & 0x0F) << 2];
outputBytes[outputLength++] = '=';
}
else if (i == inputLength - 1)
{
// == terminator
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0x03) << 4];
outputBytes[outputLength++] = '=';
outputBytes[outputLength++] = '=';
}
if (outputLength >= 4)
{
//truncate data to match actual output length
outputBytes = realloc(outputBytes, (NSUInteger)outputLength);
return [[NSString alloc] initWithBytesNoCopy:outputBytes
length:(NSUInteger)outputLength
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
}
else if (outputBytes)
{
free(outputBytes);
}
return nil;
}
Null termination is not the only problem when converting from NSData to NSString.
NSString is not designed to hold arbitrary binary data. It expects an encoding.
If your NSData contains an invalid UTF-8 sequence, initializing the NSString will fail.
The documentation isn't completely clear on this point, but for initWithData it says:
Returns nil if the initialization fails for some reason (for example
if data does not represent valid data for encoding).
Also: The JSON specification defines a string as a sequence of Unicode characters.
That means even if you're able to get your raw data into a JSON string, parsing could fail on the receiving end if the code performs UTF-8 validation.
If you don't want to use Base64, take a look at the answers here.
All code in this answer is pseudo-code fragments, you need to convert the algorithms into Objective-C or other language yourself.
Your question raises many questions... You start with:
I have an NSData object. I need to convert its bytes to a string and send as JSON. description returns hex and is unreliable (according to various SO posters).
This appears to suggest you wish to encode the bytes as a string, ready to decode them back to bytes the other end. If this is the case you have a number of choices, such as Base-64 encoding etc. If you want something simple you can just encode each byte as its two character hex value, pseudo code outline:
NSMutableString *encodedString = #"".mutableCopy;
foreach aByte in byteData
[encodedString appendFormat:#"%02x", aByte];
The format %02x means two hexadecimal digits with zero padding. This results in a string which can be sent as JSON and decoded easily the other end. The byte size over the wire will probably be twice the byte length as UTF-8 is the recommended encoding for JSON over the wire.
However in response to one of the answer you write:
But I need absolutely the raw bits.
What do you mean by this? Is your receiver going to interpret the JSON string it gets as a sequence of raw bytes? If so you have a number of problems to address. JSON strings are a subset of JavaScript strings and are stored as UCS-2 or UTF-16, that is they are sequences of 16-bit values not 8-bit values. If you encode each byte into a character in a string then it will be represented using 16-bits, if your receiver can access the byte stream it has to skip ever other byte. Of course if you receiver accesses the strings a character at a time each 16-bit character can be truncated back to an 8-bit byte. Now you might think if you take this approach then each 8-bit byte can just be output as a character as part of a string, but that won't work. While all values 1-255 are valid Unicode character code points, and JavaScript/JSON allow NULs (0 value) in strings, not all those values are printable, you cannot put a double quote " into a string without escaping it, and the escape character is \ - all these will need to be encoded into the string. You'd end up with something like:
NSMutableString *encodedString = #"".mutableCopy;
foreach aByte in byteData
if (isprint(aByte) && aByte != '"' && aByte != '\\')
[encodedString appendFormat:#"%c", aByte];
otherwise
[encodedString appendFormat:#"\\u00%02x", aByte]; // JSON unicode escape sequence
This will produce a string which when parsed by a JSON decoder will give you one character (16-bits) for each byte, the top 8-bits being zero. However if you pass this string to a JSON encoder it will encode the unicode escape sequences, which are already encoded... So you really need to send this string over the wire yourself to avoid this...
Confused? Getting complicated? Well why are you trying to send binary byte data as a string? You never say what your high-level goal is or what, if anything, is known about the byte data (e.g. does it represent character in some encoding)
If this is really just an array of bytes then why not send it as JSON array of numbers - a byte is just a number in the range 0-255. To do this you would use code along the lines of:
NSMutableArray *encodedBytes = [NSMutableArray new];
foreach aByte in byteData
[encodedBytes addObject:#(aByte)]; // add aByte as an NSNumber object
Now pass encodedBytes to NSJSONSerialisation and it will send a JSON array of numbers over the wire, the receiver will reverse the process packing each byte back into a byte buffer and you have you bytes back.
This method avoids all issues of valid strings, encodings and escapes.
HTH
I have tried to implement an ascii to character converter.
code is
[NSString stringWithFormat:#"%c",ascii]
It works fine for ascii upto 127
After 127 it shows the character for apple special characters
from here I found that
There are several different variations of the 8-bit ASCII table.
I need the
ISO 8859-1, also called ISO Latin-1
Ascii while converting. How can I convert ascii to ISO 8859-1 other than APPLE's special caracters.
[NSString stringWithCString:asciiString encoding:NSISOLatin1StringEncoding];
where asciiString is your ascii char followed by a null byte.
I have entered the character encoding problem and I made encoding library. This library doesn't support ISO Latin-1 but I think you can use it with "NSISOLatin1StringEncoding".
The internal code is below:
+ (NSString *)encodedStringWithContentsOfURL:(NSURL *)url
{
// Get the web page HTML
NSData *data = [NSData dataWithContentsOfURL:url];
// response
int enc_arr[] = {
NSUTF8StringEncoding, // UTF-8
NSShiftJISStringEncoding, // Shift_JIS
NSJapaneseEUCStringEncoding, // EUC-JP
NSISO2022JPStringEncoding, // JIS
NSUnicodeStringEncoding, // Unicode
NSASCIIStringEncoding // ASCII
};
NSString *data_str = nil;
int max = sizeof(enc_arr) / sizeof(enc_arr[0]);
for (int i=0; i<max; i++) {
data_str = [
[NSString alloc]
initWithData : data
encoding : enc_arr[i]
];
if (data_str!=nil) {
break;
}
}
return data_str;
}
You can download this library from https://github.com/weed/p120801_CharacterEncodingLibrary
I wish my advice be help you.
How do I convert a char to an NSString in Objective-C?
Not a null-terminated C string, just a simple char c = 'a'.
You can use stringWithFormat:, passing in a format of %c to represent a character, like this:
char c = 'a';
NSString *s = [NSString stringWithFormat:#"%c", c];
You can make a C-string out of one character like this:
char cs[2] = {c, 0}; //c is the character to convert
NSString *s = [[NSString alloc] initWithCString:cs encoding: SomeEncoding];
Alternatively, if the character is known to be an ASCII character (i. e. Latin letter, number, or a punctuation sign), here's another way:
unichar uc = (unichar)c; //Just extend to 16 bits
NSString *s = [NSString stringWithCharacters:&uc length:1];
The latter snippet with surely fail (not crash, but produce a wrong string) with national characters. For those, simple extension to 16 bits is not a correct conversion to Unicode. That's why the encoding parameter is needed.
Also note that the two snippets above produce a string with diferent deallocation requirements. The latter makes an autoreleased string, the former makes a string that needs a [release] call.
I'm storing large unicode characters (0x10000+) as long types which eventually need to be converted to NSStrings. Smaller unicode characters can be created as a unichar, and an NSString can be created using
[NSString stringWithCharacters:(const unichar *)characters length:(NSUInteger)length]
So, I imagine the best way to get an NSString from the unicode long value would be to first get a unichar* from the long value. Any idea on how I might go about doing this?
Is there any reason you are storing the values as longs? For Unicode storage you only need to store the values as UInt32, which would then make it easy to interpret the data as UTF-32 by doing something like this:
int numberOfChars = 3;
UInt32* yourStringBuffer = malloc(sizeof(UInt32) * numberOfChars);
yourStringBuffer[0] = 0x2F8DB; //杞
yourStringBuffer[1] = 0x2318; //⌘
yourStringBuffer[2] = 0x263A; //☺
NSData* stringData = [NSData dataWithBytes:yourStringBuffer length:sizeof(UInt32) * numberOfChars];
//set the encoding according to the current byte order
NSStringEncoding encoding;
if(CFByteOrderGetCurrent() == CFByteOrderBigEndian)
encoding = NSUTF32BigEndianStringEncoding;
else
encoding = NSUTF32LittleEndianStringEncoding;
NSString* string = [[NSString alloc] initWithData:stringData encoding:encoding];
free(yourStringBuffer);
NSLog(#"%#",string);
//output: 杞⌘☺