Appending null characters to regular characters - objective-c

I want to concatenate null characters with regular characters like so:
NSString *message = #"ABC";
NSUInteger length = [message length];
char char4 = length;
char char3 = length >> 8;
char char2 = length >> 16;
char char1 = length >> 24;
message = [NSString stringWithFormat:#"%c%c%c%c%#",char4,char3,char2,char1,message];
The problem is that the string stays at length 4 (and looks like ¿ABC). How can I edit this code so that the null characters are also appended to the string?
In reply to Mark's comment
What I am trying to do here is append the length of the string to the beginning of the string, not in numerical format, but in the format of ascii characters (almost like a base 256 number). I would use the format, #"%04c%#",length,message, the problem is that the resulting string would be 000¿ABC and zeroes have ascii value (48 in decimal to be exact) and so that defeats the purpose. The ASCII character that has decimal value 0 is the null character (\0) so I have to use that instead of 0. It is necessary that I have those leading null characters.
For the purposes of what I'm trying to accomplish, the following code works
NSUInteger length = [message length];
char char4 = length;
char char3 = length >> 8;
char char2 = length >> 16;
char char1 = length >> 24;
if (char4 == '\0')
message = [NSString stringWithFormat:#"\0%#",message];
else
message = [NSString stringWithFormat:#"%c%#",char4,message];
if (char3 == '\0')
message = [NSString stringWithFormat:#"\0%#",message];
else
message = [NSString stringWithFormat:#"%c%#",char3,message];
if (char2 == '\0')
message = [NSString stringWithFormat:#"\0%#",message];
else
message = [NSString stringWithFormat:#"%c%#",char2,message];
if (char1 == '\0')
message = [NSString stringWithFormat:#"\0%#",message];
else
message = [NSString stringWithFormat:#"%c%#",char1,message];
But, if anybody can contribute something shorter or more intuitive, that'd be great.

Don't do this. It's a bad idea. For example, NSString is conceptually built on 16-bit Unicode characters, but you're trying to prepend bytes. Note that there's nothing guaranteeing that NSString's internal representation is UTF-16 or any other specific encoding. In any case, when the string is written out it has to be converted to whatever encoding and that is unlikely to preserve your length prefixing in a way you can predict.
Use an NSData with the length in the first 4 (or maybe 8 would be better for a 64-bit length) bytes followed by the string in a particular encoding. I recommend UTF-8.

Related

Converting an audio file to char and sending to another device to listen to

I'm new in objective-C and I'm not getting to converting an audio file to char exactly I need. It is ignoring a sequence of zeros (0) and it's deforming the data structure.
My code is so:
-(NSString *) dataToHex:(NSData*) data {
const unsigned char *dbytes = (unsigned char*)[data bytes];
NSMutableString *hexStr =
[NSMutableString stringWithCapacity:[data length]/**2*/];
int i;
for (i = 0; i < [data length]; i++) {
[hexStr appendFormat:#"%x", dbytes[i]];
}
return [NSString stringWithString: hexStr];
}
Thank you very much.
If I understand your problem correctly the issue is with your format %x - this produces a hexadecimal text representation with sufficient digits to represent the value and without any leading zeroes.
For example the value 32 will produce the text 20, while the value 12 produces c - only one character long.
If you wish to convert to hex representation and then back again each of your byte values needs to produce the same number of characters - as otherwise you can't know where the boundaries are between each byte's representation.
To do this you can use the format %02x, which means always produce two characters padding with zeroes as required. For example with this format 12 will produce 0c.
HTH

Convert NSData byte array to string?

I have an NSData object. I need to convert its bytes to a string and send as JSON. description returns hex and is unreliable (according to various SO posters). So I'm looking at code like this:
NSUInteger len = [imageData length];
Byte *byteData = (Byte*)malloc(len);
[imageData getBytes:&byteData length:len];
How do I then send byteData as JSON? I want to send the raw bytes.
CODE:
NSString *jsonBase64 = [imageData base64EncodedString];
NSLog(#"BASE 64 FINGERPRINT: %#", jsonBase64);
NSData *b64 = [NSData dataFromBase64String:jsonBase64];
NSLog(#"Equal: %d", [imageData isEqualToData:b64]);
NSLog(#"b64: %#", b64);
NSLog(#"original: %#", imageData);
NSString *decoded = [[NSString alloc] initWithData:b64 encoding:NSUTF8StringEncoding];
NSLog(#"decoded: %#", decoded);
I get values for everything except for the last line - decoded.
Which would indicate to me that the raw bytes are not formatted in NSUTF8encoding?
The reason the String is being considered 'unreliable' in previous Stack posts is because they too were attempting to use NSData objects where the ending bytes aren't properly terminated with NULL :
NSString *jsonString = [NSString stringWithUTF8String:[nsDataObj bytes]];
// This is unreliable because it may result in NULL string values
Whereas the example below should give you your desired results because the NSData byte string will terminate correctly:
NSString *jsonString = [[NSString alloc] initWithBytes:[nsDataObj bytes] length:[nsDataObj length] encoding: NSUTF8StringEncoding];
You were on the right track and hopefully this is able to help you solve your current problem. Best of luck!
~ EDIT ~
Make sure you are declaring your NSData Object from an image like so:
NSData *imageData = [[NSData alloc] init];
imageData = UIImagePNGRepresentation(yourImage);
Have you tried using something like this:
#implementation NSData (Base64)
- (NSString *)base64EncodedString
{
return [self base64EncodedStringWithWrapWidth:0];
}
This will turn your NSData in a base64 string, and on the other side you just need to decode it.
EDIT: #Lucas said you can do something like this:
NSString *myString = [[NSString alloc] initWithData:myData encoding:NSUTF8StringEncoding];
but i had some problem with this method because of some special characters, and because of that i started using base64 strings for communication.
EDIT3: Trys this method base64EncodedString
#implementation NSData (Base64)
- (NSString *)base64EncodedString
{
return [self base64EncodedStringWithWrapWidth:0];
}
//Helper Method
- (NSString *)base64EncodedStringWithWrapWidth:(NSUInteger)wrapWidth
{
//ensure wrapWidth is a multiple of 4
wrapWidth = (wrapWidth / 4) * 4;
const char lookup[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
long long inputLength = [self length];
const unsigned char *inputBytes = [self bytes];
long long maxOutputLength = (inputLength / 3 + 1) * 4;
maxOutputLength += wrapWidth? (maxOutputLength / wrapWidth) * 2: 0;
unsigned char *outputBytes = (unsigned char *)malloc((NSUInteger)maxOutputLength);
long long i;
long long outputLength = 0;
for (i = 0; i < inputLength - 2; i += 3)
{
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[((inputBytes[i] & 0x03) << 4) | ((inputBytes[i + 1] & 0xF0) >> 4)];
outputBytes[outputLength++] = lookup[((inputBytes[i + 1] & 0x0F) << 2) | ((inputBytes[i + 2] & 0xC0) >> 6)];
outputBytes[outputLength++] = lookup[inputBytes[i + 2] & 0x3F];
//add line break
if (wrapWidth && (outputLength + 2) % (wrapWidth + 2) == 0)
{
outputBytes[outputLength++] = '\r';
outputBytes[outputLength++] = '\n';
}
}
//handle left-over data
if (i == inputLength - 2)
{
// = terminator
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[((inputBytes[i] & 0x03) << 4) | ((inputBytes[i + 1] & 0xF0) >> 4)];
outputBytes[outputLength++] = lookup[(inputBytes[i + 1] & 0x0F) << 2];
outputBytes[outputLength++] = '=';
}
else if (i == inputLength - 1)
{
// == terminator
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0xFC) >> 2];
outputBytes[outputLength++] = lookup[(inputBytes[i] & 0x03) << 4];
outputBytes[outputLength++] = '=';
outputBytes[outputLength++] = '=';
}
if (outputLength >= 4)
{
//truncate data to match actual output length
outputBytes = realloc(outputBytes, (NSUInteger)outputLength);
return [[NSString alloc] initWithBytesNoCopy:outputBytes
length:(NSUInteger)outputLength
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
}
else if (outputBytes)
{
free(outputBytes);
}
return nil;
}
Null termination is not the only problem when converting from NSData to NSString.
NSString is not designed to hold arbitrary binary data. It expects an encoding.
If your NSData contains an invalid UTF-8 sequence, initializing the NSString will fail.
The documentation isn't completely clear on this point, but for initWithData it says:
Returns nil if the initialization fails for some reason (for example
if data does not represent valid data for encoding).
Also: The JSON specification defines a string as a sequence of Unicode characters.
That means even if you're able to get your raw data into a JSON string, parsing could fail on the receiving end if the code performs UTF-8 validation.
If you don't want to use Base64, take a look at the answers here.
All code in this answer is pseudo-code fragments, you need to convert the algorithms into Objective-C or other language yourself.
Your question raises many questions... You start with:
I have an NSData object. I need to convert its bytes to a string and send as JSON. description returns hex and is unreliable (according to various SO posters).
This appears to suggest you wish to encode the bytes as a string, ready to decode them back to bytes the other end. If this is the case you have a number of choices, such as Base-64 encoding etc. If you want something simple you can just encode each byte as its two character hex value, pseudo code outline:
NSMutableString *encodedString = #"".mutableCopy;
foreach aByte in byteData
[encodedString appendFormat:#"%02x", aByte];
The format %02x means two hexadecimal digits with zero padding. This results in a string which can be sent as JSON and decoded easily the other end. The byte size over the wire will probably be twice the byte length as UTF-8 is the recommended encoding for JSON over the wire.
However in response to one of the answer you write:
But I need absolutely the raw bits.
What do you mean by this? Is your receiver going to interpret the JSON string it gets as a sequence of raw bytes? If so you have a number of problems to address. JSON strings are a subset of JavaScript strings and are stored as UCS-2 or UTF-16, that is they are sequences of 16-bit values not 8-bit values. If you encode each byte into a character in a string then it will be represented using 16-bits, if your receiver can access the byte stream it has to skip ever other byte. Of course if you receiver accesses the strings a character at a time each 16-bit character can be truncated back to an 8-bit byte. Now you might think if you take this approach then each 8-bit byte can just be output as a character as part of a string, but that won't work. While all values 1-255 are valid Unicode character code points, and JavaScript/JSON allow NULs (0 value) in strings, not all those values are printable, you cannot put a double quote " into a string without escaping it, and the escape character is \ - all these will need to be encoded into the string. You'd end up with something like:
NSMutableString *encodedString = #"".mutableCopy;
foreach aByte in byteData
if (isprint(aByte) && aByte != '"' && aByte != '\\')
[encodedString appendFormat:#"%c", aByte];
otherwise
[encodedString appendFormat:#"\\u00%02x", aByte]; // JSON unicode escape sequence
This will produce a string which when parsed by a JSON decoder will give you one character (16-bits) for each byte, the top 8-bits being zero. However if you pass this string to a JSON encoder it will encode the unicode escape sequences, which are already encoded... So you really need to send this string over the wire yourself to avoid this...
Confused? Getting complicated? Well why are you trying to send binary byte data as a string? You never say what your high-level goal is or what, if anything, is known about the byte data (e.g. does it represent character in some encoding)
If this is really just an array of bytes then why not send it as JSON array of numbers - a byte is just a number in the range 0-255. To do this you would use code along the lines of:
NSMutableArray *encodedBytes = [NSMutableArray new];
foreach aByte in byteData
[encodedBytes addObject:#(aByte)]; // add aByte as an NSNumber object
Now pass encodedBytes to NSJSONSerialisation and it will send a JSON array of numbers over the wire, the receiver will reverse the process packing each byte back into a byte buffer and you have you bytes back.
This method avoids all issues of valid strings, encodings and escapes.
HTH

In Objective-C, how to print out N spaces? (using stringWithCharacters)

The following is tried to print out N number of spaces (or 12 in the example):
NSLog(#"hello%#world", [NSString stringWithCharacters:" " length:12]);
const unichar arrayChars[] = {' '};
NSLog(#"hello%#world", [NSString stringWithCharacters:arrayChars length:12]);
const unichar oneChar = ' ';
NSLog(#"hello%#world", [NSString stringWithCharacters:&oneChar length:12]);
But they all print out weird things such as hello ÔÅÓñüÔÅ®Óñü®ÓüÅ®ÓñüÔ®ÓüÔÅ®world... I thought a "char array" is the same as a "string" and the same as a "pointer to a character"? The API spec says it is to be a "C array of Unicode characters" (by Unicode, is it UTF8? if it is, then it should be compatible with ASCII)... How to make it work and why those 3 ways won't work?
You can use %*s to specify the width.
NSLog(#"Hello%*sWorld", 12, "");
Reference:
A field width, or precision, or both, may be indicated by an asterisk
( '*' ). In this case an argument of type int supplies the field width
or precision. Applications shall ensure that arguments specifying
field width, or precision, or both appear in that order before the
argument, if any, to be converted.
This will get you what you want:
NSLog(#"hello%#world", [#"" stringByPaddingToLength:12 withString:#" " startingAtIndex:0]);
I think the issue you have is you are misinterpreting what +(NSString *)stringWithCharacters:length: is supposed to do. It's not supposed to repeat the characters, but instead copy them from the array into a string.
So in your case you only have a single ' ' in the array, meaning the other 11 characters will be taken from whatever follows arrayChars in memory.
If you want to print out a pattern of n spaces, the easiest way to do that would be to use -(NSString *)stringByPaddingToLength:withString:startingAtIndex:, i.e creating something like this.
NSString *formatString = #"Hello%#World";
NSString *paddingString = [[NSString string] stringByPaddingToLength: n withString: #" " startingAtIndex: 0];
NSLog(formatString, paddingString);
This is probably the fastest method:
NSString *spacesWithLength(int nSpaces)
{
char UTF8Arr[nSpaces + 1];
memset(UTF8Arr, ' ', nSpaces * sizeof(*UTF8Arr));
UTF8Arr[nSpaces] = '\0';
return [NSString stringWithUTF8String:UTF8Arr];
}
The reason your current code isn't working is because +stringWithCharacters: expects an array with a length of characters of 12, while your array is only 1 character in length {' '}. So, to fix, you must create a buffer for your array (in this case, we use a char array, not a unichar, because we can easily memset a char array, but not a unichar array).
The method I provided above is probably the fastest that is possible with a dynamic length. If you are willing to use GCC extensions, and you have a fixed size array of spaces you need, you can do this:
NSString *spacesWithLength7()
{
unichar characters[] = { [0 ... 7] = ' ' };
return [NSString stringWithCharacters:characters length:7];
}
Unfortunately, that extension doesn't work with variables, so it must be a constant.
Through the magic of GCC extensions and preprocessor macros, I give you.... THE REPEATENATOR! Simply pass in a string (or a char), and it will do the rest! Buy now, costs you only $19.95, operators are standing by! (Based on the idea suggested by #JeremyL)
// step 1: determine if char is a char or string, or NSString.
// step 2: repeat that char or string
// step 3: return that as a NSString
#define repeat(inp, cnt) __rep_func__(#encode(typeof(inp)), inp, cnt)
// arg list: (int siz, int / char *input, int n)
static inline NSString *__rep_func__(char *typ, ...)
{
const char *str = NULL;
int n;
{
va_list args;
va_start(args, typ);
if (typ[0] == 'i')
str = (const char []) { va_arg(args, int), '\0' };
else if (typ[0] == '#')
str = [va_arg(args, id) UTF8String];
else
str = va_arg(args, const char *);
n = va_arg(args, int);
va_end(args);
}
int len = strlen(str);
char outbuf[(len * n) + 1];
// now copy the content
for (int i = 0; i < n; i++) {
for (int j = 0; j < len; j++) {
outbuf[(i * len) + j] = str[j];
}
}
outbuf[(len * n)] = '\0';
return [NSString stringWithUTF8String:outbuf];
}
The stringWithCharaters:length: method makes an NSString (or an instance of a subclass of NSString) using the first length characters in the C array. It does not iterate over the given array of characters until it reaches the length.
The output you are seeing is the area of memory 12 Unicode characters long starting at the location of your passed 1 Unicode character array.
This should work.
NSLog(#"hello%#world", [NSString stringWithCharacters:" " length:12]);

is it possible to convert NSString into unichar

I have a NSString object and want to change it into unichar.
int decimal = [[temp substringFromIndex:2] intValue]; // decimal = 12298
NSString *hex = [NSString stringWithFormat:#"0x%x", decimal]; // hex = 0x300a
NSString *chineseChar = [NSString stringWithFormat:#"%C", hex];
// This statement log a different Chinese char every time I run this code
NSLog(#"%#",chineseChar);
When I see the log, It gives different character every time when I run my code.
m I missing something...?
The %C format specifier takes a 16-bit Unicode character (unichar) as input, not an NSString. You're passing in an NSString, which is getting reinterpreted as an integer character; since the string can be stored at a different address in memory each time you run, you get that address as an integer, which is why you get a different Chinese character every time you run your code.
Just pass in the character as an integer:
unichar decimal = 12298;
NSString *charStr = [NSString stringWithFormat:#"%C", decimal];
// charStr is now a string containing the single character U+300A,
// LEFT DOUBLE ANGLE BRACKET
How about -[NSString characterAtIndex:]? It wants a character index and returns a unichar.

Converting long value to unichar* in objective-c

I'm storing large unicode characters (0x10000+) as long types which eventually need to be converted to NSStrings. Smaller unicode characters can be created as a unichar, and an NSString can be created using
[NSString stringWithCharacters:(const unichar *)characters length:(NSUInteger)length]
So, I imagine the best way to get an NSString from the unicode long value would be to first get a unichar* from the long value. Any idea on how I might go about doing this?
Is there any reason you are storing the values as longs? For Unicode storage you only need to store the values as UInt32, which would then make it easy to interpret the data as UTF-32 by doing something like this:
int numberOfChars = 3;
UInt32* yourStringBuffer = malloc(sizeof(UInt32) * numberOfChars);
yourStringBuffer[0] = 0x2F8DB; //杞
yourStringBuffer[1] = 0x2318; //⌘
yourStringBuffer[2] = 0x263A; //☺
NSData* stringData = [NSData dataWithBytes:yourStringBuffer length:sizeof(UInt32) * numberOfChars];
//set the encoding according to the current byte order
NSStringEncoding encoding;
if(CFByteOrderGetCurrent() == CFByteOrderBigEndian)
encoding = NSUTF32BigEndianStringEncoding;
else
encoding = NSUTF32LittleEndianStringEncoding;
NSString* string = [[NSString alloc] initWithData:stringData encoding:encoding];
free(yourStringBuffer);
NSLog(#"%#",string);
//output: 杞⌘☺