Signed byte array to UIImage - objective-c

I am trying to display a picture from a byte-array produced by a web service. Printing out a description it looks like this:
("-119",80,78,71,13,10,26,10,0,0,0,13,3 ... )
From the header it is clear that it's a png encoded in signed integers. It is an __NSCFArray having __NSCFNumber elements.
My code in Objective-C (based on much googling):
NSData *data = [NSData dataWithBytes:(const void *)myImageArray length [myImageArray count]];
UIImage *arrayImage = [UIImage imageWithData:data];
I receive a null UIImage pointer.
I also tried to converting it to unsigned NSNumbers first and then passing it to NSData, though perhaps I did not do this correctly. What am I doing wrong?

You cannot simply cast an NSArray of NSNumber into binary data. Both NSArray and NSNumber are objects; they have their own headers and internal structure that is not the same as the original string of bytes. You'll need to convert it byte-by-byte with something along these lines:
NSArray *bytes = #[#1, #2, #3];
NSMutableData *data = [NSMutableData dataWithLength:bytes.count];
for (NSUInteger i = 0; i < bytes.count; i++) {
char value = [bytes[i] charValue];
[data replaceBytesInRange:NSMakeRange(i, 1) withBytes:&value];
}
char is a signed int8_t, which appears to be the kind of data you're working with. It is often used to mean "an ASCII character," but in C it is commonly also used to mean "byte."

Related

char* to NSData issue

I am trying to write a function that takes char* as an input parameter and will serialize it into JSON.
I am running into an issue with converting the input parameter, options to NSData. I used the following line of code:
NSData *data = [NSData dataWithBytes:options length:sizeof(options)];
This did not work. A different set of code did work:
NSString* stringFromChar = [[NSString alloc] initWithUTF8String:options];
NSData * data = [stringFromChar dataUsingEncoding:NSUTF8StringEncoding];
I am curious about why it was necessary to convert my code from char* to an NSString and then to NSData and why I could not do that directly. Is there a way to directly convert char* to NSData without this intermediary step? Thanks.
As the comments indicated, sizeof(options) where options is a char * will produce the size of the pointer, not the length of the string. Also pointed out in comments, strlen(options) counts characters up to the first 0x0, which is what you want...
NSData *data = [NSData dataWithBytes:options length:strlen(options)];
// options must be null-terminated

Does NSJSONSerialization deserialize numbers as NSDecimalNumber?

Take the following piece of code:
NSError *error;
NSString *myJSONString = #"{ \"foo\" : 0.1}";
NSData *jsonData = [myJSONString dataUsingEncoding:NSUTF8StringEncoding];
NSDictionary *results = [NSJSONSerialization JSONObjectWithData:jsonData options:0 error:&error];
My question is, is results[#"foo"] an NSDecimalNumber, or something with finite binary precision like a double or float? Basically, I have an application that requires the lossless accuracy that comes with an NSDecimalNumber, and need to ensure that the JSON deserialization doesn't result in rounding because of doubles/floats etcetera.
E.g. if it was interpreted as a float, I'd run into problems like this with precision:
float baz = 0.1;
NSLog(#"baz: %.20f", baz);
// prints baz: 0.10000000149011611938
I've tried interpreting foo as an NSDecimalNumber and printing the result:
NSDecimalNumber *fooAsDecimal = results[#"foo"];
NSLog(#"fooAsDecimal: %#", [fooAsDecimal stringValue]);
// prints fooAsDecimal: 0.1
But then I found that calling stringValue on an NSDecimalNumber doesn't print all significant digits anyway, e.g...
NSDecimalNumber *barDecimal = [NSDecimalNumber decimalNumberWithString:#"0.1000000000000000000000000000000000000000000011"];
NSLog(#"barDecimal: %#", barDecimal);
// prints barDecimal: 0.1
...so printing fooAsDecimal doesn't tell me whether results[#"foo"] was at some point rounded to finite precision by the JSON parser or not.
To be clear, I realise I could use a string rather than a number in the JSON representation to store the value of foo, i.e. "0.1" instead of 0.1, and then use [NSDecimalNumber decimalNumberWithString:results[#"foo"]]. But, what I'm interested in is how the NSJSONSerialization class deserializes JSON numbers, so I know whether this is really necessary or not.
NSJSONSerialization (and JSONSerialization in Swift) follow the general pattern:
If a number has only an integer part (no decimal or exponent), attempt to parse it as a long long. If that doesn't overflow, return an NSNumber with long long.
Attempt to parse a double with strtod_l. If it doesn't overflow, return an NSNumber with double.
In all other cases, attempt to use NSDecimalNumber which supports a much larger range of values, specifically a mantissa up to 38 digits and exponent between -128...127.
If you look at other examples people have posted you can see that when the value exceeds the range or precision of a double you get an NSDecimalNumber back.
The short answer is that you should not serialize to JSON if you require NSDecimalNumber levels of precision. JSON has only one number format: double, which has inferior precision to NSDecimalNumber.
The long answer, which is of academic interest only, because the short answer is also the right answer, is "Not necessarily." NSJSONSerialization does sometimes deserialize as NSDecimalNumber, but it is undocumented, and I have not determined, what the set of circumstances under which it does is. For instance:
BOOL boolYes = YES;
int16_t int16 = 12345;
int32_t int32 = 2134567890;
uint32_t uint32 = 3124141341;
unsigned long long ull = 312414134131241413ull;
double dlrep = 1.5;
double dlmayrep = 1.1234567891011127;
float fl = 3124134134678.13;
double dl = 13421331.72348729 * 1000000000000000000000000000000000000000000000000000.0;
long long negLong = -632414314135135234;
unsigned long long unrepresentable = 10765432100123456789ull;
dict[#"bool"] = #(boolYes);
dict[#"int16"] = #(int16);
dict[#"int32"] = #(int32);
dict[#"dlrep"] = #(dlrep);
dict[#"dlmayrep"] = #(dlmayrep);
dict[#"fl"] = #(fl);
dict[#"dl"] = #(dl);
dict[#"uint32"] = #(uint32);
dict[#"ull"] = #(ull);
dict[#"negLong"] = #(negLong);
dict[#"unrepresentable"] = #(unrepresentable);
NSData *data = [NSJSONSerialization dataWithJSONObject:dict options:NSJSONWritingPrettyPrinted error:nil];
NSDictionary *dict_back = (NSDictionary *)[NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers error:nil];
and in the debugger:
(lldb) po [dict_back[#"bool"] class]
__NSCFBoolean
(lldb) po [dict_back[#"int16"] class]
__NSCFNumber
(lldb) po [dict_back[#"int32"] class]
__NSCFNumber
(lldb) po [dict_back[#"ull"] class]
__NSCFNumber
(lldb) po [dict_back[#"fl"] class]
NSDecimalNumber
(lldb) po [dict_back[#"dl"] class]
NSDecimalNumber
(lldb) po [dict_back[#"dlrep"] class]
__NSCFNumber
(lldb) po [dict_back[#"dlmayrep"] class]
__NSCFNumber
(lldb) po [dict_back[#"negLong"] class]
__NSCFNumber
(lldb) po [dict_back[#"unrepresentable"] class]
NSDecimalNumber
So make of that what you will. You should definitely not assume that if you serialize an NSDecimalNumber to JSON that you will get an NSDecimalNumber back out.
But, again, you should not store NSDecimalNumbers in JSON.
I had the same problem, except I'm using Swift 3. I made a patched version of the JSONSerialization class that parses all numbers as Decimal's. It can only parse/deserialize JSON, but does not have any serialization code. It's based on Apple's open source re-implementation of Foundation in Swift.
To answer the question in the title: No, it doesn't, it creates NSNumber objects. You can easily test this:
NSArray *a = #[[NSDecimalNumber decimalNumberWithString:#"0.1"]];
NSData *data = [NSJSONSerialization dataWithJSONObject:a options:0 error:NULL];
a = [NSJSONSerialization JSONObjectWithData:data options:0 error:NULL];
NSLog(#"%#", [a[0] class]);
will print __NSCFNumber.
You can convert that NSNumber object to an NSDecimalNumber with [NSDecimalNumber decimalNumberWithDecimal:[number decimalValue]], but according to the docs for decimalValue
The value returned isn’t guaranteed to be exact for float and double values.

Converting NSData bytes to NSString

I am trying to create a 16 byte and later 32 byte initialization vector in objective-c (Mac OS). I took some code on how to create random bytes and modified it to 16 bytes, but I have some difficulty with this. The NSData dumps the hex, but an NSString dump gives nil, and a cstring NSLog gives the wrong number of characters (not reproduced the same in the dump here).
Here is my terminal output:
2012-01-07 14:29:07.705 Test3Test[4633:80f] iv hex <48ea262d efd8f5f5 f8021126 fd74c9fd>
2012-01-07 14:29:07.710 Test3Test[4633:80f] IV string: (null)
2012-01-07 14:29:07.711 Test3Test[4633:80f] IV char string t^Q¶�^��^A
Here is the main program:
int main (int argc, const char * argv[])
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
//NSString *iv_string = [NSString stringWithCString:iv encoding:NSUTF8StringEncoding];
testclass *obj = [testclass alloc];
NSData *iv_data = [obj createRandomNSData];
//[iv_string dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"iv hex %#",iv_data);
//NSString *iv_string = [[NSString alloc] initWithBytes:[iv_data bytes] length:16 encoding:NSUTF8StringE$
NSString *iv_string = [[NSString alloc] initWithData:iv_data encoding:NSUTF8StringEncoding];
NSLog(#"IV string: %#",iv_string);
NSLog(#"IV char string %.*s",[iv_data bytes]);
return 0;
]
(I left in the above some commented code that I tried and did not work also).
Below is my random number generater, taken from a stack overflow example:
#implementation testclass
-(NSData*)createRandomNSData
{
int twentyMb = 16;
NSMutableData* theData = [NSMutableData dataWithCapacity:twentyMb];
for( unsigned int i = 0 ; i < twentyMb/4 ; ++i )
{
u_int32_t randomBits = arc4random();
[theData appendBytes:(void*)&randomBits length:4];
}
NSData *data = [NSData dataWithData:theData];
[theData dealloc];
return data;
}
#end
I am really quite clueless as to what could be the problem here. If I have data as bytes, it should convert to a string or not necessarily? I have looked over the relevant examples here on stackoverflow, but none of them have worked in this situation.
Thanks,
Elijah
An arbitrary byte sequence may not be legal UTF8 encoding. As #Joachim Isaksson notes, there is seldom reason to convert to strings this way. If you need to store random data as a string, you should use an encoding scheme like Base64, serialize the NSData to a plist, or similar approach. You cannot simply use a cstring either, since NULL is legal inside of a random byte sequence, but is not legal inside of a cstring.
You do not need to build your own random byte creator on Mac or iOS. There's one built-in called SecRandomCopyBytes(). For example (from Properly encrypting with AES with CommonCrypto):
+ (NSData *)randomDataOfLength:(size_t)length {
NSMutableData *data = [NSMutableData dataWithLength:length];
int result = SecRandomCopyBytes(kSecRandomDefault,
length,
data.mutableBytes);
NSAssert(result == 0, #"Unable to generate random bytes: %d",
errno);
return data;
}
When converting NSData to NSString using an UTF8 encoding, you won't necessarily end up with the same number of bytes since not all binary values are valid encodings of characters. I'd say using a string for binary data is a recipe for problems.
What is the use of the string? NSData is exactly the datatype you want for storing binary data to begin with.

Sending hexadecimal data to devices (Converting NSString to hexadecimal data)

I'm trying to send hexadecimal data via WiFi.
The code is something like this:
NSString *abc = #"0x1b 0x50";
NSData *data = [[[NSData alloc] initWithData:[abc dataUsingEncoding:NSASCIIStringEncoding]]autorelease];
[outputStream write:[data bytes] maxLength:[data length]]];
Instead of sending the hexadecimal data, it's sending it in text format.
I tried with NSUTF8StringEncoding, but it's the same. I'm using it with the NSStream class.
You're not getting what you expect with NSString *abc = #"0x1b 0x50". It's almost the same as having NSString *abc = #"cat dog 123 0x0x0x"; just a bunch of words separated by spaces. So when you create your NSData object, you're just initializing it with a string of characters, not a series of actual numbers.
If you can get your numbers into an NSArray, this question/answer should help you: How to convert NSArray to NSData?
The data that you probably want to send is simply 0x1b50, which is the decimal number 6992 (assuming big-endian), and fits into two bytes. This is not the same as a string (which could contain anything) even if it happens to contain some human-readable representation of those numbers.
I'm assuming you want to send this as binary data, and if so one way would be to simply send a buffer formed by a single UInt16 instead of a string. I'm not very familiar with the relevant APIs, but look to see if you can populate the NSData with an integer, perhaps something like:
UInt16 i = 0x1b50; // Or = 6992
[[NSData alloc] initWithBytes: &i length: sizeof(i)]
[outputStream write: [data bytes] maxLength: [data length]]];
Again, I'm not fluent with Objective C, but this should be the general approach to sending the number 0x1b50 as binary data.

Why does this NSString created from an NSData object fail to show it has contents?

Why does the following code produce the logging at the bottom ?
Here is the anomaly- my second NSLog should print the chrStr but produces nothing, empty, which is verified by this debug command:
(gdb) po chrStr
object returns empty description
However, the third NSString where I re-convert the NSString back to NSData object DOES display the the data, the same value as in the first NSLog, as it should. This would indicate to me that chrStr must have actual contents. But it seems not to be so from the NSLOG or the po command. Why ?
NSString *login;
NSString *pass;
// Purpose: NSString *loginString = [NSString stringWithFormat:#"\000%#\000%#", login, pass];
login = #"Loginname"; // text string1
pass = #"Password"; // text string2
// convert text strings to data objects
NSData *subData1 = [login dataUsingEncoding:NSUTF8StringEncoding];
NSData *subData2 = [pass dataUsingEncoding:NSUTF8StringEncoding];
// embed a NULL into new NSData object
NSMutableData *data = [NSMutableData data];
unsigned char zeroByte = 0;
[data appendBytes:&zeroByte length:1];
// append string1, NULL, string2 to data object
[data appendData:subData1];
[data appendBytes:&zeroByte length:1];
[data appendData:subData2];
NSLog(#"1.NSData: %#", data); // print data object
// create a character string from data object
NSString *chrStr = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSLog(#"2.NSString: %#", chrStr); // print character string
// create data object from string object
NSData *chrData = [chrStr dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"3.NSDATA: %#", chrData); // print data object
Produces:
[1071:207] 1.NSData: 004c6f67 696e6e61 6d650050 61737377 6f7264
[1071:207] 2.NSString:
[1071:207] 3.NSDATA: 004c6f67 696e6e61 6d650050 61737377 6f7264
This is a real mystery to me. If chrStr is empty then 3-NSDATA could not display its info, but it does !
What am I trying to accomplish ? Well, check my very first comment line: // purpose:
That line when uncommented produces a warning, even though it actually works, so I was trying to do it another way that allowed me to have a clean compile. If you see a better way to accomplish that objective, I all eyes and ears. But please don't dwell on why that #"\000%#\000%#" string is necessary, start out accepting that it is. Thanks.
In C (and therefore objective-c), a null byte is used to represent the end of a string. When you create the string object, it takes all of the data you have given it without parsing, which is why you can convert it back to data successfully. However, when you display the string, the system reads the string up to the first null byte, which is the first byte. Therefore, the string contains data, but any system functions which read byte by byte instead of using the strings returned length will think it is empty. When you work with non-displayable characters, you should try to use data objects over string objects as often as possible.