Populate an NSImageView box from NSString data - objective-c

I am very new to Objective C and I've been searching Google for a number of hours trying to find a solution.
I have an NSString which looks like
273350/364D4D002A00041EB8F1E0CEF1E0CCF1E0CCF1E0CCF1E0CCF etc etc
which refers to a TIFF image (I guess in some sort of RAW string format), I want to populate an NSImageView with the data.
This is what I've attempted so far:
NSData *picdata = [NSData dataWithBytes:[albumArtStr UTF8String] length:[albumArtStr length]];
NSImage *myPicture = [[NSImage alloc] initWithData:picdata];
[_albumArtCell setImage:myPicture];
Where "albumArtCell" is the NSImageView

That data looks like hex encoded image with a length in front of it, not an unencoded TIFF, which is a tagged binary format. Perhaps you need to strip the number before the slash and decode e rest of the string from hex digits into NSData and then call [[NSImage alloc] initWithData] using that decided data.
You will need to decode it to binary before handing it to NSImage as it only understands the raw binary form of TIFF.

I believe the problem is due to the fact that [albumArtStr length] returns the number of "unicode character", and not number of "bytes".
So your NSData is probably not set-up to be the right size and so doesn't have the right format for a UIImage to be decoded properly.
Try this instead to create a NSData from NSString instance:
NSData* picData = [albumArtStr dataUsingEncoding: NSUTF8StringEncoding];

Related

Signed byte array to UIImage

I am trying to display a picture from a byte-array produced by a web service. Printing out a description it looks like this:
("-119",80,78,71,13,10,26,10,0,0,0,13,3 ... )
From the header it is clear that it's a png encoded in signed integers. It is an __NSCFArray having __NSCFNumber elements.
My code in Objective-C (based on much googling):
NSData *data = [NSData dataWithBytes:(const void *)myImageArray length [myImageArray count]];
UIImage *arrayImage = [UIImage imageWithData:data];
I receive a null UIImage pointer.
I also tried to converting it to unsigned NSNumbers first and then passing it to NSData, though perhaps I did not do this correctly. What am I doing wrong?
You cannot simply cast an NSArray of NSNumber into binary data. Both NSArray and NSNumber are objects; they have their own headers and internal structure that is not the same as the original string of bytes. You'll need to convert it byte-by-byte with something along these lines:
NSArray *bytes = #[#1, #2, #3];
NSMutableData *data = [NSMutableData dataWithLength:bytes.count];
for (NSUInteger i = 0; i < bytes.count; i++) {
char value = [bytes[i] charValue];
[data replaceBytesInRange:NSMakeRange(i, 1) withBytes:&value];
}
char is a signed int8_t, which appears to be the kind of data you're working with. It is often used to mean "an ASCII character," but in C it is commonly also used to mean "byte."

iOS: ZBar SDK unicode characters

When scanning QR codes with ZBar the string resulting from the process does not display unicode characters properly. The word Márti encoded as a QR code by any free to use QR code generator (like http://qrcode.kaywa.com) would result in Mテ。rti.
In other SO questions (1, 2) it was suggested to embed a BOM at the start of the resulting string, but doing this:
NSString *qrString = [NSString stringWithFormat:#"\xEF\xBB\xBF%#",symbol.data];
or this:
NSString *qrString = [[NSString alloc] initWithFormat:#"\357\273\277%#", symbol.data];
resulted in the same, flawed result with the Asian character. symbol.data is the resulting NSString provided by ZBar.
UPDATE: Based on dda's answer, the solution was the following:
NSString *qrString = symbol.data;
//look for misinterpreted acute characters and convert them to UTF-8
if ([qrString canBeConvertedToEncoding:NSShiftJISStringEncoding]) {
qrString = [NSString stringWithCString:[symbol.data cStringUsingEncoding: NSShiftJISStringEncoding] encoding:NSUTF8StringEncoding];
}
According to the Wikipedia page about QR, the encoding of binary data [for which Márti would apply] is ISO 8859-1. It could be an encoding-as-unicode-encoding problem. But seeing a kanji there, it could be that the problem is an encoding-as-QR-encoding issue: maybe the text, being not ASCII, is encoded by default as Shift JIS X 0208 (ie kanji/kana).
I could create QR codes of "日本語"(japanese) and "Márti" with following libraries:
iOS-QR-Code-Encoder
QR-Code-Encoder-for-Objective-C.
You can read those QR codes with ZBar.
iOS-QR-Code-Encoder:
NSString* orginalString = #"Márti"(or "日本語"(japanese));
NSString *data = [[NSString alloc] initWithFormat:#"\357\273\277%#", orginalString];
UIImage* qrcodeImage = [QRCodeGenerator qrImageForString:data imageSize:imageView.bounds.size.width];
QR-Code-Encoder-for-Objective-C:
NSString* orginalString = #"Márti"(or "日本語"(japanese));
NSString *data = [[NSString alloc] initWithFormat:#"\357\273\277%#", orginalString];
//first encode the string into a matrix of bools, TRUE for black dot and FALSE for white. Let the encoder decide the error correction level and version
DataMatrix* qrMatrix = [QREncoder encodeWithECLevel:QR_ECLEVEL_AUTO version:QR_VERSION_AUTO string:data];
//then render the matrix
UIImage* qrcodeImage = [QREncoder renderDataMatrix:qrMatrix imageDimension:qrcodeImageDimension];
Just a word of caution, the solution as is will exclude use in Japan and scanning of QR Codes with actual Kanji coding inside. In fact it will probably create problems for any QR Code with Unicode characters inside that canBeConvertedToEncoding:NSShiftJISStringEncoding.
A more universal solution is to insert the BOM characters prior to the QR Code encoding to force UTF-8 coding (before it is created). ZBar was never the problem here, it is rooted in the creation of the QR Code.

Emojis displaying as boxes

I am making an app that posts content from a UITextField, to a PHP script, and store in a database to read later. I am having one heck of a time working with emojis.
Using json, I will get a response like this:
content = "\Uf44d";
id = 104;
time = 1350359055;
In this instance, I used an emoji. But when I do [dictionary objectForKey:#"content"], a box just appears. 
I'm thinking I need to convert to UTF-8. But I'm not completely sure. Please help!
U+F44D is a Unicode character in the "Private Use Area" U+E000..U+F8FF, so that is probably not the character you want to display.
On the other hand, U+1F44D is the "THUMBS UP SIGN", so it could be that your Web service does not create a correct JSON response for Unicode characters greater than 0xFFFF.
According to the JSON RFC, characters that are not part of the "Basic Multilingual Plane" can be escaped using a UTF-16 surrogate pair. For the U+1F44D character the JSON Unicode escape sequence would be "\ud83d\udc4d".
The following code shows that it works in general:
const char *s = "{ \"content\": \"\\ud83d\\udc4d\", \"id\": 104, \"time\": 1350359055 }";
NSData *jsonData = [NSData dataWithBytes:s length:strlen(s)];
NSError *error;
NSDictionary *jsonDict = [NSJSONSerialization JSONObjectWithData:jsonData options:0 error:&error];
self.label.text = [jsonDict objectForKey:#"content"];
This displays the "THUMBS UP SIGN" correctly in the label.
But you don't have to escape characters, the Web service could also just send the UTF-8 sequence.
I hope this will help you...I'm using this and e106 is my emoji code.
Only replcare with content line with below line:
content = [NSString stringWithFormat:#"%C", 0xe106];

Copy a part of NSData byte array to another NSData type

I have an original NSData type which contains let's say 100 bytes. I want to get 2 other NSData types. The first containing the first 20 bytes of the 100, and the second one containing the other 80.
They should be copied from the original NSData. Sorry if I wasn't so clear, but I'm pretty new with Objective-C.
You can use NSData's -(NSData *)subdataWithRange:(NSRange)range; to do that.
From your example, here is some code :
// original data in myData
NSData *d1 = [myData subdataWithRange:NSMakeRange(0, 20)];
NSData *d2 = [myData subdataWithRange:NSMakeRange(20, 80)];
Of course, the ranges are immediate here, you will probably have to do calculations, to make it work for your actual code.
Swift 3
let subdata1 = data?.subdata(in: 0..<20)
let subdata2 = data?.subdata(in: 20..<80)
Due to this is question is in very top of Google Search I wanna write here an example for swift
NSData *mainData = /*This is you actual Data*/
NSData *fPart = [mainData subdataWithRange:NSMakeRange(0, 20)];
NSData *sPart = [mainData subdataWithRange:NSMakeRange(20, 80)];
Instead 80 you can use some dynamic - like data length

Sending hexadecimal data to devices (Converting NSString to hexadecimal data)

I'm trying to send hexadecimal data via WiFi.
The code is something like this:
NSString *abc = #"0x1b 0x50";
NSData *data = [[[NSData alloc] initWithData:[abc dataUsingEncoding:NSASCIIStringEncoding]]autorelease];
[outputStream write:[data bytes] maxLength:[data length]]];
Instead of sending the hexadecimal data, it's sending it in text format.
I tried with NSUTF8StringEncoding, but it's the same. I'm using it with the NSStream class.
You're not getting what you expect with NSString *abc = #"0x1b 0x50". It's almost the same as having NSString *abc = #"cat dog 123 0x0x0x"; just a bunch of words separated by spaces. So when you create your NSData object, you're just initializing it with a string of characters, not a series of actual numbers.
If you can get your numbers into an NSArray, this question/answer should help you: How to convert NSArray to NSData?
The data that you probably want to send is simply 0x1b50, which is the decimal number 6992 (assuming big-endian), and fits into two bytes. This is not the same as a string (which could contain anything) even if it happens to contain some human-readable representation of those numbers.
I'm assuming you want to send this as binary data, and if so one way would be to simply send a buffer formed by a single UInt16 instead of a string. I'm not very familiar with the relevant APIs, but look to see if you can populate the NSData with an integer, perhaps something like:
UInt16 i = 0x1b50; // Or = 6992
[[NSData alloc] initWithBytes: &i length: sizeof(i)]
[outputStream write: [data bytes] maxLength: [data length]]];
Again, I'm not fluent with Objective C, but this should be the general approach to sending the number 0x1b50 as binary data.