NSImage at app start - objective-c

I have an image view object. I can get it to load images when a buttun is pressed but I want to know if there is a way to get it to load at the start of the app.
The code im using is
NSURL *lURL = nil;
NSData *lData = nil;
NSImage *lImage = nil;
lData = [lURL resourceDataUsingCache:YES];
lImage = [[NSImage allocWithZone:[self zone]] initWithData:lData];
lURL = [[NSURL alloc] initWithString:#"urltoimage"];
[imageView setImage:lImage];
any help is appreciated

Use -[UIViewController viewDidLoad] method which is called exactly once when the controller is first loaded into the memory.

Related

How to show polylines added in the source in mapbox ios?

I'm trying to show the polylines taken from a GeoJson then added to the same source but with poor results.
NSString *jsonString = #"{\"type\": \"FeatureCollection\",\"features\": [{\"type\": \"Feature\",\"properties\": {},\"geometry\": {\"type\": \"LineString\",\"coordinates\": [[4.873809814453125,52.3755991766591],[4.882049560546875,52.339534544106435],[4.94659423828125,52.34708539110632],[4.94659423828125,52.376437538867776],[5.009765625,52.370568669179654]]}},{\"type\": \"Feature\",\"properties\": {},\"geometry\": {\"type\": \"LineString\",\"coordinates\": [[4.73785400390625,52.32694693334544],[4.882049560546875,52.32778621884898],[4.872436523437499,52.29420237796669],[4.9713134765625,52.340373590787394]]}}]}";
NSData *jsonData = [jsonString dataUsingEncoding:NSUTF8StringEncoding];
MGLShapeCollectionFeature *shapeCollectionFeature = (MGLShapeCollectionFeature *)[MGLShape shapeWithData:jsonData encoding:NSUTF8StringEncoding error:NULL];
MGLMultiPolyline *polylines = [MGLMultiPolyline multiPolylineWithPolylines:shapeCollectionFeature.shapes];
MGLShapeSource *source = [[MGLShapeSource alloc] initWithIdentifier:#"transit" shape:polylines options:nil];
[self.mapView.style addSource:source];
MGLLineStyleLayer *lineLayer = [[MGLLineStyleLayer alloc] initWithIdentifier:#"layer" source:source];
[self.mapView.style addLayer:lineLayer];
I logged the source object and inside there are the two polylines. But why are they not shown? what am I doing wrong?
I use mapbox sdk 3.7.6 for ios.
Are you using the -[MGLMapViewDelegate mapView:didFinishLoadingStyle] method to ensure that your map has properly initialized before you are adding the style layer? If not, you are likely running into a race issue where you are adding data that is immediately overwritten as the style is loading.
If you modify your code to ensure that the source and style aren't being added prematurely, I would expect your issues to resolve.
- (void)mapView:(MGLMapView *)mapView didFinishLoadingStyle:(MGLStyle *)style {
MGLShapeSource *source = [[MGLShapeSource alloc] initWithIdentifier:#"transit" shape:polylines options:nil];
[self.mapView.style addSource:source];
MGLLineStyleLayer *lineLayer = [[MGLLineStyleLayer alloc] initWithIdentifier:#"layer" source:source];
[self.mapView.style addLayer:lineLayer];
}
⚠️ Disclaimer: I currently work at Mapbox ⚠️

how do I pass image data to a native module in React Native?

In my class that implements the RCTBridgeModule protocol in Xcode, I am trying to write an RCT_EXPORT_MEATHOD that I can expose to React Native code to consume image data. Currently, I can write an image to disk in React Native, and then pass the path through to the native method, but I'm wondering if there is a better technique to pass image data directly for better performance?
So instead of this:
RCT_EXPORT_METHOD(scanImage:(NSString *)path) {
UIImage *sampleImage = [[UIImage alloc] initWithContentsOfFile:path];
[self processImage: UIImage];
}
Something more like this:
RCT_EXPORT_METHOD(scanImage:(NSData *)imageData) {
[self processImageWithData: imageData];
}
You can use [RCTConvert UIImage:icon] method from #import <React/RCTConvert.h>
You need to specify "schema" as "data".
For more details look at the source code bellow:
https://github.com/facebook/react-native/blob/master/React/Base/RCTConvert.m#L762
For me, this works:
NSURL *url = [NSURL URLWithString:[imagePath stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]]];
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:url]];
imagePath is NSString comes from the react-native bridge (using react-native-image-picker's response - response.uri).

Memory issues with ARC on iOS and Mac

I am trying to mirror screen of my mac to iphone. I have this method in Mac app delegate to capture screeen into base64 string.
-(NSString*)baseString{
CGImageRef screen = CGDisplayCreateImage(displays[0]);
CGFloat w = CGImageGetWidth(screen);
CGFloat h = CGImageGetHeight(screen);
NSImage * image = [[NSImage alloc] initWithCGImage:screen size:(NSSize){w,h}];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, w, h)];
[bitmapRep setCompression:NSTIFFCompressionJPEG factor:.3];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSJPEGFileType properties:_options];
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
image = nil;
bitmapRep = nil;
imageData = nil;
return base64String;}
after that I am sending it to iphone and present it in UIImageView.
Delay between screenshots is 40 miliseconds. Everything works as expected until there is enough memory. After minute of streaming it starts swapping and use 6GB of RAM. iOS app memory usage is also growing lineary. By the time iOS reaches 90MB of ram, mac has 6GB.
Even if I stop streaming memory is not released.
I'm using ARC in both projects. Would it make any difference if migrate it to manual reference counting ?
I also tried #autoreleasepool {...} block, but it didn't help.
Any ideas ?
EDIT
My iOS code is here
NSString message = [NSString stringWithFormat:#"data:image/png;base64,%#",base64];
NSURL *url = [NSURL URLWithString:message];
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *ret = [UIImage imageWithData:imageData];
self.image.image = ret;
You have a serious memory leak. The docs for CGDisplayCreateImage clearly state:
The caller is responsible for releasing the image created by calling CGImageRelease.
Update your code with a call to:
CGImageRelease(screen);
I'd add that just after creating the NSImage.
We can't help with your iOS memory leaks since you didn't post your iOS code, but I see a big memory leak in your Mac code.
You are calling a Core Foundation function, CGDisplayCreateImage. Core Foundation objects are not managed by ARC. If a Core Foundation function has "Create" (or "copy") in the name then it follows the "create rule" and you are responsible for releasing the returned CF object when you are done with it.
Some CF objects have special release calls. For those that don't, just call CFRelease. CGImageRef has a special release call, CGImageRelease().
You need a corresponding call to CGImageRelease(screen), probably after the call to initWithCGImage.

Displaying a photo from url

I want to get a photo from my homepage and display it. And it (kind of) works. But sometimes it takes min 10 seconds to load the next scene because of something that happens here. So here is what I do :
NSString *myURL = [PICURL stringByAppendingString:[[[[levelConfig objectForKey:category] objectForKey:lSet] objectForKey:levelString] objectForKey:#"pic"]];
UIImage *dYKPic = [UIImage imageWithData: [NSData dataWithContentsOfURL: [NSURL URLWithString:myURL]]];
if(dYKPic == nil){
NSString *defaultURL = #"http://www.exampleHP.com/exampleFolder/default.jpg";
dYKPic = [UIImage imageWithData: [NSData dataWithContentsOfURL: [NSURL URLWithString:defaultURL]]];
}
CCTexture2D *tex = [[CCTexture2D alloc] initWithImage:dYKPic];
CCSprite *image = [CCSprite spriteWithTexture:tex];
image.anchorPoint = ccp(0,0);
image.position = ccp(32,216);
[self addChild:image z:2];
So, it takes 10 seconds, and additionally, the default.jpg is loaded - even though the picture exists - but that just in the case where it takes so long... 70% of the cases it works perfectly normal... So what is wrong ? Where do I release tex ? Immediately after adding the child ?
It has to load the picture. Thats the issue. You either need to load and cache it, store it, or preload it before you need it.
Or one final option is the load it async and just update your view when its finished.

xcode objective c - when I call 2 methods - only the last called method runs and first one is skipped

-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
[self NextHeading]; // this plays an mp3 file
[self NextHeadingMeaning]; // this plays an Mp3 file
}
Only [self NextHeadingMeaning] method is called and NextHeading method is missed each time
-(IBAction) NextHeading{
[audio stop];
NSString *Filename = [[NSString alloc]initWithFormat:#"CH%#S%#",Heading,Meaning];
Filepath = [[NSBundle mainBundle]pathForResource:Filename ofType:#"mp3"];
audio = [[AVAudioPlayer alloc]initWithContentsOfURL:[NSURL fileURLWithPath:Filepath] error:NULL];
audio.delegate = self;
[audio play];
[Filename autorelease];
}
-(IBAction) NextHeadingMeaning {
[audio stop];
NSString *Filename = [[NSString alloc] initWithFormat:#"CH%#S%#",bold**Chapter**bold, Meaning];
Filepath = [[NSBundle mainBundle]pathForResource:Filename ofType:#"mp3"];
audio = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:Filepath] error:NULL];
audio.delegate = self;
[audio play];
[Filename autorelease];
}
Why is this happening and how can I resolve it ?
Please advice, thanks in advance.
You just used a single iVar (audio) as an player, and when you send NextHeading & NextHeadingMeaning message, the audio init with your sound_1.mp3 file firstly (it'll take some seconds if the mp3 file is big), then at the next moment (your first mp3 file might not inited, or has inited, but stopped followed by next message), you redo the init action with another mp3 file (sound_2.mp3), and finally, when the second mp3 file init done, audio plays sound_2.mp3. That's why you think the NextHeading is skipped.
So, to solve this problem, you can use a NSMutableArray iVar (e.g. audioPlayers), and create a local audio for both NextHeading & NextHeadingMeaning, and push it to audioPlayers.
And I think it is better to preload sound files if you can. :)
EDIT:
There's a playAtTime: method instead of play, you can delay the second audio player's playing time by this method, just like this:
[audioPlayer playAtTime:(audioPlayer.deviceCurrentTime + delay)];
delay is in seconds (NSTimeInterval).
There is no way that the first call is skipped, put a breakpoint in it or output something with NSLog() and you'll see. Most probable cause is that the first method doesn't do what you expect and this could be for various reasons - for example condition or specific timeout.
Edit:
After looking your code, it seems that you're missing some basic stuff like variable naming, variable scope and so. To simply make your code run, just replace the NSString *Filename.. string from the second method and probably it'll work. A better choice would be to visit Start Developing iOS Apps Today and follow the roadmap.