I have developed an audio play app in iOS with Objective-c and C language, it can play audio in most situations. But when I switch the app into background and then switch it back,it will not play the audio. Another problem is when I open it with earphone, if I take off the earphone, it will also stop play the audio.
The init function is:
- (instancetype)init {
self = [super init];
if (self) {
sysnLock = [[NSLock alloc] init];
if (_audioDescription.mSampleRate <= 0) {
_audioDescription.mSampleRate = _sampleRates;
_audioDescription.mFormatID = kAudioFormatLinearPCM;
_audioDescription.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
_audioDescription.mChannelsPerFrame = _channels;
_audioDescription.mFramesPerPacket = 1;
_audioDescription.mBitsPerChannel = 16;
_audioDescription.mBytesPerFrame = (_audioDescription.mBitsPerChannel / 8) * _audioDescription.mChannelsPerFrame;
_audioDescription.mBytesPerPacket = _audioDescription.mBytesPerFrame * _audioDescription.mFramesPerPacket;
}
AudioQueueNewOutput(&_audioDescription, AudioPlayerCallback, (__bridge void *_Nullable)(self), nil, 0, 0,
&audioQueue);
AudioQueueSetParameter(audioQueue, kAudioQueueParam_Volume, 100.0);
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
audioQueueBufferUsed[i] = false;
osState = AudioQueueAllocateBuffer(audioQueue, MIN_SIZE_PER_FRAME, &audioQueueBuffers[i]);
}
osState = AudioQueueStart(audioQueue, NULL);
}
return self;
}
In which AudioPlayerCallback is a callback function to reset the audioBufferUsed.
And the audio play function is:
- (void)startPlay:(AudioData *)audioData {
uint8_t *c_data = get_data(audioData);
size_t c_data_size = get_data_size(audioData);
[sysnLock lock];
int i = 0;
while (true) {
if (!audioQueueBufferUsed[i]) {
audioQueueBufferUsed[i] = true;
break;
} else {
i++;
if (i >= QUEUE_BUFFER_SIZE) {
i = 0;
}
}
}
audioQueueBuffers[i]->mAudioDataByteSize = (unsigned int)c_data_size;
memcpy(audioQueueBuffers[i]->mAudioData, c_data, c_data_size);
AudioQueueEnqueueBuffer(audioQueue, audioQueueBuffers[i], 0, NULL);
OSStatus status = AudioQueueStart(audioQueue, NULL);
[sysnLock unlock];
}
In which get_data and get_data_size can get the uint_8 data and its size to play.
You will setup NotificationCenter to observe the following keys
AVAudioSessionInterruptionNotification and AVAudioSessionRouteChangeNotification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(interruption:) name:AVAudioSessionInterruptionNotification
object:nil];
- (void)interruption:(NSNotification *)notiz {
// handle stuff when audio interrupts
}
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(routeChanged:) name:AVAudioSessionRouteChangeNotification object:nil];
- (void)routeChanged:(NSNotification *)notiz {
// handle stuff when route is changed, aka headphone jack in/out
}
In classic coding style you don't forget to remove the observer in your dealloc method.
[[NSNotificationCenter defaultCenter] removeObserver:self name:... object:nil];
This will allow you to watch when your disturbing events happen and act accordingly with re-creation of your AudioQueue, Start, Stop, Buffer if needed.
Another way is to observe changes of propertys of the AudioQueueBuffer API with...
AudioQueueAddPropertyListener(AudioQueueRef inAQ, AudioQueuePropertyID inID, AudioQueuePropertyListenerProc inProc, void * inUserData);
Related
I have a Metal-based application that utilizes AVFoundation for movie playback + seeking. To start, I am only processing .mov files and nothing else, and even the app in question will not process any other format. While it has been working effectively in the past, I recently received feedback from some M1 users about black frames only showing up on their app regardless of which time they set their seek bar to.
I have performed the following troubleshooting in my attempts find the root source for this black texture bug:
Verified that the video being processed is .mov movie file type.
Verified that the CVPixelBufferRef object returned from AVPlayerItemVideoOutput's -copyPixelBufferForItemTime: is valid, i.e. not nil.
Verified that the MTLTexture created from the CVPixelBufferRef is also valid, i.e. also not nil.
Converted the MTLTexture to a bitmap and saved it as a .JPEG image to the user's disk.
The last part is probably the most important step here as the images saved are also all black (for users experiencing the bug) when viewed using Finder, and made me come to the assumption that I might be using AVFoundation somewhat wrong. While I hope that my opening post might not be too long in regards to code, below are the following steps I am performing in order to process videos to be rendered using Metal for your reference.
Inside my reader class, I have the following properties:
VideoReaderClass.m
#property (retain) AVPlayer *vidPlayer;
#property (retain) AVPlayerItem *vidPlayerItem;
#property (retain) AVPlayerItemVideoOutput *playerItemVideoOutput;
#property (retain) AVMutableComposition *videoMutableComp;
#property (assign) AVPlayerItemStatus playerItemStatus;
// The frame duration of the composition, as well as the media being processed.
#property (assign) CMTime frameDuration;
// Tracks the time to insert a new footage to
#property (assign) CMTime startInsertTime;
// Weak reference to a delegate responsible for processing seek:
#property (weak) id<VideoReaderRenderer> vidReaderRenderer;
A method called before reading starts. Handles observer cleanup as well.
- (void)initializeReadMediaPipeline
{
[_lock lock];
_startInsertTime = kCMTimeZero;
_playerItemStatus = AVPlayerItemStatusUnknown;
if(_videoMutableComp)
{
[_videoMutableComp release];
}
_videoMutableComp = [AVMutableComposition composition];
[_videoMutableComp retain];
if(_vidPlayer)
{
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"status"
context:MyAddAVPlayerItemKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackBufferFull"
context:MyAVPlayerItemBufferFullKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackBufferEmpty"
context:MyAVPlayerItemBufferEmptyKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackLikelyToKeepUp"
context:MyAVPlayerItemBufferKeepUpKVOContext];
[_vidPlayer release];
_vidPlayer = nil;
}
[_lock unlock];
}
In this class is a public method to be called in a background queue when the app should begin processing a video, or a set of videos.
- (BOOL)addMediaAtLocation:(NSURL *)location
{
BOOL result = NO;
[_lock lock];
NSDictionary* optsInfo =
#{
AVURLAssetPreferPreciseDurationAndTimingKey : #(YES)
};
AVURLAsset* assetURL = [AVURLAsset URLAssetWithURL:location options:optsInfo];
AVAssetTrack* assetTrack = [assetURL tracksWithMediaType:AVMediaTypeVideo].firstObject;
NSError* error = nil;
[_videoMutableComp insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofAsset:assetURL
atTime:_startInsertTime
error:&error];
if(!error)
{
[_vidPlayerItem addObserver:self
forKeyPath:#"status"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAddAVPlayerItemKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackBufferFull"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferFullKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackBufferEmpty"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferEmptyKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackLikelyToKeepUp"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferKeepUpKVOContext];
[_vidPlayer replaceCurrentItemWithPlayerItem:_vidPlayerItem];
}
_vidPlayer = [[AVPlayer alloc] init];
if(_playerItemVideoOutput)
{
[_playerItemVideoOutput release];
_playerItemVideoOutput = nil;
}
[_lock unlock];
}
Called externally by our view controller when seeking is needed.
- (void)seekToFrame:(CMTime)time
{
if(_vidReaderRenderer)
{
if(_vidPlayerItem && _playerItemVideoOutput)
{
[_vidPlayerItem seekTimeTime:time
toleranceBefore:kCMTimeZero
toleranceAfter:kCMTimeZero
completionHandler:^(BOOL finished){
if(finished)
{
CVPixelBufferRef p_buffer = [_playerItemVideoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
if(p_buffer)
{
[_vidReaderRenderer seekOperationFinished:p_buffer];
}
}
}];
}
}
}
Lastly, this is where I handle the notification when the AVPlayerItem's status is set for "ReadyToPlay."
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary<NSKeyValueChangeKey,id> *)change
context:(void *)context
{
// ... Checking for right contexts here, omitting for this example.
if([keyPath isEqualToString:#"status"])
{
AVPlayerItemStatus status = AVPlayerItemStatusUnknown;
// Get the status change from the change dictionary
NSNumber *statusNumber = change[NSKeyValueChangeNewKey];
if([statusNumber isKindOfClass:[NSNumber class]])
{
status = statusNumber.integerValue;
}
// Switch over the status
switch(status)
{
case AVPlayerItemStatusReadyToPlay:
{
// Ready to Play
if(_vidPlayerItem)
{
[_lock lock];
NSDictionary* attribs =
#{
(NSString*)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_64RGBAHalf)
};
_playerItemVideoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attribs];
[_vidPlayerItem addOutput:_playerItemVideoOutput];
[_vidPlayer setRate:0.0];
_playerItemStatus = status;
[_lock unlock];
// "Wake up" the AVPlayer/AVPlayerItem here.
[self seekToFrame:kCMTimeZero];
}
break;
}
}
}
}
The code listing below is the class that also acts the delegate for a custom protocol called VideoReaderRenderer, which handles the seekToTime: completion block, as well as converting the pixel buffer to a MTLTexture:
RendererDelegate.m
At some point in its initialization and before performing any seek operations, I instantiate the CVMetalTextureCacheRef.
CVReturn ret_val = CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, _mtlDevice, nil, &_metalTextureCache);
The method in which my VideoReader class calls inside seekToTime's completion block:
-(void)seekOperationFinished:(CVPixelBufferRef)pixelBuffer
{
CVMetalTextureRef mtl_tex = NULL;
size_t w = CVPixelBufferGetWidth(pixelBuffer);
size_t h = CVPixelBufferGetHeight(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CVReturn ret_val = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_metalTextureCache,
pixelBuffer,
nil,
MTLPixelFormatRGBA16Float,
w,
h,
0,
&mtl_tex);
CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
if(ret_val != kCVReturnSuccess)
{
if(mtl_tex != NULL)
{
CVBufferRelease(mtl_tex);
CVPixelBufferRelease(pixelBuffer);
}
return;
}
id<MTLTexture> inputTexture = CVMetalTextureGetTexture(mtl_tex);
if(!inputTexture)
return;
NSSize texSize = NSMakeSize(inputTexture.width, inputTexture.height);
_viewPortSize = (simd_uint2){(uint)texSize.width, (uint)texSize.height};
// Create the texture here.
[_textureLock lock];
if(NSEqualSizes(_projectSize, texSize))
{
if(!_inputFrameTex)
{
MTLTextureDescriptor* texDescriptor = [MTLTextureDescriptor new];
texDescriptor.width = texSize.width;
texDescriptor.height = texSize.height;
texDescriptor.pixelFormat = MTLPixelFormatRGBA16Float;
texDescriptor.usage = MTLTextureUsageShaderWrite | MTLTextureUsageShaderRead;
_inputFrameTex = [_mtlDevice newTextureWithDescriptor:texDescriptor];
}
id<MTLCommandBuffer> commandBuffer = [_drawCopyCommandQueue commandBuffer];
id<MTLBlitCommandEncoder> blitEncoder = [commandBuffer blitCommandEncoder];
[blitEncoder copyFromTexture:inputTexture
sourceSlice:0
sourceLevel:0
sourceOrigin:MTLOriginMake(0, 0, 0)
sourceSize:MTLSizeMake(inputTexture.width, inputTexture.height, 1)
toTexture:_inputFrameTex
destinationSlice:0
destinationLevel:0
destinationOrigin:MTLOriginMake(0, 0, 0)];
[blitEncoder endEncoding];
[commandBuffer commit];
[commandBuffer waitUntilCompleted];
}
[_textureLock unlock];
CVBufferRelease(mtl_tex);
CVPixelBufferRelease(pixelBuffer);
// Added "trouble-shooting" code to save the contents of the MTLTexture as a JPEG onto the user's disk.
if(_inputFrameTex)
{
// ... Save contents of texture to disk as JPEG.
}
}
Once again, my apologies for the rather long post. Additional details include having our rendering being displayed on a custom NSView backed by a CAMetalLayer, where its subsequent draw calls are called by a CVDisplayLink for background rendering. Though I don't think the latter seems to be the source of the problem of seekToTime: returning black frames. Can anyone care to shed some light to my situation? Thank you very much in advance.
I have a Xcode 5/Cocoa program that clicks the left mouse button after specified interval a specified number of times. That part works fine. The problem occurs when I want to stop the while loop prematurely.
I'm using a listener to detect any key press during the running of the program, set a stopnow variable and check for that variable in the while loop. But, the while loop doesn't detect the change in the variable until the loop finishes.
Also, I change a counter in the title bar of the window to display the count of clicks done, and that doesn't get updated either until the loop finishes.
I do get the NSLog message when I press a key.
I'm very confused.
My code is here :
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
// Insert code here to initialize your application
[[self myWindow] setLevel:NSFloatingWindowLevel];
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask handler:^(NSEvent *event) {
keychar = (unichar) event.characters;
[NSApp activateIgnoringOtherApps:YES];
stopnow = 1;
NSLog(#"Key Pressed = x%x (%x) (%x)",keychar,(keychar&0x7f00),((keychar&0xff00)>>8));
}];
}
- (IBAction)setClickPoint:(NSButton *)sender {
sleep(5);
CGEventRef ourEvent = CGEventCreate(NULL);
cgPoint = CGEventGetLocation(ourEvent);
myPoint = [NSString stringWithFormat:#" (%5.0f,%5.0f)", cgPoint.x, cgPoint.y];
myNewTitle = [mytitle stringByAppendingString:myPoint];
[[self myWindow] setTitle:myNewTitle];
}
(IBAction)strtButton:(NSButton *)sender {
NSLog(#"Entered strButtn");
numClicks = [_nClicks intValue];
numWait = [_nWait floatValue];
i = 0;
while (i < numClicks || numClicks == 0) {
i++;
myTotal = [NSString stringWithFormat:#" %i of %i", i, numClicks];
myNewTitle = [mytitle stringByAppendingString:myPoint];
myNewTitle = [myNewTitle stringByAppendingString:myTotal];
[[self myWindow] setTitle:myNewTitle];
CGWarpMouseCursorPosition(cgPoint);
CGEventRef down = CGEventCreateMouseEvent(0, kCGEventLeftMouseDown,cgPoint, 0);
CGEventPost(kCGSessionEventTap, down);
CFRelease(down);
CGEventRef up = CGEventCreateMouseEvent(0, kCGEventLeftMouseUp,cgPoint, 0);
CGEventPost(kCGSessionEventTap, up);
CGRealease(up);
NSLog(#"stopnow = %i", stopnow);
if (stopnow == 1) {
stopnow = 0;
break;
}
usleep((unsigned int)(numWait * 1000000.0));
}
}
A Cocoa/Cocoa Touch app is an event-based environment, so you cannot have long running "loops" in the main thread, as you stop the handling and delivery of the events.
When your loop finishes, the UI is able to update the bits you are seeing, as it can now deliver the events.
You will need to do this work in the background thread, or some such.
Ok, here is what works - use dispatch_async(global type) for the main loop, use dispatch_async(main queue) for the code that updates the title.
I've just started with the sparrow framework, and have been following "The Big Sparrow Tutorial" by Gamua themselves. I'm on the first part of the tutorial, using the AppScaffold 1.3 but when I try to compile my basic code it hangs at the loading screen and gives me a SIGABRT error.
I put an exception breakpoint, and it stopped here, in GameController.m (seen at bottom) of the AppScaffold:
mGame = [[Game alloc] initWithWidth:gameWidth height:gameHeight];
This was also my only output:
2012-07-30 07:19:54.787 AppScaffold[1682:10a03] -[Game initWithWidth:height:]: unrecognized selector sent to instance 0x7553980
(lldb)
I am using the stock AppScaffold, the only thing I changed was the Game.m.
This is my Game.m:
#interface Game : SPSprite
#end
#implementation Game
{
#private
SPImage *mBackground;
SPImage *mBasket;
NSMutableArray *mEggs;
}
- (id)init
{
if((self = [super init]))
{
//load the background image first, add it to the display tree
//and keep it for later use
mBackground = [[SPImage alloc] initWithContentsOfFile:#"background.png"];
[self addChild:mBackground];
//load the image of the basket, add it to the display tree
//and keep it for later use
mBasket = [[SPImage alloc] initWithContentsOfFile:#"basket.png"];
[self addChild:mBasket];
//create a list that will hold the eggs,
//which we will add and remove repeatedly during the game
mEggs = [[NSMutableArray alloc] init];
}
return self;
}
- (void)dealloc
{
[mBackground release];
[mBasket release];
[mEggs release];
[super dealloc];
}
#end
I've tried my best to use my basic troubleshooting tactics, but I'm very new to Obj-C and Sparrow and could use a hand :)
Thanks
EDIT: I've addded the GameController.m contents here for clarity:
//
// GameController.m
// AppScaffold
//
#import <OpenGLES/ES1/gl.h>
#import "GameController.h"
#interface GameController ()
- (UIInterfaceOrientation)initialInterfaceOrientation;
#end
#implementation GameController
- (id)initWithWidth:(float)width height:(float)height
{
if ((self = [super initWithWidth:width height:height]))
{
float gameWidth = width;
float gameHeight = height;
// if we start up in landscape mode, width and height are swapped.
UIInterfaceOrientation orientation = [self initialInterfaceOrientation];
if (UIInterfaceOrientationIsLandscape(orientation)) SP_SWAP(gameWidth, gameHeight, float);
mGame = [[Game alloc] initWithWidth:gameWidth height:gameHeight];
mGame.pivotX = gameWidth / 2;
mGame.pivotY = gameHeight / 2;
mGame.x = width / 2;
mGame.y = height / 2;
[self rotateToInterfaceOrientation:orientation animationTime:0];
[self addChild:mGame];
}
return self;
}
- (void)dealloc
{
[mGame release];
[super dealloc];
}
- (UIInterfaceOrientation)initialInterfaceOrientation
{
// In an iPhone app, the 'statusBarOrientation' has the correct value on Startup;
// unfortunately, that's not the case for an iPad app (for whatever reason). Thus, we read the
// value from the app's plist file.
NSDictionary *bundleInfo = [[NSBundle mainBundle] infoDictionary];
NSString *initialOrientation = [bundleInfo objectForKey:#"UIInterfaceOrientation"];
if (initialOrientation)
{
if ([initialOrientation isEqualToString:#"UIInterfaceOrientationPortrait"])
return UIInterfaceOrientationPortrait;
else if ([initialOrientation isEqualToString:#"UIInterfaceOrientationPortraitUpsideDown"])
return UIInterfaceOrientationPortraitUpsideDown;
else if ([initialOrientation isEqualToString:#"UIInterfaceOrientationLandscapeLeft"])
return UIInterfaceOrientationLandscapeLeft;
else
return UIInterfaceOrientationLandscapeRight;
}
else
{
return [[UIApplication sharedApplication] statusBarOrientation];
}
}
- (void)rotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
animationTime:(double)animationTime
{
float angles[] = {0.0f, 0.0f, -PI, PI_HALF, -PI_HALF};
float oldAngle = mGame.rotation;
float newAngle = angles[(int)interfaceOrientation];
// make sure that rotation is always carried out via the minimal angle
while (oldAngle - newAngle > PI) newAngle += TWO_PI;
while (oldAngle - newAngle < -PI) newAngle -= TWO_PI;
// rotate game
if (animationTime)
{
SPTween *tween = [SPTween tweenWithTarget:mGame time:animationTime
transition:SP_TRANSITION_EASE_IN_OUT];
[tween animateProperty:#"rotation" targetValue:newAngle];
[[SPStage mainStage].juggler removeObjectsWithTarget:mGame];
[[SPStage mainStage].juggler addObject:tween];
}
else
{
mGame.rotation = newAngle;
}
// inform all display objects about the new game size
BOOL isPortrait = UIInterfaceOrientationIsPortrait(interfaceOrientation);
float newWidth = isPortrait ? MIN(mGame.gameWidth, mGame.gameHeight) :
MAX(mGame.gameWidth, mGame.gameHeight);
float newHeight = isPortrait ? MAX(mGame.gameWidth, mGame.gameHeight) :
MIN(mGame.gameWidth, mGame.gameHeight);
if (newWidth != mGame.gameWidth)
{
mGame.gameWidth = newWidth;
mGame.gameHeight = newHeight;
SPEvent *resizeEvent = [[SPResizeEvent alloc] initWithType:SP_EVENT_TYPE_RESIZE
width:newWidth height:newHeight animationTime:animationTime];
[mGame broadcastEvent:resizeEvent];
[resizeEvent release];
}
}
// Enable this method for the simplest possible universal app support: it will display a black
// border around the iPhone (640x960) game when it is started on the iPad (768x1024); no need to
// modify any coordinates.
/*
- (void)render:(SPRenderSupport *)support
{
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
{
glEnable(GL_SCISSOR_TEST);
glScissor(64, 32, 640, 960);
[super render:support];
glDisable(GL_SCISSOR_TEST);
}
else
[super render:support];
}
*/
#end
Here is my Xcode project: http://cl.ly/2e3g02260N47
You are calling a
initWithWidth:height:
method, while none is defined in your class.
From your edit, it seems that the initWithWidth method is declared in the class GameController, not in Game.
So, it seems that the
In which context are you calling initWithWidth:height: method is declared in Game.h but you define it in GameController.m.
This explains both why you get the SIGABRT and the errors when compiling.
The fix is calling
mGame = [[GameController alloc] init];
from GameController initWithWidth...
- (id)initWithWidth:(float)width height:(float)height
{
if ((self = [super initWithWidth:width height:height]))
{
float gameWidth = width;
float gameHeight = height;
// if we start up in landscape mode, width and height are swapped.
UIInterfaceOrientation orientation = [self initialInterfaceOrientation];
if (UIInterfaceOrientationIsLandscape(orientation)) SP_SWAP(gameWidth, gameHeight, float);
mGame = [[Game alloc] init];
mGame.pivotX = gameWidth / 2;
mGame.pivotY = gameHeight / 2;
mGame.x = width / 2;
mGame.y = height / 2;
[self rotateToInterfaceOrientation:orientation animationTime:0];
[self addChild:mGame];
}
return self;
}
The tutorial was very old and was incompatible with the latest scaffold;
I did this:
- (id)init
{
if((self = [super init]))
{
when I should've done this:
- (id)initWithWidth:(float)width height:(float)height
{
if ((self = [super initWithWidth:width height:height]))
{
thanks, though sergio!
(There are much better sparrow tutorials and I'm even making my own video tutorials :P)
in my app i have an array of CALayer that I have animated along a bezierPath. When I close and reopen the app my layers are not animating and not in the same position as before closing the app. I have implemented two methods, pauseLayer and resumeLayer that works when I trigger them with two buttons inside my app but they won't work after closing the app. The code is the following
- (void)pauseLayers{
for(int y=0; y<=end;y++)
{
CFTimeInterval pausedTime = [car[y] convertTime:CACurrentMediaTime() fromLayer:nil];
car[y].speed = 0.0;
car[y].timeOffset = pausedTime;
standardUserDefaults[y] = [NSUserDefaults standardUserDefaults];
if (standardUserDefaults[y]) {
[standardUserDefaults[y] setDouble:pausedTime forKey:#"pausedTime"];
[standardUserDefaults[y] synchronize];
}
NSLog(#"saving positions");
}
}
-(void)resumeLayers
{
for(int y=0; y<=end;y++)
{
standardUserDefaults[y] = [NSUserDefaults standardUserDefaults];
car[y].timeOffset = [standardUserDefaults[y] doubleForKey:#"pausedTime"];
CFTimeInterval pausedTime = [car[y] timeOffset];
car[y].speed = 1.0;
car[y].timeOffset = 0.0;
car[y].beginTime = 0.0;
CFTimeInterval timeSincePause = [car[y] convertTime:CACurrentMediaTime() fromLayer:nil] - pausedTime;
car[y].beginTime = timeSincePause;
}
}
- (void)applicationDidEnterBackground:(UIApplication *)application {
mosquitosViewController *mvc = [[mosquitosViewController alloc] init];
[mvc pauseLayers];
}
The problem with what you are trying to do above is that you are creating a completely new instance of your view controller, which is not the one that was showing onscreen. That's why nothing happens when you send the pauseLayers message.
What you should do is register to receive notifications for when your app goes to and comes from the background and call the appropriate methods (pauseLayers and resumeLayers) when that notification arrives.
You should add the following code somewhere in your mosquitosViewController implementation (I usually do so in viewDidLoad):
// Register for notification that app did enter background
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(pauseLayers)
name:UIApplicationDidEnterBackgroundNotification
object:[UIApplication sharedApplication]];
// Register for notification that app did enter foreground
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(resumeLayers)
name:UIApplicationWillEnterForegroundNotification
object:[UIApplication sharedApplication]];
I write a Swift 4 version extension based on #cclogg and #Matej Bukovinski answers from this thread. All you need is to call layer.makeAnimationsPersistent()
Full Gist here: CALayer+AnimationPlayback.swift, CALayer+PersistentAnimations.swift
Core part:
public extension CALayer {
static private var persistentHelperKey = "CALayer.LayerPersistentHelper"
public func makeAnimationsPersistent() {
var object = objc_getAssociatedObject(self, &CALayer.persistentHelperKey)
if object == nil {
object = LayerPersistentHelper(with: self)
let nonatomic = objc_AssociationPolicy.OBJC_ASSOCIATION_RETAIN_NONATOMIC
objc_setAssociatedObject(self, &CALayer.persistentHelperKey, object, nonatomic)
}
}
}
public class LayerPersistentHelper {
private var persistentAnimations: [String: CAAnimation] = [:]
private var persistentSpeed: Float = 0.0
private weak var layer: CALayer?
public init(with layer: CALayer) {
self.layer = layer
addNotificationObservers()
}
deinit {
removeNotificationObservers()
}
}
private extension LayerPersistentHelper {
func addNotificationObservers() {
let center = NotificationCenter.default
let enterForeground = NSNotification.Name.UIApplicationWillEnterForeground
let enterBackground = NSNotification.Name.UIApplicationDidEnterBackground
center.addObserver(self, selector: #selector(didBecomeActive), name: enterForeground, object: nil)
center.addObserver(self, selector: #selector(willResignActive), name: enterBackground, object: nil)
}
func removeNotificationObservers() {
NotificationCenter.default.removeObserver(self)
}
func persistAnimations(with keys: [String]?) {
guard let layer = self.layer else { return }
keys?.forEach { (key) in
if let animation = layer.animation(forKey: key) {
persistentAnimations[key] = animation
}
}
}
func restoreAnimations(with keys: [String]?) {
guard let layer = self.layer else { return }
keys?.forEach { (key) in
if let animation = persistentAnimations[key] {
layer.add(animation, forKey: key)
}
}
}
}
#objc extension LayerPersistentHelper {
func didBecomeActive() {
guard let layer = self.layer else { return }
restoreAnimations(with: Array(persistentAnimations.keys))
persistentAnimations.removeAll()
if persistentSpeed == 1.0 { // if layer was playing before background, resume it
layer.resumeAnimations()
}
}
func willResignActive() {
guard let layer = self.layer else { return }
persistentSpeed = layer.speed
layer.speed = 1.0 // in case layer was paused from outside, set speed to 1.0 to get all animations
persistAnimations(with: layer.animationKeys())
layer.speed = persistentSpeed // restore original speed
layer.pauseAnimations()
}
}
- (void)applicationDidEnterBackground:(UIApplication *)application
{
NSLog(#"1");
mosquitosViewController *mvc = [[mosquitosViewController alloc] init];
[mvc pauseLayers];
/*
Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
*/
}
See my answer to this post for details on how to restart an animation after multitasking:
Restoring animation where it left off when app resumes from background
So I've been trying to use CGPostMouseEvent, and CGEventPostToPSN to send a mouse click to a mac game, and unfortunately have been very unsuccessful.
I was hoping someone may be able to help me think of this differently, or realize what I'm missing. Google hasn't been much help.
My guess is that it's because I'm trying to send a click event to a game window (openGL), vs. a normal window.
Here is another example of what I'm trying to send:
CGEventRef CGEvent;
NSEvent *customEvent;
NSPoint location;
location.x = 746;
location.y = 509;
customEvent = [NSEvent mouseEventWithType: NSLeftMouseDown
location: location
modifierFlags: NSLeftMouseDownMask
timestamp: time(NULL)
windowNumber: windowID
context: NULL
eventNumber: 0
clickCount: 1
pressure: 0];
CGEvent = [customEvent CGEvent];
CGEventPostToPSN(&psn, CGEvent);
Interestingly enough, I can move the mouse fine (CGDisplayMoveCursorToPoint(kCGDirectMainDisplay, clickPt);), I just can't send any clicks :/
Any help would be greatly appreciated.
Edit: Here is what is strange, once I move the mouse using CGDisplayMoveCursorToPoint, I actually have to physically move my mouse up or down a hair before I can even click, which is odd. The game doesn't accept any input unless I move it up/down (and the pointer then changes).
Thanks!
Well what you are try to build is "bot" or "robot" which basically sends commands in an orderly fashion to a game. Basically it will play for you as you are afk. This is great for games that force you to play to harvest minerals, commodities or whatever gives you money to advance in the game. Which is really kind of boring. I have successfully done this for a popular game, although i cannot mention the game as it breaks the user agreements which all these type of games have against "bots". So beware of what you are doing, as it may break your user agreement for many MMPG. But i post this successfully here because, the Mac has less bots available, none that i have been able to research, vs the PC which i have found many. So to level the playing field.. here is the code. I recommend to compile it as command line, and execute the macro in AppleScript (were the logic will reside on how to mimic the games click mouses, movements and send keys, basically your AI.
1.- First you need to run class that will get your psn "process serial number" which all games have. Basically what Thread it is running at. You can find out the name of the process in the utility in the Mac called "Activity Monitor". This can also be done easily in AppleScript.
Once you have the name, this class will locate and give you back its psn.
#import <Cocoa/Cocoa.h>
#include <Carbon/Carbon.h>
#include <stdio.h>
#interface gamePSN : NSObject
{
ProcessSerialNumber gamePSN;
ProcessInfoRec gameProcessInfo;
pid_t gameUnixPID;
}
- (ProcessSerialNumber) gamePSN;
- (ProcessInfoRec) gameProcessInfo;
- (pid_t) gameUnixPID;
- (void) getPSN;
#end
#implementation gameSN
- (ProcessSerialNumber) gamePSN { return gamePSN; }
- (ProcessInfoRec) gameProcessInfo { return gameProcessInfo; }
- (pid_t) gameUnixPID; { return gameUnixPID; }
- (void) getPSN
{
auto OSErr osErr = noErr;
auto OSErr otherErr = noErr;
auto ProcessSerialNumber process;
auto ProcessInfoRec procInfo;
auto Str255 procName;
auto FSSpec appFSSpec;
auto char cstrProcName[34];
auto char one ='G'; // FIRST CHARCTER OF GAME PROCESS NAME THESE NEED TO BE CHANGED AS I PUT IN FAKES
auto char two ='A'; // SECOND CHARACTER OF GAME PROCESS NAME THESE NEED TO BE CHANGED AS I PUT IN FAKES
auto char three = 'M'; // THIRD CHARACTER OF GAME PROCESS NAME THESE NEED TO BE CHANGED AS I PUT IN FAKES
auto unsigned int size;
process.highLongOfPSN = kNoProcess;
process.lowLongOfPSN = kNoProcess;
procInfo.processInfoLength = sizeof(ProcessInfoRec);
procInfo.processName = procName;
procInfo.processAppSpec = &appFSSpec;
while (procNotFound != (osErr = GetNextProcess(&process))) {
if (noErr == (osErr = GetProcessInformation(&process, &procInfo))) {
size = (unsigned int) procName[0];
memcpy(cstrProcName, procName + 1, size);
cstrProcName[size] = '\0';
// NEEDS TO MATCH THE SIGNATURE OF THE GAME..FIRST THREE LETTERS
// IF YOU CANT FIND IT WITH THE ACTIVITY MONITOR UTILITY OF APPLE MAC OS
// THEN RUN THIS SAME CLASS WITH AN NSLOG AND IT WILL LIST ALL YOUR RUNNING PROCESSES.
if ( (((char *) &procInfo.processSignature)[0]==one) &&
(((char *) &procInfo.processSignature)[1]==two) &&
(((char *) &procInfo.processSignature)[2]==three) &&
(((char *) &procInfo.processSignature)[3]==two))
{
gamePSN = process;
otherErr = GetProcessInformation(&gamePSN, &gameProcessInfo);
otherErr = GetProcessPID(&process, &gameUnixPID);
}
}
}
}
Once you have this process number it is easy to send key events as well as mouse events. Here is the mouse event clicks to send.
// mouseClicks.h
// ClickDep
// Created by AnonymousPlayer on 9/9/11.
#import <Foundation/Foundation.h>
#interface mouseClicks : NSObject
- (void) PostMouseEvent:(CGMouseButton) button eventType:(CGEventType) type fromPoint:(const CGPoint) point;
- (void) LeftClick:(const CGPoint) point;
- (void) RightClick:(const CGPoint) point;
- (void) doubleLeftClick:(const CGPoint) point;
- (void) doubleRightClick:(const CGPoint) point;
#end
/
// mouseClicks.m
// ClickDep
// Created by AnonymousPlayer on 9/9/11.v
#import "mouseClicks.h"
#implementation mouseClicks
- (id)init
{
self = [super init];
if (self) {
// Initialization code here if you need any.
}
return self;
}
- (void) PostMouseEvent:(CGMouseButton) button eventType:(CGEventType) type fromPoint:(const CGPoint) point;
{
CGEventRef theEvent = CGEventCreateMouseEvent(NULL, type, point, button);
CGEventSetType(theEvent, type);
CGEventPost(kCGHIDEventTap, theEvent);
CFRelease(theEvent);
}
- (void) LeftClick:(const CGPoint) point;
{
[self PostMouseEvent:kCGMouseButtonLeft eventType:kCGEventMouseMoved fromPoint:point];
NSLog(#"Click!");
[self PostMouseEvent:kCGMouseButtonLeft eventType:kCGEventLeftMouseDown fromPoint:point];
sleep(2);
[self PostMouseEvent:kCGMouseButtonLeft eventType:kCGEventLeftMouseUp fromPoint:point];
}
- (void) RightClick:(const CGPoint) point;
{
[self PostMouseEvent:kCGMouseButtonRight eventType:kCGEventMouseMoved fromPoint:point];
NSLog(#"Click Right");
[self PostMouseEvent:kCGMouseButtonRight eventType: kCGEventRightMouseDown fromPoint:point];
sleep(2);
[self PostMouseEvent:kCGMouseButtonRight eventType: kCGEventRightMouseUp fromPoint:point];
}
- (void) doubleLeftClick:(const CGPoint) point;
{
[self PostMouseEvent:kCGMouseButtonRight eventType:kCGEventMouseMoved fromPoint:point];
CGEventRef theEvent = CGEventCreateMouseEvent(NULL, kCGEventLeftMouseDown, point, kCGMouseButtonLeft);
CGEventPost(kCGHIDEventTap, theEvent);
sleep(2);
CGEventSetType(theEvent, kCGEventLeftMouseUp);
CGEventPost(kCGHIDEventTap, theEvent);
CGEventSetIntegerValueField(theEvent, kCGMouseEventClickState, 2);
CGEventSetType(theEvent, kCGEventLeftMouseDown);
CGEventPost(kCGHIDEventTap, theEvent);
sleep(2);
CGEventSetType(theEvent, kCGEventLeftMouseUp);
CGEventPost(kCGHIDEventTap, theEvent);
CFRelease(theEvent);
}
- (void) doubleRightClick:(const CGPoint) point;
{
[self PostMouseEvent:kCGMouseButtonRight eventType:kCGEventMouseMoved fromPoint:point];
CGEventRef theEvent = CGEventCreateMouseEvent(NULL, kCGEventLeftMouseDown, point, kCGMouseButtonRight);
CGEventPost(kCGHIDEventTap, theEvent);
sleep(2);
CGEventSetType(theEvent, kCGEventRightMouseUp);
CGEventPost(kCGHIDEventTap, theEvent);
CGEventSetIntegerValueField(theEvent, kCGMouseEventClickState, 2);
CGEventSetType(theEvent, kCGEventRightMouseDown);
CGEventPost(kCGHIDEventTap, theEvent);
sleep(2);
CGEventSetType(theEvent, kCGEventRightMouseUp);
CGEventPost(kCGHIDEventTap, theEvent);
CFRelease(theEvent);
}
#end
You may need to play with the sleep which is the time interval between pushing the mouse button and releasing it. I have found that using 1 second sometimes it does not do it. Putting 2 seconds make it work all the time.
So your main would to the following.
int main(int argc, char *argv[])
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSUserDefaults *args = [NSUserDefaults standardUserDefaults];
// Grabs command line arguments -x, -y, -clicks, -button
// and 1 for the click count and interval, 0 for button ie left.
int x = [args integerForKey:#"x"];
int y = [args integerForKey:#"y"];
int clicks = [args integerForKey:#"clicks"];
int button = [args integerForKey:#"button"];
//int interval= [args integerForKey:#"interval"];
int resultcode;
// PUT DEFAULT VALUES HERE WHEN SENT WITH EMPTY VALUES
/*if (x==0) {
x= 1728+66;
y= 89+80;
clicks=2;
button=0;
}
*/
// The data structure CGPoint represents a point in a two-dimensional
// coordinate system. Here, X and Y distance from upper left, in pixels.
CGPoint pt;
pt.x = x;
pt.y = y;
// Check CGEventPostToPSN Posts a Quartz event into the event stream for a specific application.
// only added the front lines plus changed null in Create Events to kCGHIDEventTap
gamePSN *gameData = [[gamePSN alloc] init];
[gameData getPSN];
ProcessSerialNumber psn = [gameData gamePSN];
resultcode = SetFrontProcess(&psn);
mouseClicks *mouseEvent =[[mouseClicks alloc] init];
if (button == 0)
{
if (clicks==1) {
[mouseEvent LeftClick:pt];
} else {
[mouseEvent doubleLeftClick:pt];
}
}
if (button == 1)
{
if (clicks==1) {
[mouseEvent RightClick:pt];
} else {
[mouseEvent doubleRightClick:pt];
}
}
[gameData release];
[mouseEvent release];
[pool drain];
return 0;
}
Hope this is helpful.... remember you can execute this in the terminal or in AppleScript by sending the following command.
do shell script "/...Path to Compiled Program.../ClickDep" & " -x " & someX & " -y " & someY & " -clicks 1 -button 1"
HAPPY GAMING!!!!