According to:
Can I use AVFoundation to stream downloaded video frames into an OpenGL ES texture?
It's posible to get the frames from an remote media. However I've been trying the suggestion, but the documentation about the use of AVPlayerItemVideoOutput is not very clear, it seems to have a delegate method outputMediaDataWillChange, which have the pointer to the AVPlayerItemVideoOutput instance.
Maybe I'm doing a wrong assumption , but this delegate method it's called every time the data will change?. It is the right place to get the CVPixelBuffer?.
Method outputMediaDataWillChange will only be called after registering requestNotificationOfMediaDataChangeWithAdvanceInterval, usually when you will pause your app or the like.
You can access the pixel buffer in your display link hook. Look for hasNewPixelBufferForItemTime and copyPixelBufferForItemTime in Apple sample (it's for OS X, but basically it's the same for iOS.
Probably not. You will need to update the texture on the same thread as your GL is doing all the work or some other thread with shared context, not on the thread you get a delegate callback that the media data has been updated. You could put some boolean value to true in this callback to notify the GL thread that the buffer is ready and it should collect it. Alternatively you could push some "target selector pair" to be performed to on the GL thread to collect the data (system like "performSelectorOnMainThread") but then again you should ask yourself if such pair already exists on the stack in case that media update is changing data faster then your GL is refreshing... I any case if you use that delegate and not handle it correctly it will not update the texture at all or it will block your GL thread.
I think, you should use ffmpeg library, as this library can connect any streaming server and get the picture in raw data. After that you can do anything with that pic.
Related
Lately I have been playing around with the bluetooth framework and grew a strong enough knowledge to start building an application. The only problem is that all the examples I found and all the practice I have made consist in putting the core bluetooth core code inside the same file as the UIView with which the user is interacting.
I would like my future application to have multiple views in which the BLE scan occurs on the background. I initially thought about creating an object with a name similar to bleDeviceFinder and pass this object through each view. However, after thinking about it I realised that if I want something to happen in the current view I need the function didDiscoverPeripheral to have direct access to the UIView objects which it is supposed to affect.
I know it is probably a stupid question, what would be the best way to do so? I was thinking maybe to set and alert and subscribe every view to that alert; is this a good solution?
A quasi singleton BTLEManager that you pass around in the app. It sends NSNotifications for events like discovery, and your ViewControllers observe these notifications. The truth (i.e. list of discovered devices) stays in BTLEManager. Once a viewController has received such a notification it asks the BTLEManager for the list of current devices and then the viewController changes your views accordingly. The Views should never talk to the BTLEManager directly.
That's how I would do it.
I have an app where I allow the user to drag/drop some data from an NSView, with a custom drag image under the cursor.
I just updated to the Xcode 6 beta, and now my drag/drop code won't compile. This is because Apple has deprecated the following NSView method:
dragImage:at:offset:event:pasteboard:source:slideBack:
Fine, but what am I supposed to use instead? All the Apple documentation I've found still merrily recommends the deprecated method.
There is also a "dragFile:" method, however my NSView represents AV data, and it's unsuitable to write a large new file to disk every time the user begins a drag. The user may well to abort the drag, possibly multiple times in rapid succession.
What is the new way to initiate a drag operation with a custom icon?
Thanks to Kyle Sluder on Apple's Dev forums for alerting me to this. It turns out the replacement is
NSView beginDraggingSessionWithItems: event: source:
Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];
I have tried this project on both android and ios with little success. There is a good chance that this stuff is just over my head. However I figured I would post my question on here as a last effort.
I'm trying to figure out when a device is rotated or flipped. My app should know when it did a 180, 360 or if the device was flipped vertically.
In an attempt to understand the way its suppose to work I tried downloading two example projects: AccelerometerGraph and CoreMotionTeapot. With these and a mix of other stuff I have figured out I was trying this:
motionManager = [[CMMotionManager alloc] init];
motionManager.accelerometerUpdateInterval = 0.01;
motionManager.deviceMotionUpdateInterval = 0.01;
[motionManager startDeviceMotionUpdates];
if (motionManager.gyroAvailable) {
motionManager.gyroUpdateInterval = 1.0/60.0;
motionManager.deviceMotionUpdateInterval = 0.01;
[motionManager startGyroUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMGyroData *gyroData, NSError *error)
{
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
}];
} else {
NSLog(#"No gyroscope on device.");
}
But I do not know how to gather the requested information(horizontal and vertical rotations) from these three values (x ,y, z).
What you're attempting is not trivial, but is certainly possible. This video should be very helpful in understanding the capabilities of the device and how to get closer to your goal:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k
While he's talking about Android, the same concepts apply to the iPhone.
from the apple's documentation : CMMotionManager Class Reference (sorry lot of reading, i've bolded some sentences for quick over-reading)
After creating an instance of CMMotionManager, an application can use it to receive four types of motion: raw accelerometer data, raw gyroscope data, raw magnetometer data, and processed device-motion data (which includes accelerometer, rotation-rate, and attitude measurements). The processed device-motion data provided by Core Motion’s sensor fusion algorithms gives the device’s attitude, rotation rate, calibrated magnetic fields, the direction of gravity, and the acceleration the user is imparting to the device.
Important An application should create only a single instance of the CMMotionManager class. Multiple instances of this class can affect the rate at which data is received from the accelerometer and gyroscope.
An application can take one of two approaches when receiving motion data, by handling it at specified update intervals or periodically sampling the motion data. With both of these approaches, the application should call the appropriate stop method (stopAccelerometerUpdates, stopGyroUpdates, stopMagnetometerUpdates, and stopDeviceMotionUpdates) when it has finished processing accelerometer, rotation-rate, magnetometer, or device-motion data.
Handing Motion Updates at Specified Intervals
To receive motion data at specific intervals, the application calls a “start” method that takes an operation queue (instance of NSOperationQueue) and a block handler of a specific type for processing those updates. The motion data is passed into the block handler. The frequency of updates is determined by the value of an “interval” property.
Accelerometer. Set the accelerometerUpdateInterval property to specify an update interval. Call the startAccelerometerUpdatesToQueue:withHandler: method, passing in a block of type CMAccelerometerHandler. Accelerometer data is passed into the block as CMAccelerometerData objects.
Gyroscope. Set the gyroUpdateInterval property to specify an update interval. Call the startGyroUpdatesToQueue:withHandler: method, passing in a block of typeCMGyroHandler. Rotation-rate data is passed into the block as CMGyroData objects.
Magnetometer. Set the magnetometerUpdateInterval property to specify an update interval. Call the startMagnetometerUpdatesToQueue:withHandler: method, passing a block of type CMMagnetometerHandler. Magnetic-field data is passed into the block as CMMagnetometerData objects.
Device motion. Set the deviceMotionUpdateInterval property to specify an update interval. Call the or startDeviceMotionUpdatesUsingReferenceFrame:toQueue:withHandler: or startDeviceMotionUpdatesToQueue:withHandler: method, passing in a block of type CMDeviceMotionHandler. With the former method (new in iOS 5.0), you can specify a reference frame to be used for the attitude estimates. Rotation-rate data is passed into the block as CMDeviceMotion objects.
Periodic Sampling of Motion Data
To handle motion data by periodic sampling, the application calls a “start” method taking no arguments and periodically accesses the motion data held by a property for a given type of motion data. This approach is the recommended approach for applications such as games. Handling accelerometer data in a block introduces additional overhead, and most game applications are interested only the latest sample of motion data when they render a frame.
Accelerometer. Call startAccelerometerUpdates to begin updates and periodically access CMAccelerometerData objects by reading the accelerometerData property.
Gyroscope. Call startGyroUpdates to begin updates and periodically access CMGyroData objects by reading the gyroData property.
Magnetometer. Call startMagnetometerUpdates to begin updates and periodically access CMMagnetometerData objects by reading the magnetometerData property.
Device motion. Call the startDeviceMotionUpdatesUsingReferenceFrame: or startDeviceMotionUpdates method to begin updates and periodically access CMDeviceMotion objects by reading the deviceMotion property. The startDeviceMotionUpdatesUsingReferenceFrame: method (new in iOS 5.0) lets you specify a reference frame to be used for the attitude estimates.
About gathering the data :
#property(readonly) CMGyroData *gyroData
Discussion
If no gyroscope data is available, the value of this property is nil. An application that is receiving gyroscope data after calling startGyroUpdates periodically checks the value of this property and processes the gyroscope data.
So you should have something like
gyroData.rotationRate.x
gyroData.rotationRate.y
gyroData.rotationRate.z
by storing them and comparing them periodically you should be able to see if the device flipped around an axis, etc.
It all depends on the iPhone position. Say, if the phone gets flipped 360 around the y axis, the compass won't change 'cos it will still be pointing the same way during the flip. And that's not just it. My hint is that you log the accelerometer and compare the data you've collected with the movement made, and then, identify the stages of the trick and make a list of stages for each trick.
Then maybe what you're looking for is just the device orientation. You should look at the UIDevice Class Reference. In particular the
– beginGeneratingDeviceOrientationNotifications
– endGeneratingDeviceOrientationNotifications
methods.
and use it like this :
[UIDevice currentDevice].orientation
You'll get in return these possible values :
typedef enum {
UIDeviceOrientationUnknown,
UIDeviceOrientationPortrait,
UIDeviceOrientationPortraitUpsideDown,
UIDeviceOrientationLandscapeLeft,
UIDeviceOrientationLandscapeRight,
UIDeviceOrientationFaceUp,
UIDeviceOrientationFaceDown
} UIDeviceOrientation;
So you'll be able to check if it's in portrait (up or down) or landscape (left or right) and if it has been flipped.
You'll be able to implement the following methods :
- willRotateToInterfaceOrientation
- didRotateToInterfaceOrientation
You can look in this link to check how you can implement the methods.
I am doing an application where images from the user are taken all together and saved in NSMutableArray.
As soon as even one image has been start coming, I need to upload images to server one by one though images are taken together
I am using [NSThread detachNewThreadSelector:#selector(uploading:) toTarget:self withObject:imagearray]; to upload images one by one. I need to show progressview to user as images are being uploaded one by one.
How do I notify after one image has been uploaded?
Or is there any other scenario that is useful for this more than NSThread+NSNotification?
I sugest to use something similar to "delegate" paradigm but thinking on threads instead of objects. So the uploading thread delegates on main thread as it is the one to make user interface changes.
For example, the uploading thread can send messages for partial upload progress
[self performSelectorOnMainThread:#selector(uploadProgression:)
withObject:foo waitUntilDone:NO]
or for each complete upload finished
[self performSelectorOnMainThread:#selector(uploadDidEnd:) withObject:foo
waitUntilDone:YES]
I suppose that you have not to stop uploading for updating partial progress in progessView but you need to wait when upload ends so not to duplicate uploading threads launching a new upload.
You should use notifications in case you don't know how many listeners it is out there and you just post notification about something. In your case you probably have only one view controller, so there is no need to use notifications so just create some protocol for the delegate and implement it in your view controller. If you need to update your UI, then you should also invoke all delegate methods using performSelectorOnMainThread.