Using a UISlider to change volume - cocoa-touch

What's the best way to use a UISlider to change the volume of the iPhone in an app?
I've tried using that Bill (Using a UISlider to change volume)
But nothing appears on the screen using the code:
MPVolumeView *volumeView = [[[MPVolumeView alloc] initWithFrame:
CGRectMake(0, 0, 215, 22)] autorelease];
volumeView.center = CGPointMake(150,375);
[volumeView sizeToFit];
[self.view addSubview:volumeView];

I've just found a very easy way. Instead of coding, just place it in your XIB.
Open the XIB where you want to place to slider into
Add a UIView to your view
Change the class identity from UIView to MPVolumeView
Change backgroundColor to clear
Voila!
PS: Tested it on a device with iPhone OS 3.0. As lostInTransit stated before, it won't work on the simulator.

Using a UISlider to change volume
^ I tried that lostintransit, however this is not changing the ringer volume. It comes up on the iphone screen, but doesn't change anything. Even using the demo project provided.

I am using this same technique, however I am noticing the following behavior:
On an iPod touch G2:
The MPVolumeView slider works and fully responds to the rocker switch from app startup time.
On an iPhone Gen 1 and 3G:
The MPVolumeView slider doesn't start changing the volume until media playback has occurred via the AudioQueue APIs, and will stop changing the volume when AudioSessionSetActive(false) is called. Even more bizarre: if you hit the rocker switch on the iPhone while using the AudioQueue APIs during playback then the MPVolumeView slider will work for the rest of the lifetime of the app.
This code keeps everything in sync, and tends to help with ensuring that the rocker switch and the slider stay in sync, but it doesn't eliminate the issues I am having with MPVolumeView on an iPhone.
All of my experience here is under iPhone OS 3.0 with the iPhone 3.0 SDK.
I hope this helps narrow down your problem somewhat.
My next tactic was to start doing things with kAudioSessionProperty_AudioRoute and try alternate values for kAudioSessionProperty_AudioCategory to see what happens.
Side note:
Thanks to everyone on the IB techniques for creation and managing the MPVolumeView. I was using code to create and place it, and I keep forgetting about the technique where you can set the type of the UIView in IB.
Side note 2:
In working with the MPVolumeView I discovered that if you set the audio category to anything other than kAudioSessionCategory_MediaPlayback that the MPVolumeView will display the text "iPhone..." when the iPhone is set to silent mode.

Use an MPVolumeView to display a slider that allows the user to modify the system volume.

If you want to be able to change the iPhone volume, MPVolumeView is the only option. Try the example on this link.
http://www.stormyprods.com/blogger/2008/09/proper-usage-of-mpvolumeview-class.html

Related

different background images for landscape and portrait view

I am developing an universal app where I have given a background image for a UIView using the following code.
UIImageView *imgView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"backImage.png"];
[self.view addSubview:imgView];
[self.view sendSubviewToBack:imgView];
I have created the following images and added to the project,
backImage.png
backImage#2x.png
backImage-Portrait~ipad.png
backImage-Portrait#2x~ipad.png
backImage-Landscape~ipad.png
backImage-Landscape#2x~ipad.png
But it does not seem to take the correct images.
It works fine when I run the app on iphone (I mean it takes the backImage.png and backImage#2x.png accordingly) but when I run the app on ipad it does not use the ipad images, it still uses the iphone images.
Can anyone please tell what must be going wrong.
Thanks in advance.
You need to detect in your code what type of device is being used, and depending on that, choose the appropriate image.
See here: iOS detect if user is on an iPad
I may be wrong, but I don’t think UIImage will know whether to load the ‘portrait' or ‘landscape' version of an image, much less change them dynamically when you rotate your interface. I think you’re expecting a lot more magic then there is.
Those conventions work in the app launch snapshot, but I think that’s the extent of it?

Take a screenshot of an UIView where its subviews are camera sessions

I'm building an app where I need to take a screenshot of a view whose subviews are camera sessions (AVFoundation sessions). I've tried this code:
CGRect rect = [self.containerView bounds];
UIGraphicsBeginImageContextWithOptions(rect.size,YES,0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.containerView.layer renderInContext:context];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Which effectively gets me an UIImage with the views, only that the camera sessions are black:
I've tried the private method UIGetScreenImage() and works perfectly, but as Apple doesn't allows this, I can't use it. I've also tried the one in Apple's docs but it's the same. I've tracked the problem to AVFoundation sessions using layers. How can I achieve this? The app has a container view with two views which are stopped camera sessions.
If using iOS 7, it's fairly simple and you could do something like this from a UIViewController:
UIView *snapshotView = [self.view snapshotViewAfterScreenUpdates:YES];
You can also use this link from a widow: iOS: what's the fastest, most performant way to make a screenshot programatically?
For iOS 6 and earlier, I could only find the following Apple Technical Q&A: [How do I take a screenshot of my app that contains both UIKit and Camera elements?]
Capture the contents of your camera view.
Draw that captured camera content yourself into the graphics context that you are rendering your UIKit elements. (Similar to what you did in your code)
I too am currently looking for a solution to this problem!
I am currently out at the moment so I can't test what I have found, but take a look at these links:
Screenshots-A Legal Way To Get Screenshots
seems like its on the right track - here is the
Example Project (and here is the initial post)
When I manage to get it to work I will definitely update this answer!

iOS UIImageView smooth Image presentation

One of the ways to improve user experience in iOS while showing images is to download them asynchronously without blocking the main thread and showing them....
But I want to add something to this -
Initially when there is no image,show a spinner while the async download has started.
After the download cache the image on local iOS disk for later use.
After the download populate the image part of UIImageView.
And dont just plonk the image into view for user. Showly Fade in the user (i.e. from alpha 0.0 to 1.0)
I have been using SDWebImage for sometime now. It works well but does not satisfy my 1st requirement (about spinner) and 4th.
Is there any help out there to satisfy all this?
Three20 http://www.three20.info has a TTImageView class that statisfies 2-3, you can subclass it and overwrite setImage: and create the fade animation there. (or just modify TTImageView.m directly).
Spinner is easy as well when you modify TTImageView you can add a TTActivityView on top and remove it on setImage:

AVPlayerLayer - ReProgramming the Wheel?

I'm currently using an AVPlayer, along with an AVPlayerLayer to play back some video. While playing back the video, I've registered for time updates every 30th of a second during the video. This is used to draw a graph of the acceleration at that point in the video, and have it update along with the video. The graph is using the CMTime from the video, so if I skip to a different portion of the video, the graph immediately represents that point in time in the video with no extra work.
Anywho, as far as I'm aware, if I want to get an interface similar to what the MediaPlayer framework offers, I'm going to have to do that myself.
What I'm wondering is, is there a way to use my AVPlayer with the MediaPlayer framework? (Not that I can see.) Or, is there a way to register for incremental time updates with the MediaPlayer framework.
My code, if anyone is interested, follows :
[moviePlayer addPeriodicTimeObserverForInterval: CMTimeMake(1, 30) queue: dispatch_queue_create("eventQueue", NULL) usingBlock: ^(CMTime time) {
loopCount = (int)(CMTimeGetSeconds(time) * 30);
if(loopCount < [dataPointArray count]) {
dispatch_sync(dispatch_get_main_queue(), ^{
[graphLayer setNeedsDisplay];
});
}
}];
Thanks!
If you're talking about the window chrome displayed by MPMoviePlayer then I'm afraid you are looking at creating this UI yourself.
AFAIK there is no way of achieving the timing behaviour you need using the MediaPlayer framework, which is very much a simple "play some media" framework. You're doing the right thing by using AVFoundation.
Which leaves you needing to create the UI yourself. My suggestion would be to start with a XIB file to create the general layout; toolbar at the top with a done button, a large view that represents a custom playback view (using your AVPlayerLayer) and a separate view to contain your controls.
You'll need to write some custom controller code to automatically show/hide the playback controls and toolbar as needed if you want to simulate the MPMoviePlayer UI.
You can use https://bitbucket.org/brentsimmons/ngmovieplayer as a starting point (if it existed at the time you asked).
From the project page: "Replicates much of the behavior of MPMoviePlayerViewController -- but uses AVFoundation."
You might want to look for AVSynchronizedLayer class. I don't think there's a lot in the official programming guide. You can find bits of info here and there: subfurther, Otter Software.
In O'Really Programming iOS 4 (or 5) there's also a short reference on how to let a square move/stop along a line in synch with the animation.
Another demo (not a lot of code) is shown during WWDC 2011 session Working with Media in AV Foundation.

Cocoa, Flicker with animation when overlaying a NSWindow on a QTMovieView

In a project that I am currently working on I have an transparent NSWindow overlayed on a QTMovieView. At certain points I slide a custom view into this child window with animation so that it is displayed over the movie for a short period of time. The only odd behavior is that the animation is smooth on a Mac Book Pro but on a Mac Book(Same OS-X Version) there is significant flicker. The flicker only occurs on the portion of the window that has the actual QTMovie behind it.
Has anyone seen this behavior before or found a way to work around it?
The older MacBooks don't have real video hardware and used shared memory, so it's probably an issue with a slow video card trying to update # 30fps. Have you tried smaller movies to see if the issue goes away?
You may be better off with a pipeline like in the QTCoreVideo101 sample code from Apple. That would be a bit more work, you'd have to take care of the animation yourself, but you would get ultimate control over what is being drawn.