Slow down playback speed? - objective-c

How can I slow down an audio file (for playback only) on Mac OS X, but preserve good quality? I tried using QTKit to slow down audio but the quality is bad.
Edit: I'm using this code:
QTMovie *audio = [[QTMovie alloc] initWithFile:mediaClipURL error:&error];
// ... (error handling)
[audio setRate:0.5];

As "markratledge" guessed, I also suspect you want "speed adjustment without pitch bending." It's pretty straightforward to do without third-party code. You can set the QTMovieRateChangesPreservePitchAttribute attribute and just adjust the movie's rate:
QTMovie = [[QTMovie alloc] initWithURL:mediaClipURL error:nil];
if (movie)
{
// Set preserve-pitch attribute
[movie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieRateChangesPreservePitchAttribute];
[movie setRate:0.5];
}
// ...
Note: The further away from 1.0 you are, the more distortion you're going to have. There's really no way around this. Samples will be repeated when going slow at the same pitch and samples will be cut very short when going fast at the same pitch. It's a fact of audio processing - the harder the effect, the more distortion you'll eventually have.

The audio editor Audacity http://audacity.sourceforge.net/ has an effect that increases of decreases tempo without changing pitch, and is open source. So it might be good for some applicable code.

Related

iPhone iOS6 how to return Image IO type "resident dirty memory" to the OS?

I"m looking at the WWDC 2010 video which deals with advanced memory analysis( session 311):
At around 45:00 into the video, the performance engineer discusses what to do with "Resident Dirty memory" that your app has loaded in RAM. The engineer suggests that in response to memory warnings, your app should clear this. The engineer pastes in his custom class "flush" method into didReceiveMemoryWarning and everything is fine, but the code does not really offer any examples of HOW the memory is to be freed.
The question that I have is - how to do I flush large chunks of dirty memory used by "Image IO"? :
Here's around 74 mb of memory just sitting around dirty ( for close to 6 minutes now), waiting for someone to return it to iOS6. Nothing is happening to it. Since it does not go away on its own, I need to know how to return it to iOS.
These blocks appear to originate from code like this and (maybe other image related operations).
UIImage *screenshot = nil;
#autoreleasepool {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(iPhoneRetinaIconSize, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(iPhoneRetinaIconSize);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
The issue is that there's a lot of memory sitting around, loaded in RAM, unable to be returned to the operating system until the app crashes.
For webview-related dirty memory, I found that this may work:
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
[[NSURLCache sharedURLCache] removeAllCachedResponses];
[[NSURLCache sharedURLCache] setDiskCapacity:0];
[[NSURLCache sharedURLCache] setMemoryCapacity:0];
// Dispose of any resources that can be recreated.
}
Is there an equivalent for UIImage, CALayer or UIGraphics ?
I am far from an expert in these issues, but based on the tests I conducted with the code you provided, I'd say you just have to release the UIImages created in these blocks of code.
As far as I understand, the Image IO or GC raster data labeled chunks of memory are really just the underlying data of your images (UIImage being a UIKit wrapper on top of these). So to release the memory, release the image.
I tested this by creating a bunch of UIImages using your code, simulating a memory warning which released all of the created images:
The images speak for themselves. Releasing my UIImages (at ~00:08) removed the big GC raster data chunk from resident memory.
Because removing completely an image from your UI may not be the best solution for user experience, maybe you could try to downsize your largest images when receiving a memory warning, a poorer resolution resulting in a smaller memory footprint. Another idea (again this depends on what your images are used for) could be to dump the images to disk, and load them latter, when needed.
Hope that helps.

Problems creating lots of thumbnails using AVAssetImageGenerator with ARC

I've looked through every list I can find for ideas on this but have been stuck for a couple of days now so here's hoping...
I'm trying to create a lot of video thumbnails (100 or so) at once using AVAssetImageGenerator.
While testing I'm always using the same movie files, but the process (seemingly) randomly succeeds or fails, plus I now get all the movies loaded into physical memory at once which I don't want. I'm obviously doing something wrong but cannot figure out what - and under ARC can't manually manage the memory.
Background;
It's an AVFoundation video player using ARC (Xcode 4.2.1, OSX 10.7.2, MBP 2.4GHz i5, 4GB RAM).
It has a playlist style interface allowing you to drag folders of movie files to it. It recurses through the paths it's given and plucks out the valid movies and adds them to the list.
Movies are represented by a Movie class which has an AVAsset member variable (movieAsset) initialised with the file path. So far so good - everything works, movies can be played, only the movie being played is in physical memory.
I've added a -createThumbnail method to the Movie class which is called in the Movie's init method (code snippet below).
With the addition of this code I'm getting a few behaviours I can't eradicate - none of which occur if I don't call the -createThumbnail code below. Any ideas where I'm going wrong?
Every movie added to the playlist is now being loaded into physical memory immediately - so the apps memory footprint has gone way up (without thumbnail code = 40MB for 100 movies, with thumbnails (NSImages at 32x18 pixels) 750MB for the same 100 movies).
Looking at Activity Monitor->Open Files and Ports, all the movie files are listed even after thumbnail creation has finished. This didn't occur before - only the movie being played was listed.
Thumbnail creation completely locks up my machine until it's complete - even though I'm calling AVAssetImageGenerator within an asynchronous block - (CPU usage never gets above 35%). Could this be a disk access problem trying to read 100 movies at once?
Thumbnail creation is very erratic. Sometimes all thumbnails are created, but often a random 30-70% are not. Maybe also a disk access problem?
I'm very new to OOP and Obj-C so have probably made a newbies mistake - but I just can't track it down...
Also worth noting that the "Error creating thumbnail" and "Error finding duration" NSLogs are never called...
-(void)createThumbnail{
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[movieAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus valueStatus = [movieAsset statusOfValueForKey:#"duration" error:&error];
switch (valueStatus) {
case AVKeyValueStatusLoaded:
if ([movieAsset tracksWithMediaCharacteristic:AVMediaTypeVideo]) {
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:movieAsset];
Float64 movieDuration = CMTimeGetSeconds([movieAsset duration]);
CMTime middleFrame = CMTimeMakeWithSeconds(movieDuration/2.0, 600);
CGImageRef imageForThumbnail = [imageGenerator copyCGImageAtTime:middleFrame actualTime:NULL error:NULL];
if (imageForThumbnail != NULL) {
NSSize sizeOption = NSMakeSize(32, 18);
self.thumbnail = [[NSImage alloc] initWithCGImage:imageForThumbnail size:sizeOption];
NSLog(#"Thumbnail created for %#", [self filenameString]);
}
else{
NSLog(#"-----Error creating thumbnail for %#", [self filenameString]);
}
CGImageRelease(imageForThumbnail);
}
break;
case AVKeyValueStatusFailed:
NSLog(#"Error finding duration");
break;
case AVKeyValueStatusCancelled:
NSLog(#"Cancelled finding duration");
break;
}
}];
}
(Note: I've been using the same few folders of movie files to develop the app for the last month or so. They're all local valid files that play successfully in the app. Some of these folders dropped contain over a hundred movie files nested within various subfolders).
Many thanks if anyone can help.
Chas.
I had some weird issues when using AVAssetImageGenerator in this way as well (on iOS at least). It seems to me at least to be somewhat broken when used in combination with the blocks/GCD API. Rather than loading everything asynchronously, try making a single queue/loop that operates in a single background thread, and walk through the movies that way. This should also help to keep your memory usage down.

Frame synchronization with AVPlayer

I'm having an issue syncing external content in a CALayer with an AVPlayer at high precision.
My first thought was to lay out an array of frames (equal to the number of frames in the video) within a CAKeyframeAnimation and sync with an AVSynchronizedLayer. However, upon stepping through the video frame-by-frame, it appears that AVPlayer and Core Animation redraw on different cycles, as there is a slight (but noticeable) delay between them before they sync up.
Short of processing and displaying through Core Video, is there a way to accurately sync with an AVPlayer on the frame level?
Update: February 5, 2012
So far the best way I've found to do this is to pre-render through AVAssetExportSession coupled with AVVideoCompositionCoreAnimationTool and a CAKeyFrameAnimation.
I'm still very interested in learning of any real-time ways to do this, however.
What do you mean by 'high precision?'
Although the docs claim that an AVAssetReader is not designed for real-time usage, in practice I have had no problems reading video in real-time using it (cf https://stackoverflow.com/a/4216161/42961). The returned frames come with a 'Presentation timestamp' which you can fetch using CMSampleBufferGetPresentationTimeStamp.
You'll want one part of the project to be the 'master' timekeeper here. Assuming your CALayer animation is quick to compute and doesn't involve potentially blocky things like disk access, I'd use that as the master time source. When you need to draw content (eg in the draw selector on your UIView subclass) you should read currentTime from the CALayer animation, if necessary proceed through the AVAssetReader's video frames using copyNextSampleBuffer until CMSampleBufferGetPresentationTimeStamp returns >= currentTime, draw the frame, and then draw the CALayer animation content over the top.
If your player is using an AVURLAsset, did you load it with the precise duration flag set? I.e. something like:
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:aUrl options:options];

How to speed up saving a UIImagePickerController image from the camera to the filesystem via UIImagePNGRepresentation()?

I'm making an applications that let users take a photo and show them both in thumbnail and photo viewer.
I have NSManagedObject class called photo and photo has a method that takes UIImage and converts it to PNG using UIImagePNGRepresentation() and saves it to filesystem.
After this operation, resize the image to thumbnail size and save it.
The problem here is UIImagePNGRepresentation() and conversion of image size seems to be really slow and I don't know if this is a right way to do it.
Tell me if anyone know the best way to accomplish what I want to do.
Thank you in advance.
Depending on the image resolution, UIImagePNGRepresentation can indeed be quite slow, as can any writing to the file system.
You should always execute these types of operations in an asynchronous queue. Even if the performance seems good enough for your application when testing, you should still do it an asynch queue -- you never know what other processes the device might have going on which might slow the save down once your app is in the hands of users.
Newer versions of iOS make saving asynchronously really, really easy using Grand Central Dispatch (GCD). The steps are:
Create an NSBlockOperation which saves the image
In the block operation's completion block, read the image from disk & display it. The only caveat here is that you must use the main queue to display the image: all UI operations must occur on the main thread.
Add the block operation to an operation queue and watch it go!
That's it. And here's the code:
// Create a block operation with our saves
NSBlockOperation* saveOp = [NSBlockOperation blockOperationWithBlock: ^{
[UIImagePNGRepresentation(image) writeToFile:file atomically:YES];
[UIImagePNGRepresentation(thumbImage) writeToFile:thumbfile atomically:YES];
}];
// Use the completion block to update our UI from the main queue
[saveOp setCompletionBlock:^{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
UIImage *image = [UIImage imageWithContentsOfFile:thumbfile];
// TODO: Assign image to imageview
}];
}];
// Kick off the operation, sit back, and relax. Go answer some stackoverflow
// questions or something.
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperation:saveOp];
Once you are comfortable with this code pattern, you will find yourself using it a lot. It's incredibly useful when generating large datasets, long operations on load, etc. Essentially, any operation that makes your UI laggy in the least is a good candidate for this code. Just remember, you can't do anything to the UI while you aren't in the main queue and everything else is cake.
Yes, it does take time on iPhone 4, where the image size is around 6 MB. The solution is to execute UIImagePNGRepresentation() in a background thread, using performSelectorInBackground:withObject:, so that your UI thread does not freeze.
It will probably be much faster to do the resizing before converting to PNG.
Try UIImageJPEGRepresentation with a medium compression quality. If the bottleneck is IO then this may prove faster as the filesize will generally be smaller than a png.
Use Instruments to check whether UIImagePNGRepresentation is the slow part or whether it is writing the data out to the filesystem which is slow.

How to seek within an audio track using avassetreader?

I'm familiar with how to stream audio data from the ipod library using AVAssetReader, but I'm at a loss as to how to seek within the track. e.g. start playback at the halfway point, etc. Starting from the beginning and then sequentially getting successive samples is easy, but surely there must be a way to have random access?
AVAssetReader has a property, timeRange, which determines the time range of the asset from which media data will be read.
#property(nonatomic) CMTimeRange timeRange
The intersection of the value of this property and CMTimeRangeMake(kCMTimeZero, asset.duration) determines the time range of the asset from which media data will be read.
The default value is CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity). You cannot change the value of this property after reading has started.
So, if you want to seek to the middle the track, you'd create a CMTimeRange from asset.duration/2 to asset.duration, and set that as the timeRange on the AVAssetReader.
AVAssetReader is amazingly slow when seeking. If you try to recreate an AVAssetReader to seek while the user is dragging a slider, your app will bring iOS to its knees.
Instead, you should use an AVAssetReader for fast forward only access to video frames, and then also use an AVPlayerItem and AVPlayerItemVideoOutput when the user wants to seek with a slider.
It would be nice if Apple combined AVAssetReader and AVPlayerItem / AVPlayerItemVideoOutput into a new class that was performant and was able to seek quickly.
Be aware that AVPlayerItemVideoOutput will not give back pixel buffers unless there is an AVPlayer attached to the AVPlayerItem. This is obviously a strange implementation detail, but it is what it is.
If you are using AVPlayer and AVPlayerLayer, then you can simply use the seek methods on AVPlayer itself. The above details are only important if you are doing custom rendering with the pixel buffers and/or need to send the pixel buffers to an AVAssetWriter.