Is there a way to determine the duration of a video currently set to WKInterfaceInlineMovie? I need to implement a circular progress bar displaying a current progress.
I have a URL of the file initially downloaded from network. It plays well, but I haven't found any way to determine its length (actually, nor questions asking that which is strange).
Of course, I can ask the backend server to send this info, but I'd like to avoid such complications if possible.
OK, it seems I overcame the WKInterfaceInlineMovie API limitation by the help of AVFoundation and CoreMedia.
I create AVAsset object using a movie URL from a shared folder (AVAsset(url:)). Then I get CMTime duration from the AVAsset's duration property (which is a CMTime object).
Actually, I was very surprised to find out that it works. I'm still testing it, because it's too good to be true and I'm expecting to run across some pitfalls. I'll update the answer if anything else's found.
Related
I started to check the way how I can use NAudio to play sounds using different Output devices.
Now I would like to use NAudio for the following use case:
I would like to play a ringback when a call arrives on a soft phone. The ringback audio (wav file of 3-5 sec) should be played repeatedly until the call is accepted or the caller hangs-up the call.
I found now two ways of doing it:
Following entry explains how this could be solved by playing a file in a loop:
NAudio looping an audio file
Another entry explains how to work this out by using a timer to play the file repeatedly:
NAudio - Play an audiofile, wait for 2 seconds, play the audio file again
The question is which way is the better one to go for? So does it make sense to start the playback often for a quite short-time?
Another question which arise is if there is way to make sure that in the looping case the playback is rellay stopped as it does not make sense to have the ringback played from the loudspeaker although the person is talking already to the caller.
Thank you very much for your support!
Uzay
I'd recommend using the looping solution in this scenario. It avoids the need to keep closing and opening the soundcard. Stopping works exactly the same whether you are looping or not.
I have been attempting to change which audio device my computer sends sound to. My end goal is to create a program that can make my laptop output to its built-in speakers even when headphones are plugged into the headphone jack.
I stumbled across this project, but the methods it uses (specifically AudioHardwareSetProperty) are deprecated. It also just doesn't work (it will say it changed the output device, but sound will still go to my headphones).
CoreAudio seems very poorly documented and I could not find ANY code online that did not use that function. I would go with the deprecated function if it did what I wanted, but it doesn't. I'm unsure weather it's broken or just doesn't do what I think it does, but that really doesn't matter in the end.
I attempted to look at the comments on AudioHardwareSetProperty but all I found was this in the discussion section:
Note that the value of the property should not be considered changed until the
HAL has called the listeners as many properties values are changed
asynchronously. Also note that the same functionality is provided by the
function AudioObjectGetPropertyData().
This is obviously not true, since I know for a fact that AudioObjectGetPropertyData is used for getting information about one specific audio device.
Is what I am trying to do possible with CoreAudio?
I'm developing a Mac App and I need to check if Itunes (11.0) is shuffling my music so, to check that, I'm using iTunes.h and the following code:
if([iTunes.currentPlaylist shuffle]){
NSLog(#"yes");
}else{
NSLog(#"no");
}
Even though I have my iTunes shuffling, it always outputs "no".
Any ideas why is this happening or am I checking it the wrong way?
This I know: in iTunes 11 some things changed. One of them is that the "shuffle"-option is playlist-independent...
Good luck with that.
I reported a bug about a month ago, like tons of other developers.
Didn't hear anything, and probably won't.
Like DigiMonk wrote, it a change in iTunes 11, but they don't update their API.
A long way around for the time being might be to listen for NSDistributedNotifications and check whether the attributes for the currently playing track match the previous or next one. It's not the cleanest solution, but it should work if what's being shuffled is an album or artist. Just check whether the track numbers go in sequence, or if the artist name is the same, etc.
I'm working on a hobby project which I'm slowly updating in my spare time to help learn some new things. One stumbling block I've come across is working with Core Data in a separate thread. I've read Apple's documentation about Core Data concurrency and everything I read seemed straight forward enough so I began to update my project to load data on a background thread as I don't want to lockout the UI whilst things are loading.
The project works fine if the Core Data object is loaded on the main thread. It crashes if I switch to background loading.
At this stage, I can verify that:
The NSManagedObject loads on the thread and I can access it's properties
Outputting the data to the stdout works fine and looks correct.
A binary comparson of the data object loaded on the main thread and the data loaded on the background thread proves they are identical.
The actual problem occurs when I call a category implementation on NSData. I can verify the NSData object is fine when it's loaded on the background thread, it's only when I call a function to do some work on the NSData object after it's loaded do I get a problem. The problem is a EXC_BAD_ACCESS, which usually means the address of an object is wrong but it doesn't quite make any sense.
I'm probably just getting something obvious or simple wrong - but I just can't see the forest for the trees.
If you think you can offer any advice on this as it's driving my crazy then you can find the code here:
Edit post answer: Removed URL as project no longer exists.
Ok I've finally found out what the problem is. The decompression method was exceeding the stack size of the thread and therefore causing a weird & random EXC_BAD_ACCESS to be fired.
I would have expected the debugger to produce a more direct clue in this case.
So a valid 'stack overflow' problem, solved.
I know this has been asked before, but the only answers I have found use UIImages. I need to make a 25 fps video from an NSArray of NSImages in Objective-C. Could somebody give me a link to the documentation dealing with this (if there is any), or tell me how I can do it?
NOTE: I will also need to know which frameworks to use if there is no documentation on this. And, before you ask, I have done lots of searches for the documentation.
You can do this with QTKit by creating a movie and adding images as frames. See the "Creating a Single Frame-Grabbing Application" section. Step 13 specifically demonstrates how to add an image (and have it last a specific duration in the movie ... should probably last more than a single frame for, say, stop-motion stuff).