Find a similar voice. How and where? - voice-recognition

I have an mp3 file of an acapella song (isolated vocals).
I want to find all the audio files on the internet where the voice is similar (not necessarily the same, just similar) to the one in my file (for example: David Bowie: https://youtu.be/FFrGGtKqCi4 ).
I want to search among videos on YouTube and among audio files on the Internet.
How to find audio files with a similar voice?
P.S. I heard that audio files are converted to spectograms (pictures from audio).
Maybe, to achieve my goal, I need to look for audio files with a similar spectrogram? (Where can I do it?)
(For example, how image search works in Google Images)

Related

How to determine number of voices in one recorded file in cocoa?

I want to extract the information such as how many voices as per person in one recorded file.
I know the NSSpeechRecognizer class to recognise the speech, but not able to get to know the number of voices in one recorded file.
Please provide some suggestion.
Thanks,
Yogesh Arora
Counting the number of voices requires segmenting the audio file and diarising the speakers, which is by no means a simple task. Cocoa doesn't provide an API for that, at least yet. There are some open-source libraries that might help. You might want to check the ALIZE toolkit, for example.

Embedded ASS subtitle track in streamed video?

We'd like to build a small specialized clone of the ill-fated popcorn-time project, that is to say a node-webkit frontend for peerflix. The videos we'd like to play are mkv files that have embedded ASS subtitle tracks, and we can't seem to get the embedded subtitles to show up: while VLC nicely shows them, html5 video players in webkit-based things don't, not even in Google Chrome (so it's not a matter of Chromium's reduced codec support).
Now, I'm a bit out of our depths here, I don't really know much about these things, but it seems to me the media engine underneath webkit just ignores the ASS subtitle track here. Is it because it's ASS? Is it a matter of codecs somehow? Or is it, after all, a html5 thing? Now, the html5 video "living standard" mentions that "captions can be provided, either embedded in the video stream or as external files using the track element" - so the feature is at least planned, but I do realize that implementation is lacking. However, given that node-webkit uses ffmpeg as the underlying engine, it seems strange to me that the subtitles are not picked up at all.
Could someone more knowledgeable please advise us the problem? Also, is there anything we could do about it?
Extracting the subtitles beforehand is not an option, though I have been playing with the idea of extracting the subtitles on the fly, and feeding that stream back to the player - I had some modest success with this, and it looks like it could be done with some effort, but I'm really out of my depth here, and the whole idea is pretty contrived anyway.
However, I find it improbable that nobody has run into this problem before, hence this question: is there any way to show embedded (ASS) subtitle tracks in a streamed video in node-webkit?
Not sure if this would help but according to this page node-webkit doesn't ship with codec for patented media formats. They do have a few suggestions on the page, one of which is to compile your own node-webkit.
You could try using Popcorn Time's ffmpegsumo file which is what I used when I needed mp3 support and Chrome's version didn't work. Although, I don't know if that supports ASS subtitle format(considering its use, I would think it has to).
Note: I would have commented this answer but unfortunately I don't have commenting privileges yet. A couple of upvotes sure would be nice ;)

iOS Localization of large files like movies

I am creating a new iOS project, where I have to include a video in my app. We need to access this video also in offline mode - so I need to include it as video file in my project.
The question - what is the best practice for such large file localization? App will be in at least 7 languages and I can not decide - include video in 7 languages which would dramatically change size of app or include it only in English and localize other stuff? Probably someone knows - if my phone language is for example Spanish and I download localizable app - does this app include videos in all languages or only in my selected?
Any answer will be appropriated and thank you in advance.
You could do something nifty with bundling the video and audio separated. You could then play the correct audio for each localization. The same could be done with subtitles.
Most optimized way would be to not bundle any video in the application and just allow the users to download the video for the current localization from within the app. That's what I think.
A single video can have multiple audio streams. So you can simply create one video with multiple audio streams. Audio is general smaller compared to video. So should not hurt as much. User can then select the audio langauage [or program can preselect based on some preference] to play it out.
Almost all container formats can contain more than 1 audio stream. [Almost all, definitely mp4 which is what I guess you are using.]
EDIT: You can possibly also have two files. One with half the languages which are the most common and one with other half which are less common to reduce some size.

is there a Video for writing to plist in iOS?

I'm not saying I'm lazy (I am), but I want to read what is in the plist, and then when the user does something, to append that info to the existing plist, and write it back, so that info is not lost.
Now that's the type of thing I would think there'd be a video on.
Thus, my question is twofold:
a) Where's a good video (iTunes, Standford Univ, iOS App Pgming Guide) on writing to a plist using file handlers ?(gulp)
b) how can I do a search and restrict the media to videos (so I can find videos on these subjects in the future)?
I'm especially looking for how to handle the notorious first time, and how to make sure I don't overwrite good data by wrongly thinking it's empty.
There are plenty of such tutorials on YouTube. Have a look at this one:
iPhone Programming - Reading/Writing .plist Files

Providing data as needed for QTMovie

I understand that if I wanted to provide a QTMovie with data from an arbitrary source as it is needed I probably have to deal with the QTDataReference class, but unfortunately my experience with anything similar is limited to an audio data callback with Audio Buffers on the iPhone.
If I initialized the QTDataReference with some NSMutableData, how would I know when it needs more data and furthermore, how would I "clear" already played movie data and provide it again when the user seeks back (I want them to be able to)?
Basically the movie data I want to provide would in the end come from a set of files (which are really just one movie file split up), which become available sequentially during playback. This part is crucial.
Anybody who gets me going in the right direction can get beta access to the first Mac OS X Usenet movie streamer ;)
You likely can't do this using QTKit. Start with the QuickTime Overview and go from there.
The first idea that occurs to me is creating a QuickTime movie that directs QuickTime to look to the files you expect to download (or even to look directly to their remote URLs) for its media, then siccing QuickTime Player/QTMovieView on it and hoping that QuickTime's support for progressive download patches over any rough spots.