Accessing iSight programmatically? - objective-c

Is it possible to access the iSight camera on a macbook programmatically? By this I mean I would like to be able to just grab still frames from the iSight camera on command and then do something with them. If so, is it only accessible using objective c, or could other languages be used as well?

You should check out the QTKit Capture documentation.
On Leopard, you can get at all of it over the RubyCocoa bridge:
require 'osx/cocoa'
OSX.require_framework("/System/Library/Frameworks/QTKit.framework")
OSX::QTCaptureDevice.inputDevices.each do |device|
puts device.localizedDisplayName
end

I don't have a Mac here, but there is some Documentation up here:
http://developer.apple.com/documentation/Hardware/Conceptual/iSightProgGuide/01introduction/chapter_1_section_1.html
It looks like you have to go through the QuickTime API. There is supposed to be a Sample Project called "MungGrab" which could be worth a look according to this thread.

If you poke around Apple's mailing lists you can find some code to do it in Java as well. Here's a simple example suitable for capturing individual frames, and here's a more complicated one that's fast enough to display live video.

There's a command line utility called isightcapture that does more or less what you want to do. You could probably get the code from the developer (his e-mail address is in the readme you get when you download the utility).

One thing that hasn't been mentioned so far is the IKPictureTaker, which is part of Image Kit. This will come up with the standard OS provided panel to take pictures though, with all the possible filter functionality etc. included. I'm not sure if that's what you want.
I suppose you can use it from other languages as well, considering there are things like cocoa bridges but I have no experience with them.
Googling also came up with another question on stackoverflow that seems to address this issue.

Aside from ObjC, you can use the PyObjC or RubyCocoa bindings to access it also. If you're not picky about which language, I'd say use Ruby, as PyObjC is horribly badly documented (even the official Apple page on it refers to the old version, not the one that came with OS X Leopard)
Quartz Composer is probably the easiest way to access it, and .quartz files can be embed in applications pretty easily (and the data piped out to ObjC or such)
Also, I suppose there should be an example or two of this in the /Developer/Examples/

From a related question which specifically asked the solution to be pythonic, you should give a try to motmot's camiface library from Andrew Straw. It also works with firewire cameras, but it works also with the isight, which is what you are looking for.
From the tutorial:
import motmot.cam_iface.cam_iface_ctypes as cam_iface
import numpy as np
mode_num = 0
device_num = 0
num_buffers = 32
cam = cam_iface.Camera(device_num,num_buffers,mode_num)
cam.start_camera()
frame = np.asarray(cam.grab_next_frame_blocking())
print 'grabbed frame with shape %s'%(frame.shape,)

Related

Is there a way to use text-to-speech directly in bb10

I want to generate some text from my bb10 app to give audio feedback to the user.
(But the screenreader like in the accessibility feature is not sufficient)
Has anybody already successfully got text-to-speech implemented?
There are countless open source projects that do this on PC platforms. You may have your best luck in fitting them to your needs. – Josh C
Any library you would recommend? It should have C or C++ interface and must work offline (no server based solution) and it should not occupy too much memory. – thowa
I had to check to make sure it was written in C++ which it is. It is called ESpeak. I heard about it nearly 7 years ago when I was looking for a speech synthesizer that was powerful/robust enough to sound like a human. I believe it was ESpeak, and back then it was a complicated task to get it to spew out realistic sounding speech.
http://sourceforge.net/projects/espeak/files/
This one looks promising as well; however it is written in java.
http://mary.dfki.de/Download/openmary-open-source-emotional-text-to-speech-synthesis-system-released
Found here https://github.com/marytts/marytts

Capture mac screen

What is the best way to record the mac screen with cocoa? I know there are many examples at the apple developer reference library. SonOfGrab explain how to capture the screen with quartz but also that it isn't enough fast to use it to grab many frames every second. OpenGLScreenSnapshot has same results but it isn't fast, too. OpenGLScreenCapture seems to be the best way to do it but XCode prompts me many errors because it's made for 10.4 and it requires old Quicktime commands I think they pushed into QTKit but I can't find a way to convert it. Could anyone send me to someone's site that converted the project or tell me if there are some other ways to do it. Thanks in advance.
OpenGL would be the way to go. You should still be able to use the OpenGLScreenCapture sample if your architecture is set to 32-bit. (QuickTime is not available in 64-bit.)
-Ken

How to get and parse JSON using objective C?

Is it possible to get and parse JSON using objective C, then manipulate it within the cocoa framework for the iphone/pad? I'm specifically looking to do this for a couple of public APIs out there.
See here: how to do json parsing in iphone
Basically, you should look into the TouchJSON library (with CJSONDeserializer and CJSONSerializer).
Used Json-framework on some previous projects, worked really well.
EDIT: I read your post a bit too fast. I've used it on a Mac app before but not targeting the iphone/ipad. I think it should work but have no background to it. Maybe someone else can confirm?
It's not only possible, it's dirt simple if you use one of the many existing open source projects dedicated to this task. I recommend trying yajl-objc, which offers a streaming parser, but json-framework is a good one too. They're very similar.
I'd stay away from TouchJSON, since it gave me trouble a while back with special characters (line breaks) in strings.
However, I'll join the choir recommending json-framework. Since I switched to that from TouchJSON everything's been running smoothly.
Regarding how to integrate the API in your project, they're equally simple to include and use.
As a side note, I'm just now testing out JSONKit, since it's supposed to be much faster than both TouchJSON and json-framework. However, I can't vouch for its stability yet. The reviews of it are good, though.
If you're developing an application that is only iOS 5.0 or later, you can use NSJSONSerialization.

sample mac Firefox Plugins?

I'm trying to re-write an old image-viewing plugin for the mac. The old version uses QuickDraw (I said it was old) and resources (really really old) and so it doesn't work in Firefox 3.6 (which is why I'm re-writing it)
I know some Objective C, and so I figure I'm going co re-write this in that using new-fangled Mac routines and nibs, etc. However, I don't know how to start. I've got the BasicPlugin example that comes with mozilla source, so I know how to create a plugin with entrypoints, etc. However, I don't know how to create the nib, and how to interface Obj-C with the entrypoints, etc.
Does anyone know of a more advanced sample for mac than BasicPlugin.bundle? (Preferably simple enough that I can just look at it and understand it...)
thanks.
Sadly i don't really know of any good "intermediate" example. However, integrating Obj-C isn't that difficult. Thus, following is a short overview of what needs to be done.
You can use Obj-C and C/C++-sources in the same project, its just recommendable to keep them seperated to some extent. This can for example be done by letting the source file with the entry-points and other NPAPI-interfacing stay plain C or C++ files and e.g. forward calls into the plugin from there.
Opaque pointers help to keep a clean seperation, see e.g. here.
The main changes to your plugin include switching to different drawing and event models. These have to be negotiated in NPP_New(), here is an example for the drawing model. When using Cocoa and to support 64bit enviroments, you need to use the Cocoa event model.
To draw UI elements you should be able to use a NSGraphicsContext from the CGContextRef and then draw an NSView in the context. See also the details provided in this post and its follow-ups.

Google Wave extension for Programmers and their Code

Sorry if this is well known but Googling for my answer only came up with links about making Google Wave gadgets.
My question is, are there any Google Wave gadgets that allow for better collaborative code editing? I mean, I can set the font to fixed width etc., but are their any gadgets designed for it?
Responses shouldn't include anything about git or svn. I use those when I want to use those. This is about Google Wave!
Here is a huge list of robots available for Wave: http://www.chaaps.com/huge-list-of-125-google-wave-robots-add-bots-and-enjoy-wave.html
Maybe there is one in there?
Don't know how well it works but found an extension called CodeBot.
http://aaron.oirt.rutgers.edu/myapp/root/gadget/codeGadget
-- its a work in progress. Let me know if you want the source code.
I will package it and release it or something like it for the next WHIFF
release.