I was trying to implement a long exposure timer for the Sony Alpha 7 using the Remote Camera API. Is it possible to set the shutter speed using the API to BULB? Also, is it possible to control the exposure time from within an app when the shutter is set to BULB?
I hope Sony will implement these features, otherwise the API is kind of useless for many scenarios.
Here is an interesting article about exactly this demand: http://blog.programmableweb.com/2013/09/10/new-sony-camera-remote-api-leaves-developers-wanting-more/
I would be very happy to know if the features will be implemented in the future.
Knowing this we could already start developing apps otherwise it makes no sense...
Sorry at this time those features are not available in the API. Will let you know if they are implemented in the future.
Related
I just bought a Sony A7 and I am blown away with the incredible pictures it takes, but now I would like to interact and automate the use of this camera using the Sony Remote Camera API. I consider myself a maker and would like to do some fun stuff: add a laser trigger with Arduino, do some computer controlled light painting, and some long-term (on the order of weeks) time-lapse photography. One reason I purchased this Sony camera over other models from famous brands such as Canon, Nikon, or Samsung is because of the ingenious Sony Remote Camera API. However, after reading through the API reference it seems that many of the features cannot be accessed. Is this true? Does anyone know a work around?
Specifically, I am interested in changing a lot of the manual settings that you can change through the menu system on the camera such as ISO, shutter speed, and aperture. I am also interested in taking HDR images in a time-lapse manner and it would be nice to change this setting through the API as well. If anyone knows, why wasn't the API opened up to the whole menu system in the first place?
Finally, if any employee of Sony is reading this I would like to make this plea: PLEASE PLEASE PLEASE keep supporting the Remote Camera API and improve upon an already amazing idea! I think the more control you offer to makers and developers the more popular your cameras will become. I think you could create a cult following if you can manage to capture the imagination of makers across the world and get just one cool project to go viral on the internet. Using http and POST commands is super awesome, because it is OS agnostic and makes communication a breeze. Did I mention that is awesome?! Sony's cameras will nicely integrate themselves into the internet of things.
I think the Remote Camera API strategy is better than the strategies of Sony's competitors. Nikon and Canon have nothing comparable. The closest thing is Samsung gluing Android onto the Galaxy NX, but that is a completely unnecessary cost since most people already own a smart phone; all that needs to exist is a link that allows the camera to talk to the phone, like the Sony API. Sony gets it. Please don't abandon this direction you are taking or the Remote Camera API, because I love where it is heading.
Thanks!
New API features for the Lens Style Cameras DSC-QX100 and DSC-QX10 will be expanded during the spring of 2014. The shutter speed functionality, white balance, ISO settings and more will be included! Check out the official announcement here: https://developer.sony.com/2014/02/24/new-cameras-now-support-camera-remote-api-beta-new-api-features-coming-this-spring-to-selected-cameras/
Thanks a lot for your valuable feedback. Great to hear, that the APIs are used and we are looking forward nice implementations!
Peter
I want to read how much data from 3G every app uses. Is this is possible in iOS 5.x ? And in iOS 4.x? My goal is for example:
Maps consumed 3 MB from your data plan
Mail consumed 420 kB from your data plan
etc, etc. Is this possible?
EDIT:
I just found app doing that: Data Man Pro
EDIT 2:
I'm starting a bounty. Extra points goes to the answer that make this clear. I know it is possible (screen from Data Man Pro) and i'm sure the solution is limited. But what is the solution and how to implement this.
These are just hints not a solution. I thought about this many times, but never really started implementing the whole thing.
first of all, you can calculate transferred bytes querying network interfaces, take a look to this SO answer for code and a nice explanation about network interfaces on iOS;
use sysctl or similar system functions to detect which apps are currently running (and for running I mean the process state is set to RUNNING, like the ps or top commands do on OSX. Never tried I just suppose this to be possible on iOS, hoping there are no problems with app running as unprivileged user) so you can deduce which apps are running and save the traffic stats for those apps. Obviously, given the possibility to have applications runnning in background it is hard to determine which app is transferring data.
It also could be possible to retrieve informations about network activity per process/app like nettop does on OSX Lion, unfortunately nettop uses the private framework NetworkStatistics.framework so you can't dig something out it's implementation;
take into account time;
My 2 cents
No, all applications in iOS are sandboxed, meaning you cannot access anything outside of the application. I do not believe this is possible. Neither do I believe data-traffic is saved on this level on the device, hence apple would have implemented it in either the network page or the usage page in Settings.app.
Besides that, not everybody has a "data-plan". E.g. in Sweden its common that data-traffic is free of charge without limit in either size or speed.
I am right now working on one application where I need to find out user's heartbeat rate. I found plenty of applications working on the same. But not able to find a single private or public API supporting the same.
Is there any framework available, that can be helpful for the same? Also I was wondering whether UIAccelerometer class can be helpful for the same and what can be the level of accuracy with the same?
How to implement the same feature using : putting the finger on iPhone camera or by putting the microphones on jaw or wrist or some other way?
Is there any way to check the blood circulation changes ad find the heart beat using the same or UIAccelerometer? Any API or some code?? Thank you.
There is no API used to detect heart rates, these apps do so in a variety of ways.
Some will use the accelerometer to measure when the device shakes with each pulse. Other use the camera lens, with the flash on, then detect when blood moves through the finger by detecting the light levels that can be seen.
Various DSP signal processing techniques can be used to possibly discern very low level periodic signals out of a long enough set of samples taken at an appropriate sample rate (accelerometer or reflected light color).
Some of the advanced math functions in the Accelerate framework API can be used as building blocks for these various DSP techniques. An explanation would require several chapters of a Digital Signal Processing textbook, so that might be a good place to start.
I'm working on a pretty complicated app right now, but I just got a really good, niche market idea for an AR game for iPhone. I would love to get some preliminary research done on whether or not it is worth the effort. I got a few (about 4 days) in which to code this. Is this a realistic timeline for what I'm trying to accomplish?
While I'm pretty familiar with the CMDeviceMotion, and can get location updates from GPS, there are 4 features that I think may take a colossal amount of work:
1) Working with camera in real time to draw augmented reality controls. Are there any good tutorials on how to overlay a view on top of a live camera feed?
2)Making the app work when GPS reception is spotty. It seems that some apps know how to keep updating the location based on accelerometer/gyroscope from the last known location. Where would I start on this front?
3)The networking component. I'm very new to multiplayer games. I got a website that can run PHP. Should I abandon my networking idea until I get a web server? Or is there some way I can run this in P2P over 3G without a base station?
4)Google maps integration for fast updates. Does this take a lot of effort?
I'm sorry if any of these questions are too broad and vague. I'm very excited about this idea, but would like to know what I'm dealing with before spending time on the app and realizing that I'm dealing with a monumental task!
I think you are dealing with a monumental task (especially the multiplayer part, where you'll encounter issues like lag/timing).
For the augmented reality part of your project, you can take a look at mixare augmented reality engine. It's free and open source software and the code is available on github: https://github.com/mixare/
Be aware that if you base your code upon mixare, you'll have to release your app under the same GPLv3 license as mixare.
Good luck for your project!
HTH,
Daniele
In a nutshell Fast Dormancy allows the RRC state machine to go to IDLE(CELL_PCH) from CELL_DCH without waiting for the timer to expire. Is there any OS (Android, Windows Phone, iOS etc) which exposes APIs using which we can invoke fast dormancy on 3G devices? Any pointers appreciated.
EDIT: Does any OS expose API's to
switch off 3G radio or switch radio
states(DCH,FACH,IDLE etc.)
I'm not sure if I understood your question correctly (I'm not familiar with the actual 3G-technology), but at least BlackBerry API (since 4.2.1) does have the following method:
Requests that the radios belonging to
the provided Wireless Access Families
be powered off.
http://www.blackberry.com/developers/docs/6.0.0api/net/rim/device/api/system/Radio.html#deactivateWAFs(int)
Constants used with the above:
http://www.blackberry.com/developers/docs/6.0.0api/net/rim/device/api/system/RadioInfo.html#WAF_3GPP
Not sure if this is what you actually meant.
It seems that Blackberry also expose fast dormancy since API 4.0.0
http://www.blackberry.com/developers/docs/5.0.0api/net/rim/device/api/io/IOProperties.html#CDMA_SET_FAST_DORMANCY_FLAG
and
http://www.blackberry.com/developers/docs/4.0.2api/net/rim/device/api/io/IOProperties.html
The OFono stack used by MeeGo seems to have Fast Dormancy settings (and radio toggling) in the radio settings api, but I can't really see at which level those would be available to users. The API doc is in their git repo:
http://meego.gitorious.org/meego-cellular/ofono/blobs/5639c653979e324e0b3a195ec3fab07fc2bd3a05/doc/radio-settings-api.txt
I've read NCFD has been blamed for spotty 3G performance on iOS devices in some cases, so I'm not sure programmatically playing with at an application level is such a good idea, especially since you'd be making assumptions about the entire platform's network stack requirements.