I have developed many apps on Android, and I have tried using Broadcast Receiver concept in many of them such as knowing when the battery is low or such.
I would like to use the same concept in developing on Windows phone, but I could not find Broadcast Receiver in it.
I want my application to know whenever the user took a new photo by camera. In addition, keep tracking of it.
What do you suggest?
There’s no broadcast receiver concept in Windows Phone.
IMO, this is by design.
I know two good reasons why I wouldn’t want that on my phone: privacy (almost no one reads list while installing apps) and battery life.
You can read those photos while your app is running (caching file list+modified times for faster updates).
And/or, you might want to create a lens extension, here’s a link for WP8.
Related
I am looking for a wireless communication technology for exchanging data between devices via sound in ultrasonic frequencies.It is possible to communicate with two mobile devices.I want to communicate a mobile and an embedded device.Is it possible?Any device is working with this protocol?
Of course it is possible. Back in the 1970's my TV remote control used ultrasound to change the channel and turn the TV off. The control was somewhat rudimentary IIRC a short press changed the channel up and a long press turned the TV off. It worked quite reliably for these functions.
Providing more functionality would require a more complicated modulation scheme which, as has been said in another answer, would be prone to interference from other sound sources. This probably explains why infra-red communucation signals are used in more modern remote control systems.
It is possible - why shouldn't it be? Smartphones are just embedded computers too. I imagine getting CE/FCC/etc certifications with such an embedded device will not be so easy. And production testing ...
But is it feasible? Probably not. Power consumption is a lot higher than with any RF-link, it's more susceptible to noise (quite literally) and the required components (microphone+speaker) are bigger than RF-components (antenna).
And then there's a whole bunch of other things you need to keep in mind when working with ultrasound, starting with the plastic design of the embedded device. But also things like the effect of ultrasound on people and their pets etc.
I know this is not directly programming related, but is there a way to purposely limit the signal strength on a testing mobile device to determine how your app performs under weak signal conditions?
I have an app that streams video and audio to a server, and need to test how it performs in low signal areas.. Any suggestions please?
One realistic way to do it is put it in a weak Faraday cage. You can make one or buy a bag or other pre-manufactured cage that protects against radio transmissions. As long as it's not too strong, it should weaken but not completely block the signal.
you can use a software like network link conditioner on OSX and netlimiter on windows, they have options for bandwidth limiting and even packet loss and presets for different typical situations plus the ability to create some yourself, you can just create a wifi network on your machine and connect to it from the device you want to test
please not that iOS has network link conditioner built-in (you can find it under the developer menu in settings), while android may have something on a rooted device (never tried anything though)
If run run your app in a simulator, many have options for emulating poor signal conditions.
There is at least one open source project whose aim is to simulate different network conditions for exactly the type of testing you are describing:
https://github.com/facebook/augmented-traffic-control
This can work in a cellular network but in this would most likely require your own base stations etc. This is possible via other open source projects (e.g. http://openbsc.osmocom.org/trac/), but is likely not necessary as you can probably simulate the same affect with the WiFi test set up.
I want to embed a video stream into my web page, which is part of our own cloud based software. The video should be low-latency (like video conferencing), and it would be preferable, but not required, for it to include audio. I am comfortable serving streaming binary data from the server-side, and embedding it into the page using HTML5 video.
What I am not comfortable with is the ability to capture the video data to begin with. The client does not already have a solution in place, and is looking to us for assistance. The video would be routed through our server equipment, and not be an embedded peice that connects directly to the video source.
It is a known quantity for us to use a USB or built-in camera from the computer. What I would like more information is about stand-alone cameras.
Some models of cameras have their own API documentation (example). It would seem from what I am reading that a manufacturer would typically have their own API which they repeat on many or all of their models, and that each manufacturer would be different in their API. However, I have only done surface reading and hope to gain more knowledge from someone who has already researched this, or perhaps even had first hand experience.
Do stand-alone cameras generally include an API? (Wouldn't this is a common requirement, so that security software can use multiple lines of cameras?) Or if not an API, how is the data retrieved from the on-board webserver? Is it usually flash based? Perhaps there is a re-useable video stream I could capture from there? Or is the stream formatting usually diverse?
What would I run into when trying to get the server-side to capture that data?
How does latency on a stand-alone device compare with a USB camera solution?
Do you have tips on picking out a stand-alone camera that would be a good fit for streaming through a server?
I am experienced at using JavaScript (both HTML5 and Node.JS), Perl and Java.
Each camera manufacturer has their own take on this from the point of access points; generally you should be able to ask for a snapshot or a MJPEG stream, but it can vary. Take a look at this entry on CodeProject; it tackles two common methodologies. Here's another one targeted at Foscam specifically.
Get a good NAS, I suggest Synology, check out their long list of supported IP Web Cams. You can connect them with a hub or with a router or whatever you wish. It's not a "computer" as-in "tower", but it does many computer jobs, and it can stay on while your computer is off or away, and do thing like like video feeds, torrents, backups, etc.
I'm not an expert on all the features, so I don't know how to get it to broadcast without recording, but even if it does then at least it's separate. Synology is a popular brand and there are lot of authorized and un-authorized plugins for it. Check them out and see if one suits you.
I want to read how much data from 3G every app uses. Is this is possible in iOS 5.x ? And in iOS 4.x? My goal is for example:
Maps consumed 3 MB from your data plan
Mail consumed 420 kB from your data plan
etc, etc. Is this possible?
EDIT:
I just found app doing that: Data Man Pro
EDIT 2:
I'm starting a bounty. Extra points goes to the answer that make this clear. I know it is possible (screen from Data Man Pro) and i'm sure the solution is limited. But what is the solution and how to implement this.
These are just hints not a solution. I thought about this many times, but never really started implementing the whole thing.
first of all, you can calculate transferred bytes querying network interfaces, take a look to this SO answer for code and a nice explanation about network interfaces on iOS;
use sysctl or similar system functions to detect which apps are currently running (and for running I mean the process state is set to RUNNING, like the ps or top commands do on OSX. Never tried I just suppose this to be possible on iOS, hoping there are no problems with app running as unprivileged user) so you can deduce which apps are running and save the traffic stats for those apps. Obviously, given the possibility to have applications runnning in background it is hard to determine which app is transferring data.
It also could be possible to retrieve informations about network activity per process/app like nettop does on OSX Lion, unfortunately nettop uses the private framework NetworkStatistics.framework so you can't dig something out it's implementation;
take into account time;
My 2 cents
No, all applications in iOS are sandboxed, meaning you cannot access anything outside of the application. I do not believe this is possible. Neither do I believe data-traffic is saved on this level on the device, hence apple would have implemented it in either the network page or the usage page in Settings.app.
Besides that, not everybody has a "data-plan". E.g. in Sweden its common that data-traffic is free of charge without limit in either size or speed.
I'm looking into writing an app that runs as a background process and detects when an app (say, Safari) is playing audio. I can use NSWorkspace to get the process ID's of the currently running applications but I'm at a loss when it comes to detecting what those processes are doing. I assume that there is a way to listen in on a process and detect what public messages the objects are sending. I apologize for my ignorance on the subject.
Has anyone attempted anything like this or are aware of any resources that can help?
I don't think that your "answer" is an answer at all...
and there IS an answer (which is not "42")
your best bet for doing this would be to write a pass-through audio output device. Much like soundflower, actually. so your audio output device would then load the actual (physical) audio output device and pass the audio data along to it directly (after first having a look at the audio stream, of course!). then you only need to convince your users to configure your audio device as the default audio output device so that the majority of applications which play sound will use it automatically. and voila...
your audio processing function will probably just do a quick RMS on the buffer before passing it along to the actual output device. and when the audio power crosses a certain threshold (probably something like -54dB with apple audio hardware), then you know that some app is making sound.
|K<
SoundFlower is an open-source project that allows Mac OS X applications to pass audio to each other. It almost certainly does something similar to what you describe.
I've been informed on another thread that while this is possible, it is an extremely advanced technique and not recommended. It would involve using Application Enhancer (APE) and is considered a not 'nice' thing to do. Looks like that app idea is destined for the big recycling bin in the sky :)