I have looked all through the Nest developers site, and I can't find anything related to the camera. I can find plenty of great information on the thermostat and the smoke / CO alarm, but nothing for the camera.
In particular, I'm looking for how to get the video URL, how to get/set video resolution and frame rate, and how to get/set the zoom level (if applicable). Thanks.
As a short update to this question: I believe Nest made an API for the camera available today: https://developer.nest.com/documentation/api-reference/overview#cameras
There are no Nest Cam API's at the moment. However, we're hearing feedback asking for them. I'd love to hear what you would use them for.
I believe that the Dropcam (Nest camera) API is still in Beta. You can sign up for the beta here.
Related
I am trying to make an existing website accessible but I'm struggling with a banner that has a video background streamed from vimeo. I found an identical issue in this SO thread Making a video element with no sound accessible. The poster stated that axe Tools failed the video as it had no captions - the same situation I face.
The accepted answer does not solve that issue though. Setting a parent element to aria-hidden has no impact on axeTools. I need to make the site accessible but also to pass an audit. Does anyone know a way to do this? It seems to me that WCAG is not well thought out here as it seems to have no video equivalent to alt="" or aria-role="presentation" as it does for images when a video is purely decorative and conveys no meaningful information.
Ignore axe or any other automated testing tool for the moment because they often have errors in their reporting. Axe-like tools are not the standard for measuring accessibility. WCAG is. Using a human and the Web Content Accessibility Guidelines is your sure source to know if you are conformant or not.
In this case, you have two guidelines that apply.
1.2.1 Audio-only and Video-only (Prerecorded)
2.2.2 Pause, Stop, Hide
For the first, it says:
Either an alternative for time-based media or an audio track is provided that presents equivalent information for prerecorded video-only content.
So two key points here are:
You can either have an alternative for time-based media (which is often a transcript, and in this case you could have a transcript that describes what is going on in the video) or you can have an audio track that describes the same thing.
You only need this alternative if the video has meaning. The guideline says the alternative must provide "equivalent information". So what's the purpose or meaning of the video? Does it provide information or is it decorative? If the video were removed, would critical information be lost? If it's purely decorative and doesn't provide anything, then the "equivalent information" would essentially be "nothing". The video could be hidden from assistive technology. But you'd want to talk to your designers about the purpose or meaning of the video.
The second guideline, 2.2.2, says if there's moving content (usually an animation but a video counts as moving information), then the user needs a way to stop it. That's normally done with a pause button on the video but it could be done with lots of other ways.
In summary, you need to decide the purpose of the video and whether it conveys information, and if so, decide what an "equivalent" experience would be, and also provide a way to pause the video.
I have a UIMapView in my iPhone-app.
First I want to show streetnames on it.
Second I want to give the user the chance to look for places with an UISearchViewController.
Are there any tutorials for these 2 points and which service do you think is the best for finding places?
You need to explore and understand Apple's MapKit Framework (https://developer.apple.com/library/ios/documentation/MapKit/Reference/MapKit_Framework_Reference/_index.html). To show stuff on the Map you can use MKAnnotations and MKLocalSearch to get points of interest around the user's location. The Documentation is super clear and has examples on how to get latitude/longitude, drop pins, pin-point user's location, add balloons with info (MKAnnotations), etc. Also, take a look at UISearchViewController documentation to see if it is the best for your use case.
Then, I would suggest you to take a look at Google Places Search API (https://developers.google.com/places/documentation/search) or Foursquare's API (https://developer.foursquare.com/docs/). At the end everything is about showing the info coming on a JSON response from one of the APIs on a List or on the Map with balloons.
If your data source ends up being Google Places... I would suggest you to use Google Maps for iOS instead of the native MapKit.
As for tutorials, there are probably 100s of them in the Interwebz... but I suggest you to start with Apple Documentation on MapKit understand that first and then try to mix it with other data sources.
Good luck and happy holidays!
I want to capture the total number of bytes sent/received through all network interfaces and particular interfaces of iPhone. I didn't find any help from Google about any similar code for this. Is there anyone can help me about an Objective C code for this?
This is not accessible information, and you would violate the private framework rules if you tried accessing this on the device.
So I'm afraid it is not possible.
I don't think you can do it. As iOS Technology Overview mentioned, the function as you described is on Core Service level and not found in official documents. There may be some private APIs for it but Apple will definitely prevent you from using them.
So, I am sorry for that.
I have been trawling through all I can to do with the Google GData YouTube API and its doing nothing but hurting my brain, Google do not make it clear to learn and there seesm to be little content online about it (Objective-C iOS, Mac).
I just want to achieve one of the most simply tasks, search YouTube, getting thumbnails, descriptions, titles etc for the search results. If you can help with this I would appreciate it. I am unsure of where to start.
Alternatively if you can recommend some good resources for learning the API I'd appreciate it, thanks.
The YouTubeSample app would be the most straightforward example. You would need to first read and understand the API library documentation.
I've been scouring both this site and the net in general for an example cocoa app that uses QTKit or Audio Queue and actually works.
Unfortunately, I can't find anything that fits the above description.
All I want to do is get a simple audio recording app so I can learn how it works!
Please, I have put alot of time into this already so don't point me to Apple Dev. It is just too dense for my simple brain.
And yes, there is a dupe or two here, but none of them actually produced a satisfactory outcome.
I am desperate! I feel like this should be WAY easier. I am starting to worry about getting deeply into Cocoa because the developer documentation is really not good.
Help!
It's not reasonable to ask people not to point to the documentation, especially when there is a step-by-step tutorial for creating a simple recording app with QTKit therein. It's titled "Creating a Simple Capture and Recording Application" with about ten steps (with code). If that's not enough, the Sample Code section gives you "MyRecorder," which is a ready-to-go media recorder using QTKit.
It's far easier to get help if you a) don't limit people by telling them not to refer you to a resource and, b) start with some standard resource and explain what it is that's confusing you or that's not working for you, so we have a starting point from which to offer help.