I am writing a Boxee App and I want to list all albums for a picasa account.
The problem I'm facing is that I have no idea of how to list albums and photos in the Boxee app.
The application itself should be fairly simple:
First a main screen where all your albums are listed, then when you click an album you see all your photos in a grid of say 4x5 photos
When you click a photo you go in to a view where one photo is displayed on the whole screen where back and forward lets users go back and forth in that gallery.
I have wrote a simple wrapper arround the gdata photos Python API which I was aiming to use so hopefully all the heavy lifting will be done by the gdata API.
Does anybody have some good links to example applications or tutorials to one or more of the features I want in the application?
Boxee uses an XML based approach for describing an application's interface. You'll need one XML for each screen of your application and you'll connect them together using the API.
You would build this XML screens using various controls defined by the XML API. Basically a control (a button, a list, a label, etc) is described as an XML node with attributes and child nodes. You can check a list of all the available controls here: http://developer.boxee.tv/UI_Controls
You can use the Python API to control various properties of the UI elements you coded in your XML files. For example you could fill a list with photos taken from a server, you could change the label on a button, load another screen and much more. Here are the Python API specs: http://developer.boxee.tv/Python_API
Make sure you read trough the Boxee dev pages and also remember that Boxee originated from the XBMC project so most of the documentation regarding XBMC skinning (http://wiki.xbmc.org/?title=Skinning_XBMC) also applies to Boxee.
Another thing that might help you is looking at other apps. Find an app that is somehow similar to what you want to do, find it in Boxee's app folder and peek at the code there.
Related
The problem
So, I am developing a react native application and I am facing the callenge of selecting multiple images from the user gallery. Just like apps like WhatsApp, Telegram, Twitter and even Reddit do. So with that in mind, i tried to use launchImageLibraryAsync from expo-image-picker but, as specified in their documentation, the "multiple selecion" of images is only supported on the web.
What I have thinked of
So, based on several searches, it seemed like i had to build my own "Gallery". To do this, so far i've tried to use #react-native-community/cameraroll and expo-media-library, but both of them requires that we pass the first property to the getPhotos (for #react-native-community/cameraroll) or to the getAssetsAsync (for expo-media-library) functions, which defines the first items to be fetched. This is a problem because I do not want to fetch like 20 items and then, when the user reaches the end of the list, it fetches more 20 items. I need something like this (this example is from Telegram). You can see that the app never stops me from scrolling, it goes all the way through my entire gallery.
What would also be nice
If you use reddit mobile, you can see that you can also select multiple photos using the several Apps like Google Photos, Files, Google Drive and so on.
This would be even nicer because I wouldn't need to implement a custom made Picker. Do you guys know how could I implement this?
OBS: I am using Expo with Bare Workflow, so I can use just about any package. I also opened a discussion at the Expo repo about it. You can check it here: https://github.com/expo/expo/discussions/15210.
Thank you in advance :)
I have been using this package called expo-images-picker check it out. It has similar function to what you need. It works in Mobile as well.
Link here
We generate quick links in our iOS app that are supposed to point to specific content within the app. When a quick link is shared via a messaging app that supports preview snippets, we want the snippet to display custom content depending on the parameters passed when our iOS app generates a link.
For example, user wants to share an audio, the app generates a link which is then posted in a messaging app or social media. We want the preview snippet to reflect specific title/subtitle and image related to that audio.
We use a custom domain name for Branch links if that matters.
What is the right way to achieve this?
You can use Link Preview for achieving this functionality. It will enable the link to display content as a preview card in Facebook, Twitter, Pinterest, iMessage, etc. This card can contain a title, description and image (that you append in the link as OG Tags such as $og_title, $og_description, and $og_image_url) .
How to create custom templates in iOS app having uiimageview ,uitextview,and many other views so that user can select any one template and starts editing it.
There is a famous library thats floating around for this kind of usage - iOS BoilerPlate
It is intended to provide a base of code to start with
It is not intended to be a framework
It is intended to be modified and extended by the developer to fit their needs
It includes solid third-party libraries if needed to not reinvent the wheel
What it includes -
HTTP requests and an image cache (both in-memory and disk-based)
UITableViews and UITableViewCells: fast scrolling, async images, pull-down-to-refresh, swipeable cells,...
A built-in browser so your users don't leave your application when they browse to a certain URL
Maps and locations: directions between two points, autocomplete a location, etc.
If I wanted to create a mobile app that allows the user to take pictures with their phone, record audio notes and record video, how would I do that?
I was browsing through the Sencha Touch 2 API and while I see documentation on video and audio files, it seems like it is just providing a way for me to access files stored on the phone - not actual triggers to record, or take pictures.
Am I missing something?
How would I do what I want?
In order for Sencha Touch to have access to your phone capabilities, you need to use a product like Phone Gap
Unless there is a HTML5 api for doing those sorts of things I don't think you can do that. I know on PhoneGap there are native extensions added into that platform for access to things like microphone, camera, etc. I don't know if Sencha Touch has added any of those sorts of extensions in order for you do this.
Just thinking out of the box here, but you might be able to put Sencha javascript into a Web View from within an Android Java process. Then the Java code could expose an object in its process as an extension point to the Javascript engine for access to Camera, Microphone, what not.
I'm searching for some code samples for my Flash application. The app is based on screen-flows , i.e each screen is defined with some visual elements with some buttons which have async events associated with them(consuming web services basically). The buttons also have functionality to go back and forth between screens and jump to any screen. I want to define all of the visual elements and the functionality of the events in a XML file. Thus the model, view and controller are all in the XML. Does any framework allow this like Pure MVC? Where can I find some examples of this kind of functionality that I want?
There's a Flash implementation of Heirarchical State Machines here, which mentions XML and Flow Control:
http://code.google.com/p/troyworks/
(Note: There's a very slick screencasting application called Screenflow which may lay claim on that particular term these days...)