Callback or real-time communication for social table api - social-tables

I am evaluating social table API to see if there's any way to get notified when data in the social tablet side changes so that we can sync the data in real time. I can't find anything on call back or long running operations. Does that mean polling is the only option?

There is no real time API for Social Tables. This means that you're correct in that polling is the only option for keeping data in sync.

Related

Design: Is it correct to push records to database first and then pick from database and push to rabbitmq

We are e-commerce website with several listings. The listings can be created/updated/deleted from various apps:- i.e. desktop, mobile, apps.
We need to push all this updated information to some third party APIs i.e. any new listing created/deleted or if the existing listing updates, we need to push complete listing information through some third party API. We are using Rabbitmq for this since we expect a high volume of record update.
We have two choices:
Push from all endpoints (desk/msite/app) info like (listingId, Action on listing i.e. CREATE/UPDATE/DELETE) to rabbitmq. Rabbitmq further dequeue these messages and hit appropriate API.
Implement trigger on listings table i.e. on create, insert entry into some table with column (listingId, database action to be executed i.e. CREATE/UPDATE/DELETE). Now, create a job toread from this table after every 10 sec. and push these to rabbitmq.
Which is better appraoch?
I think an HTTP based API might be the best solution. You can implement a gateway which includes security (OAuth2/SAML), rate limiting etc. Internally you can use RabbitMQ. The gateway can publish the updates to RabbitMQ and have subscribers which write the data to your master database and other subscribers that publish the data to your third party APIs.
The added benefit of an HTTP gateway, beyond the extra security controls available, is that you could change your mind about RabbitMQ in the future without impacting your desktop and mobile apps which would be difficult to update in their entirety.
Having worked with databases for most of my career, I tend to avoid triggers. Especially if you expect large volumes of updates. I have experienced performance problems and reliability problems with triggers in the past.
It is worth noting that RabbitMQ does have message duplication problems - there is no exactly once delivery guarantee. So you will need to implement your own deduplication or preferably make all actions idempotent.

Pushing near real time position updates to a number of clients

I'm currently in the early stages of designing a cross-platform mobile app (iOS, Android, WP8). At its core, the application will allow proximity based interaction between an unknown number of client devices.
Could someone recommend a framework or technology that would allow me to push server generated location based proximity alerts to the clients that even scales well? Could push notifications (cross platform using PushSharp) be a viable option for this, or will Push notifications always go hand in hand with some kind of toast notification on the device?
Update: 30 seconds or even a minute delay should do.
Push Notifications have way to much latency to give you anything near real-time. Also Push Notifications are not guaranteed to reach a device.
You really need to think about how much time constrained the location updates should be. You should also make some tests with how fast the GPS on various devices updates. I have tested on a Nexus 4 and it is not near real-time either. So throughout your entire application whether it is server side or client side you will have things blowing your real-time wishes.
However IF you can live with a delay of a couple of seconds to maybe 30 seconds or more, Push Notifications my work well for you, they also scales quite nicely.
EDIT:
Push notifications always go hand in hand with some kind of toast
notification on the device?
There is a concept called RAW notifications, which allow you to send arbitrary information with the Notifications. However personally I would just notify the client about updates ready on a server, where it then can get all the information the application needs. This is because as I said Push Notifications are not guaranteed to ever reach the device, but also because you are limited to how much information you can embed in the Notifications.
So my suggestion for a flow would be:
Client A updates its location and sends it to a web service
Web Service receives info from Client A and prepares notifications for all the other Clients, which need information about Client A and pushes information about they need to update their info.
Client B receives a Push Notification telling it to refresh data from the Web Service and does that.
That would work well if the application is in a background state. When showing in the foreground I would simply poll a server every second or so; Still receiving Notifications just forcing the client to update.
For the "scales well" part, you might also want to take a look at Windows Azure Service Bus Notification Hubs. It's currently in preview and supports only Windows 8 Store apps and iOS but support for Android and Windows Phone is on the way.

Strategies for designing REST APIs for all types of client devices

The question is more targeted at server side development.
When writing a REST API, I want to write it in such a way that it can be consumed by both desktop and mobile applications.
Could see two possible approaches:
Each API should support pagination and the responsibility should be delegated to the client of how much data should be fetched in one go. So , mobile apps will ask for fewer pages in one go and desktop applications will ask for more.
Separate APIs for mobile devices hosted separately. The front-end web server can check the user agent (i.e. source from where is request is coming) and if it's a mobile device, then re-route the request to the server hosting the APIs for mobile devices.
Interested to know more strategies around this.
Appreciate your inputs.
I would suggest a bit of both (1) and (2), here's how.
Instead of re-building whole new api for mobile itself, Have adapters for all the supported devices. i.e have a layer on top of you REST API implementation which renders/instructs the underlying service to return the content suitable for selected mobile device.
coming to pagination, you can parameterize the pagination as an input from the Abstraction mentioned above.
I would recommend something closer to option (1). If the main difference between the clients will be the amount of data they request at a time, it seems trivial to add some kind of query parameter or HTTP header to the REST API indicating how many records to return, for instance.
Relying on checking the User-Agent header may require you to maintain a list of known client user agents and match against them, which would be an additional maintenance cost of a separate API.

Shared data between iOS devices

I'm developing an internal iOS cocoa app, in which multiple devices need to connect and read/write to a data connection. It will be similar to an inventory application.
Would this be best done using a server-side SQLLite communicator or some other kind of server-side data store? Or is there a method i don't know of that multiple devices can share data.
Any help is appreciated. Thanks.
A third-party service possibly worth checking out is Parse.
With Parse, you can add a scalable and powerful backend in minutes and launch a full-featured app in record time without ever worrying about server management. We offer push notifications, social integration, data storage, and the ability to add rich custom logic to your app’s backend with Cloud Code.
Depending upon the complexity and assuming that all devices can be connected to the same icloud account you could utliize icloud for this.
http://www.raywenderlich.com/6015/beginning-icloud-in-ios-5-tutorial-part-1

Performance testing Twitter Streaming API consumer

I have a service that consumes twitter posts in realtime using the Twitter Streaming API.
I have built a background process which connects to the stream and pushes tweet into Redis. This is built with node.js
What I need to do is to figure out what the maximum number of tweets this process can consume. I need to performance test this setup.
What is the best way to test this?
I need to know:
how many tweets it can handle before it falls over
what happens when the process can't handle any more tweets
Another reason why I would want to do this is to work out whether its worth using node.js at all.
I would prefer to write it with EventMachine instead.
Since you're inherently limited by the frequency and volume of tweets coming from the Twitter Streaming API, what you're actually interested in benchmarking is the I/O performance of your background process with respect to Redis.
Mock the tweets and generate pseudo-tweets or collect a significant sampling of actual tweets and use this data set in your benchmarking. After mocking/generating this data set, you can precisely write your benchmark against this. For example, data set in hand, you could push the entirety of this data set all at once into your new tweet event handling logic, or simulate peaks and valleys of activity.
The point being, when benchmarking, identify and isolate the desired variable (number of tweets), use a standardized sample, and mock away inconsistent and outside behavior (API limits, variable tweet/sec rate).
I would suggest to create custom client simulating Twitter Stream API. The client can generate tweets for your application to consume. We can use a load testing tool that supports custom scripts to run this twitter script from distributed machines to generated the desired load. While the tweets are being generated you can monitor the health of the system to measure the impact of tweet throughput on your application.