I am writing an app which has GCM in it. I am not trying to compete with Whatsapp. It should just a sort of bulletin board between all those who use this app.
The app is opened with a menu of buttons and one of them directs the user to the Messages activity.
As long as you stay on this activity, the messages are there and everything works as it should be.
But, if I go back to the menu activity and then go back into the messages activity, then all messages are gone and are not shown.
I am using ListActivity for the messages and the onCreate function, destroys the current list each time it is read.
My question is, how can I return to the messages activity without loosing them? I want to have it as in Whatsapp, when you go into Whatsapp the messages are always there!
I thought of saving the messages list into a file and then read it (or let's say the last 40-50 messages) anytime the onCreate is called. Is this a good solution? Is there any solution without saving the list to a file?
Here is some of my code inside the messages activity:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_msg_main);
Intent in = getIntent();
username = in.getStringExtra("username");
password = in.getStringExtra("password");
nickname = in.getStringExtra("nickname");
messages = new ArrayList<Message>();
txtNewMsg = (EditText) this.findViewById(R.id.text);
adapter = new MessagesAdapter(this, messages);
setListAdapter(adapter);
registerReceiver(mHandleMessageReceiver, new IntentFilter(DISPLAY_MESSAGE_ACTION));
}
private final BroadcastReceiver mHandleMessageReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent) {
String newMessage = intent.getExtras().getString(EXTRA_MESSAGE);
// Waking up mobile if it is sleeping
WakeLocker.acquire(getApplicationContext());
SimpleDateFormat sdf = new SimpleDateFormat("dd/MM/yyyy HH:mm:ss");
String currentDateandTime = sdf.format(new Date());
Spanned newMessageDate = Html.fromHtml("<small><i><font color=\"red\">" + currentDateandTime + "</font></i></small>" + " " + "<small><i><font color=\"blue\">" + nickname + "</font></i></small>" + "<br />" + newMessage);
addNewMessage(new Message(newMessageDate, true));
WakeLocker.release();
}
};
#Override
protected void onDestroy() {
if (mRegisterTask != null) {
mRegisterTask.cancel(true);
}
try {
unregisterReceiver(mHandleMessageReceiver);
GCMRegistrar.onDestroy(this);
} catch (Exception e) {
Log.e("UnRegister Receiver Error", "> " + e.getMessage());
}
super.onDestroy();
}
void addNewMessage(Message m)
{
messages.add(m);
adapter.notifyDataSetChanged();
getListView().setSelection(messages.size()-1);
}
The code includes: onCreate, BroadcastReceiver (GCM receiver), onDestroy and addNewMessage which adds the new message to the messages list.
Thanks for any help!
AJ
Create database with message table and users table which will store the message and users details.
Now whenever new message is come store in database and then when you open the app load limited message from table. If the app is open and message is come first store in your database and add to your list object and invalidate your listview.
Related
QuickBlox not delivering GCM notifications to subscribed devices. I tried sending notification message from Admin Panel too, but it isn't delivered to device, but still in Admin Panel it shows it as "sent". But no history available.
And also what could be the reason for this ? How to mitigate this ?
How to view sent GCM notifications.
The admin panel is not good enough - it lies sometime by saying 'send successfully' but it never reaches the other end.
Try sending the GCM notification by using the code snippet provided in the documentation. It works always.
public void sendMessageOnClick() {
// Send Push: create QuickBlox Push Notification Event
QBEvent qbEvent = new QBEvent();
qbEvent.setNotificationType(QBNotificationType.PUSH);
qbEvent.setEnvironment(QBEnvironment.DEVELOPMENT);
// generic push - will be delivered to all platforms (Android, iOS, WP, Blackberry..)
qbEvent.setMessage("how are you doing");
StringifyArrayList<Integer> userIds = new StringifyArrayList<Integer>();
userIds.add(6132691);
qbEvent.setUserIds(userIds);
QBMessages.createEvent(qbEvent, new QBEntityCallbackImpl<QBEvent>() {
#Override
public void onSuccess(QBEvent qbEvent, Bundle bundle) {
}
#Override
public void onError(List<String> strings) {
// errors
}
});
}
//OR the below:--
private void sendPushNotifications(){
// recipients
StringifyArrayList<Integer> userIds = new StringifyArrayList<Integer>();
userIds.add(6114793);
//userIds.add(960);
QBEvent event = new QBEvent();
event.setUserIds(userIds);
event.setEnvironment(QBEnvironment.DEVELOPMENT);
event.setNotificationType(QBNotificationType.PUSH);
event.setPushType(QBPushType.GCM);
HashMap<String, String> data = new HashMap<String, String>();
data.put("data.message", "Hello");
data.put("data.type", "welcome message");
event.setMessage(data);
QBMessages.createEvent(event, new QBEntityCallbackImpl<QBEvent>() {
#Override
public void onSuccess(QBEvent qbEvent, Bundle args) {
// sent
}
#Override
public void onError(List<String> errors) {
}
});
}
I've searched in the web, parse docs and ask many people but no one can point me how to do it.
I have an RSS app who getting the articles into a UITableView.
when I'm sending a Push it's open the app itself but not the article I want to (well obviously since I don't know how to code that) .
Can anyone please give me ideas how to do it ?
(code sample will be useful as well) .
First of all you have to implement your own Receiver class instead of default Parse push receiver and put it into AndroidManifest.xml as follows :
<receiver android:name="net.blabla.notification.PushNotifHandler" android:exported="false">
<intent-filter>
<action android:name="net.bla.PUSH_MESSAGE" />
</intent-filter>
</receiver>
In your PushNotifHandler.java class you should put some parameters to Intent you will throw as follows :
public class PushNotifHandler extends BroadcastReceiver{
private static final String TAG = PushNotifHandler.class.getSimpleName();
private static int nextNotifID = (int)(System.currentTimeMillis()/1000);
private static final long VIBRATION_DURATION = 500;
#Override
public void onReceive(Context context, Intent intent) {
try {
String action = intent.getAction();
Intent resultIntent = new Intent(context, ToBeOpenedActivity.class);
JSONObject jsonData = new JSONObject(intent.getExtras().getString("com.parse.Data"));
fillNotificationData(jsonData, action, resultIntent);
String title = jsonData.getString("messageTitle") + "";
String message = jsonData.getString("messageText") + "";
TaskStackBuilder stackBuilder = TaskStackBuilder.from(context);
stackBuilder.addParentStack(ToBeOpenedActivity.class);
stackBuilder.addNextIntent(resultIntent);
PendingIntent resultPendingIntent =
stackBuilder.getPendingIntent(
0,
PendingIntent.FLAG_UPDATE_CURRENT
);
Notification notification;
NotificationCompat.Builder builder = new NotificationCompat.Builder(context).
setSmallIcon(R.drawable.icon).
setContentTitle(title).
setContentText(message).
setAutoCancel(true);
builder.setContentIntent(resultPendingIntent);
notification = builder.getNotification();
NotificationManager mNotificationManager = (NotificationManager) context.getSystemService(Context.NOTIFICATION_SERVICE);
mNotificationManager.notify(TAG, nextNotifID++, notification);
vibratePhone(context);
} catch (Exception e) {
Log.d(TAG, "Exception: " + e.getMessage());
}
}
private void vibratePhone(Context context) {
Vibrator vibrator = (Vibrator) context.getSystemService(Context.VIBRATOR_SERVICE);
vibrator.vibrate(VIBRATION_DURATION);
}
private void fillNotificationData(JSONObject json, String action, Intent resultIntent)
throws JSONException {
Log.d(TAG, "ACTION : " + action);
resultIntent.putExtra("paramString", json.getString("paramFromServer"));
}
}
With key "com.parse.Data" you will get parameters sent from your server code as json format.After getting paramString and paramBoolean parameters from this json, you will put these parameters into new Intent you hav created as seen in fillNotificationData method.
Other parts of onReceive method creates a local notification and vibrates the device.
Finally on your activity class onResume() method you will check intent parameters to realize if you are opening the app from push notification or not.
#Override
public void onResume() {
super.onResume();
checkPushNotificationCase(getIntent());
}
private void checkPushNotificationCase(Intent intent) {
Bundle extraParameters = intent.getExtras();
Log.d("checking push notifications intent extras : " + extraParameters);
if (extraParameters != null) {
if(extraParameters.containsKey("paramString")) {
// doSomething
}
}
}
I hope you have asked this question for Android :))
I am new to UCMA and I am learning as I go through examples. I am trying to build 2 Lync clients A and B with the scenario as follows,
A calls B
B answers
A plays audio
B records it using Recorder.
I am stuck at trying to record the call at B. For B its an incoming call. I need to attach the audiovideoflow to the recorder, but I am not sure on how to do it. I will appreciate any help.
Apologies on the unformatted code, I am not sure how to format it properly, I tried.
Thanks.
Kris
Client B Code:
Accepts an incoming call
Records the media received in the incoming call. ***This is the part I have trouble
using System;
using System.Threading;
using Microsoft.Rtc.Collaboration;
using Microsoft.Rtc.Collaboration.AudioVideo;
using Microsoft.Rtc.Signaling;
using Microsoft.Rtc.Collaboration.Lync;
namespace Microsoft.Rtc.Collaboration.LyncUAS
{
public class LyncUAS
{
#region Locals
private LyncUASConfigurationHelper _helper;
private UserEndpoint _userEndpoint;
private AudioVideoCall _audioVideoCall;
private AudioVideoFlow _audioVideoFlow;
private Conversation _incomingConversation;
//Wait handles are only present to keep things synchronous and easy to read.
private AutoResetEvent _autoResetEvent = new AutoResetEvent(false);
private EventHandler<AudioVideoFlowConfigurationRequestedEventArgs> _audioVideoFlowConfigurationRequestedEventHandler;
private EventHandler<MediaFlowStateChangedEventArgs> _audioVideoFlowStateChangedEventHandler;
private AutoResetEvent _waitForAudioVideoCallEstablishCompleted = new AutoResetEvent(false);
private AutoResetEvent _waitForAudioVideoFlowStateChangedToActiveCompleted = new AutoResetEvent(false);
private AutoResetEvent _waitForPrepareSourceCompleted = new AutoResetEvent(false);
#endregion
#region Methods
/// <summary>
/// Instantiate and run the DeclineIncomingCall quickstart.
/// </summary>
/// <param name="args">unused</param>
public static void Main(string[] args)
{
LyncUAS lyncUAS = new LyncUAS();
lyncUAS.Run();
}
private void Run()
{
string filename = "received.wma";
_helper = new LyncUASConfigurationHelper();
// Create a user endpoint, using the network credential object
// defined above.
_userEndpoint = _helper.CreateEstablishedUserEndpoint("Lync UAS" /*endpointFriendlyName*/);
_userEndpoint.RegisterForIncomingCall<AudioVideoCall>(On_AudioVideoCall_Received);
Console.WriteLine("Waiting for incoming call...");
_autoResetEvent.WaitOne();
Console.WriteLine("came after call is connected");
//start recording for audio.
Recorder recorder = new Recorder();
recorder.StateChanged += new EventHandler<RecorderStateChangedEventArgs>(recorder_StateChanged);
recorder.VoiceActivityChanged += new EventHandler<VoiceActivityChangedEventArgs>(recorder_VoiceActivityChanged);
//**********This is the issue, currently _audioVideoFlow is null, it is not attached to the flow
//So this will fail, how to attach _audioVideoFlow to an incoming call ?? HELP !!!
// recorder.AttachFlow(_audioVideoFlow); ------------> HELP!
WmaFileSink sink = new WmaFileSink(filename);
recorder.SetSink(sink);
recorder.Start();
Console.WriteLine("Started Recording ...");
_autoResetEvent.WaitOne();
recorder.Stop();
Console.WriteLine("Stopped Recording ...");
recorder.DetachFlow();
Console.WriteLine("Exiting");
Thread.Sleep(2000);
}
private void audioVideoFlow_StateChanged(object sender, MediaFlowStateChangedEventArgs e)
{
Console.WriteLine("Flow state changed from " + e.PreviousState + " to " + e.State);
//When flow is active, media operations can begin
if (e.State == MediaFlowState.Active)
{
// Flow-related media operations normally begin here.
_waitForAudioVideoFlowStateChangedToActiveCompleted.Set();
}
// call sample event handler
if (_audioVideoFlowStateChangedEventHandler != null)
{
_audioVideoFlowStateChangedEventHandler(sender, e);
}
}
void recorder_VoiceActivityChanged(object sender, VoiceActivityChangedEventArgs e)
{
Console.WriteLine("Recorder detected " + (e.IsVoice ? "voice" : "silence") + " at " + e.TimeStamp);
}
void recorder_StateChanged(object sender, RecorderStateChangedEventArgs e)
{
Console.WriteLine("Recorder state changed from " + e.PreviousState + " to " + e.State);
}
void On_AudioVideoCall_Received(object sender, CallReceivedEventArgs<AudioVideoCall> e)
{
//Type checking was done by the platform; no risk of this being any
// type other than the type expected.
_audioVideoCall = e.Call;
// Call: StateChanged: Only hooked up for logging, to show the call
// state transitions.
_audioVideoCall.StateChanged += new
EventHandler<CallStateChangedEventArgs>(_audioVideoCall_StateChanged);
_incomingConversation = new Conversation(_userEndpoint);
Console.WriteLine("Call Received! From: " + e.RemoteParticipant.Uri + " Toast is: " +e.ToastMessage.Message);
_audioVideoCall.BeginAccept(
ar =>
{
try {
_audioVideoCall.EndAccept(ar);
Console.WriteLine("Call must be connected at this point. "+_audioVideoCall.State);
_autoResetEvent.Set();
} catch (RealTimeException ex) { Console.WriteLine(ex); }
}, null);
}
//Just to record the state transitions in the console.
void _audioVideoCall_StateChanged(object sender, CallStateChangedEventArgs e)
{
Console.WriteLine("Call has changed state. The previous call state was: " + e.PreviousState +
" and the current state is: " + e.State);
if (e.State == CallState.Terminated)
{
Console.WriteLine("Shutting down");
_autoResetEvent.Set();
_helper.ShutdownPlatform();
}
}
#endregion
}
}
I think I have figured out what's not quite right here.
Your Code
// Create a user endpoint, using the network credential object
// defined above.
_userEndpoint = _helper.CreateEstablishedUserEndpoint("Lync UAS" /*endpointFriendlyName*/);
_userEndpoint.RegisterForIncomingCall<AudioVideoCall>(On_AudioVideoCall_Received);
Console.WriteLine("Waiting for incoming call...");
_autoResetEvent.WaitOne();
Console.WriteLine("came after call is connected");
//start recording for audio.
Recorder recorder = new Recorder();
recorder.StateChanged += new EventHandler<RecorderStateChangedEventArgs>(recorder_StateChanged);
recorder.VoiceActivityChanged += new EventHandler<VoiceActivityChangedEventArgs>(recorder_VoiceActivityChanged);
//**********This is the issue, currently _audioVideoFlow is null, it is not attached to the flow //So this will fail, how to attach _audioVideoFlow to an incoming call ?? HELP !!!
// recorder.AttachFlow(_audioVideoFlow); ------------> HELP!
Looking good so far. I'm assuming you're establishing and such in your CreateEstablishedUserEndpoint method, but I'm not seeing where you're getting the value for _audioVideoFlow.
I'm guessing you might be doing it elsewhere, but on the off chance that's actually where you're running into problems, here's that bit:
Simplest pattern to get AVFlow
public static void RegisterForIncomingCall(LocalEndpoint localEndpoint)
{
localEndpoint.RegisterForIncomingCall
<AudioVideoCall>(IncomingCallDelegate);
}
private static void IncomingCallDelegate(object sender, CallReceivedEventArgs<AudioVideoCall> e)
{
e.Call.AudioVideoFlowConfigurationRequested += IncomingCallOnAudioVideoFlowConfigurationRequested;
}
private static void IncomingCallOnAudioVideoFlowConfigurationRequested(object sender, AudioVideoFlowConfigurationRequestedEventArgs e)
{
AudioVideoFlow audioVideoFlow = e.Flow; // <--- There's your flow, gentleman.
}
Now, instead of registering for your incoming call, just call RegisterForIncomingCall(_userEndpoint);.
Your AVFlow will be hanging off e.Flow above, you could then pass that into your recorder: recorder.AttachFlow(e.Flow) or simply assign the flow to a field in your class and autoResetEvent.WaitOne(); and set that up where you're setting that up now.
Obviously this is a pretty naive implementation. A lot can go wrong in those few lines of code (exception handling/static event handler memory leak comes immediately to mind); don't forget to wire up events related to status changes on the conversation/call and endpoints, as well as any of the recovery related items.
While creating a GCM client application, asynctask is giving compilation errors.
OnCreate we are calling registerBackgrouod which will check whether gcm instance is running or not, if not create one.
But asyntask is giving error : "Asynctask cannot be resolved to a type"
private void registerBackground() {
new AsyncTask() {
protected String doInBackground(Void... params) {
String msg = "";
try {
if (gcm == null) {
gcm = GoogleCloudMessaging.getInstance(context);
}
regid = gcm.register(SENDER_ID);
msg = "Device registered, registration id=" + regid;
// You should send the registration ID to your server over HTTP,
// so it can use GCM/HTTP or CCS to send messages to your app.
// For this demo: we don't need to send it because the device
// will send upstream messages to a server that echo back the message
// using the 'from' address in the message.
// Save the regid - no need to register again.
setRegistrationId(context, regid);
} catch (IOException ex) {
msg = "Error :" + ex.getMessage();
}
return msg;
}
protected void onPostExecute(String msg) {
mDisplay.append(msg + "\n");
}
}.execute(null, null, null);
As already observed by the AlexBcn, and according to the documentation of AsyncTask, you would pass to the AsyncTask three types as param. Because you want to return the payload of the GCM push notification as a String, you would invoke AsyncTask<Void, Void, String>
So the correct code snippet of GCM client is:
private void registerInBackground() {
new AsyncTask<Void, Void, String>() {
#Override
protected String doInBackground(Void... params) {
String msg = "";
try {
if (gcm == null) {
gcm = GoogleCloudMessaging.getInstance(context);
}
regid = gcm.register(SENDER_ID);
msg = "Device registered, registration ID=" + regid;
// You should send the registration ID to your server over HTTP, so it
// can use GCM/HTTP or CCS to send messages to your app.
// For this demo: we don't need to send it because the device will send
// upstream messages to a server that echo back the message using the
// 'from' address in the message.
// Persist the regID - no need to register again.
storeRegistrationId(context, regid);
} catch (IOException ex) {
msg = "Error :" + ex.getMessage();
// If there is an error, don't just keep trying to register.
// Require the user to click a button again, or perform
// exponential back-off.
}
return msg;
}.execute(null, null, null);
}
This is because of the params you pass in to Async task.
For further help:
I recently uploaded the fully functional GCM java client to my Github Account:
GCM Android Client
It has got both server and client implementation.
I am using the solution from the ServiceStack Re-usability use case project.
To this solution I have added a new console app which contains the code below.
With the original Re-usability use-case project, when an EmailMessage is published it was handled by a subscriber which sent an email (i.e. SMessageService.Any(EmailMessage request).
When I run the console app, which means I have two applications that are subscribing to the EmailMessage, only the new console app receives the message.
I have the following:
My console app is:
class Program
{
static void Main(string[] args)
{
var subscriberHost = new SubscriberHost();
subscriberHost.Init();
Console.WriteLine("Waiting of publishing
to happen on EmailMessage as we are subscribing to it...");
Console.ReadLine();
}
}
public class SubscriberHost : AppHostHttpListenerBase
{
private RedisMqServer mqHost;
public SubscriberHost()
:base("Subscriber console",typeof(EmailMessageEventHandler).Assembly)
{
}
public override void Configure(Container container)
{
var redisFactory = new PooledRedisClientManager("localhost:6379");
mqHost = new RedisMqServer(redisFactory, retryCount:2);
mqHost.RegisterHandler<EmailMessage>((message) =>
{
var emailMessage = message.GetBody();
Console.WriteLine(emailMessage.To);
Console.WriteLine(emailMessage.Subject);
Console.WriteLine(emailMessage.Body);
return new SMessageReceipt {
Type = "not used",
To = "test",
From = "Reusability",
RefId = "1,"
};
});
// mqHost.RegisterHandler<EmailMessage>(ServiceController.ExecuteMessage);
mqHost.Start();
}
}
I was expecting both subscribers to receive the EmailMessage but only the new console app is receiving it. Why isn't the other subscriber receiving the message?
The client code that does the publishing has not been modified.
What I have shown above is using Redis MQ, and for the multiple subscribers problem I was testing I need the Redis Pub/Sub.
For MQ, a subscriber takes the message off the queue to process. Once processed, that is it.
For Pub/Sub, there could be many subscribers and each will receive a copy of the message.
I hope this helps others.