receive all connected players on google play services realtime multiplayer game - google-play-services

is there a way to get a list of all the ids of the players that are connected and near the user that is playing my realtime multiplayer game?
I already tried Games.Players.loadConnectedPlayers() method, but i donĀ“t know how to use its PendingResult
I wanna use public RoomConfig.Builder addPlayersToInvite (ArrayList playerIds)
to make the user play with only specific players

You can use result callback as follows:
Games.Players.loadConnectedPlayers(mHelper.getApiClient(), false).setResultCallback(
new ResultCallback<Players.LoadPlayersResult>() {
#Override
public void onResult(LoadPlayersResult arg0) {
// TODO Auto-generated method stub
}
});

Related

Xamarin.Forms Communication Between Two Pages Within Same App on Different Devices

Technologies, frameworks and devices I'm using:
Framework: Xamarin.Forms
IDE: Visual Studio 2022
Physical Device (smartphone): Zebra TC26 (Android 10)
Physical Device (smartwatch): Samsung Galaxy Watch4 (Android 11)
Problem definition
Currently I have a test Xamarin.Forms project that consists of two different UIs (XAML files):
User Interface 1: HomePage.XAML - This screen should be displayed on the smartphone
User Interface 2: WatchScreen.XAML - This screen should be displayed on the smartwatch
With code below I make sure HomePage.XAML is deployed to a smartphone and watchscreen is deployed to a smartwatch:
Page homePage = new NavigationPage(new HomePage());
// BuildVersionCodes.R is a reference to Android version 11 (mostly now used by Wear OS 3.x)
if (Build.VERSION.SdkInt == BuildVersionCodes.R)
{
// SM-R870 is a reference to the Samsung Galaxy Watch4
// Note: This is needed to ensure the UI is specific to the UI of a smartwatch
if (Build.Model == "SM-R870")
{
Page watchScreen = new NavigationPage(new WatchScreen());
MainPage = watchScreen;
}
}
else
{
MainPage = homePage;
}
Now I want to make these pages on different devices communicate with each other. HomePage.xaml exists within the main Xamarin.Forms project as well as WatchScreen.xaml.
The way I want them to communicate with each other is by sending a message or something. A Xamarin.Forms project also comes with a native project. In this native Xamarin.Android project I try to retrieve inside the MainActivity.cs the button that exists within the main project by using (in WatchScreen.xaml this button exists and in WatchScreen.xaml.cs I have a method that gives this button back).
Method in WatchScreen.xaml.cs that gives button back:
public Button GetSendButtonFromWearableUI() => btnSendMessage;
In MainActivity.cs I get this method by using:
Button button = (App.Current.MainPage.Navigation.NavigationStack.LastOrDefault() as WatchScreen)
.GetSendButtonFromWearableUI();
Whenever I click on the button by doing this:
button.Clicked += delegate
{
SendData();
};
Some data should be sent from MainActivity.cs and catched by HomePage.xaml and displayed on it. I tried several approaches but I didn't succeed in achieving what needs to happen.. Therefore, I'm wondering if you guys could help me out with this and would be much appreciated.
In the meantime I've been investigating this issue and came up with a solution. Follow steps below to get the same result. To make this solution work I've combined the Wearable Data Layer API from Google and MessagingCenter from Microsoft.
Also the example below shows only the communication from the smartwatch to the smartphone. In order to reverse processes you can put the send button on the HomePage instead of the smartwatch screen and make sure to subscribe to the correct messages.
One last note: keep in mind that code used below from Google is deprecated but it still works...
References used to make this work:
Syncing Data Between Wearable and Handheld Devices Using Xamarin in Android
Installed dependencies on the Xamarin.Android project within Xamarin.Forms project:
Xamarin.Android.Support.v4
Xamarin.GooglePlayServices.Base
Xamarin.GooglePlayServices.Wearable
MessageKeys.cs
This class is used to declare message keys that are being used to send and receive messages between devices.
public class MessageKeys
{
public const string Smartwatch = "Smartwatch";
public const string Smartphone = "Smartphone";
}
Xamarin.Forms (Base project) - App.xaml.cs
In the App.xaml.cs, as pointed out earlier, I'm making sure the wearable UI displays WatchScreen.xaml and any other devices display regular Android UI -> HomePage.xaml.
Xamarin.Forms (Base project) - WatchScreen.xaml.cs
Send message from Wearable device to Android smartphone.
private void btnSendMessage_Clicked(object sender, EventArgs e)
{
MessagingCenter.Send(Xamarin.Forms.Application.Current, MessageKeys.Smartwatch);
}
Xamarin.Forms (Base project) - HomePage.xaml.cs
public HomePage()
{
InitializeComponent();
MessagingCenter.Subscribe<Xamarin.Forms.Application>(Xamarin.Forms.Application.Current, MessageKeys.Smartphone, (sender) =>
{
DisplayAlert("Message", "Wearable message received!", "OK");
});
}
Xamarin.Forms (Native Android Project) - MainActivity.cs
Within MainActivity.cs I implement the following interfaces:
public class MainActivity : WearableActivity, DataClient.IOnDataChangedListener,
GoogleApiClient.IConnectionCallbacks, GoogleApiClient.IOnConnectionFailedListener
Variables:
private GoogleApiClient client;
const string syncPath = "/[project name]/[subdirectory for watch]";
Internal class 'MessageReceiver' for receiving broadcast messages:
[BroadcastReceiver]
public class MessageReciever : BroadcastReceiver
{
MainActivity main;
public MessageReciever() { }
public MessageReciever(MainActivity owner) { this.main = owner; }
public override void OnReceive(Context context, Intent intent)
{
main.ProcessMessage(intent);
}
}
Registering receiver (to receive through Wearable Data Layer API), creating Google Client and Subscribing to smartwatch message (to retrieve message through MessagingCenter)
protected override void OnCreate(Bundle bundle)
{
IntentFilter filter = new IntentFilter(Intent.ActionSend);
MessageReciever receiver = new MessageReciever(this);
LocalBroadcastManager.GetInstance(this).RegisterReceiver(receiver, filter);
client = new GoogleApiClient.Builder(this, this, this)
.AddApi(WearableClass.Api)
.Build();
MessagingCenter.Subscribe<Xamarin.Forms.Application>(Xamarin.Forms.Application.Current, MessageKeys.Smartwatch, (sender) =>
{
SendData();
});
}
ProcessMessage method: sends received message from wearable to smartphone
public void ProcessMessage(Intent intent)
{
// For now I'm not sending the payload...
string message = intent.GetStringExtra("WearMessage");
MessagingCenter.Send(Xamarin.Forms.Application.Current, MessageKeys.Smartphone);
}
SendData(), OnStart(), OnStop(), OnDataChanged (didn't do anything with this part, because this is to receive messages outside the project and I don't need it for now), OnConnected(), OnConnectionSuspended(), OnConnectionFailed():
See the reference to see what code has been used, since code is exactly the same... P.S.: one thing for SendData has been changed. If you want to keep sending data, remove 'client.Disconenct()' from finally after the try and catch block.
Xamarin.Forms (Native Android Project) - WearableService inherits from WearableListenerService:
WearableService is a new class and created within the native project. Also for this part see the reference, because it's the exact same code being used within my project.
To get an overall overview of what's happening, I've visualized this in the diagram below: (example shows how communication works from smartwatch to smartphone)
If you want to communicate from smartphone to smartwatch, you could do something like this:
That's it guys. Now you will receive messages within the same application using the Wearable Data Layer API and MessagingCenter. Instead of having separate projects, we just use separate UIs to make this happen...

Agora many to one live streaming

I have a requirement in which different user will stream videos from their camera to a server and there will be dashboard in which the admin can view all the streams real-time something like how surveillance works? I think video broadcasting can help but the documentation says it enables live streaming from one-to-many and many-to-many but there is no mention of the many-to-one case. How can I achieve this?
The use-case you have described would be implemented the same way as a many-to-many broadcast.
For your use-case you would have all of the camera streams join the channel as broadcasters and then the "surveillance" user would join as an audience. The audience member subscribes to all the remote streams without having to broadcast a stream of their own.
[Update]
With Agora's SDK you can use an external video source, you would just have to manage it yourself. If you are using a custom video source then you don't need to use RTMP.
IVideoFrameConsumer mConsumer;
boolean mHasStarted;
// Create a VideoSource instance.
VideoSource source = new VideoSource() {
#Override
public int getBufferType() {
// Get the current frame type.
// The SDK uses different methods to process different frame types.
// If you want to switch to another VideoSource type, create another instance.
// There are three video frame types: BYTE_BUFFER(1); BYTE_ARRAY(2); TEXTURE(3)
return BufferType.BYTE_ARRAY;
}
#Override
public boolean onInitialize(IVideoFrameConsumer consumer) {
// Consumer was created by the SDK.
// Save it in the lifecycle of the VideoSource.
mConsumer = consumer;
}
#Override
public boolean onStart() {
mHasStarted = true;
}
#Override
public void onStop() {
mHasStarted = false;
}
#Override
public void onDispose() {
// Release the consumer.
mConsumer = null;
}
};
// Change the inputting video stream to the VideoSource instance.
rtcEngine.setVideoSource(source);
// After receiving the video frame data, use the consumer class to send the data.
// Choose differnet methods according to the frame type.
// For example, the current frame type is byte array, i.e. NV21.
if (mHasStarted && mConsumer != null) {
mConsumer.consumeByteArrayFrame(data, AgoraVideoFrame.NV21, width, height, rotation, timestamp);
}
full guide: https://docs.agora.io/en/Video/custom_video_android?platform=Android

ACTIVATE/DEACTIVATE events don't work for a mobile game app using starling framework

So I'm developping a mobile game using the framework starling, and I want the game to pause when I hit the home/back button on my phone, and of course to resume when going back to the game. I did some research and I tried the following:
this.addEventListener(FlashEvent.DEACTIVATE, stopGame);
this.addEventListener(FlashEvent.ACTIVATE, continueGame);
private function continueGame(event:FlashEvent):void
{
...
}
private function stopGame(event:FlashEvent):void
{
...
}
I had to add a new class called FlashEvent that extends flash.events.Event, because I use starling Event and flash Event in the same class, and when I use flash.events.Event I get this error:
Error: Access of undefined property flash
And the same thing for starling.events.Event.
So I used the code above and tried it out in my phone, but when I hit back/home, the game keeps going and the music keeps playing.
My question is: what is the correct way to dispatch the activate/deactivate event in an air mobile app?
Used in your main startup class.
(note that in this example 'app:Main' is the class I call the Starling start method.
Note that you should determine event classes with:
starling.events.Event.XXX
flash.events.Event.XXX
_mStarling.addEventListener(starling.events.Event.ROOT_CREATED,
function onRootCreated(event:Object, app:Main):void
{
_mStarling.removeEventListener(starling.events.Event.ROOT_CREATED, onRootCreated);
app.start(assets);
_mStarling.start();
NativeApplication.nativeApplication.addEventListener(
flash.events.Event.ACTIVATE, function (e:*):void {
_mStarling.start();
try {
// optionally call some other methods
} catch(e:Error) {
}
});
NativeApplication.nativeApplication.addEventListener(
flash.events.Event.DEACTIVATE, function (e:*):void {
try{
// optionally call some other methods before stopping
} catch(e:Error) {
}
_mStarling.stop();
});
});
My code
public function Main()
{
stage.addEventListener(flash.events.Event.DEACTIVATE, onDeactivate);
stage.addEventListener(flash.events.Event.ACTIVATE, onActivate);
}
And you set a paused boolean and check for it at the top of your game loop:
if ( paused ) return;
If you have animations you use a juggler and if you aren't calling advanceTime on the juggler it is paused.

Google Play Services - Sending A Rematch Request

After a multiplayer game is over, is there any possibility of sending a rematch request to the same participants?
Note that this is not provided by the Google API, I am wondering if people have any ideas on implementing such a system.
Thanks,
Rajat
If you just want a solution to inviting all the same people from the previous game.. this works for the player that initiated that initiated the game:
When you get back the Intent from selecting the players.. just save it:
// Handle the result of the "Select players UI" we launched when the user clicked the
// "Invite friends" button. We react by creating a room with those players.
private void handleSelectPlayersResult(int response, Intent data) {
if (response != Activity.RESULT_OK) {
Log.w(TAG, "*** select players UI cancelled, " + response);
Gdx.app.postRunnable(new Runnable() {
#Override
public void run() {
mGHInterface.onBackedOut();
}
});
return;
}
Log.d(TAG, "Select players UI succeeded.");
previousMatch = data;
// get the invitee list ...(etc, etc)
Then, whatever way you want to activate the next bit of code works:
#Override
public void sendOutRematch() {
handleSelectPlayersResult(Activity.RESULT_OK, previousMatch);
}
Now, for players that are invited to the room... need to look at the Room object for them, and get the invited player id's from that object.. and save that list of players so it can be used in the Room Creation process with something like this to grab the ids :
ArrayList<String> playerIDs;
for (Participant p : room.getParticipants()) {
playerIDs.add(p.getPlayer().getPlayerId());
}
then use that list in your room creation process.

How to create AsyncCallback call for Presenter Widget in GWTP?

Ok, we often see people use AsyncCallback for client to call methods from Server. That's easy, even more easy if we use GWTP platform.
My question is how we create AsyncCallback for a presenter widget in GWTP? Thre is no server involved.
Ex, i want to create a ConfirmationPresenter which has 2 buttons (ok & cancel). When user click ok the system will go to onSuccess of AsyncCallback method.
private AsyncCallback<ConfirmResult> confirmCallback=new AsyncCallback<ConfirmResult>(){
#Override
public void onFailure(Throwable caught) {
// TODO Auto-generated method stub
}
#Override
public void onSuccess(ConfirmResult result) {
//do something here
}
};
to call the above method we can do this:
Confirmation action=new Confirmation();
String msg="pls click ok to confirm");
action.set(msg);
dispatchAsync.execute(action, confirmCallback);
I just know the basic structure of Async Callback but I don't know how to create it. I can only create it if i use eClipse, but it will create for server call.
If you can provide a very simple example based on GWTP platform then it will be great. Some other examples on internet was not based on GWTP platform & too complicated.