Agora many to one live streaming - agora.io

I have a requirement in which different user will stream videos from their camera to a server and there will be dashboard in which the admin can view all the streams real-time something like how surveillance works? I think video broadcasting can help but the documentation says it enables live streaming from one-to-many and many-to-many but there is no mention of the many-to-one case. How can I achieve this?

The use-case you have described would be implemented the same way as a many-to-many broadcast.
For your use-case you would have all of the camera streams join the channel as broadcasters and then the "surveillance" user would join as an audience. The audience member subscribes to all the remote streams without having to broadcast a stream of their own.
[Update]
With Agora's SDK you can use an external video source, you would just have to manage it yourself. If you are using a custom video source then you don't need to use RTMP.
IVideoFrameConsumer mConsumer;
boolean mHasStarted;
// Create a VideoSource instance.
VideoSource source = new VideoSource() {
#Override
public int getBufferType() {
// Get the current frame type.
// The SDK uses different methods to process different frame types.
// If you want to switch to another VideoSource type, create another instance.
// There are three video frame types: BYTE_BUFFER(1); BYTE_ARRAY(2); TEXTURE(3)
return BufferType.BYTE_ARRAY;
}
#Override
public boolean onInitialize(IVideoFrameConsumer consumer) {
// Consumer was created by the SDK.
// Save it in the lifecycle of the VideoSource.
mConsumer = consumer;
}
#Override
public boolean onStart() {
mHasStarted = true;
}
#Override
public void onStop() {
mHasStarted = false;
}
#Override
public void onDispose() {
// Release the consumer.
mConsumer = null;
}
};
// Change the inputting video stream to the VideoSource instance.
rtcEngine.setVideoSource(source);
// After receiving the video frame data, use the consumer class to send the data.
// Choose differnet methods according to the frame type.
// For example, the current frame type is byte array, i.e. NV21.
if (mHasStarted && mConsumer != null) {
mConsumer.consumeByteArrayFrame(data, AgoraVideoFrame.NV21, width, height, rotation, timestamp);
}
full guide: https://docs.agora.io/en/Video/custom_video_android?platform=Android

Related

Can flutter native side use eventchannel transfer the MAP data?

I am running the flutter platform channel "Eventchannel" with windows platform
I know this not mention in official platform channel document
but I found some example that work with windows and I test it.
Now, I can transfer the single data from below code:
void initEventChannel(flutter::FlutterEngine* flutter_instance) {
const static std::string event_channel_name("getFromWinsBrowsingDevice");
const flutter::StandardMethodCodec& codec = flutter::StandardMethodCodec::GetInstance();
flutter::EventChannel event_channel_name_(flutter_instance->messenger(), event_channel_name, &codec);
event_channel_name_.SetStreamHandler(
std::make_unique<flutter::StreamHandlerFunctions<flutter::EncodableValue>>(on_listen, on_cancel));
}
std::unique_ptr<flutter::StreamHandlerError<flutter::EncodableValue>> on_listen(
const flutter::EncodableValue* agruments,
std::unique_ptr<flutter::EventSink<flutter::EncodableValue>>&& events) {
std::thread BrowsingThread(sentBrowsingEvent, std::move(events));
BrowsingThread.detach();
return NULL;
}
void sentBrowsingEvent(std::unique_ptr<flutter::EventSink<flutter::EncodableValue>>&& events) {
//Browsing_Check_routine, &browsing_test
//creat MAP in Browsing_Check_routine to feedback the event to UI
while (1) {
Browsing_Check_routine()
events.get()->Success(flutter::EncodableValue(BrowsingDeviceMap)); // This will fail while send all MAP data
}
std::this_thread::sleep_for(std::chrono::seconds(1));
#endif
}
}
I want to receive some help, how should I fix this error to pass the MAP data to flutter side?
events.get()->Success(flutter::EncodableValue(BrowsingDeviceMap)); //
This will fail while send all MAP data
Thank you!
Edit:
I realize my map is c++ type, if I want to pass the map with the EncodableValue, the variable must declare in EncodableMap type.

Spring Cloud Stream deserialization error handling for Batch processing

I have a question about handling deserialization exceptions in Spring Cloud Stream while processing batches (i.e. batch-mode: true).
Per the documentation here, https://docs.spring.io/spring-kafka/docs/2.5.12.RELEASE/reference/html/#error-handling-deserializer, (looking at the implementation of FailedFooProvider), it looks like this function should return a subclass of the original message.
Is the intent here that a list of both Foo's and BadFoo's will end up at the original #StreamListener method, and then it will be up to the code (i.e. me) to sort them out and handle separately? I suspect this is the case, as I've read that the automated DLQ sending isn't desirable for batch error handling, as it would resubmit the whole batch.
And if this is the case, what if there is more than one message type received by the app via different #StreamListener's, say Foo's and Bar's. What type should the value function return in that case? Below is the pseudo code to illustrate the second question?
#StreamListener
public void readFoos(List<Foo> foos) {
List<> badFoos = foos.stream()
.filter(f -> f instanceof BadFoo)
.map(f -> (BadFoo) f)
.collect(Collectors.toList());
// logic
}
#StreamListener
public void readBars(List<Bar> bars) {
// logic
}
// Updated to return Object and let apply() determine subclass
public class FailedFooProvider implements Function<FailedDeserializationInfo, Object> {
#Override
public Object apply(FailedDeserializationInfo info) {
if (info.getTopics().equals("foo-topic") {
return new BadFoo(info);
}
else if (info.getTopics().equals("bar-topic") {
return new BadBar(info);
}
}
}
Yes, the list will contain the function result for failed deserializations; the application needs to handle them.
The function needs to return the same type that would have been returned by a successful deserialization.
You can't use conditions with batch listeners. If the list has a mixture of Foos and Bars, they all go to the same listener.

What image format should be employed for real time processing using Camera2 in android?

I am developing an android application which processes Camera2 preview frames and displays processed frames on the Texture. At first, I tested with camera1 api, it works fine for real time image processing.
private class CameraPreviewCallback implements Camera.PreviewCallback {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
processingRunnable.setNextFrame(data, camera);
}
}
Then, I changed my code which utilizes camera2 api. For getting preview frames, I set ImageFormat as YUV_420_888
mImageReaderPreview = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.YUV_420_888, 3);
mImageReaderPreview.setOnImageAvailableListener(mOnPreviewAvailableListener, mBackgroundHandler);
private final ImageReader.OnImageAvailableListener mOnPreviewAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image mImage = reader.acquireLatestImage();
if(mImage == null) {
return;
}
processingRunnable.setNextFrame(convertYUV420888ToNV21(mImage));
mImage.close();
}
};
However, it's working slower than camera1. May be it's because of having one extra conversion from YUV_420_888 to NV21. Since Camera1 can directly provides NV21 frame from Camera1.
Conversion could be expensive, depending on how you implement it and what the layout of the YUV_420_888 on a given device is.
Certainly if it's written in pure Java is probably going to be slow.
That said, if the device you're using is at the LEGACY hardware level, camera2 has to run in a legacy mode that can be slow for receiving YUV information. For those devices, staying on API1 may be preferable for your use case.

How do I map a texture onto an Entity in a Minecraft Plugin

I'm trying to write a plugin to brand cattle and thought it would be pretty easy, but I'm stuck looking for the information that would help me do this.
Where can I find information that will help me map a texture (from a png, for example) onto an Entity. While there's information about built-in textures for Players etc, I haven't found a resource that would help me understand how I could get something to render on the side of an Entity.
I'm guessing that I'd use something like the following calls...
Minecraft.getMinecraft().renderEngine.bindTexture(new ResourceLocation("tc:textures/gui/my-icon.png"));
Minecraft.getMinecraft().ingameGUI.drawTexturedModalRect(etc);
Not certain how I'd enforce them into the drawing of a cow or a horse.
This isn't possible while using Bukkit, since Bukkit is server-side and can't change textures. There's one exception, though: Servers can send players resource packs. However, there does not appear to be a way to create a unique texture based off of any data, so you'd have to make all cows look the same. It wouldn't really do what you want. (Players are another exception, but protocol-wise, their skins are arbitrary anyways).
However, if you want to use Minecraft Forge, this is far more manageable. You can subclass the entity and change some of the rendering code. Perhaps you can also have an item of some sort (a branding iron, maybe) to convert existing cows into branded cows (they would still spawn as normal cows). I'm not too much of a Forge dev, but something like this should work (though I haven't tested it). This is more of an outline; things like converting the entity and creating an item I'll leave to you.
Here's a basic outline for an entitiy that tracks a texture between the server and the client:
import net.minecraft.entity.passive.EntityCow;
import net.minecraft.util.ResourceLocation;
public class EntityBrandedCow extends EntityCow {
#Override
protected void entityInit() {
super.entityInit();
// Data watcher lets you track data between the server and client
// without handling packets yourself
// http://wiki.vg/Entities
this.dataWatcher.addObject(14, "minecraft:textures/entity/cow/cow.png");
}
public void setTexture(ResourceLocation texture) {
this.dataWatcher.updateObject(14, texture.toString());
}
public ResourceLocation getTexture() {
return new ResourceLocation(this.dataWatcher.getWatchableObjectString(14));
}
}
You'll need to register a custom renderer for your new entity. This would go in the client proxy.
RenderingRegistry.registerEntityRenderingHandler(EntityBrandedCow.class, new RenderBrandedCow(Minecraft.getRenderManager(), new ModelCow(), .7f));
And here's such a render you can use:
import net.minecraft.util.ResourceLocation;
import net.minecraft.client.model.ModelBase;
import net.minecraft.entity.Entity;
import net.minecraft.client.model.ModelBase;
import net.minecraft.client.renderer.entity.RenderLiving;
import net.minecraft.client.renderer.entity.RenderManager;
public class RenderBrandedCow extends RenderLiving {
public RenderBrandedCow(RenderManager manager, ModelBase model, float shadowSize) {
super(manager, model, shadowSize);
}
#Override
protected ResourceLocation getEntityTexture(Entity entity) {
return ((EntityBrandedCow)entity).getTexture();
}
}
That renderer only changes the texture, and doesn't actually overlay anything. This, among other things, means that texture packs won't change branded cows without creating additional textures. An alternative would be to create a second layer. (This is based off of the way sheep wool works - see net.minecraft.client.renderer.entity.layers.LayerSheepWool and net.minecraft.client.renderer.RenderSheep). You can change the renderer to this:
import net.minecraft.client.model.ModelBase;
import net.minecraft.client.renderer.entity.RenderLiving;
import net.minecraft.client.renderer.entity.RenderManager;
public class RenderBrandedCow extends RenderCow {
public RenderBrandedCow(RenderManager manager, ModelBase model, float shadowSize) {
super(manager, model, shadowSize);
this.addLayer(new LayerCowBrand(this));
}
}
And here's the start of some kind of layer rendering code. This won't work on its own; you'll need to write a ModelBrand (see ModelSheep1 for the basis of that).
public class LayerCowBrand implements LayerRenderer {
private final BrandedCowRenderer renderer;
private final ModelBrand model = new ModelBrand();
public LayerCowBrand(BrandedCowRenderer renderer) {
this.renderer = renderer;
}
public void doRenderLayer(EntityBrandedCow entity, float p_177162_2_, float p_177162_3_, float p_177162_4_, float p_177162_5_, float p_177162_6_, float p_177162_7_, float p_177162_8_) {
// It's common to write a second method with the right parameters...
// I don't know off my hand what the parameters here are.
this.renderer.bindTexture(entity.getTexture());
this.model.setModelAttributes(this.sheepRenderer.getMainModel());
this.model.setLivingAnimations(p_177162_1_, p_177162_2_, p_177162_3_, p_177162_4_);
this.model.render(p_177162_1_, p_177162_2_, p_177162_3_, p_177162_5_, p_177162_6_, p_177162_7_, p_177162_8_);
}
public boolean shouldCombineTextures() {
// I don't know
return true;
}
public void doRenderLayer(EntityLivingBase p_177141_1_, float p_177141_2_, float p_177141_3_, float p_177141_4_, float p_177141_5_, float p_177141_6_, float p_177141_7_, float p_177141_8_) {
// This is the actual render method that implements the interface.
this.doRenderLayer((EntityBrandedCow)p_177141_1_, p_177141_2_, p_177141_3_, p_177141_4_, p_177141_5_, p_177141_6_, p_177141_7_, p_177141_8_);
}
}
Hopefully this at least lets you get started. As I said, I'm not a forge dev, but this should be the basics. If you want to ask more questions about forge, post here on Stack Overflow using minecraft-forge (Gaming Stack Exchange also has a minecraft-forge tag but that's for mod usage, not development).

Camera application for all Android devices

I'm currently developing a camera application for Android on which some problems have occurred. I need it to work on all Android devices and since all of these works in different ways specially with the camera hardware, I'm having a hard time finding a solution that works for every device.
My application main goal is to launch the camera on a button click, take a photo and upload it to a server. So I don't really need the functionality of saving the image on the device, but if that's needed for further image use I might as well allow it.
For example I'm testing my application on a Samsung Galaxy SII and a Motorola Pad. I got working code that launches the camera, which is by the way C# code since I'm using Monodroid:
Intent cameraIntent = new Intent(Android.Provider.MediaStore.ActionImageCapture);
StartActivityForResult(cameraIntent, PHOTO_CAPTURE);
And I fetch the result, similar to this guide I followed:
http://kevinpotgieter.wordpress.com/2011/03/30/null-intent-passed-back-on-samsung-galaxy-tab/
Why I followed this guide is because the activity returns null on my galaxy device (Another device oriented problem).
This code works fine on the Galaxy device. It takes a photo and saves the photo in the gallery from which i can upload to a server. By further research this is apparently galaxy standard behaviour, so this doesn't work on my Motorola pad. The camera works fine, but no image is saved to gallery.
So with this background my question is, am I on the right path here? Do I need to save the image to gallery in order for further use in my application? Is there any solution that works for every Android device, cause that's the solution i need.
Thanks for any feedback!
After reading the linked article, the approach taken in that article is geared toward the Galaxy line, since they appear to write to the gallery automatically.
This article discusses some other scenarios in detail:
Android ACTION_IMAGE_CAPTURE Intent
So, I don't necessarily think that following the linked article that you provided is the right path. Not all devices automatically write to the gallery as described in that article, afaik. The article I linked to points to the issues being related to security and suggests writing the image to a /sdcard/tmp folder for storing the original image. Going down a similar path would more than likely lead to code that is going to work reliably across many devices.
Here are some other links for reference:
Google discussion regarding this subject: http://code.google.com/p/android/issues/detail?id=1480
Project with potential a solution to the problem: https://github.com/johnyma22/classdroid
While that discussion/project are in Java/Android SDK, the same concepts should apply to Monodroid. I'd be happy to help you adapt the code to a working Mono for Android solution if you need help.
To long2know:
Yes the same concepts applies to Monodroid. I've already read the stack article you linked among with some other similar. However i don't like the approach in that particular post since it checks for bugs for some devices that are hardcoded into a collection. Meaning it might fail to detect bugs in future devices. Since i won't be doing maintenance on this application, i can't allow this. I found a solution elsewhere and adapted it to my case and i'll post it below if someone would need it. It works on both my devices, guessing it would work for the majority of other devices. Thanks for your post!
Solution that allows you to snap a picture and use, also with the option of using a image from gallery. Solution uses option menu for these purposes, just for testing. (Monodroid code).
Camera code is inspired by:
access to full resolution pictures from camera with MonoDroid
namespace StackOverFlow.UsingCameraWithMonodroid
{
[Activity(Label = "ImageActivity")]
public class ImageActivity
private readonly static int TakePicture = 1;
private readonly static int SelectPicture = 2;
private string imageUriString;
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
this.SetContentView(Resource.Layout.ImageActivity);
}
public override bool OnCreateOptionsMenu(IMenu menu)
{
MenuInflater flate = this.MenuInflater;
flate.Inflate(Resource.Menu.ImageMenues, menu);
return base.OnCreateOptionsMenu(menu);
}
public override bool OnOptionsItemSelected(IMenuItem item)
{
switch (item.ItemId)
{
case Resource.Id.UseExisting:
this.SelectImageFromStorage();
return true;
case Resource.Id.AddNew:
this.StartCamera();
return true;
default:
return base.OnOptionsItemSelected(item);
}
}
private Boolean isMounted
{
get
{
return Android.OS.Environment.ExternalStorageState.Equals(Android.OS.Environment.MediaMounted);
}
}
private void StartCamera()
{
var imageUri = ContentResolver.Insert(isMounted ? MediaStore.Images.Media.ExternalContentUri
: MediaStore.Images.Media.InternalContentUri, new ContentValues());
this.imageUriString = imageUri.ToString();
var cameraIntent = new Intent(MediaStore.ActionImageCapture);
cameraIntent.PutExtra(MediaStore.ExtraOutput, imageUri);
this.StartActivityForResult(cameraIntent, TakePicture);
}
private void SelectImageFromStorage()
{
Intent intent = new Intent();
intent.SetType("image/*");
intent.SetAction(Intent.ActionGetContent);
this.StartActivityForResult(Intent.CreateChooser(intent,
"Select Picture"), SelectPicture);
}
// Example code of using the result, in my case i want to upload in another activity
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
// If a picture was taken
if (resultCode == Result.Ok && requestCode == TakePicture)
{
// For some devices data can become null when using the camera activity.
// For this reason we save pass the already saved imageUriString to the upload activity
// in order to adapt to every device. Instead we would want to use the data intent
// like in the SelectPicture option.
var uploadIntent = new Intent(this.BaseContext, typeof(UploadActivity));
uploadIntent.PutExtra("ImageUri", this.imageUriString);
this.StartActivity(uploadIntent);
}
// User has selected a image from storage
else if (requestCode == SelectPicture)
{
var uploadIntent = new Intent(this.BaseContext, typeof(UploadActivity));
uploadIntent.PutExtra("ImageUri", data.DataString);
this.StartActivity(uploadIntent);
}
}
}
}