What image format should be employed for real time processing using Camera2 in android? - camera

I am developing an android application which processes Camera2 preview frames and displays processed frames on the Texture. At first, I tested with camera1 api, it works fine for real time image processing.
private class CameraPreviewCallback implements Camera.PreviewCallback {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
processingRunnable.setNextFrame(data, camera);
}
}
Then, I changed my code which utilizes camera2 api. For getting preview frames, I set ImageFormat as YUV_420_888
mImageReaderPreview = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.YUV_420_888, 3);
mImageReaderPreview.setOnImageAvailableListener(mOnPreviewAvailableListener, mBackgroundHandler);
private final ImageReader.OnImageAvailableListener mOnPreviewAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image mImage = reader.acquireLatestImage();
if(mImage == null) {
return;
}
processingRunnable.setNextFrame(convertYUV420888ToNV21(mImage));
mImage.close();
}
};
However, it's working slower than camera1. May be it's because of having one extra conversion from YUV_420_888 to NV21. Since Camera1 can directly provides NV21 frame from Camera1.

Conversion could be expensive, depending on how you implement it and what the layout of the YUV_420_888 on a given device is.
Certainly if it's written in pure Java is probably going to be slow.
That said, if the device you're using is at the LEGACY hardware level, camera2 has to run in a legacy mode that can be slow for receiving YUV information. For those devices, staying on API1 may be preferable for your use case.

Related

Xamarin.Forms Communication Between Two Pages Within Same App on Different Devices

Technologies, frameworks and devices I'm using:
Framework: Xamarin.Forms
IDE: Visual Studio 2022
Physical Device (smartphone): Zebra TC26 (Android 10)
Physical Device (smartwatch): Samsung Galaxy Watch4 (Android 11)
Problem definition
Currently I have a test Xamarin.Forms project that consists of two different UIs (XAML files):
User Interface 1: HomePage.XAML - This screen should be displayed on the smartphone
User Interface 2: WatchScreen.XAML - This screen should be displayed on the smartwatch
With code below I make sure HomePage.XAML is deployed to a smartphone and watchscreen is deployed to a smartwatch:
Page homePage = new NavigationPage(new HomePage());
// BuildVersionCodes.R is a reference to Android version 11 (mostly now used by Wear OS 3.x)
if (Build.VERSION.SdkInt == BuildVersionCodes.R)
{
// SM-R870 is a reference to the Samsung Galaxy Watch4
// Note: This is needed to ensure the UI is specific to the UI of a smartwatch
if (Build.Model == "SM-R870")
{
Page watchScreen = new NavigationPage(new WatchScreen());
MainPage = watchScreen;
}
}
else
{
MainPage = homePage;
}
Now I want to make these pages on different devices communicate with each other. HomePage.xaml exists within the main Xamarin.Forms project as well as WatchScreen.xaml.
The way I want them to communicate with each other is by sending a message or something. A Xamarin.Forms project also comes with a native project. In this native Xamarin.Android project I try to retrieve inside the MainActivity.cs the button that exists within the main project by using (in WatchScreen.xaml this button exists and in WatchScreen.xaml.cs I have a method that gives this button back).
Method in WatchScreen.xaml.cs that gives button back:
public Button GetSendButtonFromWearableUI() => btnSendMessage;
In MainActivity.cs I get this method by using:
Button button = (App.Current.MainPage.Navigation.NavigationStack.LastOrDefault() as WatchScreen)
.GetSendButtonFromWearableUI();
Whenever I click on the button by doing this:
button.Clicked += delegate
{
SendData();
};
Some data should be sent from MainActivity.cs and catched by HomePage.xaml and displayed on it. I tried several approaches but I didn't succeed in achieving what needs to happen.. Therefore, I'm wondering if you guys could help me out with this and would be much appreciated.
In the meantime I've been investigating this issue and came up with a solution. Follow steps below to get the same result. To make this solution work I've combined the Wearable Data Layer API from Google and MessagingCenter from Microsoft.
Also the example below shows only the communication from the smartwatch to the smartphone. In order to reverse processes you can put the send button on the HomePage instead of the smartwatch screen and make sure to subscribe to the correct messages.
One last note: keep in mind that code used below from Google is deprecated but it still works...
References used to make this work:
Syncing Data Between Wearable and Handheld Devices Using Xamarin in Android
Installed dependencies on the Xamarin.Android project within Xamarin.Forms project:
Xamarin.Android.Support.v4
Xamarin.GooglePlayServices.Base
Xamarin.GooglePlayServices.Wearable
MessageKeys.cs
This class is used to declare message keys that are being used to send and receive messages between devices.
public class MessageKeys
{
public const string Smartwatch = "Smartwatch";
public const string Smartphone = "Smartphone";
}
Xamarin.Forms (Base project) - App.xaml.cs
In the App.xaml.cs, as pointed out earlier, I'm making sure the wearable UI displays WatchScreen.xaml and any other devices display regular Android UI -> HomePage.xaml.
Xamarin.Forms (Base project) - WatchScreen.xaml.cs
Send message from Wearable device to Android smartphone.
private void btnSendMessage_Clicked(object sender, EventArgs e)
{
MessagingCenter.Send(Xamarin.Forms.Application.Current, MessageKeys.Smartwatch);
}
Xamarin.Forms (Base project) - HomePage.xaml.cs
public HomePage()
{
InitializeComponent();
MessagingCenter.Subscribe<Xamarin.Forms.Application>(Xamarin.Forms.Application.Current, MessageKeys.Smartphone, (sender) =>
{
DisplayAlert("Message", "Wearable message received!", "OK");
});
}
Xamarin.Forms (Native Android Project) - MainActivity.cs
Within MainActivity.cs I implement the following interfaces:
public class MainActivity : WearableActivity, DataClient.IOnDataChangedListener,
GoogleApiClient.IConnectionCallbacks, GoogleApiClient.IOnConnectionFailedListener
Variables:
private GoogleApiClient client;
const string syncPath = "/[project name]/[subdirectory for watch]";
Internal class 'MessageReceiver' for receiving broadcast messages:
[BroadcastReceiver]
public class MessageReciever : BroadcastReceiver
{
MainActivity main;
public MessageReciever() { }
public MessageReciever(MainActivity owner) { this.main = owner; }
public override void OnReceive(Context context, Intent intent)
{
main.ProcessMessage(intent);
}
}
Registering receiver (to receive through Wearable Data Layer API), creating Google Client and Subscribing to smartwatch message (to retrieve message through MessagingCenter)
protected override void OnCreate(Bundle bundle)
{
IntentFilter filter = new IntentFilter(Intent.ActionSend);
MessageReciever receiver = new MessageReciever(this);
LocalBroadcastManager.GetInstance(this).RegisterReceiver(receiver, filter);
client = new GoogleApiClient.Builder(this, this, this)
.AddApi(WearableClass.Api)
.Build();
MessagingCenter.Subscribe<Xamarin.Forms.Application>(Xamarin.Forms.Application.Current, MessageKeys.Smartwatch, (sender) =>
{
SendData();
});
}
ProcessMessage method: sends received message from wearable to smartphone
public void ProcessMessage(Intent intent)
{
// For now I'm not sending the payload...
string message = intent.GetStringExtra("WearMessage");
MessagingCenter.Send(Xamarin.Forms.Application.Current, MessageKeys.Smartphone);
}
SendData(), OnStart(), OnStop(), OnDataChanged (didn't do anything with this part, because this is to receive messages outside the project and I don't need it for now), OnConnected(), OnConnectionSuspended(), OnConnectionFailed():
See the reference to see what code has been used, since code is exactly the same... P.S.: one thing for SendData has been changed. If you want to keep sending data, remove 'client.Disconenct()' from finally after the try and catch block.
Xamarin.Forms (Native Android Project) - WearableService inherits from WearableListenerService:
WearableService is a new class and created within the native project. Also for this part see the reference, because it's the exact same code being used within my project.
To get an overall overview of what's happening, I've visualized this in the diagram below: (example shows how communication works from smartwatch to smartphone)
If you want to communicate from smartphone to smartwatch, you could do something like this:
That's it guys. Now you will receive messages within the same application using the Wearable Data Layer API and MessagingCenter. Instead of having separate projects, we just use separate UIs to make this happen...

vlcj - How to change the volume of audio before playing it?

I tried this
val player: MediaPlayer = MediaPlayerFactory("-vvv").mediaPlayers().newMediaPlayer()
val result0: Boolean = player.audio().setVolume(50) // result0: true
player.media().play("/path/to/audio.ogg")
val result1: Boolean = player.audio().setVolume(50) // result1: false
and this
val player: MediaPlayer = MediaPlayerFactory("-vvv").mediaPlayers().newMediaPlayer()
val result0 = player.audio().setVolume(50) // result0: true
player.media().prepare("/path/to/audio.ogg")
val result1: Boolean = player.audio().setVolume(50) // result1: false
player.controls().play()
val result2: Boolean = player.audio().setVolume(50) // result2: false
but the volume remains at 100%.
The only way I found is to make something like this
val player: MediaPlayer = MediaPlayerFactory("-vvv").mediaPlayers().newMediaPlayer()
player.events().addMediaPlayerEventListener(object : MediaPlayerEventAdapter() {
override fun mediaPlayerReady(mediaPlayer: MediaPlayer) {
mediaPlayer.submit {
mediaPlayer.audio().setVolume(50)
}
}
})
player.media().play("/path/to/audio.ogg")
But the solution is a bit far from ideal. Because it starts to play, plays a bit, and then whoosh, the volume has changed.
I tried vlcj 4.4.0 and 4.5.2, VLC 3.0.8 and 3.0.10, jdk8 and 14, but it works in the same way.
This is something that unfortunately does not work in VLC 3.x, but does work in the upcoming VLC 4.x (at the time of writing this answer, VLC 4 is still in development).
The following code works for me using the latest VLC 4 built from source, and the latest vlcj-5 snapshot:
import uk.co.caprica.vlcj.player.component.AudioPlayerComponent;
import uk.co.caprica.vlcj.test.VlcjTest;
public class AudioMediaPlayerComponentTest extends VlcjTest {
public static void main(String[] args) throws Exception {
String mrl = "/home/music/some-cool-synthwave-tune.mp3";
AudioPlayerComponent audioMediaPlayerComponent = new AudioPlayerComponent();
audioMediaPlayerComponent.mediaPlayer().audio().setVolume(5);
audioMediaPlayerComponent.mediaPlayer().media().play(mrl);
Thread.currentThread().join();
}
}
The initial volume for the media player comes from the OS volume settings, and in fact the OS volume setting is linked both ways to the media player. Changing the volume in one place is reflected in the other.
Volume handling through LibVLC generally just seems much better in VLC 4.
If you're stuck on VLC 3, which is reasonable at the present time, then unfortunately you're also stuck with some sort of compromise solution like using the "ready" event that you've already found.
All the ready event does is to wait for the first position-changed event, and that event was created specifically as a compromise for purposes like this.
I tested all the native event callbacks available for the media player, and nothing worked to set the volume before playback had actually started.
This leaves you with the following, as you already found:
import uk.co.caprica.vlcj.player.base.MediaPlayer;
import uk.co.caprica.vlcj.player.component.AudioPlayerComponent;
import uk.co.caprica.vlcj.test.VlcjTest;
public class AudioMediaPlayerComponentTest extends VlcjTest {
public static void main(String[] args) throws Exception {
String mrl = "/home/music/some-cool-synthwave-tune.mp3";
AudioPlayerComponent audioMediaPlayerComponent = new AudioPlayerComponent() {
#Override
public void mediaPlayerReady(MediaPlayer mediaPlayer) {
mediaPlayer.audio().setVolume(30);
}
};
audioMediaPlayerComponent.mediaPlayer().media().play(mrl);
Thread.currentThread().join();
}
}
A completely sideways alternative might be to play the shortest possible silent media as a kind of pre-roll - when that media is finished (there's a finished or stopped event you can listen for) you should then be able to set the volume and play your actual media. I did not try this.

Agora many to one live streaming

I have a requirement in which different user will stream videos from their camera to a server and there will be dashboard in which the admin can view all the streams real-time something like how surveillance works? I think video broadcasting can help but the documentation says it enables live streaming from one-to-many and many-to-many but there is no mention of the many-to-one case. How can I achieve this?
The use-case you have described would be implemented the same way as a many-to-many broadcast.
For your use-case you would have all of the camera streams join the channel as broadcasters and then the "surveillance" user would join as an audience. The audience member subscribes to all the remote streams without having to broadcast a stream of their own.
[Update]
With Agora's SDK you can use an external video source, you would just have to manage it yourself. If you are using a custom video source then you don't need to use RTMP.
IVideoFrameConsumer mConsumer;
boolean mHasStarted;
// Create a VideoSource instance.
VideoSource source = new VideoSource() {
#Override
public int getBufferType() {
// Get the current frame type.
// The SDK uses different methods to process different frame types.
// If you want to switch to another VideoSource type, create another instance.
// There are three video frame types: BYTE_BUFFER(1); BYTE_ARRAY(2); TEXTURE(3)
return BufferType.BYTE_ARRAY;
}
#Override
public boolean onInitialize(IVideoFrameConsumer consumer) {
// Consumer was created by the SDK.
// Save it in the lifecycle of the VideoSource.
mConsumer = consumer;
}
#Override
public boolean onStart() {
mHasStarted = true;
}
#Override
public void onStop() {
mHasStarted = false;
}
#Override
public void onDispose() {
// Release the consumer.
mConsumer = null;
}
};
// Change the inputting video stream to the VideoSource instance.
rtcEngine.setVideoSource(source);
// After receiving the video frame data, use the consumer class to send the data.
// Choose differnet methods according to the frame type.
// For example, the current frame type is byte array, i.e. NV21.
if (mHasStarted && mConsumer != null) {
mConsumer.consumeByteArrayFrame(data, AgoraVideoFrame.NV21, width, height, rotation, timestamp);
}
full guide: https://docs.agora.io/en/Video/custom_video_android?platform=Android

change pitch of .3gp file while playing

In my project i have recorded sound using mediaplayer and save as .3gp file but when i want to play it using some audio effect or fast forwarding or change pitch of audio while playing. i have used mediaplayer but not working.then i used audiotrack but audiotrack takes only bytestream as input to play. i just want to play .3gp file and change pitch while playing.. i use this one below.
Help me...thanks in advance...
public void play() {
File path = new File(
Environment.getExternalStorageDirectory().getAbsolutePath()
+ "/sdcard/meditest/");
File[] f=path.listFiles();
isPlaying=true;
int bufferSize = AudioTrack.getMinBufferSize(outfrequency,
channelConfigurationout, audioEncoding);
short[] audiodata = new short[bufferSize];
try {
DataInputStream dis = new DataInputStream(
new BufferedInputStream(new FileInputStream(
f[0])));
audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, outfrequency,
channelConfigurationout, audioEncoding, bufferSize,
AudioTrack.MODE_STREAM);
audioTrack.setPlaybackRate((int) (frequency*1.5));
AudioManager audioManager = (AudioManager)this.getSystemService(Context.AUDIO_SERVICE);
// Set the volume of played media to maximum.
audioTrack.setStereoVolume(1.0f,1.0f);
Log.d("Clapper","player start");
audioTrack.play();
while (isPlaying && dis.available() > 0) {
int i = 0;
while (dis.available() > 0 && i < audiodata.length) {
audiodata[i] = dis.readShort();
i++;
if(i/50==0)
Log.d("Clapper", "playing now"+i);
}
audioTrack.write(audiodata, 0, audiodata.length);
}
Log.d("Clapper","AUDIO LENGTH: "+String.valueOf(audiodata));
dis.close();
audioTrack.stop();
} catch (Throwable t) {
Log.e("AudioTrack", "Playback Failed");
}
Log.d("Clapper","AUDIO state: "+String.valueOf(audioTrack.getPlayState()));
talkAnimation.stop();
if(audioTrack.getPlayState()!=AudioTrack.PLAYSTATE_PLAYING)
{
runOnUiThread(new Runnable() {
public void run() {
// TODO Auto-generated method stub
imgtalk.setBackgroundResource(R.drawable.talk1);
}
});
}
}
I tried library called Sonic. Its basically for Speech as it use PSOLA algo to change pitch and tempo.
Sonic Library
i got your problem .Media player does not support cha
Consider using a SoundPool
http://developer.android.com/reference/android/media/SoundPool.html
It supports changing the pitch in realtime while playing
The playback rate can also be changed. A playback rate of 1.0 causes the sound to play at its original frequency (resampled, if necessary, to the hardware output frequency). A playback rate of 2.0 causes the sound to play at twice its original frequency, and a playback rate of 0.5 causes it to play at half its original frequency. The playback rate range is 0.5 to 2.0.
Once the sounds are loaded and play has started, the application can trigger sounds by calling SoundPool.play(). Playing streams can be paused or resumed, and the application can also alter the pitch by adjusting the playback rate in real-time for doppler or synthesis effects.
http://developer.android.com/reference/android/media/SoundPool.html#setRate(int, float)
IF you want to change pitch while playing sound you have to use sound pool .this is the best way to do this.you can fast forward your playing by some amount and see you feel that pitch has been changed.

Camera application for all Android devices

I'm currently developing a camera application for Android on which some problems have occurred. I need it to work on all Android devices and since all of these works in different ways specially with the camera hardware, I'm having a hard time finding a solution that works for every device.
My application main goal is to launch the camera on a button click, take a photo and upload it to a server. So I don't really need the functionality of saving the image on the device, but if that's needed for further image use I might as well allow it.
For example I'm testing my application on a Samsung Galaxy SII and a Motorola Pad. I got working code that launches the camera, which is by the way C# code since I'm using Monodroid:
Intent cameraIntent = new Intent(Android.Provider.MediaStore.ActionImageCapture);
StartActivityForResult(cameraIntent, PHOTO_CAPTURE);
And I fetch the result, similar to this guide I followed:
http://kevinpotgieter.wordpress.com/2011/03/30/null-intent-passed-back-on-samsung-galaxy-tab/
Why I followed this guide is because the activity returns null on my galaxy device (Another device oriented problem).
This code works fine on the Galaxy device. It takes a photo and saves the photo in the gallery from which i can upload to a server. By further research this is apparently galaxy standard behaviour, so this doesn't work on my Motorola pad. The camera works fine, but no image is saved to gallery.
So with this background my question is, am I on the right path here? Do I need to save the image to gallery in order for further use in my application? Is there any solution that works for every Android device, cause that's the solution i need.
Thanks for any feedback!
After reading the linked article, the approach taken in that article is geared toward the Galaxy line, since they appear to write to the gallery automatically.
This article discusses some other scenarios in detail:
Android ACTION_IMAGE_CAPTURE Intent
So, I don't necessarily think that following the linked article that you provided is the right path. Not all devices automatically write to the gallery as described in that article, afaik. The article I linked to points to the issues being related to security and suggests writing the image to a /sdcard/tmp folder for storing the original image. Going down a similar path would more than likely lead to code that is going to work reliably across many devices.
Here are some other links for reference:
Google discussion regarding this subject: http://code.google.com/p/android/issues/detail?id=1480
Project with potential a solution to the problem: https://github.com/johnyma22/classdroid
While that discussion/project are in Java/Android SDK, the same concepts should apply to Monodroid. I'd be happy to help you adapt the code to a working Mono for Android solution if you need help.
To long2know:
Yes the same concepts applies to Monodroid. I've already read the stack article you linked among with some other similar. However i don't like the approach in that particular post since it checks for bugs for some devices that are hardcoded into a collection. Meaning it might fail to detect bugs in future devices. Since i won't be doing maintenance on this application, i can't allow this. I found a solution elsewhere and adapted it to my case and i'll post it below if someone would need it. It works on both my devices, guessing it would work for the majority of other devices. Thanks for your post!
Solution that allows you to snap a picture and use, also with the option of using a image from gallery. Solution uses option menu for these purposes, just for testing. (Monodroid code).
Camera code is inspired by:
access to full resolution pictures from camera with MonoDroid
namespace StackOverFlow.UsingCameraWithMonodroid
{
[Activity(Label = "ImageActivity")]
public class ImageActivity
private readonly static int TakePicture = 1;
private readonly static int SelectPicture = 2;
private string imageUriString;
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
this.SetContentView(Resource.Layout.ImageActivity);
}
public override bool OnCreateOptionsMenu(IMenu menu)
{
MenuInflater flate = this.MenuInflater;
flate.Inflate(Resource.Menu.ImageMenues, menu);
return base.OnCreateOptionsMenu(menu);
}
public override bool OnOptionsItemSelected(IMenuItem item)
{
switch (item.ItemId)
{
case Resource.Id.UseExisting:
this.SelectImageFromStorage();
return true;
case Resource.Id.AddNew:
this.StartCamera();
return true;
default:
return base.OnOptionsItemSelected(item);
}
}
private Boolean isMounted
{
get
{
return Android.OS.Environment.ExternalStorageState.Equals(Android.OS.Environment.MediaMounted);
}
}
private void StartCamera()
{
var imageUri = ContentResolver.Insert(isMounted ? MediaStore.Images.Media.ExternalContentUri
: MediaStore.Images.Media.InternalContentUri, new ContentValues());
this.imageUriString = imageUri.ToString();
var cameraIntent = new Intent(MediaStore.ActionImageCapture);
cameraIntent.PutExtra(MediaStore.ExtraOutput, imageUri);
this.StartActivityForResult(cameraIntent, TakePicture);
}
private void SelectImageFromStorage()
{
Intent intent = new Intent();
intent.SetType("image/*");
intent.SetAction(Intent.ActionGetContent);
this.StartActivityForResult(Intent.CreateChooser(intent,
"Select Picture"), SelectPicture);
}
// Example code of using the result, in my case i want to upload in another activity
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
// If a picture was taken
if (resultCode == Result.Ok && requestCode == TakePicture)
{
// For some devices data can become null when using the camera activity.
// For this reason we save pass the already saved imageUriString to the upload activity
// in order to adapt to every device. Instead we would want to use the data intent
// like in the SelectPicture option.
var uploadIntent = new Intent(this.BaseContext, typeof(UploadActivity));
uploadIntent.PutExtra("ImageUri", this.imageUriString);
this.StartActivity(uploadIntent);
}
// User has selected a image from storage
else if (requestCode == SelectPicture)
{
var uploadIntent = new Intent(this.BaseContext, typeof(UploadActivity));
uploadIntent.PutExtra("ImageUri", data.DataString);
this.StartActivity(uploadIntent);
}
}
}
}