Drawing arbitrary jpeg to Canvas - android-canvas

Im trying to draw a jpg to a canvas in onDraw() however the Bitmap type doesnt seem to be drawable (doesn't have the .draw method). My goal is to have a jpeg that is movable on the screen with a touch/drag. How do I need to draw this?
Here is my constructor (where i pass the image path to the view)
public TouchViewClass(Context context, AttributeSet attrs, int defStyle, String picPath) {
super(context, attrs, defStyle);
this.picPath=picPath;
}
Here is my onDraw
#Override
public void onDraw(Canvas canvas) {
super.onDraw(canvas);
Bitmap myImg = BitmapFactory.decodeFile(picPath);
canvas.save();
canvas.translate(mPosX, mPosY);
//Here is where the bitmap should be drawn
canvas.restore();
}

First of all, I would not recommend decoding file to bitmap in onDraw(). This method is called every single frame, so your draw process is going to be extremely slow. You need to decode your file beforehand (in your case in your constructor) and reuse this bitmap in onDraw.
As for actual drawing bitmap, you can easily draw bitmap by calling Canvas.drawBitmap() method.
So basically you don't even need to translate canvas - you can specify destination directly:
canvas.drawBitmap(bmp, mPosX, mPosY, null);

I would change the faulty implementation exposed to something like:
public class TouchViewClass extends View{
private Bitmap myImg;
private float mPosX = 0;
private float mPosY = 0;
private Paint paint = new Paint();
public float getmPosX() {
return mPosX;
}
public void setmPosX(float mPosX) {
this.mPosX = mPosX;
}
public float getmPosY() {
return mPosY;
}
public void setmPosY(float mPosY) {
this.mPosY = mPosY;
}
/**
* So you can inflate from XML
* #param context
* #param attrs
*/
public TouchViewClass(Context context, AttributeSet attrs) {
super(context, attrs);
}
/**
* So you can create in code
* #param context
* #param attrs
* #param picPath
*/
public TouchViewClass(Context context, AttributeSet attrs, String picPath) {
super(context, attrs);
setPicPath(context, picPath);
}
public void setPicPath(Context context, String picPath){
if(!TextUtils.isEmpty(picPath)){
myImg = BitmapFactory.decodeFile(picPath);
}else{
Resources resources = context.getResources();
myImg = BitmapFactory.decodeResource(resources, R.drawable.ic_launcher);
}
}
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.drawBitmap(myImg, mPosX, mPosY, paint);
}
}

Related

How to turn on flash in picture mode

[SOLVED]
After searching for an answer, I didn't find a solution for turning on the flash when in picture mode.
The app opens the camera in the background, and continulsy processes the pictures and detects objects, but the phone is located in a container which doesn't have light there, thus I need to make sure the flash is always opened.
There can be other approaches I'm considering as well and I'm not sure how to get these approaches to work also:
Switch to Video Mode. (Because I'm processing the pictures of the camera preview anyway, and in video mode the flash mode can work w/o recording a vide).
Set the camera default app to different app which supports image preview with flash when tapping on screen (I'll need to figure out how to switch to different app and how to simulate tapping, maybe even with another device which is connected to the app w/ bluetooth and sends clicks).
Override camera's API and make sure the Flash can be on, or just disabled and let another app turn on the flash.
This doesn't seem to work: (in the last code block)
Camera.Parameters parameters = camera.getParameters();
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_ON);
Solution 1 or 3 should be ideal, any ideas how to make it work? This is the code I'm using:
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.Dialog;
import android.app.DialogFragment;
import android.app.Fragment;
import android.content.Context;
import android.content.DialogInterface;
import android.content.res.Configuration;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.ImageReader;
import android.media.ImageReader.OnImageAvailableListener;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.text.TextUtils;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.LayoutInflater;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.Toast;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;
import org.tensorflow.lite.examples.classification.customview.AutoFitTextureView;
import org.tensorflow.lite.examples.classification.env.Logger;
public class CameraConnectionFragment extends Fragment {
private static final Logger LOGGER = new Logger();
/**
* The camera preview size will be chosen to be the smallest frame by pixel size capable of
* containing a DESIRED_SIZE x DESIRED_SIZE square.
*/
private static final int MINIMUM_PREVIEW_SIZE = 320;
/** Conversion from screen rotation to JPEG orientation. */
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
private static final String FRAGMENT_DIALOG = "dialog";
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
/** A {#link Semaphore} to prevent the app from exiting before closing the camera. */
private final Semaphore cameraOpenCloseLock = new Semaphore(1);
/** A {#link OnImageAvailableListener} to receive frames as they are available. */
private final OnImageAvailableListener imageListener;
/** The input size in pixels desired by TensorFlow (width and height of a square bitmap). */
private final Size inputSize;
/** The layout identifier to inflate for this Fragment. */
private final int layout;
private final ConnectionCallback cameraConnectionCallback;
private final CameraCaptureSession.CaptureCallback captureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureProgressed(
final CameraCaptureSession session,
final CaptureRequest request,
final CaptureResult partialResult) {}
#Override
public void onCaptureCompleted(
final CameraCaptureSession session,
final CaptureRequest request,
final TotalCaptureResult result) {}
};
/** ID of the current {#link CameraDevice}. */
private String cameraId;
/** An {#link AutoFitTextureView} for camera preview. */
private AutoFitTextureView textureView;
/** A {#link CameraCaptureSession } for camera preview. */
private CameraCaptureSession captureSession;
/** A reference to the opened {#link CameraDevice}. */
private CameraDevice cameraDevice;
/** The rotation in degrees of the camera sensor from the display. */
private Integer sensorOrientation;
/** The {#link Size} of camera preview. */
private Size previewSize;
/** An additional thread for running tasks that shouldn't block the UI. */
private HandlerThread backgroundThread;
/** A {#link Handler} for running tasks in the background. */
private Handler backgroundHandler;
/**
* {#link TextureView.SurfaceTextureListener} handles several lifecycle events on a {#link
* TextureView}.
*/
private final TextureView.SurfaceTextureListener surfaceTextureListener =
new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(
final SurfaceTexture texture, final int width, final int height) {
openCamera(width, height);
}
#Override
public void onSurfaceTextureSizeChanged(
final SurfaceTexture texture, final int width, final int height) {
configureTransform(width, height);
}
#Override
public boolean onSurfaceTextureDestroyed(final SurfaceTexture texture) {
return true;
}
#Override
public void onSurfaceTextureUpdated(final SurfaceTexture texture) {}
};
/** An {#link ImageReader} that handles preview frame capture. */
private ImageReader previewReader;
/** {#link CaptureRequest.Builder} for the camera preview */
private CaptureRequest.Builder previewRequestBuilder;
/** {#link CaptureRequest} generated by {#link #previewRequestBuilder} */
private CaptureRequest previewRequest;
/** {#link CameraDevice.StateCallback} is called when {#link CameraDevice} changes its state. */
private final CameraDevice.StateCallback stateCallback =
new CameraDevice.StateCallback() {
#Override
public void onOpened(final CameraDevice cd) {
// This method is called when the camera is opened. We start camera preview here.
cameraOpenCloseLock.release();
cameraDevice = cd;
createCameraPreviewSession();
}
#Override
public void onDisconnected(final CameraDevice cd) {
cameraOpenCloseLock.release();
cd.close();
cameraDevice = null;
}
#Override
public void onError(final CameraDevice cd, final int error) {
cameraOpenCloseLock.release();
cd.close();
cameraDevice = null;
final Activity activity = getActivity();
if (null != activity) {
activity.finish();
}
}
};
#SuppressLint("ValidFragment")
private CameraConnectionFragment(
final ConnectionCallback connectionCallback,
final OnImageAvailableListener imageListener,
final int layout,
final Size inputSize) {
this.cameraConnectionCallback = connectionCallback;
this.imageListener = imageListener;
this.layout = layout;
this.inputSize = inputSize;
}
/**
* Given {#code choices} of {#code Size}s supported by a camera, chooses the smallest one whose
* width and height are at least as large as the minimum of both, or an exact match if possible.
*
* #param choices The list of sizes that the camera supports for the intended output class
* #param width The minimum desired width
* #param height The minimum desired height
* #return The optimal {#code Size}, or an arbitrary one if none were big enough
*/
protected static Size chooseOptimalSize(final Size[] choices, final int width, final int height) {
final int minSize = Math.max(Math.min(width, height), MINIMUM_PREVIEW_SIZE);
final Size desiredSize = new Size(width, height);
// Collect the supported resolutions that are at least as big as the preview Surface
boolean exactSizeFound = false;
final List<Size> bigEnough = new ArrayList<Size>();
final List<Size> tooSmall = new ArrayList<Size>();
for (final Size option : choices) {
if (option.equals(desiredSize)) {
// Set the size but don't return yet so that remaining sizes will still be logged.
exactSizeFound = true;
}
if (option.getHeight() >= minSize && option.getWidth() >= minSize) {
bigEnough.add(option);
} else {
tooSmall.add(option);
}
}
LOGGER.i("Desired size: " + desiredSize + ", min size: " + minSize + "x" + minSize);
LOGGER.i("Valid preview sizes: [" + TextUtils.join(", ", bigEnough) + "]");
LOGGER.i("Rejected preview sizes: [" + TextUtils.join(", ", tooSmall) + "]");
if (exactSizeFound) {
LOGGER.i("Exact size match found.");
return desiredSize;
}
// Pick the smallest of those, assuming we found any
if (bigEnough.size() > 0) {
final Size chosenSize = Collections.min(bigEnough, new CompareSizesByArea());
LOGGER.i("Chosen size: " + chosenSize.getWidth() + "x" + chosenSize.getHeight());
return chosenSize;
} else {
LOGGER.e("Couldn't find any suitable preview size");
return choices[0];
}
}
public static CameraConnectionFragment newInstance(
final ConnectionCallback callback,
final OnImageAvailableListener imageListener,
final int layout,
final Size inputSize) {
return new CameraConnectionFragment(callback, imageListener, layout, inputSize);
}
/**
* Shows a {#link Toast} on the UI thread.
*
* #param text The message to show
*/
private void showToast(final String text) {
final Activity activity = getActivity();
if (activity != null) {
activity.runOnUiThread(
new Runnable() {
#Override
public void run() {
Toast.makeText(activity, text, Toast.LENGTH_SHORT).show();
}
});
}
}
#Override
public View onCreateView(
final LayoutInflater inflater, final ViewGroup container, final Bundle savedInstanceState) {
return inflater.inflate(layout, container, false);
}
#Override
public void onViewCreated(final View view, final Bundle savedInstanceState) {
textureView = (AutoFitTextureView) view.findViewById(R.id.texture);
}
#Override
public void onActivityCreated(final Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
}
#Override
public void onResume() {
super.onResume();
startBackgroundThread();
// When the screen is turned off and turned back on, the SurfaceTexture is already
// available, and "onSurfaceTextureAvailable" will not be called. In that case, we can open
// a camera and start preview from here (otherwise, we wait until the surface is ready in
// the SurfaceTextureListener).
if (textureView.isAvailable()) {
openCamera(textureView.getWidth(), textureView.getHeight());
} else {
textureView.setSurfaceTextureListener(surfaceTextureListener);
}
}
#Override
public void onPause() {
closeCamera();
stopBackgroundThread();
super.onPause();
}
public void setCamera(String cameraId) {
this.cameraId = cameraId;
}
/** Sets up member variables related to camera. */
private void setUpCameraOutputs() {
final Activity activity = getActivity();
final CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
final CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
final StreamConfigurationMap map =
characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
sensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
// Danger, W.R.! Attempting to use too large a preview size could exceed the camera
// bus' bandwidth limitation, resulting in gorgeous previews but the storage of
// garbage capture data.
previewSize =
chooseOptimalSize(
map.getOutputSizes(SurfaceTexture.class),
inputSize.getWidth(),
inputSize.getHeight());
// We fit the aspect ratio of TextureView to the size of preview we picked.
final int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
textureView.setAspectRatio(previewSize.getWidth(), previewSize.getHeight());
textureView.setVisibility(View.GONE);
} else {
textureView.setAspectRatio(previewSize.getHeight(), previewSize.getWidth());
textureView.setVisibility(View.GONE);
}
} catch (final CameraAccessException e) {
LOGGER.e(e, "Exception!");
} catch (final NullPointerException e) {
// Currently an NPE is thrown when the Camera2API is used but not supported on the
// device this code runs.
// TODO(andrewharp): abstract ErrorDialog/RuntimeException handling out into new method and
// reuse throughout app.
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
throw new RuntimeException(getString(R.string.camera_error));
}
cameraConnectionCallback.onPreviewSizeChosen(previewSize, sensorOrientation);
}
/** Opens the camera specified by {#link CameraConnectionFragment#cameraId}. */
private void openCamera(final int width, final int height) {
setUpCameraOutputs();
configureTransform(width, height);
final Activity activity = getActivity();
final CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
if (!cameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
manager.openCamera(cameraId, stateCallback, backgroundHandler);
} catch (final CameraAccessException e) {
LOGGER.e(e, "Exception!");
} catch (final InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
}
}
/** Closes the current {#link CameraDevice}. */
private void closeCamera() {
try {
cameraOpenCloseLock.acquire();
if (null != captureSession) {
captureSession.close();
captureSession = null;
}
if (null != cameraDevice) {
cameraDevice.close();
cameraDevice = null;
}
if (null != previewReader) {
previewReader.close();
previewReader = null;
}
} catch (final InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
cameraOpenCloseLock.release();
}
}
/** Starts a background thread and its {#link Handler}. */
private void startBackgroundThread() {
backgroundThread = new HandlerThread("ImageListener");
backgroundThread.start();
backgroundHandler = new Handler(backgroundThread.getLooper());
}
/** Stops the background thread and its {#link Handler}. */
private void stopBackgroundThread() {
backgroundThread.quitSafely();
try {
backgroundThread.join();
backgroundThread = null;
backgroundHandler = null;
} catch (final InterruptedException e) {
LOGGER.e(e, "Exception!");
}
}
/** Creates a new {#link CameraCaptureSession} for camera preview. */
private void createCameraPreviewSession() {
try {
final SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(previewSize.getWidth(), previewSize.getHeight());
// This is the output Surface we need to start preview.
final Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(surface);
LOGGER.i("Opening camera preview: " + previewSize.getWidth() + "x" + previewSize.getHeight());
// Create the reader for the preview frames.
previewReader =
ImageReader.newInstance(
previewSize.getWidth(), previewSize.getHeight(), ImageFormat.YUV_420_888, 2);
previewReader.setOnImageAvailableListener(imageListener, backgroundHandler);
previewRequestBuilder.addTarget(previewReader.getSurface());
// Here, we create a CameraCaptureSession for camera preview.
cameraDevice.createCaptureSession(
Arrays.asList(surface, previewReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(final CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
captureSession = cameraCaptureSession;
try {
// Auto focus should be continuous for camera preview.
previewRequestBuilder.set(
CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
// previewRequestBuilder.set(
// CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
previewRequestBuilder.set(
CaptureRequest.FLASH_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
// Finally, we start displaying the camera preview.
previewRequest = previewRequestBuilder.build();
captureSession.setRepeatingRequest(
previewRequest, captureCallback, backgroundHandler);
} catch (final CameraAccessException e) {
LOGGER.e(e, "Exception!");
}
}
#Override
public void onConfigureFailed(final CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
},
null);
} catch (final CameraAccessException e) {
LOGGER.e(e, "Exception!");
}
}
}
}
The second one:
public class LegacyCameraConnectionFragment extends Fragment {
private static final Logger LOGGER = new Logger();
/** Conversion from screen rotation to JPEG orientation. */
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
private Camera camera;
private Camera.PreviewCallback imageListener;
private Size desiredSize;
/** The layout identifier to inflate for this Fragment. */
private int layout;
/** An {#link AutoFitTextureView} for camera preview. */
private AutoFitTextureView textureView;
/**
* {#link TextureView.SurfaceTextureListener} handles several lifecycle events on a {#link
* TextureView}.
*/
private final TextureView.SurfaceTextureListener surfaceTextureListener =
new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(
final SurfaceTexture texture, final int width, final int height) {
int index = getCameraId();
camera = Camera.open(index);
try {
Camera.Parameters parameters = camera.getParameters();
List<String> focusModes = parameters.getSupportedFocusModes();
if (focusModes != null
&& focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
}
List<Camera.Size> cameraSizes = parameters.getSupportedPreviewSizes();
Size[] sizes = new Size[cameraSizes.size()];
int i = 0;
for (Camera.Size size : cameraSizes) {
sizes[i++] = new Size(size.width, size.height);
}
Size previewSize =
CameraConnectionFragment.chooseOptimalSize(
sizes, desiredSize.getWidth(), desiredSize.getHeight());
parameters.setPreviewSize(previewSize.getWidth(), previewSize.getHeight());
camera.setDisplayOrientation(90);
camera.setParameters(parameters);
camera.setPreviewTexture(texture);
} catch (IOException exception) {
camera.release();
}
camera.setPreviewCallbackWithBuffer(imageListener);
Camera.Size s = camera.getParameters().getPreviewSize();
camera.addCallbackBuffer(new byte[ImageUtils.getYUVByteSize(s.height, s.width)]);
textureView.setAspectRatio(s.height, s.width);
camera.startPreview();
}
#Override
public void onSurfaceTextureSizeChanged(
final SurfaceTexture texture, final int width, final int height) {}
#Override
public boolean onSurfaceTextureDestroyed(final SurfaceTexture texture) {
return true;
}
#Override
public void onSurfaceTextureUpdated(final SurfaceTexture texture) {}
};
/** An additional thread for running tasks that shouldn't block the UI. */
private HandlerThread backgroundThread;
#SuppressLint("ValidFragment")
public LegacyCameraConnectionFragment(
final Camera.PreviewCallback imageListener, final int layout, final Size desiredSize) {
this.imageListener = imageListener;
this.layout = layout;
this.desiredSize = desiredSize;
}
#Override
public View onCreateView(
final LayoutInflater inflater, final ViewGroup container, final Bundle savedInstanceState) {
return inflater.inflate(layout, container, false);
}
#Override
public void onViewCreated(final View view, final Bundle savedInstanceState) {
textureView = (AutoFitTextureView) view.findViewById(R.id.texture);
}
#Override
public void onActivityCreated(final Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
}
#Override
public void onResume() {
super.onResume();
startBackgroundThread();
// When the screen is turned off and turned back on, the SurfaceTexture is already
// available, and "onSurfaceTextureAvailable" will not be called. In that case, we can open
// a camera and start preview from here (otherwise, we wait until the surface is ready in
// the SurfaceTextureListener).
if (textureView.isAvailable()) {
camera.startPreview();
} else {
textureView.setSurfaceTextureListener(surfaceTextureListener);
}
}
#Override
public void onPause() {
stopCamera();
stopBackgroundThread();
super.onPause();
}
/** Starts a background thread and its {#link Handler}. */
private void startBackgroundThread() {
backgroundThread = new HandlerThread("CameraBackground");
backgroundThread.start();
}
/** Stops the background thread and its {#link Handler}. */
private void stopBackgroundThread() {
backgroundThread.quitSafely();
try {
backgroundThread.join();
backgroundThread = null;
} catch (final InterruptedException e) {
LOGGER.e(e, "Exception!");
}
}
protected void stopCamera() {
if (camera != null) {
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
camera = null;
}
}
private int getCameraId() {
CameraInfo ci = new CameraInfo();
for (int i = 0; i < Camera.getNumberOfCameras(); i++) {
Camera.getCameraInfo(i, ci);
if (ci.facing == CameraInfo.CAMERA_FACING_BACK) return i;
}
return -1; // No camera found
}
}
SOLUTION: in second block code:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_TORCH);
and in first block code:
//Check Whether device supports AutoFlash, If you YES then set AutoFlash
List<String> flashModes = parameters.getSupportedFlashModes();
if (flashModes.contains(android.hardware.Camera.Parameters.FLASH_MODE_AUTO))
{
parameters.setFlashMode(parameters.FLASH_MODE_AUTO);
}
SOLUTION: in second block code:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_TORCH);
and in first block code:
//Check Whether device supports AutoFlash, If you YES then set AutoFlash
List<String> flashModes = parameters.getSupportedFlashModes();
if (flashModes.contains(android.hardware.Camera.Parameters.FLASH_MODE_AUTO))
{
parameters.setFlashMode(parameters.FLASH_MODE_AUTO);
}

Initialize array in Kotlin based on list size

I need to add LineChart(using MpAndroidChart) dynamically in LinearLayout.
I have declared an arrayList,named list.
val list = arrayListOf<ABC>()
....
for (i in list) {
chart[] = LineChart(activity)
}
What is the value I should put inside [] ? Let say the list's saiz is 2, I need to have 2 chart in LinearLayout.
How should I initialize LineChart?
Example
chart[i] = LineChart(activity) ???
LineChart
public class LineChart extends BarLineChartBase<LineData> implements LineDataProvider {
public LineChart(Context context) {
super(context);
}
public LineChart(Context context, AttributeSet attrs) {
super(context, attrs);
}
public LineChart(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs, defStyle);
}
#Override
protected void init() {
super.init();
mRenderer = new LineChartRenderer(this, mAnimator, mViewPortHandler);
}
#Override
public LineData getLineData() {
return mData;
}
#Override
protected void onDetachedFromWindow() {
// releases the bitmap in the renderer to avoid oom error
if (mRenderer != null && mRenderer instanceof LineChartRenderer) {
((LineChartRenderer) mRenderer).releaseBitmap();
}
super.onDetachedFromWindow();
}
}
Your question is a little unclear, but I'm following correctly, you can do it in one line like this:
val chart = Array(list.size){ LineChart(list[it]) }
Or:
val chart = list.map{ LineChart(it) }.toTypedArray()
(The latter creates a temporary list, which may be slightly less efficient; but it iterates through the list instead of indexing, which could be faster if the list isn't random-access.)
This is my answer
for (i in 0 unti list.size) {
chart[i] = LineChart(activity)
}

How to draw several lines slowly in constant velocity on canvas by Android?

I need capture the mark to draw a figure on canvas in Android, and the effect just like the follow gif:
Well, as far, I can draw a side with constant velocity by ValueAnimator. However, I just only can draw one side at one time, because I can't save the last side when drawing the next side. So, is there a good way to solve the problem?
Code for draw a line slowly by ValueAnimator:
GraphicsView.java
public class GraphicsView extends View {
private int stepX, stepY = 0;
private int startX, startY, stopX, stopY = 0;
private Paint paint = null;
public GraphicsView(Context context) {
super(context);
// Paint
paint = new Paint();
paint.setAntiAlias(true);
paint.setColor(Color.RED);
paint.setStyle(Paint.Style.STROKE);
startX = 40;
startY = 397;
stopX = 1040;
stopY = 397;
Init();
}
public void Init(){
ValueAnimator animatorX = ValueAnimator.ofFloat(startX, stopX);
ValueAnimator animatorY = ValueAnimator.ofFloat(startY, stopY);
animatorX.addUpdateListener(new ValueAnimator.AnimatorUpdateListener() {
#Override
public void onAnimationUpdate(ValueAnimator valueAnimator) {
stepX = Math.round((Float)valueAnimator.getAnimatedValue()); invalidate();
}
});
animatorY.addUpdateListener(new ValueAnimator.AnimatorUpdateListener() {
#Override
public void onAnimationUpdate(ValueAnimator valueAnimator) {
stepY = Math.round((Float)valueAnimator.getAnimatedValue()); invalidate();
}
});
AnimatorSet set = new AnimatorSet();
LinearInterpolator l = new LinearInterpolator();
set.setInterpolator(l);
set.setDuration(3000);
set.playTogether(animatorX, animatorY);
set.start();
}
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.drawLine(startX, startY, stepX, stepY, paint);
}
}
MainActivity.java
public class MainActivity extends AppCompatActivity {
private Display display = null;
private GraphicsView view = null;
private ConstraintLayout layout = null;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
display = getWindowManager().getDefaultDisplay();
layout = (ConstraintLayout)findViewById(R.id.main_layout);
view = new GraphicsView(this);
view.setMinimumWidth(display.getWidth());
view.setMinimumHeight(display.getHeight());
layout.addView(view);
}
}
you can use the ObjectAnimator class to callback
to one of your class methods every time you'd like to draw a bit more of the path.
import android.animation.ObjectAnimator;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.DashPathEffect;
import android.graphics.Paint;
import android.graphics.Path;
import android.graphics.PathEffect;
import android.graphics.PathMeasure;
import android.util.AttributeSet;
import android.view.View;
import android.util.Log;
public class PathView extends View
{
Path path;
Paint paint;
float length;
public PathView(Context context)
{
super(context);
}
public PathView(Context context, AttributeSet attrs)
{
super(context, attrs);
}
public PathView(Context context, AttributeSet attrs, int defStyleAttr)
{
super(context, attrs, defStyleAttr);
}
public void init()
{
paint = new Paint();
paint.setColor(Color.BLUE);
paint.setStrokeWidth(10);
paint.setStyle(Paint.Style.STROKE);
path = new Path();
path.moveTo(50, 50);
path.lineTo(50, 500);
path.lineTo(200, 500);
path.lineTo(200, 300);
path.lineTo(350, 300);
// Measure the path
PathMeasure measure = new PathMeasure(path, false);
length = measure.getLength();
float[] intervals = new float[]{length, length};
ObjectAnimator animator = ObjectAnimator.ofFloat(PathView.this, "phase", 1.0f, 0.0f);
animator.setDuration(3000);
animator.start();
}
//is called by animtor object
public void setPhase(float phase)
{
Log.d("pathview","setPhase called with:" + String.valueOf(phase));
paint.setPathEffect(createPathEffect(length, phase, 0.0f));
invalidate();//will calll onDraw
}
private static PathEffect createPathEffect(float pathLength, float phase, float offset)
{
return new DashPathEffect(new float[] { pathLength, pathLength },
Math.max(phase * pathLength, offset));
}
#Override
public void onDraw(Canvas c)
{
super.onDraw(c);
c.drawPath(path, paint);
}
}
Then, just call init() to begin the animation, like this (or if you'd like it to start as soon as the view is inflated, put the init() call inside the constructors):
PathView path_view = (PathView) root_view.findViewById(R.id.path);
path_view.init();
Also see this question here, and
Using Value Animator Example
Reference 1
Reference 2
Reference 3

Automatically scrolling TextView inside a RecyclerView

My current problem is the following: I have a RecyclerView and each element of this RecyclerView has a TextView. I want that the TextView scrolls automatically. What if tried so far:
Customize TextView:
public class ScrollCustomTextView extends AppCompatTextView {
public ScrollCustomTextView(Context context) {
super(context);
}
public ScrollCustomTextView(Context context, AttributeSet attrs, int defStyle) {
super(context,attrs,defStyle);
}
public ScrollCustomTextView(Context context, AttributeSet attrs) {
super(context,attrs);
}
#Override
protected void onFocusChanged(boolean focused, int direction, Rect previouslyFocusedRect) {
if (focused)
super.onFocusChanged(focused, direction, previouslyFocusedRect);
}
#Override
public void onWindowFocusChanged(boolean focused) {
if(focused)
super.onWindowFocusChanged(focused);
}
#Override
public boolean isFocused() {
return true;
}
}
and my .xml file looks like this:
<package.ScrollCustomTextView
android:id="#+id/main_idea_name"
android:scrollHorizontally="true"
android:ellipsize="marquee"
android:marqueeRepeatLimit="marquee_forever"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="#color/darkGreen"
android:textSize="27dp"
android:layout_marginTop="10dp"
android:maxLines="1"
android:textAppearance="#android:style/TextAppearance.Medium"
/>
And in RecyclerView in the constructor for the ViewHolder I set
textView.setSelected(true);
But it doesn't work. Does anyone have an idea to solve this?
Thanks.
So here is the answer:
I put my TextView inside a HorizontalScrollView and added the following code to my adapter for the RecyclerView:
private void animateTextView(TextView mTextView) {
int textWidth = getTextViewWidth(mTextView);
int displayWidth = getDisplayWidth(mContext);
/* Start animation only when text is longer than dislay width. */
if(displayWidth<=textWidth) {
Animation mAnimation = new TranslateAnimation(
0, -textWidth,
0, 0);
mAnimation.setDuration(10000); // Set custom duration.
mAnimation.setStartOffset(1000); // Set custom offset.
mAnimation.setRepeatMode(Animation.RESTART); // This will animate text back ater it reaches end.
mAnimation.setRepeatCount(Animation.INFINITE); // Infinite animation.
mTextView.startAnimation(mAnimation);
}
}
private int getDisplayWidth(Context context) {
int displayWidth;
WindowManager windowManager = (WindowManager)context.getSystemService(
Context.WINDOW_SERVICE);
Display display = windowManager.getDefaultDisplay();
Point screenSize = new Point();
if(Build.VERSION.SDK_INT>=Build.VERSION_CODES.HONEYCOMB_MR2) {
display.getSize(screenSize);
displayWidth = screenSize.x;
} else {
displayWidth = display.getWidth();
}
return displayWidth;
}
private int getTextViewWidth(TextView textView) {
textView.measure(0, 0); // Need to set measure to (0, 0).
return textView.getMeasuredWidth();
}
And start the animation with:
animateTextView(txtView);
Note: there is no more need for the customized TextView, just use a normal TextView.

Eclipse plugin - ColumnLabelProvider display only image

So I am developing an Eclipse plug-in and using ColumnLabelProvider to provide label for the columns of my tree viewer.
However, in one of the columns, I only intend to display an image and no text. However, in the final display, Eclipse reserves blank space for the text element even if I return a null.
Is there any way to make it display only image and in the full space provided?
Here is the code snippet:
column4.setLabelProvider(new ColumnLabelProvider() {
#Override
public String getText(Object element) {
return null;
}
#Override
public Image getImage(Object element) {
/* Code to Display an image follows */
.....
}
});
ColumnLabelProvider will always leave space for the text.
You can use a class derived from OwnerDrawLabelProvider to draw the column yourself.
Something like:
public abstract class CentredImageCellLabelProvider extends OwnerDrawLabelProvider
{
protected CentredImageCellLabelProvider()
{
}
#Override
protected void measure(Event event, Object element)
{
}
#Override
protected void erase(final Event event, final Object element)
{
// Don't call super.erase() to suppress non-standard selection draw
}
#Override
protected void paint(final Event event, final Object element)
{
TableItem item = (TableItem)event.item;
Rectangle itemBounds = item.getBounds(event.index);
GC gc = event.gc;
Image image = getImage(element);
Rectangle imageBounds = image.getBounds();
int x = event.x + Math.max(0, (itemBounds.width - imageBounds.width) / 2);
int y = event.y + Math.max(0, (itemBounds.height - imageBounds.height) / 2);
gc.drawImage(image, x, y);
}
protected abstract Image getImage(Object element);
}