Adv

Android CameraX

Android CameraX in Java

Share

Related Concepts

Adv

2 Examples

  1. In this piece we will look at some of the easiest to understand CameraX examples. These examples are available in Github and can allow you to master CameraX.

    But first it is important to understand what CameraX is in the first place.

    What is CameraX?

    CameraX is a newer Camera API introduced in the Android Jetpack suite that is backwards compatible upto Android API level 21.

    Advantages of CameraX

    Here are the advantages of CameraX:

    1. Consistent across a variety of devices, starting from Android 5.0(API Level 21). No need to induce device specific code in your project.
    2. Easier than previous APIs.
    3. Has all the capabilities of Camera2.
    4. Lifecycle aware, like most Android Jetpack components.

     

    Installing CameraX

    To install CameraX you need to add it as a dependency in your app level build.gradle.

    // Use the most recent version of CameraX, currently that is alpha06.
    def camerax_version = '1.0.0-alpha06'
    implementation "androidx.camera:camera-core:${camerax_version}"
    implementation "androidx.camera:camera-camera2:${camerax_version}"

    It also requires some java8 methods:

    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }

    1. Simple CameraX Java Example

    1. First create a class extending AppCompatActivity:

    public class MainActivity extends AppCompatActivity {
    //...

    1. Define as instance fields Requestcode, permission and TextureView:

        private int REQUEST_CODE_PERMISSIONS = 10; //arbitrary number, can be changed accordingly
        private final String[] REQUIRED_PERMISSIONS = new String[]{"android.permission.CAMERA","android.permission.WRITE_EXTERNAL_STORAGE"};}; //array w/ permissions from manifest
        TextureView txView;

    1. Create a method to start the camera:

        private void startCamera() {
            //make sure there isn't another camera instance running before starting
            CameraX.unbindAll();
    
            /* start preview */
            int aspRatioW = txView.getWidth(); //get width of screen
            int aspRatioH = txView.getHeight(); //get height
            Rational asp = new Rational (aspRatioW, aspRatioH); //aspect ratio
            Size screen = new Size(aspRatioW, aspRatioH); //size of the screen
    
            //config obj for preview/viewfinder thingy.
            PreviewConfig pConfig = new PreviewConfig.Builder().setTargetAspectRatio(asp).setTargetResolution(screen).build();
            Preview preview = new Preview(pConfig); //lets build it
    
            preview.setOnPreviewOutputUpdateListener(
                    new Preview.OnPreviewOutputUpdateListener() {
                        //to update the surface texture we have to destroy it first, then re-add it
                        @Override
                        public void onUpdated(Preview.PreviewOutput output){
                            ViewGroup parent = (ViewGroup) txView.getParent();
                            parent.removeView(txView);
                            parent.addView(txView, 0);
    
                            txView.setSurfaceTexture(output.getSurfaceTexture());
                            updateTransform();
                        }
                    });
    
            /* image capture */
    
            //config obj, selected capture mode
            ImageCaptureConfig imgCapConfig = new ImageCaptureConfig.Builder().setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY)
                    .setTargetRotation(getWindowManager().getDefaultDisplay().getRotation()).build();
            final ImageCapture imgCap = new ImageCapture(imgCapConfig);
    
            findViewById(R.id.capture_button).setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View v) {
                    File file = new File(Environment.getExternalStorageDirectory() + "/" + System.currentTimeMillis() + ".jpg");
                    imgCap.takePicture(file, new ImageCapture.OnImageSavedListener() {
                        @Override
                        public void onImageSaved(@NonNull File file) {
                            String msg = "Photo capture succeeded: " + file.getAbsolutePath();
                            Toast.makeText(getBaseContext(), msg,Toast.LENGTH_LONG).show();
                        }
    
                        @Override
                        public void onError(@NonNull ImageCapture.UseCaseError useCaseError, @NonNull String message, @Nullable Throwable cause) {
                            String msg = "Photo capture failed: " + message;
                            Toast.makeText(getBaseContext(), msg,Toast.LENGTH_LONG).show();
                            if(cause != null){
                                cause.printStackTrace();
                            }
                        }
                    });
                }
            });
     /* image analyser */
    
            ImageAnalysisConfig imgAConfig = new ImageAnalysisConfig.Builder().setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE).build();
            ImageAnalysis analysis = new ImageAnalysis(imgAConfig);
    
            analysis.setAnalyzer(
                new ImageAnalysis.Analyzer(){
                    @Override
                    public void analyze(ImageProxy image, int rotationDegrees){
                        //y'all can add code to analyse stuff here idek go wild.
                    }
                });
    
            //bind to lifecycle:
            CameraX.bindToLifecycle((LifecycleOwner)this, analysis, imgCap, preview);
        }

    1. Check if permissions have been granted

     private boolean allPermissionsGranted(){
            //check if req permissions have been granted
            for(String permission : REQUIRED_PERMISSIONS){
                if(ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED){
                    return false;
                }
            }
            return true;
        }

    1. Handle Permission Results

        @Override
        public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
            //start camera when permissions have been granted otherwise exit app
            if(requestCode == REQUEST_CODE_PERMISSIONS){
                if(allPermissionsGranted()){
                    startCamera();
                } else{
                    Toast.makeText(this, "Permissions not granted by the user.", Toast.LENGTH_SHORT).show();
                    finish();
                }
            }
        }

    Special thanks to @thunderedge for this example.

    Download example

    2. Simple CameraX App

    This is a translation of the demo provided by Google codelabs into Java. This example comprises two files:

    (a). LuminosityAnalyzer .java

    Start by creating a file called LuminosityAnalyzer.java. Make it implement androidx.camera.core.ImageAnalysis.Analyzer interface.

    public class LuminosityAnalyzer implements ImageAnalysis.Analyzer {

    Create a long instance field to hold the last analyisis timestamp:

    private long lastAnalyzedTimestamp = 0L;

    Create a helper method to extract byte array from image plane buffer:

        /**
         * Helper extension function used to extract a byte array from an
         * image plane buffer
         */
        private byte[] byteBufferToByteArray(ByteBuffer buffer) {
            buffer.rewind();
            byte[] data = new byte[buffer.remaining()];
            buffer.get(data);
            return data;
        }

    Override the analyze method

        @Override
        public void analyze(ImageProxy image, int rotationDegrees) {
            long currentTimestamp = System.currentTimeMillis();
            // Calculate the average luma no more often than every second
            if (currentTimestamp - lastAnalyzedTimestamp >= TimeUnit.SECONDS.toMillis(1)) {
                // Since format in ImageAnalysis is YUV, image.planes[0]
                // contains the Y (luminance) plane
                ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                // Extract image data from callback object
                byte[] data = byteBufferToByteArray(buffer);
                // Convert the data into an array of pixel values
                // NOTE: this is translated from the following kotlin code, ain't sure about it being right
                // val pixels = data.map { it.toInt() and 0xFF }
                int[] pixels = new int[data.length];
                int pos = 0;
                for (byte b : data) {
                    pixels[pos] = b & 0xFF;
                    pos++;
                }
                // Compute average luminance for the image
                double luma = Arrays.stream(pixels).average().orElse(Double.NaN);
                // Log the new luma value
                Log.d("CameraXApp", "Average luminosity: " + luma);
                // Update timestamp of last analyzed frame
                lastAnalyzedTimestamp = currentTimestamp;
            }

    (b). MainActivity.java

    Then your main activity:

    public class MainActivity extends AppCompatActivity {
        private final static int REQUEST_CODE_PERMISSION = 10;
        private final static String[] REQUIRED_PERMISSIONS = new String[]{Manifest.permission.CAMERA};
    
        private TextureView viewFinder;
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_main);
    
            viewFinder = findViewById(R.id.view_finder);
            viewFinder.addOnLayoutChangeListener(new View.OnLayoutChangeListener() {
                @Override
                public void onLayoutChange(View v, int left, int top, int right, int bottom, int oldLeft, int oldTop, int oldRight, int oldBottom) {
                    // Every time the provided texture view changes, recompute layout
                    updateTransform();
                }
            });
            if (allPermissionsGranted()) {
                viewFinder.post(new Runnable() {
                    @Override
                    public void run() {
                        startCamera();
                    }
                });
            } else {
                ActivityCompat.requestPermissions(
                        this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSION);
            }
        }
    
        private void updateTransform() {
            Matrix matrix = new Matrix();
    
            // Compute the center of the view finder
            float centerX = viewFinder.getWidth() / 2;
            float centerY = viewFinder.getHeight() / 2;
    
            // Correct preview output to account for display rotation
            int rotationDegrees;
            switch (viewFinder.getDisplay().getRotation()) {
                case Surface.ROTATION_0:
                    rotationDegrees = 0;
                    break;
                case Surface.ROTATION_90:
                    rotationDegrees = 90;
                    break;
                case Surface.ROTATION_180:
                    rotationDegrees = 180;
                    break;
                case Surface.ROTATION_270:
                    rotationDegrees = 270;
                    break;
                default:
                    return;
            }
            matrix.postRotate(rotationDegrees, centerX, centerY);
    
            // Finally, apply transformations to our TextureView
            viewFinder.setTransform(matrix);
        }
    
        @Override
        public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
            if (requestCode == REQUEST_CODE_PERMISSION) {
                if (allPermissionsGranted()) {
                    viewFinder.post(new Runnable() {
                        @Override
                        public void run() {
                            startCamera();
                        }
                    });
                }
            }
        }
    
        private void startCamera() {
            // Create configuration object for the viewfinder use case
            PreviewConfig previewConfig = new PreviewConfig.Builder()
                    .setTargetAspectRatio(new Rational(1, 1))
                    .setTargetResolution(new Size(640, 640))
                    .build();
    
            // Build the viewfinder use case
            Preview preview = new Preview(previewConfig);
    
            // Everytime the viewfinder is updated, recompute the layout
            preview.setOnPreviewOutputUpdateListener(new Preview.OnPreviewOutputUpdateListener() {
                @Override
                public void onUpdated(Preview.PreviewOutput output) {
                    // To update the SurfaceTexture, we have to remove it and re-add it
                    ViewGroup parent = (ViewGroup) viewFinder.getParent();
                    parent.removeView(viewFinder);
                    parent.addView(viewFinder, 0);
    
                    viewFinder.setSurfaceTexture(output.getSurfaceTexture());
                    updateTransform();
                }
            });
    
            // Create configuration object for the image capture use case
            ImageCaptureConfig imageCaptureConfig = new ImageCaptureConfig.Builder()
                    .setTargetAspectRatio(new Rational(1, 1))
                    .setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY)
                    .build();
            final ImageCapture imageCapture = new ImageCapture(imageCaptureConfig);
            findViewById(R.id.capture_button).setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View v) {
                    File file = new File(getExternalMediaDirs()[0], System.currentTimeMillis() + ".jpg");
                    imageCapture.takePicture(file, new ImageCapture.OnImageSavedListener() {
                        @Override
                        public void onImageSaved(@NonNull File file) {
                            Toast.makeText(MainActivity.this, "Photo saved as " + file.getAbsolutePath(), Toast.LENGTH_SHORT).show();
                        }
    
                        @Override
                        public void onError(@NonNull ImageCapture.ImageCaptureError imageCaptureError, @NonNull String message, @Nullable Throwable cause) {
                            Toast.makeText(MainActivity.this, "Couldn't save photo: " + message, Toast.LENGTH_SHORT).show();
                            if (cause != null)
                                cause.printStackTrace();
                        }
                    });
                }
            });
    
            // Setup image analysis pipeline that computes average pixel luminance
            // TODO add analyzerThread and setCallbackHandler as in the original example in Kotlin
            ImageAnalysisConfig analysisConfig = new ImageAnalysisConfig.Builder()
                    .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE)
                    .build();
    
            // Build the image analysis use case and instantiate our analyzer
            ImageAnalysis imageAnalysis = new ImageAnalysis(analysisConfig);
            imageAnalysis.setAnalyzer(new LuminosityAnalyzer());
    
            // Bind use cases to lifecycle
            CameraX.bindToLifecycle(this, preview, imageCapture, imageAnalysis);
        }
    
        private boolean allPermissionsGranted() {
            for (String perm : REQUIRED_PERMISSIONS) {
                if (ContextCompat.checkSelfPermission(getBaseContext(), perm) != PackageManager.PERMISSION_GRANTED) {
                    return false;
                }
            }
            return true;
        }
    }

    Special thanks to @fmmarzoa for this example. Download full code here.

  2. Adv



Share an Example

Share an Example

Browse
What is the capital of Egypt? ( Cairo )