TL;DR: You can use the front and back cameras at the same time if the device’s software and hardware support it, which can’t be taken for granted. Currently, the only way how to query it is to use the concurrent camera streaming API or look for the FEATURE_CAMERA_CONCURRENT feature on the system.
Many modern Android phones have more than one camera on the front and/or the back. Some cameras work together to make your photos look better, like capturing more light with the help of a monochrome camera, or utilizing features such as the bokeh effect using additional information from a depth camera. Other cameras work independently, each offering a different feature to broaden your photo-taking abilities. You can choose a wide-angle camera to take ultrawide selfies, or zoom in on details using a telephoto camera without any physical changes on your device.
However, there’s a catch. These cameras are meant to face the same direction, i.e., either the front or back. What about using the front and back cameras at the same time?
This option has already been with us for a while. It was first introduced by HMD Global, the manufacturer of Android Nokia phones, as Dual-Sight mode (also known as Bothie mode) back in 2017. A few years later, Samsung released Director’s View and Dual recording modes in its camera app. Other examples are Xiaomi smartphones supporting Dual Video mode or Oppo devices with Dual-view Video.
FYI: Apple introduced Multi-Camera Capture for iOS in 2019.
Director’s View (Source: samsung.com)
In general, there are several phones with this feature, but it is not widely supported. Why is it like that in the world of dual-, triple- or even quad-camera smartphones, if they are already capable of processing inputs from multiple lenses? Why do some include it, while others do not?
Well, I don’t know exactly. Whatever the challenge, vendors will always keep their solutions for themselves to have an advantage in the market for business reasons.
Nevertheless, I’d like to share some insight from my recent investigation. Although it doesn’t provide all the answers, it might serve as a good starting point for anyone interested in the topic.
Overall, there are three options for adding camera functionality to an Android app:
CameraX (version 1.2.0-beta01 at the time of writing) does not support selecting the front and back cameras at the same time. You can request either the default front-facing camera or the default back-facing camera, but not both. To have more granular control over camera selection, you have to use Camera2.
For more information about camera selection in CameraX, check the Camera selection section in the CameraX guide.
It is crucial to understand the multi-camera API, which was introduced with Android 9 (API level 28). Even though it might sound like what we need, it’s not. The multi-camera API allows operating with more physical cameras simultaneously through an abstract (software) logical camera, but those cameras must point in the same direction.
The output of the logical camera depends on its implementation. It might be a stream from one of its physical cameras or a fused stream coming from more than one. Either way, there is only a single active camera session.
Concurrent Camera Streaming
Although it is not mentioned in the current Camera2 guide, starting with Android 11 (API level 30) the Camera2 API contains methods to find out if a device supports concurrent streaming of more cameras, including the possibility to operate on the front and back cameras at the same time.
Its presence can also be checked on PackageManager using the FEATURE_CAMERA_CONCURRENT feature name, which can be advertised by devices on API levels below 30.
Compared to the multi-camera API, the main difference is that concurrent cameras operate as standalone entities — every camera has its own camera session.
To allow app developers to query a device, manufacturers must implement appropriate methods defined in the camera Hardware Abstraction Layer (HAL).
Camera Hardware Abstraction Layer (HAL) defines standardized interfaces that connect the higher-level camera framework APIs with the camera driver and hardware. Every camera vendor must implement HAL so that apps can correctly operate with the camera hardware. For more information, check the documentation.
For instance, the Samsung Galaxy S22 series, with Director’s View in the camera app, allows using pairs of cameras corresponding to the set of combinations returned by the API. Google Pixel 6 behaves the same way, even though it does not have a similar mode in the camera app.
On the other hand, for some devices — such as Samsung Galaxy Z Flip or Xiaomi Poco X3 — with the appropriate camera mode and while allowing running the front and back cameras together programmatically, the FEATURE_CAMERA_CONCURRENT is not available and the returned set of combinations is empty. Does this mean that third-party apps should not rely on the functionality?
Apart from that, the majority of devices I tried invoked the onError callback with the ERROR_MAX_CAMERAS_IN_USE error code when I tried to open the front and back cameras, with their declared set of concurrent cameras empty.
But if we can run more cameras that are a part of one logical camera using multi-camera API on one device, why can’t the same device automatically handle more cameras that are not in one logical unit, thus getting concurrent streaming support for free?
There was only one way how to figure it out…
Image Signal Processor
In general, digital photography is about collecting light information through the optical lens and converting it into an electrical signal by image sensors. The raw data from sensors is further sent to an Image Signal Processor (ISP), where the original characteristics of the scene are reproduced to create an attractive picture that looks similar to what we’ve seen. Smartphones have ISP integrated within system-on-chip (SoC). Even though the tiny size of the lens and image sensors on a phone limits the visual output of the camera, ISP can compensate for it. Therefore, the ISP has as much impact on the image quality as the smartphone’s camera itself.
If we want to process data from more cameras, the ISP must have sufficient hardware resources for it — which might not be the case for every Android device. On the other hand, it is not something exceptional.
Current chipsets have already provided dual, triple or the latest ISPs with advanced processing pipelines that claim to be able to run two, three or even four cameras concurrently. After all, every device operating with multiple cameras facing the same direction should have corresponding hardware support. So why don’t manufacturers also utilize the power for running the front and back cameras simultaneously?
Dual ISP (Source: semiconductor.samsung.com)
Let’s take the Google Pixel 3 as an example.
The phone has a dual front camera. Both cameras are 8 MP, where one is wide and the second is ultrawide. On the back is a wide 12.2 MP single camera.
Pixel 3 is equipped with a Snapdragon 845 processor. Its Qualcomm Spectra 280 image signal processor, Dual 14-bit ISPs states that it supports processing the output from two to 16 MP cameras at once.
Based on the specification, I assumed that the device should be able to run the front and back cameras concurrently. But this didn’t end up being true. While testing it, I was only able to run both front cameras — which is to be expected as they are backed by a software logical camera. When I tried to run one of the front cameras and back camera simultaneously, I received the ERROR_MAX_CAMERAS_IN_USE error code.
Google Pixel 3 Front-Facing Cameras (Source: ifixit.com)
The number of processing pipelines that an ISP offers (dual, triple, etc.) is apparently not the only indicator. The ISP and camera are standalone components that might even be provided by different vendors. Therefore, the way they are wired plays a role. In one scenario, multiple cameras can be connected to the same processing pipeline which can, by standard definition, operate only one camera at a time.
Once the hardware is set up, it doesn’t have to be possible to change the way it works with a software update.
In the case of Pixel 3, this could potentially mean that the back camera shares the pipeline with one of the front cameras and hence they cannot be opened together. Because the front cameras can run simultaneously, their pipelines should be separated.
Another reason might simply be concerns about performance, battery life, thermal limits, bandwidth constraints… or just a lack of time and motivation to implement such an unusual use case. If the amount of resources is limited, it is logical that manufacturers prefer to utilize it for features with higher added value for the customer — like better captures using two back cameras — rather than sacrificing it for the nice-to-have front-and-back camera support.
This is, of course, just a theory. The truth might be somewhere else and likely varies device by device, manufacturer by manufacturer.
To figure out the approximate ratio of devices that support concurrent streaming of the front and back cameras without manually querying all devices, I decided to reach out to the most popular Android phone manufacturers of 2021 to obtain a list of devices that might provide the functionality in some form. Unfortunately, the majority of corresponding technical support could not provide sufficient data.
As an alternative, I used the statistics of a dating app to create a sample data set from phones on which the app was installed. Then I filtered models that have the FEATURE_CAMERA_CONCURRENT system feature. The application is distributed worldwide with the largest audience in the United States (≈16%).
The outcome is not verified, as I simply do not have enough devices on which to test it.
Dating app’s active devices with the FEATURE_CAMERA_CONCURRENT feature
The result is that, out of nearly 120,000 Samsung, Xiaomi, Oppo and Vivo phones that have the dating app installed, roughly 23% should be able to run the front and back cameras concurrently. The majority of these devices are made by Samsung.
So, can we use the front and back cameras simultaneously? Solely on devices with the required software and hardware support. While seemingly obvious, it’s not that trivial to determine. So far, the only way to establish this is by using the concurrent camera streaming API or by looking for the FEATURE_CAMERA_CONCURRENT feature on the system — the implementation of which doesn’t appear unified among models.
Let's see how the Android camera evolves in the near future. Maybe the option of running the front and back cameras at once will become the new standard, just like having multiple cameras. It could help to boost vlogs, memories, the augmented reality experience… or creative users might come up with something entirely new.
I’d like to thank Helen Koike for advice and consultation, Alexander Kovalenko for endless rubberducking, Juraj Kuliska for providing feedback and Linda Krestanova for the English check.
Helen Koike. Image Signal Processing (ISP) Drivers & How to Merge One Upstream
Douglas Schmidt. Infrastructure Middleware (Part 1): the Android Hardware Abstraction Layer
Android Authority. Qualcomm ISP Explained
Android Developer Backstage. Episode 177: Honor Every Photon
Lifewire. What Are Dual Cameras?