1<!-- Copyright (C) 2013 The Android Open Source Project
2
3     Licensed under the Apache License, Version 2.0 (the "License");
4     you may not use this file except in compliance with the License.
5     You may obtain a copy of the License at
6
7          http://www.apache.org/licenses/LICENSE-2.0
8
9     Unless required by applicable law or agreed to in writing, software
10     distributed under the License is distributed on an "AS IS" BASIS,
11     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12     See the License for the specific language governing permissions and
13     limitations under the License.
14-->
15<HTML>
16<BODY>
17<p>The android.hardware.camera2 package provides an interface to
18individual camera devices connected to an Android device. It replaces
19the deprecated {@link android.hardware.Camera} class.</p>
20
21<p>This package models a camera device as a pipeline, which takes in
22input requests for capturing a single frame, captures the single image
23per the request, and then outputs one capture result metadata packet,
24plus a set of output image buffers for the request. The requests are
25processed in-order, and multiple requests can be in flight at
26once. Since the camera device is a pipeline with multiple stages,
27having multiple requests in flight is required to maintain full
28framerate on most Android devices.</p>
29
30<p>To enumerate, query, and open available camera devices, obtain a
31{@link android.hardware.camera2.CameraManager} instance.</p>
32
33<p>Individual {@link android.hardware.camera2.CameraDevice
34CameraDevices} provide a set of static property information that
35describes the hardware device and the available settings and output
36parameters for the device. This information is provided through the
37{@link android.hardware.camera2.CameraCharacteristics} object, and is
38available through {@link
39android.hardware.camera2.CameraManager#getCameraCharacteristics}</p>
40
41<p>To capture or stream images from a camera device, the application
42must first create a {@link
43android.hardware.camera2.CameraCaptureSession camera capture session}
44with a set of output Surfaces for use with the camera device, with
45{@link
46android.hardware.camera2.CameraDevice#createCaptureSession}. Each
47Surface has to be pre-configured with an {@link
48android.hardware.camera2.params.StreamConfigurationMap appropriate
49size and format} (if applicable) to match the sizes and formats
50available from the camera device. A target Surface can be obtained
51from a variety of classes, including {@link android.view.SurfaceView},
52{@link android.graphics.SurfaceTexture} via
53{@link android.view.Surface#Surface(SurfaceTexture)},
54{@link android.media.MediaCodec}, {@link android.media.MediaRecorder},
55{@link android.renderscript.Allocation}, and {@link android.media.ImageReader}.
56</p>
57
58<p>Generally, camera preview images are sent to {@link
59android.view.SurfaceView} or {@link android.view.TextureView} (via its
60{@link android.graphics.SurfaceTexture}). Capture of JPEG images or
61RAW buffers for {@link android.hardware.camera2.DngCreator} can be
62done with {@link android.media.ImageReader} with the {@link
63android.graphics.ImageFormat#JPEG} and {@link
64android.graphics.ImageFormat#RAW_SENSOR} formats.  Application-driven
65processing of camera data in OpenGL ES, or directly in managed or
66native code is best done through {@link
67android.graphics.SurfaceTexture}, or {@link android.media.ImageReader}
68with a {@link android.graphics.ImageFormat#YUV_420_888} format,
69respectively. </p>
70
71<p>By default, YUV-format buffers provided by the camera are using the
72JFIF YUV<->RGB transform matrix (equivalent to Rec.601 full-range
73encoding), and after conversion to RGB with this matrix, the resulting
74RGB data is in the sRGB colorspace.  Captured JPEG images may contain
75an ICC profile to specify their color space information; if not, they
76should be assumed to be in the sRGB space as well. On some devices,
77the output colorspace can be changed via {@link
78android.hardware.camera2.params.SessionConfiguration#setColorSpace}.
79</p>
80<p>
81Note that although the YUV->RGB transform is the JFIF matrix (Rec.601
82full-range), due to legacy and compatibility reasons, the output is in
83the sRGB colorspace, which uses the Rec.709 color primaries. Image
84processing code can safely treat the output RGB as being in the sRGB
85colorspace.
86</p>
87
88<p>The application then needs to construct a {@link
89android.hardware.camera2.CaptureRequest}, which defines all the
90capture parameters needed by a camera device to capture a single
91image. The request also lists which of the configured output Surfaces
92should be used as targets for this capture. The CameraDevice has a
93{@link android.hardware.camera2.CameraDevice#createCaptureRequest
94factory method} for creating a {@link
95android.hardware.camera2.CaptureRequest.Builder request builder} for a
96given use case, which is optimized for the Android device the
97application is running on.</p>
98
99<p>Once the request has been set up, it can be handed to the active
100capture session either for a one-shot {@link
101android.hardware.camera2.CameraCaptureSession#capture capture} or for
102an endlessly {@link
103android.hardware.camera2.CameraCaptureSession#setRepeatingRequest
104repeating} use. Both methods also have a variant that accepts a list
105of requests to use as a burst capture / repeating burst. Repeating
106requests have a lower priority than captures, so a request submitted
107through <code>capture()</code> while there's a repeating request
108configured will be captured before any new instances of the currently
109repeating (burst) capture will begin capture.</p>
110
111<p>After processing a request, the camera device will produce a {@link
112android.hardware.camera2.TotalCaptureResult} object, which contains
113information about the state of the camera device at time of capture,
114and the final settings used. These may vary somewhat from the request,
115if rounding or resolving contradictory parameters was necessary. The
116camera device will also send a frame of image data into each of the
117output {@code Surfaces} included in the request. These are produced
118asynchronously relative to the output CaptureResult, sometimes
119substantially later.</p>
120
121</BODY>
122</HTML>
123