summaryrefslogtreecommitdiff
path: root/Documentation/userspace-api
diff options
context:
space:
mode:
authorLinus Torvalds <torvalds@linux-foundation.org>2023-11-07 02:06:06 +0300
committerLinus Torvalds <torvalds@linux-foundation.org>2023-11-07 02:06:06 +0300
commitbe3ca57cfb777ad820c6659d52e60bbdd36bf5ff (patch)
tree2aec9aa9c20d3a82bce9d3df93a049058c3bca4e /Documentation/userspace-api
parentd2f51b3516dade79269ff45eae2a7668ae711b25 (diff)
parent3e238417254bfdcc23fe207780b59cbb08656762 (diff)
downloadlinux-be3ca57cfb777ad820c6659d52e60bbdd36bf5ff.tar.xz
Merge tag 'media/v6.7-1' of git://git.kernel.org/pub/scm/linux/kernel/git/mchehab/linux-media
Pull media updates from Mauro Carvalho Chehab: - the old V4L2 core videobuf kAPI was finally removed. All media drivers should now be using VB2 kAPI - new automotive driver: mgb4 - new platform video driver: npcm-video - new sensor driver: mt9m114 - new TI driver used in conjunction with Cadence CSI2RX IP to bridge TI-specific parts - ir-rx51 was removed and the N900 DT binding was moved to the pwm-ir-tx generic driver - drop atomisp-specific ov5693, using the upstream driver instead - the camss driver has gained RDI3 support for VFE 17x - the atomisp driver now detects ISP2400 or ISP2401 at run time. No need to set it up at build time anymore - lots of driver fixes, cleanups and improvements * tag 'media/v6.7-1' of git://git.kernel.org/pub/scm/linux/kernel/git/mchehab/linux-media: (377 commits) media: nuvoton: VIDEO_NPCM_VCD_ECE should depend on ARCH_NPCM media: venus: Fix firmware path for resources media: venus: hfi_cmds: Replace one-element array with flex-array member and use __counted_by media: venus: hfi_parser: Add check to keep the number of codecs within range media: venus: hfi: add checks to handle capabilities from firmware media: venus: hfi: fix the check to handle session buffer requirement media: venus: hfi: add checks to perform sanity on queue pointers media: platform: cadence: select MIPI_DPHY dependency media: MAINTAINERS: Fix path for J721E CSI2RX bindings media: cec: meson: always include meson sub-directory in Makefile media: videobuf2: Fix IS_ERR checking in vb2_dc_put_userptr() media: platform: mtk-mdp3: fix uninitialized variable in mdp_path_config() media: mediatek: vcodec: using encoder device to alloc/free encoder memory media: imx-jpeg: notify source chagne event when the first picture parsed media: cx231xx: Use EP5_BUF_SIZE macro media: siano: Drop unnecessary error check for debugfs_create_dir/file() media: mediatek: vcodec: Handle invalid encoder vsi media: aspeed: Drop unnecessary error check for debugfs_create_file() Documentation: media: buffer.rst: fix V4L2_BUF_FLAG_PREPARED Documentation: media: gen-errors.rst: fix confusing ENOTTY description ...
Diffstat (limited to 'Documentation/userspace-api')
-rw-r--r--Documentation/userspace-api/media/drivers/camera-sensor.rst104
-rw-r--r--Documentation/userspace-api/media/drivers/index.rst2
-rw-r--r--Documentation/userspace-api/media/drivers/npcm-video.rst66
-rw-r--r--Documentation/userspace-api/media/gen-errors.rst4
-rw-r--r--Documentation/userspace-api/media/v4l/buffer.rst4
-rw-r--r--Documentation/userspace-api/media/v4l/control.rst4
-rw-r--r--Documentation/userspace-api/media/v4l/dev-subdev.rst49
-rw-r--r--Documentation/userspace-api/media/v4l/dv-timings.rst21
-rw-r--r--Documentation/userspace-api/media/v4l/pixfmt-reserved.rst7
-rw-r--r--Documentation/userspace-api/media/v4l/pixfmt-srggb12p.rst4
10 files changed, 233 insertions, 32 deletions
diff --git a/Documentation/userspace-api/media/drivers/camera-sensor.rst b/Documentation/userspace-api/media/drivers/camera-sensor.rst
new file mode 100644
index 000000000000..919a50e8b9d9
--- /dev/null
+++ b/Documentation/userspace-api/media/drivers/camera-sensor.rst
@@ -0,0 +1,104 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _media_using_camera_sensor_drivers:
+
+Using camera sensor drivers
+===========================
+
+This section describes common practices for how the V4L2 sub-device interface is
+used to control the camera sensor drivers.
+
+You may also find :ref:`media_writing_camera_sensor_drivers` useful.
+
+Frame size
+----------
+
+There are two distinct ways to configure the frame size produced by camera
+sensors.
+
+Freely configurable camera sensor drivers
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Freely configurable camera sensor drivers expose the device's internal
+processing pipeline as one or more sub-devices with different cropping and
+scaling configurations. The output size of the device is the result of a series
+of cropping and scaling operations from the device's pixel array's size.
+
+An example of such a driver is the CCS driver.
+
+Register list based drivers
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Register list based drivers generally, instead of able to configure the device
+they control based on user requests, are limited to a number of preset
+configurations that combine a number of different parameters that on hardware
+level are independent. How a driver picks such configuration is based on the
+format set on a source pad at the end of the device's internal pipeline.
+
+Most sensor drivers are implemented this way.
+
+Frame interval configuration
+----------------------------
+
+There are two different methods for obtaining possibilities for different frame
+intervals as well as configuring the frame interval. Which one to implement
+depends on the type of the device.
+
+Raw camera sensors
+~~~~~~~~~~~~~~~~~~
+
+Instead of a high level parameter such as frame interval, the frame interval is
+a result of the configuration of a number of camera sensor implementation
+specific parameters. Luckily, these parameters tend to be the same for more or
+less all modern raw camera sensors.
+
+The frame interval is calculated using the following equation::
+
+ frame interval = (analogue crop width + horizontal blanking) *
+ (analogue crop height + vertical blanking) / pixel rate
+
+The formula is bus independent and is applicable for raw timing parameters on
+large variety of devices beyond camera sensors. Devices that have no analogue
+crop, use the full source image size, i.e. pixel array size.
+
+Horizontal and vertical blanking are specified by ``V4L2_CID_HBLANK`` and
+``V4L2_CID_VBLANK``, respectively. The unit of the ``V4L2_CID_HBLANK`` control
+is pixels and the unit of the ``V4L2_CID_VBLANK`` is lines. The pixel rate in
+the sensor's **pixel array** is specified by ``V4L2_CID_PIXEL_RATE`` in the same
+sub-device. The unit of that control is pixels per second.
+
+Register list based drivers need to implement read-only sub-device nodes for the
+purpose. Devices that are not register list based need these to configure the
+device's internal processing pipeline.
+
+The first entity in the linear pipeline is the pixel array. The pixel array may
+be followed by other entities that are there to allow configuring binning,
+skipping, scaling or digital crop, see :ref:`VIDIOC_SUBDEV_G_SELECTION
+<VIDIOC_SUBDEV_G_SELECTION>`.
+
+USB cameras etc. devices
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+USB video class hardware, as well as many cameras offering a similar higher
+level interface natively, generally use the concept of frame interval (or frame
+rate) on device level in firmware or hardware. This means lower level controls
+implemented by raw cameras may not be used on uAPI (or even kAPI) to control the
+frame interval on these devices.
+
+Rotation, orientation and flipping
+----------------------------------
+
+Some systems have the camera sensor mounted upside down compared to its natural
+mounting rotation. In such cases, drivers shall expose the information to
+userspace with the :ref:`V4L2_CID_CAMERA_SENSOR_ROTATION
+<v4l2-camera-sensor-rotation>` control.
+
+Sensor drivers shall also report the sensor's mounting orientation with the
+:ref:`V4L2_CID_CAMERA_SENSOR_ORIENTATION <v4l2-camera-sensor-orientation>`.
+
+Sensor drivers that have any vertical or horizontal flips embedded in the
+register programming sequences shall initialize the :ref:`V4L2_CID_HFLIP
+<v4l2-cid-hflip>` and :ref:`V4L2_CID_VFLIP <v4l2-cid-vflip>` controls with the
+values programmed by the register sequences. The default values of these
+controls shall be 0 (disabled). Especially these controls shall not be inverted,
+independently of the sensor's mounting rotation.
diff --git a/Documentation/userspace-api/media/drivers/index.rst b/Documentation/userspace-api/media/drivers/index.rst
index 6708d649afd7..1726f8ec86fa 100644
--- a/Documentation/userspace-api/media/drivers/index.rst
+++ b/Documentation/userspace-api/media/drivers/index.rst
@@ -32,11 +32,13 @@ For more details see the file COPYING in the source distribution of Linux.
:numbered:
aspeed-video
+ camera-sensor
ccs
cx2341x-uapi
dw100
imx-uapi
max2175
+ npcm-video
omap3isp-uapi
st-vgxy61
uvcvideo
diff --git a/Documentation/userspace-api/media/drivers/npcm-video.rst b/Documentation/userspace-api/media/drivers/npcm-video.rst
new file mode 100644
index 000000000000..b47771dd8b27
--- /dev/null
+++ b/Documentation/userspace-api/media/drivers/npcm-video.rst
@@ -0,0 +1,66 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. include:: <isonum.txt>
+
+NPCM video driver
+=================
+
+This driver is used to control the Video Capture/Differentiation (VCD) engine
+and Encoding Compression Engine (ECE) present on Nuvoton NPCM SoCs. The VCD can
+capture a frame from digital video input and compare two frames in memory, and
+the ECE can compress the frame data into HEXTILE format.
+
+Driver-specific Controls
+------------------------
+
+V4L2_CID_NPCM_CAPTURE_MODE
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The VCD engine supports two modes:
+
+- COMPLETE mode:
+
+ Capture the next complete frame into memory.
+
+- DIFF mode:
+
+ Compare the incoming frame with the frame stored in memory, and updates the
+ differentiated frame in memory.
+
+Application can use ``V4L2_CID_NPCM_CAPTURE_MODE`` control to set the VCD mode
+with different control values (enum v4l2_npcm_capture_mode):
+
+- ``V4L2_NPCM_CAPTURE_MODE_COMPLETE``: will set VCD to COMPLETE mode.
+- ``V4L2_NPCM_CAPTURE_MODE_DIFF``: will set VCD to DIFF mode.
+
+V4L2_CID_NPCM_RECT_COUNT
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+If using V4L2_PIX_FMT_HEXTILE format, VCD will capture frame data and then ECE
+will compress the data into HEXTILE rectangles and store them in V4L2 video
+buffer with the layout defined in Remote Framebuffer Protocol:
+::
+
+ (RFC 6143, https://www.rfc-editor.org/rfc/rfc6143.html#section-7.6.1)
+
+ +--------------+--------------+-------------------+
+ | No. of bytes | Type [Value] | Description |
+ +--------------+--------------+-------------------+
+ | 2 | U16 | x-position |
+ | 2 | U16 | y-position |
+ | 2 | U16 | width |
+ | 2 | U16 | height |
+ | 4 | S32 | encoding-type (5) |
+ +--------------+--------------+-------------------+
+ | HEXTILE rectangle data |
+ +-------------------------------------------------+
+
+Application can get the video buffer through VIDIOC_DQBUF, and followed by
+calling ``V4L2_CID_NPCM_RECT_COUNT`` control to get the number of HEXTILE
+rectangles in this buffer.
+
+References
+----------
+include/uapi/linux/npcm-video.h
+
+**Copyright** |copy| 2022 Nuvoton Technologies
diff --git a/Documentation/userspace-api/media/gen-errors.rst b/Documentation/userspace-api/media/gen-errors.rst
index e595d0bea109..4e8defd3612b 100644
--- a/Documentation/userspace-api/media/gen-errors.rst
+++ b/Documentation/userspace-api/media/gen-errors.rst
@@ -59,9 +59,7 @@ Generic Error Codes
- - ``ENOTTY``
- - The ioctl is not supported by the driver, actually meaning that
- the required functionality is not available, or the file
- descriptor is not for a media device.
+ - The ioctl is not supported by the file descriptor.
- - ``ENOSPC``
diff --git a/Documentation/userspace-api/media/v4l/buffer.rst b/Documentation/userspace-api/media/v4l/buffer.rst
index 04dec3e570ed..52bbee81c080 100644
--- a/Documentation/userspace-api/media/v4l/buffer.rst
+++ b/Documentation/userspace-api/media/v4l/buffer.rst
@@ -549,9 +549,9 @@ Buffer Flags
- 0x00000400
- The buffer has been prepared for I/O and can be queued by the
application. Drivers set or clear this flag when the
- :ref:`VIDIOC_QUERYBUF`,
+ :ref:`VIDIOC_QUERYBUF <VIDIOC_QUERYBUF>`,
:ref:`VIDIOC_PREPARE_BUF <VIDIOC_QBUF>`,
- :ref:`VIDIOC_QBUF` or
+ :ref:`VIDIOC_QBUF <VIDIOC_QBUF>` or
:ref:`VIDIOC_DQBUF <VIDIOC_QBUF>` ioctl is called.
* .. _`V4L2-BUF-FLAG-NO-CACHE-INVALIDATE`:
diff --git a/Documentation/userspace-api/media/v4l/control.rst b/Documentation/userspace-api/media/v4l/control.rst
index 4463fce694b0..57893814a1e5 100644
--- a/Documentation/userspace-api/media/v4l/control.rst
+++ b/Documentation/userspace-api/media/v4l/control.rst
@@ -143,9 +143,13 @@ Control IDs
recognise the difference between digital and analogue gain use
controls ``V4L2_CID_DIGITAL_GAIN`` and ``V4L2_CID_ANALOGUE_GAIN``.
+.. _v4l2-cid-hflip:
+
``V4L2_CID_HFLIP`` ``(boolean)``
Mirror the picture horizontally.
+.. _v4l2-cid-vflip:
+
``V4L2_CID_VFLIP`` ``(boolean)``
Mirror the picture vertically.
diff --git a/Documentation/userspace-api/media/v4l/dev-subdev.rst b/Documentation/userspace-api/media/v4l/dev-subdev.rst
index a4f1df7093e8..43988516acdd 100644
--- a/Documentation/userspace-api/media/v4l/dev-subdev.rst
+++ b/Documentation/userspace-api/media/v4l/dev-subdev.rst
@@ -579,20 +579,19 @@ is started.
There are three steps in configuring the streams:
-1) Set up links. Connect the pads between sub-devices using the :ref:`Media
-Controller API <media_controller>`
+1. Set up links. Connect the pads between sub-devices using the
+ :ref:`Media Controller API <media_controller>`
-2) Streams. Streams are declared and their routing is configured by
-setting the routing table for the sub-device using
-:ref:`VIDIOC_SUBDEV_S_ROUTING <VIDIOC_SUBDEV_G_ROUTING>` ioctl. Note that
-setting the routing table will reset formats and selections in the
-sub-device to default values.
+2. Streams. Streams are declared and their routing is configured by setting the
+ routing table for the sub-device using :ref:`VIDIOC_SUBDEV_S_ROUTING
+ <VIDIOC_SUBDEV_G_ROUTING>` ioctl. Note that setting the routing table will
+ reset formats and selections in the sub-device to default values.
-3) Configure formats and selections. Formats and selections of each stream
-are configured separately as documented for plain sub-devices in
-:ref:`format-propagation`. The stream ID is set to the same stream ID
-associated with either sink or source pads of routes configured using the
-:ref:`VIDIOC_SUBDEV_S_ROUTING <VIDIOC_SUBDEV_G_ROUTING>` ioctl.
+3. Configure formats and selections. Formats and selections of each stream are
+ configured separately as documented for plain sub-devices in
+ :ref:`format-propagation`. The stream ID is set to the same stream ID
+ associated with either sink or source pads of routes configured using the
+ :ref:`VIDIOC_SUBDEV_S_ROUTING <VIDIOC_SUBDEV_G_ROUTING>` ioctl.
Multiplexed streams setup example
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -618,11 +617,11 @@ modeled as V4L2 devices, exposed to userspace via /dev/videoX nodes.
To configure this pipeline, the userspace must take the following steps:
-1) Set up media links between entities: connect the sensors to the bridge,
-bridge to the receiver, and the receiver to the DMA engines. This step does
-not differ from normal non-multiplexed media controller setup.
+1. Set up media links between entities: connect the sensors to the bridge,
+ bridge to the receiver, and the receiver to the DMA engines. This step does
+ not differ from normal non-multiplexed media controller setup.
-2) Configure routing
+2. Configure routing
.. flat-table:: Bridge routing table
:header-rows: 1
@@ -656,14 +655,14 @@ not differ from normal non-multiplexed media controller setup.
- V4L2_SUBDEV_ROUTE_FL_ACTIVE
- Pixel data stream from Sensor B
-3) Configure formats and selections
+3. Configure formats and selections
-After configuring routing, the next step is configuring the formats and
-selections for the streams. This is similar to performing this step without
-streams, with just one exception: the ``stream`` field needs to be assigned
-to the value of the stream ID.
+ After configuring routing, the next step is configuring the formats and
+ selections for the streams. This is similar to performing this step without
+ streams, with just one exception: the ``stream`` field needs to be assigned
+ to the value of the stream ID.
-A common way to accomplish this is to start from the sensors and propagate the
-configurations along the stream towards the receiver,
-using :ref:`VIDIOC_SUBDEV_S_FMT <VIDIOC_SUBDEV_G_FMT>` ioctls to configure each
-stream endpoint in each sub-device.
+ A common way to accomplish this is to start from the sensors and propagate
+ the configurations along the stream towards the receiver, using
+ :ref:`VIDIOC_SUBDEV_S_FMT <VIDIOC_SUBDEV_G_FMT>` ioctls to configure each
+ stream endpoint in each sub-device.
diff --git a/Documentation/userspace-api/media/v4l/dv-timings.rst b/Documentation/userspace-api/media/v4l/dv-timings.rst
index e17f056b129f..4b19bcb4bd80 100644
--- a/Documentation/userspace-api/media/v4l/dv-timings.rst
+++ b/Documentation/userspace-api/media/v4l/dv-timings.rst
@@ -33,6 +33,27 @@ current DV timings they use the
the DV timings as seen by the video receiver applications use the
:ref:`VIDIOC_QUERY_DV_TIMINGS` ioctl.
+When the hardware detects a video source change (e.g. the video
+signal appears or disappears, or the video resolution changes), then
+it will issue a `V4L2_EVENT_SOURCE_CHANGE` event. Use the
+:ref:`ioctl VIDIOC_SUBSCRIBE_EVENT <VIDIOC_SUBSCRIBE_EVENT>` and the
+:ref:`VIDIOC_DQEVENT` to check if this event was reported.
+
+If the video signal changed, then the application has to stop
+streaming, free all buffers, and call the :ref:`VIDIOC_QUERY_DV_TIMINGS`
+to obtain the new video timings, and if they are valid, it can set
+those by calling the :ref:`ioctl VIDIOC_S_DV_TIMINGS <VIDIOC_G_DV_TIMINGS>`.
+This will also update the format, so use the :ref:`ioctl VIDIOC_G_FMT <VIDIOC_G_FMT>`
+to obtain the new format. Now the application can allocate new buffers
+and start streaming again.
+
+The :ref:`VIDIOC_QUERY_DV_TIMINGS` will just report what the
+hardware detects, it will never change the configuration. If the
+currently set timings and the actually detected timings differ, then
+typically this will mean that you will not be able to capture any
+video. The correct approach is to rely on the `V4L2_EVENT_SOURCE_CHANGE`
+event so you know when something changed.
+
Applications can make use of the :ref:`input-capabilities` and
:ref:`output-capabilities` flags to determine whether the digital
video ioctls can be used with the given input or output.
diff --git a/Documentation/userspace-api/media/v4l/pixfmt-reserved.rst b/Documentation/userspace-api/media/v4l/pixfmt-reserved.rst
index 296ad2025e8d..886ba7b08d6b 100644
--- a/Documentation/userspace-api/media/v4l/pixfmt-reserved.rst
+++ b/Documentation/userspace-api/media/v4l/pixfmt-reserved.rst
@@ -288,6 +288,13 @@ please make a proposal on the linux-media mailing list.
- 'MT2110R'
- This format is two-planar 10-Bit raster mode and having similitude with
``V4L2_PIX_FMT_MM21`` in term of alignment and tiling. Used for AVC.
+ * .. _V4L2-PIX-FMT-HEXTILE:
+
+ - ``V4L2_PIX_FMT_HEXTILE``
+ - 'HXTL'
+ - Compressed format used by Nuvoton NPCM video driver. This format is
+ defined in Remote Framebuffer Protocol (RFC 6143, chapter 7.7.4 Hextile
+ Encoding).
.. raw:: latex
\normalsize
diff --git a/Documentation/userspace-api/media/v4l/pixfmt-srggb12p.rst b/Documentation/userspace-api/media/v4l/pixfmt-srggb12p.rst
index b6e79e2f8ce4..7c3810ff783c 100644
--- a/Documentation/userspace-api/media/v4l/pixfmt-srggb12p.rst
+++ b/Documentation/userspace-api/media/v4l/pixfmt-srggb12p.rst
@@ -60,7 +60,7 @@ Each cell is one byte.
G\ :sub:`10low`\ (bits 3--0)
- G\ :sub:`12high`
- R\ :sub:`13high`
- - R\ :sub:`13low`\ (bits 3--2)
+ - R\ :sub:`13low`\ (bits 7--4)
G\ :sub:`12low`\ (bits 3--0)
- - start + 12:
@@ -82,6 +82,6 @@ Each cell is one byte.
G\ :sub:`30low`\ (bits 3--0)
- G\ :sub:`32high`
- R\ :sub:`33high`
- - R\ :sub:`33low`\ (bits 3--2)
+ - R\ :sub:`33low`\ (bits 7--4)
G\ :sub:`32low`\ (bits 3--0)