NVIDIA Jetson platforms powered by the Tegra processors have established a strong position in the edge analytics market, particularly for video analytics and machine vision applications. Successful Embedded MIPI CSI camera integration on these platforms is central to unlocking their full computational power. While a wide range of interfaces — MIPI-CSI, USB, and Gigabit Ethernet — can be used for video data acquisition, the CSI interface remains the most preferred for machine vision use cases due to its low latency and high bandwidth characteristics.
In this blog, we discuss in detail the camera interface and data flow in Jetson Tegra platforms, and walk through a typical configuration and setup for a MIPI-CSI driver. For specifics, we will consider the Jetson Nano and the Onsemi OV5693 camera module. Teams engaged in product engineering services will find this a practical reference for camera driver bring-up.
Jetson Camera Subsystem for NVIDIA Jetson Development
While there are significant architectural differences between the Tegra TX1, TX2, Xavier, and Nano platforms, the camera hardware sub-system remains largely consistent across all of them. The high-level design of the sub-system is captured below.

The major components and their functionalities are:
- CSI Unit: The MIPI-CSI-compatible input sub-system responsible for data acquisition from the camera, organizing the pixel format, and forwarding data to the VI unit. There are 6 Pixel Parser (PP) units, each capable of accepting input from a single 2-lane camera. In addition to this 6-camera model, the inputs can be reconfigured so that 3 Mono or Stereo 4-lane cameras connect to PPA, CSI1_PPA, and CSI2_PPA pairs — a key consideration in Embedded MIPI CSI camera integration projects.
- VI: The Video Input unit accepts data from the CSI unit over a 24-bit bus, with data positioning determined by the input format. This data can then be routed to one or both of the downstream processing units. The VI also includes a Host1x interface with two channels — one for controlling I2C access to cameras and another for VI register programming.
- Memory: Data written to system memory for further consumption by applications.
- Image Signal Processor ISP A: Pre-processes input data and converts or packs it into a different format. ISP A can also acquire data from memory.
- Image Signal Processor ISP B: Pre-processes input data and converts or packs it into a different format. ISP B can also acquire data from memory.
The VI Unit provides a hardware-software synchronization mechanism called VI Sync Points (syncpts), used to wait for a particular condition to be met and increment a counter, or to wait for the counter to reach a particular value. Multiple predefined indices are available, each corresponding to one functionality — such as frame start, line end, or completion of ISP processing. For example, software can choose to wait until one frame is received by the VI, indicated by the next counter value corresponding to the relevant index.
With these powerful components, the Tegra Camera sub-system offers seamless handling of data from multiple sources in different formats, making it a foundation for robust Embedded MIPI CSI camera integration.
Linux 4 Tegra Camera Driver for NVIDIA Jetson Development
With an understanding of the hardware sub-system, we now look into the software architecture of the Tegra camera interface. NVIDIA supports Linux OS through its Linux4Tegra (L4T) software stack. The camera drivers configure and read data from camera sensors over the CSI bus in the sensor's native format, and optionally convert them to a different format. This software stack is the backbone of any NVIDIA Jetson Development workflow involving cameras.
NVIDIA provides two types of camera access paths that can be chosen depending on the camera and application use case:
- Direct V4L2 Interface — Primarily for capturing RAW data from a camera. This is a minimal path where no processing is performed, and data is directly consumed by the user application. It is the simpler option for straightforward Embedded MIPI CSI camera integration where post-processing is handled externally.
- Camera Core Library Interface — In this model, camera data is consumed via NVIDIA libraries such as Camera Core and libArgus. Various data-processing operations can be performed efficiently on the input data by leveraging the GPU available in the Jetson core.
In either case, the application can be a GStreamer pipeline or a custom application. GStreamer is widely used in NVIDIA Jetson Development to build camera capture and processing pipelines. Teams working on MIPI DSI display driver development alongside camera drivers will find that both share common device tree patterns in L4T.
Camera Module Development: OV5693 on Jetson
For a practical deep-dive into camera module development, we will consider the 5MP (2592×1944, Bayer sensor) Omnivision CSI camera module OV5693 that ships by default with the Tegra TX1 and TX2 carrier boards. The high-level software architecture is captured below.

The OV5693 camera is connected to I2C bus 0x06 (default I2C address 0x36) via a TCA9548 I2C expander chip. The address can be changed to 0x40 by adding a pull-up resistor on the SID pin.
The OV5693 driver is triggered using the I2C bus driver and registers itself with the Tegra V4L2 camera framework. This in turn exposes a /dev/videoX device that applications use to consume the data. This registration pattern is common across all V4L2-based Embedded MIPI CSI camera integration setups on Jetson.
To bring up the OV5693 driver, the following must be addressed:
- Appropriate node in the Device Tree
- V4L2-compatible sensor driver
Device Tree Changes for Tegra Camera — Key Step in MIPI DSI display driver development Pattern
The tegra194-camera-e3333-a00.dtsi file is located in /hardware/nvidia/platform/t19x/common/kernel-dts/t19x-common-modules/. The same device tree approach used for camera drivers is closely related to the patterns seen in MIPI DSI display driver development, since both camera (CSI) and display (DSI) sub-systems share the MIPI physical layer concepts.
Tegra-camera-platform Node
The tegra-camera-platform node consists of one or more modules that define the basic information of the camera or sensor connected to the Tegra SoC. The common top-level section holds consolidated information about all connected cameras; each module sub-section defines them individually. In this case, a single OV5693 camera is connected over two MIPI lanes — a standard configuration in single-camera Embedded MIPI CSI camera integration.
tegra-camera-platform {
compatible = 'nvidia, tegra-camera-platform';
num_csi_lanes = <2>; //Number of lanes
max_lane_speed = <1500000>; //Maximum lane speed
min_bits_per_pixel = <12>; //bits per pixel
vi_peak_byte_per_pixel = <2>; //byte per pixel
vi_bw_margin_pct = <25>; //Don't care
max_pixel_rate = <160000>; //Don't care
isp_peak_byte_per_pixel = <5>//Don't care
isp_bw_margin_pct = <25>; //Don't care
modules {
module0 { //OV5693 basic details
badge = 'ov5693_right_iicov5693';
position = 'right';
orientation = '1';
drivernode0 {
pcl_id = 'v4l2_sensor';
devname = 'ov5693 06-0036';
proc-device-tree = '/proc/device-tree/i2c@31c0000/tca9548@77/i2c@6/ov5693_a@36'; //Device tree node path
};
};
};
};
Device Tree Node
In the device tree node, all camera properties (output resolution, FPS, MIPI clock, etc.) must be added for proper operation of the device. For NVIDIA Jetson Development, getting these timing parameters correct is critical for reliable MIPI-CSI operation.
I2c@31c0000 { //I2C-6 base address
tca9548@77 { //I2C expander IC
i2c@6 {
ov5693_a@36 {
compatible = nvidia,ov5693';
reg = <0x36>; //I2C slave address
devnode = 'video0';//device name
/* Physical dimensions of sensor */
physical_w = '3.674'; //physical width of the sensor
physical_h = '2.738'; //physical height of the sensor
/* Enable EEPROM support */
has-eeprom = '1';
/* Define any required hw resources needed by driver */
/* ie. clocks, io pins, power sources */
avdd-reg = 'vana'; //Power Regulator
iovdd-reg = 'vif'; //Power Regulator
mode0 { // OV5693_MODE_2592X1944
mclk_khz = '24000'; //MIPI driving clock
num_lanes = '2'; //Number of lanes
tegra_sinterface = 'serial_a'; //Serial interface
phy_mode = 'DPHY'; //physical connection mode
discontinuous_clk = 'yes';
dpcm_enable = 'false'; //Don't care
cil_settletime = '0'; //Don't care
active_w = '2592'; //active width
active_h = '1944'; //active height
mode_type = 'bayer'; //sensor type
pixel_phase = 'bggr'; //output format
csi_pixel_bit_depth = '10'; //bit per pixel
readout_orientation = '0'; //Don't care
line_length = '2688'; //Total width
inherent_gain = '1'; //Don't care
mclk_multiplier = '6.67'; //pix_clk_hz/mclk_khz
pix_clk_hz = '160000000'; //Pixel clock HTotal*VTotal*FPS
gain_factor = '10'; //Don't care
min_gain_val = '10';/* 1DB*/ //Don't care
max_gain_val = '160';/* 16DB*/ //Don't care
step_gain_val = '1'; //Don't care
default_gain = '10'; //Don't care
min_hdr_ratio = '1'; //Don't care
max_hdr_ratio = '1'; //Don't care
framerate_factor = '1000000'; //Don't care
min_framerate = '1816577'; //Don't care
max_framerate = '30000000';
step_framerate = '1';
default_framerate = '30000000';
exposure_factor = '1000000'; //Don't care
min_exp_time = '34'; //Don't care
max_exp_time = '550385'; //Don't care
step_exp_time = '1'; //Don't care
default_exp_time = '33334'; //Don't care
embedded_metadata_height = '0';//Don't care
};
};
};
};
};
The pixel clock is calculated as follows:
pix_clk_hz = HTotal × VTotal × FPS
For OV5693 running at 2592×1944@30fps, the total height and total width are 2688×1984:
pix_clk_hz = 2688 × 1984 × 30 = 159,989,760 ≈ 160,000,000
And the MCLK multiplier is:
mclk_multiplier = pix_clk_hz / mclk_khz = 160,000,000 / 24,000,000 = 6.66
Camera Pipeline Tuning: DTS Binding
Proper Camera Pipeline Tuning on Jetson requires correct binding between the internal DTS ports. As noted earlier, the camera data flows through the following stages:
| Sensor Output | CSI Input | CSI output | VI Input |
| ov5693_ov5693_out0 | ov5693_csi_in0 | ov5693_csi_out0 | ov5693_vi_in0 |
The binding between internal ports is done using the settings below. Correct port binding is essential for both CSI camera integration and, analogously, for MIPI DSI display driver development where display ports must be similarly chained in the device tree.
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
ov5693_ov5693_out0: endpoint {
port-index = <0>;
bus-width = <2>;
remote-endpoint = <&ov5693_csi_in0>;
};
};
};
nvcsi@15a00000 {
num-channels = <1>;
#address-cells = <1>;
#size-cells = <0>;
status = 'okay';
channel@0 {
reg = <0>;
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
ov5693_csi_in0: endpoint@0 {
port-index = <0>;
bus-width = <2>;
remote-endpoint = <&ov5693_ov5693_out0>;
};
};
port@1 {
reg = <1>;
ov5693_csi_out0: endpoint@1 {
remote-endpoint = <&ov5693_vi_in0>;
};
};
};
};
};
host1x {
vi@15c10000 {
num-channels = <1>;
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
ov5693_vi_in0: endpoint {
port-index = <0>;
bus-width = <2>;
remote-endpoint = <&ov5693_csi_out0>;
};
};
};
};
The driver receives data from the VI output via the Host1x DMA engine module.
Overlay
L4T employs a mechanism of DTB overlays to enable or disable drivers. The OV5693 driver can be enabled in the DTS by setting its status field to okay.
fragment-ov5693@0 {
ids = '2180-*';
override@0 {
target = <&ov5693_cam0>;
_overlay_ {
status = 'okay';
};
};
};
During boot, if the correct camera module is detected, the overlay is added to the device tree node and further driver and device registration is completed by the camera driver (ov5693.c).
Embien's Expertise in Embedded MIPI CSI Camera Integration
Embien is a leading product engineering services provider with specialized expertise in NVIDIA Jetson Development across Tegra TX1, TX2, Xavier, and Nano platforms. Our team has extensive experience in Embedded MIPI CSI camera integration — interfacing diverse camera modules over MIPI-CSI and other protocols, enabling them with the libArgus framework, and developing custom GStreamer plugins and pipelines.
Beyond camera drivers, Embien's capabilities extend to camera module development, Camera Pipeline Tuning for machine vision and AI inference workflows, and MIPI DSI display driver development for display subsystems on the same Jetson platforms. Embien’s Global Partner Ecosystem fosters collaboration and co-innovation, combining diverse expertise to deliver impactful solutions worldwide.
Our customers include Fortune 500 companies in the fields of defence, avionics, industrial automation, medical, automotive, and semiconductors. Get in touch to accelerate your next camera-enabled edge computing product.
