jetcam

by NVIDIA-AI-IOT

NVIDIA-AI-IOT / jetcam

Easy to use Python camera interface for NVIDIA Jetson

189 Stars 58 Forks Last release: over 1 year ago (v0.0.0) MIT License 51 Commits 1 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

JetCam

JetCam is an easy to use Python camera interface for NVIDIA Jetson.

  • Works with various USB and CSI cameras using Jetson's Accelerated GStreamer Plugins
  • Easily read images as

    numpy
    arrays with
    image = camera.read()
  • Set the camera to

    running = True
    to attach callbacks to new frames

JetCam makes it easy to prototype AI projects in Python, especially within the Jupyter Lab programming environment installed in JetCard.

If you find an issue, please let us know!

Setup

git clone https://github.com/NVIDIA-AI-IOT/jetcam
cd jetcam
sudo python3 setup.py install

JetCam is tested against a system configured with the JetCard setup. Different system configurations may require additional steps.

Usage

Below we show some usage examples. You can find more in the notebooks.

Create CSI camera

Call

CSICamera
to use a compatible CSI camera.
capture_width
,
capture_height
, and
capture_fps
will control the capture shape and rate that images are aquired.
width
and
height
control the final output shape of the image as returned by the
read
function.
from jetcam.csi_camera import CSICamera

camera = CSICamera(width=224, height=224, capture_width=1080, capture_height=720, capture_fps=30)

Create USB camera

Call

USBCamera
to use a compatbile USB camera. The same parameters as
CSICamera
apply, along with a parameter
capture_device
that indicates the device index. You can check the device index by calling
ls /dev/video*
.
from jetcam.usb_camera import USBCamera

camera = USBCamera(capture_device=1)

Read

Call

read()
to read the latest image as a
numpy.ndarray
of data type
np.uint8
and shape
(224, 224, 3)
. The color format is
BGR8
.
image = camera.read()

The

read
function also updates the camera's internal
value
attribute.
camera.read()
image = camera.value

Callback

You can also set the camera to

running = True
, which will spawn a thread that acquires images from the camera. These will update the camera's
value
attribute automatically. You can attach a callback to the value using the traitlets library. This will call the callback with the new camera value as well as the old camera value
camera.running = True

def callback(change): new_image = change['new'] # do some processing...

camera.observe(callback, names='value')

Cameras

CSI Cameras

These cameras work with the

CSICamera
class. Try them out by following the example notebook.

| Model | Infared | FOV | Resolution | Cost | |:-------|:-----:|:---:|:---:|:----:| | Raspberry Pi Camera V2 | | 62.2 | 3280x2464 | $25 | | Raspberry Pi Camera V2 (NOIR) | x | 62.2 | 3280x2464 | $31 | | Arducam IMX219 CS lens mount | | | 3280x2464 | $65 | | Arducam IMX219 M12 lens mount | | | 3280x2464 | $60 | | LI-IMX219-MIPI-FF-NANO | | | 3280x2464 | $29 | | WaveShare IMX219-77 | | 77 | 3280x2464 | $19 | | WaveShare IMX219-77IR | x | 77 | 3280x2464 | $21 | | WaveShare IMX219-120 | | 120 | 3280x2464 | $20 | | WaveShare IMX219-160 | | 160 | 3280x2464 | $23 | | WaveShare IMX219-160IR | x | 160 | 3280x2464 | $25 | | WaveShare IMX219-200 | | 200 | 3280x2464 | $27 |

USB Cameras

These cameras work with the

USBCamera
class. Try them out by following the example notebook.

| Model | Infared | FOV | Resolution | Cost | |:-------|:-----:|:---:|:---:|:----:| | Logitech C270 | | 60 | 1280x720 | $18 |

See also

  • JetBot - An educational AI robot based on NVIDIA Jetson Nano

  • JetRacer - An educational AI racecar using NVIDIA Jetson Nano

  • JetCard - An SD card image for web programming AI projects with NVIDIA Jetson Nano

  • torch2trt - An easy to use PyTorch to TensorRT converter

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.