![]()
RpiTracker application template
v3.2.0
Table of contents
- Overview
- Versions
- Source code files
- Config file
- Run application
- Build application
- Raspberry PI configuration
- Source code
Overview
RpiTracker application implements video processing pipeline: video capture > tracking > RTSP server for Raspberry PI4 and PI5. The application contains libraries: VSourceV4L2 (video capture from V4L2 compatible devices), VSourceLibCamera (video capture from libcamera API compatible devices), VSourceOpenCv (video capture based on OpenCV, supports all video sources which are implemented in OpenCV), VCodecV4L2 (H264 and JPEG video encoder for Raspberry PI4 based on V4L2 API), VCodecLibav (H264, H265 and JPEG software encoders for RPI5 since it does not have any hardware codec), CvTracker (tracker library) and RtspServer (RTSP server library). The application shows how to use a combination of the libraries listed above. It is built for Raspberry PI4 and PI5 (with both OS support Debian Bullseye 11 x64 and Debian Bookworm 12 x64). It supports different video sources and provides real-time video processing.
The RpiTracker application supports two-channel data communication alongside its real-time video processing pipeline. One channel exports telemetry data (tracking results) and the other one receives commands to the tracker. Both channels use the UdpSocket C++ library.
Structure of video processing pipeline:
![]()
After starting, the application reads a JSON config file which includes video capture parameters, tracker parameters, communication ports and RTSP server parameters. If there is no config file, the application will create a new one with default parameters. When the config file is read, the application initializes the video source. Depending on configuration params, the application creates a VSourceV4L2 class object (to capture video from V4L2 compatible cameras) or a VSourceLibCamera class object (to capture video from libcamera API compatible cameras) or a VSourceOpenCv object (to capture video from video files or from RTSP streams). Then the application creates and initializes a CvTracker class object using parameters from the JSON config file. Then the application creates a board-compatible codec and initializes an RtspServer class object (RTSP server) using parameters from the JSON config file and the previously created codec. When all modules are initialized, the application executes the video processing pipeline: capture video frame from video source - tracking - send video frame to RTSP server. The user can connect to the RTSP server with any client (ffmpeg, gstreamer, VLC, Milestone etc.). Additionally, the application creates a “Log” folder to write log information.
The application chooses the right VCodec implementation on its own, based on the board it’s compiled on. If it’s on a Raspberry Pi 4, it picks VCodecV4L2 because this board supports hardware codecs. However, for Raspberry Pi 5, it goes for VCodecLibav since this board doesn’t have hardware codecs and needs a software-based solution.
In order to prepare tracker commands from a remote connection, refer to the VTracker interface library. The interface includes static methods such as encodeCommand(…) and encodeSetParamCommand(…) for constructing a buffer that holds the required command.
The application reads the input buffer for incoming commands through UdpSocket in a separate thread. After receiving a command, it decodes it and executes the corresponding action.
Versions
Table 1 - Application versions.
| Version | Release date | What’s new |
|---|---|---|
| 1.0.0 | 16.09.2023 | - First version. |
| 2.0.0 | 03.02.2024 | - Added support for different codecs. |
| 3.0.0 | 06.05.2024 | - RPI5 supported. - Code reviewed. |
| 3.0.1 | 24.05.2024 | - Documentation updated. |
| 3.0.2 | 06.08.2024 | - Submodules updated. |
| 3.0.3 | 18.09.2024 | - VCodecLibav submodule update. |
| 3.0.4 | 04.12.2024 | - Update submodules. |
| 3.0.5 | 18.03.2025 | - Update VSourceV4L2 submodule. |
| 3.0.6 | 03.04.2025 | - Multiple submodules update. |
| 3.0.7 | 22.06.2025 | - UdpSocket submodule update. |
| 3.0.8 | 27.07.2025 | - RtspServer submodule update. |
| 3.1.0 | 16.08.2025 | - CMake structure changed. |
| 3.2.0 | 29.03.2026 | - Fixed compilation errors. - Updated documentation. |
Source code files
The application is supplied by source code only. The user is given a set of files in the form of a CMake project (repository). The repository structure is shown below:
CMakeLists.txt -------------- Main CMake file.
src ------------------------- Folder with application source code.
CMakeLists.txt ---------- CMake file.
RpiTrackerVersion.h ----- Header file with application version.
RpiTrackerVersion.h.in -- File for CMake to generate version header.
main.cpp ---------------- Application source code file.
Config file
RpiTracker application reads the config file RpiTracker.json in the same folder as the application executable file. Config file content:
{
"Params": {
"communication": {
"controlUdpPort": 50020,
"telemetryDstUdpPort": 50021
},
"videoSource": {
"source": "0;1280;720;30;YUYV"
},
"videoStream": {
"bitrateKbps": 3000,
"bitrateMode": 0,
"codec": "H264",
"custom1": 0.0,
"custom2": 0.0,
"custom3": 0.0,
"enable": true,
"fitMode": 0,
"fps": 30.0,
"gop": 30,
"h264Profile": 0,
"height": 720,
"hlsCert": "no",
"hlsEnable": true,
"hlsEncryption": "no",
"hlsKey": "no",
"hlsPort": 8080,
"ip": "0.0.0.0",
"jpegQuality": 80,
"logLevel": 0,
"maxBitrateKbps": 5000,
"metadataEnable": false,
"metadataPort": 9000,
"metadataSuffix": "metadata",
"minBitrateKbps": 1000,
"overlayEnable": true,
"password": "no",
"rtmpCert": "no",
"rtmpEnable": true,
"rtmpEncryption": "no",
"rtmpKey": "no",
"rtmpPort": 1935,
"rtmpsPort": 1936,
"rtpEnable": true,
"rtpPort": 5004,
"rtspCert": "no",
"rtspEnable": true,
"rtspEncryption": "no",
"rtspKey": "no",
"rtspMulticastIp": "224.1.0.1/16",
"rtspMulticastPort": 18000,
"rtspPort": 8554,
"rtspsPort": 8555,
"srtEnable": true,
"srtPort": 6000,
"suffix": "live",
"type": 0,
"user": "no",
"webRtcCert": "no",
"webRtcEnable": true,
"webRtcEncryption": "no",
"webRtcKey": "no",
"webRtcPort": 7000,
"width": 1280
},
"videoTracker": {
"lostModeOption": 0,
"numChannels": 3,
"parallelAlgorithm": true,
"rectAutoPosition": true,
"rectAutoSize": true,
"rectHeight": 72,
"rectWidth": 72,
"windowHeight": 256,
"windowWidth": 256
}
}
}
Table 2 - Config file parameters description.
| Parameter | Type | Description |
|---|---|---|
| Video source parameters: | ||
| source | string | Video source initialization string. Based on initialization string the application creates VSourceV4L2 class object (to capture video from V4L2 compatible cameras) or VSourceLibCamera class object (to capture video from libcamera API compatible cameras) or VSourceOpenCV (to capture video from video files or from RTSP streams). So, the user can define the backend to capture video: Initialization string for V4l2 or libcamera compatible video devices (cameras). Valid formats: [full device name];[width];[height];[fps];[fourcc] or [full device name];[width];[height];[fps] or [full device name];[width];[height] or [full device name] Initialization string parameters: [full device name] - Full name of video device file (for example “/dev/video0”) for V4L2 or device number for libcamera (for example “0”). [width] - Video frame width. Video frame height must be set as well. The library will try to set this value in video capture hardware parameters. If set to 0 the library will detect frame width automatically according to existing video device parameters. [height] - Video frame height. Video frame width must be set as well. The library will try to set this value in video capture hardware parameters. If set to 0 the library will detect frame height automatically according to existing video device parameters. [fps] - FPS. The library will try to set this value in video capture hardware parameters. If set to 0 the library will detect FPS automatically according to existing video device parameters. [fourcc] - Pixel format. Valid values: BGR24, RGB24, GRAY, YUV24, YUYV, UYVY, NV12, NV21, YV12, YU12. If fourcc is not set, the library will choose the appropriate format according to existing video device parameters. Initialization string for OpenCV video sources can be video file name (for example “test.mp4”) or RTSP stream (for example “rtsp://192.168.0.1:7100/live”). |
| Tracker parameters: | ||
| rectWidth | int | Tracking rectangle width, pixels. Set by user or can be changed by tracking algorithm if rectAutoSize == 1. |
| rectHeight | int | Tracking rectangle height, pixels. Set by user or can be changed by tracking algorithm if rectAutoSize == 1. |
| windowWidth | int | Width of search window, pixels. Set by user. |
| windowHeight | int | Height of search window, pixels. Set by user. |
| lostModeOption | int | Option for LOST mode. Parameter that defines the behavior of the tracking algorithm in LOST mode. Default is 0. Possible values: 0. In LOST mode, the coordinates of the center of the tracking rectangle are not updated and remain the same as before entering LOST mode. 1. The coordinates of the center of the tracking rectangle are updated based on the components of the object’s speed calculated before going into LOST mode. If the tracking rectangle “touches” the edge of the video frame, the coordinate updating for this component (horizontal or vertical) will stop. 2. The coordinates of the center of the tracking rectangle are updated based on the components of the object’s speed calculated before going into LOST mode. The tracking is reset if the center of the tracking rectangle touches any of the edges of the video frame. |
| rectAutoSize | bool | Use tracking rectangle auto size flag: false - no use, true - use. Set by user. |
| rectAutoPosition | bool | Use tracking rectangle auto position: false - no use, true - use. Set by user. |
| numChannels | int | Number of channels for processing. E.g., number of color channels. Set by user. |
| parallelAlgorithm | bool | Use multiple threads for calculations. false - one thread, true - multiple. Set by user. |
| Streamer parameters: | ||
| enable | bool | Streamer mode: false - Off, true - On. |
| width | int | Video stream width from 8 to 8192. If the resolution of the source video frame is different from that specified in the parameters for the streamer, the source video will be scaled. |
| height | int | Video stream height from 8 to 8192. If the resolution of the source video frame is different from that specified in the parameters for the streamer, the source video will be scaled. |
| ip | string | Server IP. It is recommended to use “0.0.0.0” independent from board IP. |
| rtspPort | int | RTSP server port. RTSP initialization string to receive video will be: “rtsp://IP:rtspPort/suffix”. |
| rtspsPort | int | RTSPS (secure RTSP) server port. |
| rtpPort | int | RTP server port. |
| webRtcPort | int | WebRTC server port. |
| hlsPort | int | HLS server port. |
| srtPort | int | SRT server port. |
| rtmpPort | int | RTMP server port. |
| rtmpsPort | int | RTMPS (secure RTMP) server port. |
| metadataPort | int | Metadata server port. |
| rtspEnable | bool | Enable/disable RTSP stream. |
| rtpEnable | bool | Enable/disable RTP stream. |
| webRtcEnable | bool | Enable/disable WebRTC stream. |
| hlsEnable | bool | Enable/disable HLS stream. |
| srtEnable | bool | Enable/disable SRT stream. |
| rtmpEnable | bool | Enable/disable RTMP stream. |
| metadataEnable | bool | Enable/disable metadata stream. |
| user | string | Server user: “no” - no user. |
| password | string | Server password: “no” - no password. |
| suffix | string | Stream suffix (stream name). For example “live” or “unicast”. RTSP initialization string to receive video will be: “rtsp://IP:rtspPort/suffix”. |
| metadataSuffix | string | Metadata stream suffix. |
| minBitrateKbps | int | Minimum bitrate, kbps. |
| maxBitrateKbps | int | Maximum bitrate, kbps. |
| bitrateKbps | int | Dedicated bitrate, kbps. Will be set to VCodecV4L2 codec or VCodecLibav codec. |
| bitrateMode | int | Bitrate mode. |
| fps | float | Video stream FPS. Regardless of the FPS of the camera, the video stream will be in accordance with the specified FPS. |
| gop | int | GOP size for H264 codecs. Will be set to VCodecV4L2 codec or VCodecLibav codec. |
| h264Profile | int | H264 profile: 0 - baseline, 1 - main, 2 - high. Will be set to VCodecV4L2 codec or VCodecLibav codec. |
| jpegQuality | int | JPEG quality from 1 to 100% for JPEG codec. Will be set to VCodecV4L2 codec. |
| codec | string | Encoding format. PI4 supports: JPEG and H264. PI5 supports: H264 and HEVC. |
| fitMode | int | Scaling mode: 0 - fit, 1 - cut. Defines how to scale video to stream resolution. |
| overlayEnable | bool | Overlay enable/disable flag. true - enable, false - disable. |
| custom1 | float | Custom parameter 1. |
| custom2 | float | Custom parameter 2. |
| custom3 | float | Custom parameter 3. |
| rtspMulticastIp | string | RTSP server multicast IP. In order to enable multicast stream, type should be 1. |
| rtspMulticastPort | int | RTSP server multicast port. |
| type | int | RTSP server stream type. Value: 0 - unicast, 1 - multicast. |
| rtspEncryption | string | RTSP encryption mode: “no” - disabled. |
| webRtcEncryption | string | WebRTC encryption mode: “no” - disabled. |
| rtmpEncryption | string | RTMP encryption mode: “no” - disabled. |
| hlsEncryption | string | HLS encryption mode: “no” - disabled. |
| rtspKey | string | RTSP TLS key file path: “no” - not set. |
| rtspCert | string | RTSP TLS certificate file path: “no” - not set. |
| webRtcKey | string | WebRTC TLS key file path: “no” - not set. |
| webRtcCert | string | WebRTC TLS certificate file path: “no” - not set. |
| hlsKey | string | HLS TLS key file path: “no” - not set. |
| hlsCert | string | HLS TLS certificate file path: “no” - not set. |
| rtmpKey | string | RTMP TLS key file path: “no” - not set. |
| rtmpCert | string | RTMP TLS certificate file path: “no” - not set. |
| logLevel | int | Log level: 0 - disabled. |
| Communication: | ||
| controlUdpPort | int | UDP port number to get commands for tracker. |
| telemetryDstUdpPort | int | UDP port number to send telemetry data. |
Run application
Copy the application (RpiTracker executable and RpiTracker.json) to any folder and run:
./RpiTracker -vv
When the application is started for the first time, it creates a configuration file named RpiTracker.json if the file does not exist (refer to the Config file section). If the application is run as a superuser using sudo, the file will be owned by the root user. Therefore, to modify the configuration file, superuser privileges will be necessary. You can run the application manually or you can add the application to auto start as a systemd service. To add the application as a systemd service, create a service file:
sudo nano /etc/systemd/system/rpitracker.service
and add content:
[Unit]
Description=RpiTracker
Wants=network.target
After=syslog.target network-online.target
[Service]
Type=simple
ExecStart=/home/pi/App/RpiTracker
Restart=on-failure
RestartSec=10
KillMode=control-group
KillSignal=SIGTERM
[Install]
WantedBy=multi-user.target
Save ctrl + s and close ctrl + x.
Reload systemd services:
sudo systemctl daemon-reload
Command to control service:
Start service:
sudo systemctl start rpitracker
Enable for auto start:
sudo systemctl enable rpitracker
Stop service:
sudo systemctl stop rpitracker
Check status:
sudo systemctl status rpitracker
Build application
Before building, the user should configure the Raspberry Pi (Raspberry PI configuration). Typical build commands:
cd RpiTracker
mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make
Raspberry PI configuration
OS install
-
Download Raspberry PI imager.
-
Choose Operating System: Raspberry Pi OS (other) -> Raspberry Pi OS Lite (64-bit) Debian Bullseye 11 or Debian Bookworm 12.
-
Choose the storage (MicroSD) card. If possible, use a 128 GB MicroSD.
-
Set additional options: do not set hostname “raspberrypi.local”, enable SSH (Use password authentication), set username ”pi” and password ”pi”, configure wireless LAN according to your settings. You will need Wi-Fi for software installation, set appropriate time zone and wireless zone.
-
Save changes and push the “Write” button. After that, push “Yes” to rewrite data on MicroSD. At the end, remove the MicroSD.
-
Insert the MicroSD into the Raspberry Pi and power it up.
LAN configuration
-
Configure LAN IP address on your PC. For Windows 11, go to Settings -> Network & Internet -> Advanced Network Settings -> More network adapter options. Right-click on the LAN connection used to connect to Raspberry Pi and choose “Properties”. Double-click on “Internet Protocol Version 4 (TCP/IPv4)”. Set static IP address 192.168.0.1 and mask 255.255.255.0.
-
Connect the Raspberry Pi via LAN cable. After power up, you don’t know the IP address of the Raspberry Pi board, but you can connect to it via SSH using the “raspberrypi” name. In Windows 11, open Windows Terminal or PowerShell terminal and type the command ssh pi@raspberrypi. After connection, type yes to establish authenticity. WARNING: If you work with Windows, we recommend deleting information about previous connections. Go to folder C:/Users/[your user name]/.ssh. Open file known_hosts if it exists in a text editor. Delete all lines containing ”raspberrypi”.
-
You have to set a static IP address on the Raspberry Pi, but not NOW. After setting the static IP, you will lose the Wi-Fi connection, which you need for software installation.
Install dependencies
-
Connect to the Raspberry Pi via SSH: ssh pi@raspberrypi. WARNING: If you work with Windows, we recommend deleting information about previous connections. Go to folder C:/Users/[your user name]/.ssh. Open file known_hosts if it exists in a text editor. Delete all lines containing ”raspberrypi”.
-
Install libraries:
sudo apt-get update sudo apt install cmake build-essential libopencv-dev -
If the libcamera API is not installed, you have to install it. Please note that the RpiTracker application is compatible with a specific version of libcamera. You can install it using the following commands from source code:
# Remove everything related to libcamera sudo apt-get remove --auto-remove libcamera* libcamera-apps* # Check if the libcamera directory exists and remove it if [ -d "libcamera" ]; then echo "libcamera directory already exists. Removing..." sudo rm -rf libcamera fi # Get libcamera git clone https://github.com/raspberrypi/libcamera.git cd libcamera # Checkout v0.1.0 version git checkout v0.1.0+rpt20231122 # Install required packages sudo apt-get update && sudo apt-get install -y cmake meson ninja-build python3 python3-pip python3-setuptools python3-wheel git pkg-config libgnutls28-dev openssl libjpeg-dev libyuv-dev libv4l-dev libudev-dev libexpat1-dev libssl-dev libdw-dev libunwind-dev doxygen graphviz && pip3 install ply PyYAML # If you get an error for ply and PyYAML, run the following command. # sudo apt-get install -y python3-ply python3-yaml # build libcamera meson build ninja -C build #install libcamera sudo ninja -C build install # update linker cache sudo ldconfig # remove source code. cd ../ sudo rm -rf libcamera echo "libcamera installation complete." -
Install the required libraries for the libav codec. If you are using RPI4, this step can be skipped.
-
Run command :
sudo apt-get install -y ffmpeg libavcodec-dev libavutil-dev libavformat-dev libavdevice-dev libavfilter-dev libcurl4
Setup high performance mode (overclock) if necessary
-
Open config file:
sudo nano /boot/config.txt -
Add line at the end of file:
- For RPI4 :
force_turbo=1 arm_freq=2000 arm_freq_min=2000 over_voltage=6- For RPI5 :
force_turbo=1 over_voltage_delta=500000 arm_freq=3000 gpu_freq=800 -
Save changes ctrl+s and close ctrl+x. Reboot raspberry sudo reboot.
Please ensure that the Raspberry Pi has a proper cooling system and power source. Overclocking can cause overheating and system instability in case of insufficient cooling and/or power supply.
V4L2 camera stack configuration
-
To enable legacy (V4L2) camera support on the Raspberry Pi, you have to change system settings. Run configuration manager:
sudo raspi-config -
Select “Interface Options” -> “Legacy camera”. After select yes.
-
Go to Finish menu and close configuration manager.
-
Reboot raspberry sudo reboot.
Static IP configuration on Raspberry.
-
Connect to the Raspberry Pi via SSH ssh pi@raspberrypi after reboot. WARNING: If you work with Windows, we recommend deleting information about previous connections. Go to folder C:/Users/[your user name]/.ssh. Open file known_hosts if it exists in a text editor. Delete all lines containing ”raspberrypi”.
-
Open config file:
sudo nano /etc/dhcpcd.conf -
Scroll down in the file and delete the comments in the following lines and set IP 192.168.0.2:
interface eth0 static ip_address=192.168.0.2/24 static ip6_address=fd51:42f8:caae:d92e::ff/64 static routers=192.168.0.2 static domain_name_servers=192.168.0.2 8.8.8.8 fd51:42f8:caae:d92e::1 -
Save changes ctrl + s and exit ctrl + x. Reboot after sudo reboot.
-
After configuring the static IP, you can connect to the Raspberry Pi by IP.
ssh pi@192.168.0.2
Source code
Below is the source code of the main.cpp file:
#include <ifaddrs.h>
#include "RpiTrackerVersion.h"
#include "VSourceV4L2.h"
#include "CvTracker.h"
#include "VSourceOpenCv.h"
#include "VSourceLibCamera.h"
#include "FormatConverterOpenCv.h"
#include "UdpSocket.h"
#include "RtspServer.h"
#ifdef IS_RPI5
#include <VCodecLibav.h>
#else
#include <VCodecV4L2.h>
#endif
/// Max command size, bytes.
constexpr int MAX_COMMAND_SIZE = 128;
/// Max telemetry size, bytes.
constexpr int MAX_TELEMETRY_SIZE = 128;
/// Log folder.
#define LOG_FOLDER "Log"
/// Config file name.
#define CONFIG_FILE_NAME "RpiTracker.json"
/// Application parameters.
struct Params
{
/// Video source params.
struct VideoSource
{
/// Init string.
std::string source{"0;1280;720;30;YUYV"};
JSON_READABLE(VideoSource, source)
};
/// Communication.
struct Communication
{
/// Control UDP port.
int controlUdpPort{50020};
/// Output UDP port.
int telemetryDstUdpPort{50021};
JSON_READABLE(Communication, controlUdpPort, telemetryDstUdpPort)
};
/// Video tracker.
struct VideoTracker
{
/// Tracking rectangle width.
int rectWidth{72};
/// Tracking rectangle height.
int rectHeight{72};
/// Width of search window.
int windowWidth{256};
/// Height of search window.
int windowHeight{256};
/// Option for lost mode.
int lostModeOption{0};
/// Use tracking rectangle auto size flag: false - no use, true - use.
bool rectAutoSize{true};
/// Use tracking rectangle auto position: false - no use, true - use.
bool rectAutoPosition{true};
/// Number of color channels for processing.
int numChannels{3};
/// Use parallel algorithms.
bool parallelAlgorithm{true};
JSON_READABLE(VideoTracker, rectWidth, rectHeight, windowWidth,
windowHeight, lostModeOption, rectAutoSize,
rectAutoPosition, numChannels, parallelAlgorithm)
};
/// Video source.
VideoSource videoSource;
/// Control channel.
Communication communication;
/// Video tracker.
VideoTracker videoTracker;
/// Video stream params.
cr::video::VStreamerParams videoStream;
JSON_READABLE(Params, communication, videoStream, videoSource, videoTracker)
};
/// Application params.
Params g_params;
/// Logger.
cr::utils::Logger g_log;
/// Log flag.
std::atomic<cr::utils::PrintFlag> g_logFlag{cr::utils::PrintFlag::CONSOLE};
/// Video tracker.
cr::vtracker::CvTracker g_tracker;
/**
* @brief Load configuration params from JSON file.
* @return TRUE if parameters loaded or FALSE if not.
*/
bool loadConfig();
/**
* @brief Execute a command with a child process.
* @param cmd Command string.
*/
std::string exec(const char* cmd);
/**
* @brief Get host IP.
* @return Host IP.
*/
std::string getHostIp();
/**
* @brief Tracker coming command thread function.
*/
void trackerCommandThreadFunc();
int main(int argc, char **argv)
{
cr::utils::Logger::setSaveLogParams(LOG_FOLDER, "log", 20, 1);
// Welcome message.
g_log.print(cr::utils::PrintColor::YELLOW, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] " <<
"RpiTracker application v" << RPI_TRACKER_VERSION << std::endl;
// Check arguments and set console output if necessary.
if (argc > 1)
{
std::string str = std::string(argv[1]);
if (str == "-v" || str == "-vv")
{
g_logFlag.store(cr::utils::PrintFlag::CONSOLE_AND_FILE);
}
}
// Load config file.
if (!loadConfig())
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't load config file." << std::endl;
return -1;
}
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] " <<
"Config file loaded." << std::endl;
// Close all processes using RTSP server port.
std::string cmd = "sudo fuser -k " + std::to_string(g_params.videoStream.port) + "/tcp";
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] " <<
"Close processes: " << exec(cmd.c_str()) << std::endl;
// Close all processes using UDP server port.
cmd = "sudo fuser -k " + std::to_string(g_params.communication.controlUdpPort) + "/udp";
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] " <<
"Close processes: " << exec(cmd.c_str()) << std::endl;
// Init video source.
cr::video::VSource* videoSource;
if(g_params.videoSource.source.find("/dev/video") != std::string::npos)
{
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "]: " <<
"V4L2 video source detected." << std::endl;
videoSource = new cr::video::VSourceV4L2();
}
else
{
if(((g_params.videoSource.source.find("0") != std::string::npos) &&
g_params.videoSource.source.find("0") < 2) ||
((g_params.videoSource.source.find("1") !=
std::string::npos) && g_params.videoSource.source.find("1") < 2))
{
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "]: " <<
"libcamera video source detected." << std::endl;
videoSource = new cr::video::VSourceLibCamera();
}
else
{
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "]: " <<
"OpenCv video source detected." << std::endl;
videoSource = new cr::video::VSourceOpenCv();
}
}
if (!videoSource->openVSource(g_params.videoSource.source))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't initialise video source: " << g_params.videoSource.source << std::endl;
return -1;
}
// Init output UDP socket.
cr::clib::UdpSocket outSocket;
if (!outSocket.open(g_params.communication.telemetryDstUdpPort, false))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't init output UDP socket: " << g_params.communication.telemetryDstUdpPort << std::endl;
return false;
}
// Set video tracker default params.
g_tracker.setParam(cr::vtracker::VTrackerParam::RECT_WIDTH, static_cast<float>(g_params.videoTracker.rectWidth));
g_tracker.setParam(cr::vtracker::VTrackerParam::RECT_HEIGHT, static_cast<float>(g_params.videoTracker.rectHeight));
g_tracker.setParam(cr::vtracker::VTrackerParam::SEARCH_WINDOW_WIDTH, static_cast<float>(g_params.videoTracker.windowWidth));
g_tracker.setParam(cr::vtracker::VTrackerParam::SEARCH_WINDOW_HEIGHT, static_cast<float>(g_params.videoTracker.windowHeight));
g_tracker.setParam(cr::vtracker::VTrackerParam::RECT_AUTO_SIZE, g_params.videoTracker.rectAutoSize ? 1.0f : 0.0f);
g_tracker.setParam(cr::vtracker::VTrackerParam::RECT_AUTO_POSITION, g_params.videoTracker.rectAutoPosition ? 1.0f : 0.0f);
g_tracker.setParam(cr::vtracker::VTrackerParam::NUM_CHANNELS, static_cast<float>(g_params.videoTracker.numChannels));
g_tracker.setParam(cr::vtracker::VTrackerParam::MULTIPLE_THREADS, g_params.videoTracker.parallelAlgorithm ? 1.0f : 0.0f);
cr::vtracker::VTrackerParams trackerParams;
/// Get host IP.
if (g_params.videoStream.ip == "")
{
g_params.videoStream.ip = getHostIp();
}
/// Codec for server
cr::video::VCodec *codec;
/* VCodecLibav supports software codec that can work both on RPi4 and RPi5.
* VCodecV4L2 is hardware codec that can work only on RPi4.
* It is more efficient to use hardware codec for RPi4 and software codec is only option for RPi5.
*/
#ifdef IS_RPI5
codec = new cr::video::VCodecLibav();
codec->setParam(cr::video::VCodecParam::TYPE, 1); // 0 - hardware codec, 1 - software codec.
#else
codec = new cr::video::VCodecV4L2();
#endif
// Init RTSP server.
cr::rtsp::RtspServer server;
if(!server.initVStreamer(g_params.videoStream, codec))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Rtsp server can't init: rtsp://" << g_params.videoStream.ip << ":" <<
g_params.videoStream.port << "/" << g_params.videoStream.suffix << std::endl;
return -1;
}
g_log.print(cr::utils::PrintColor::GREEN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "]: " <<
"RTSP server init: rtsp://" << g_params.videoStream.ip << ":" <<
g_params.videoStream.port << "/" << g_params.videoStream.suffix << std::endl;
/// Init frames.
cr::video::Frame sourceFrame;
cr::video::Frame yuvFrame;
yuvFrame.fourcc = cr::video::Fourcc::YUV24;
/// Init pixel format converter.
cr::video::FormatConverterOpenCv formatConverter;
/// Init buffer for telemetry data.
uint8_t telemetryData[MAX_TELEMETRY_SIZE];
/// Telemetry data ID.
uint8_t telemetryDataId = 0;
// Tracker command input thread.
std::thread trackerCommandThread(trackerCommandThreadFunc);
trackerCommandThread.detach();
// Main loop.
while (true)
{
// Wait new video frame.
if (!videoSource->getFrame(sourceFrame, 1000))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"No input video frame." << std::endl;
}
// Convert source frame to NV12.
formatConverter.convert(sourceFrame, yuvFrame);
// Video tracking.
g_tracker.processFrame(yuvFrame);
// Get tracker results.
g_tracker.getParams(trackerParams);
// Encode tracker data.
telemetryData[0] = ++telemetryDataId;
telemetryData[1] = static_cast<uint8_t>(trackerParams.mode);
memcpy(&telemetryData[2], &trackerParams.frameWidth, sizeof(int));
memcpy(&telemetryData[6], &trackerParams.frameHeight, sizeof(int));
memcpy(&telemetryData[10], &trackerParams.rectX, sizeof(int));
memcpy(&telemetryData[14], &trackerParams.rectY, sizeof(int));
memcpy(&telemetryData[18], &trackerParams.rectWidth, sizeof(int));
memcpy(&telemetryData[22], &trackerParams.rectHeight, sizeof(int));
int telemetrySize = 26;
// Send telemetry data first time.
if (outSocket.send(telemetryData, telemetrySize) != telemetrySize)
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't send telemetry data." << std::endl;
}
// Wait 50 usec to send again.
std::chrono::time_point<std::chrono::system_clock> packetSlotStart = std::chrono::system_clock::now();
int slotTimeMks = static_cast<int>(std::chrono::duration_cast<std::chrono::microseconds>(
std::chrono::system_clock::now() - packetSlotStart)
.count());
while (slotTimeMks < 50)
{
slotTimeMks = static_cast<int>(std::chrono::duration_cast<std::chrono::microseconds>(
std::chrono::system_clock::now() - packetSlotStart)
.count());
}
packetSlotStart = std::chrono::system_clock::now();
// Send telemetry data second time in order to reduce packet loss.
// If packet loss is neither critical nor communication is health, this part can be removed.
if (outSocket.send(telemetryData, telemetrySize) != telemetrySize)
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't send telemetry data." << std::endl;
}
// Draw results.
cv::Scalar rectangleColor;
if (trackerParams.mode == 0)
{
rectangleColor = cv::Scalar(255, 128, 128); // Light coral.
}
else if (trackerParams.mode == 1)
{
rectangleColor = cv::Scalar(76, 84, 255); // Bright blue.
}
else
{
rectangleColor = cv::Scalar(29, 255, 107); // Bright green.
}
cv::Mat image(yuvFrame.height, yuvFrame.width, CV_8UC3, yuvFrame.data);
int x0 = trackerParams.rectX - trackerParams.rectWidth / 2;
int y0 = trackerParams.rectY - trackerParams.rectHeight / 2;
int x1 = x0 + trackerParams.rectWidth;
int y1 = y0 + trackerParams.rectHeight;
cv::rectangle(image, cv::Rect(x0, y0, x1 - x0 + 1, y1 - y0 + 1), rectangleColor, 2);
// Send frame into rtsp server.
if (!server.sendFrame(yuvFrame))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't send frame to RTSP server" << std::endl;
continue;
}
}
return 1;
}
bool loadConfig()
{
cr::utils::ConfigReader config;
// Open config json file (if does not exist - create new and exit).
if(config.readFromFile(CONFIG_FILE_NAME))
{
// Read values and set to params
if(!config.get(g_params, "Params"))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Params were not read" << std::endl;
return false;
}
}
else
{
// Set default params.
config.set(g_params, "Params");
// Save config file.
if (!config.writeToFile(CONFIG_FILE_NAME))
{
g_log.print(cr::utils::PrintColor::CYAN, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "]: " <<
"Config file created." << std::endl;
return false;
}
}
return true;
}
std::string exec(const char* cmd)
{
char buffer[128];
std::string result = "";
FILE* pipe = ::popen(cmd, "r");
if (!pipe)
{
throw std::runtime_error("popen() failed!");
}
try
{
while (fgets(buffer, sizeof buffer, pipe) != NULL)
{
result += buffer;
}
}
catch (...)
{
pclose(pipe);
throw;
}
pclose(pipe);
return result;
}
std::string getHostIp()
{
// Get interfaces.
struct ifaddrs *ptr_ifaddrs = nullptr;
auto result = getifaddrs(&ptr_ifaddrs);
if (result != 0)
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"getifaddrs() failed: " << std::strerror(errno) << std::endl;
return "127.0.0.1";
}
// Get list IP.
for (struct ifaddrs *ptr_entry = ptr_ifaddrs;
ptr_entry != nullptr;
ptr_entry = ptr_entry->ifa_next)
{
std::string ipaddress_human_readable_form;
std::string netmask_human_readable_form;
std::string interface_name = std::string(ptr_entry->ifa_name);
sa_family_t address_family = ptr_entry->ifa_addr->sa_family;
if (address_family == AF_INET)
{
// IPv4
// Be aware that the `ifa_addr`, `ifa_netmask` and `ifa_data` fields might contain nullptr.
if (ptr_entry->ifa_addr != nullptr)
{
char buffer[INET_ADDRSTRLEN] = {};
inet_ntop(
address_family,
&((struct sockaddr_in *)(ptr_entry->ifa_addr))->sin_addr,
buffer,
INET_ADDRSTRLEN);
ipaddress_human_readable_form = std::string(buffer);
}
if (ptr_entry->ifa_netmask != nullptr)
{
char buffer[INET_ADDRSTRLEN] = {};
inet_ntop(
address_family,
&((struct sockaddr_in *)(ptr_entry->ifa_netmask))->sin_addr,
buffer,
INET_ADDRSTRLEN);
netmask_human_readable_form = std::string(buffer);
}
// Check IP address.
if (ipaddress_human_readable_form != "127.0.0.1" &&
ipaddress_human_readable_form != "0.0.0.0")
{
return ipaddress_human_readable_form;
}
}
}
freeifaddrs(ptr_ifaddrs);
return "127.0.0.1";
}
void trackerCommandThreadFunc()
{
// Init output UDP socket.
cr::clib::UdpSocket inputSocket;
if (!inputSocket.open(g_params.communication.controlUdpPort, true))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't init input UDP socket: " << g_params.communication.controlUdpPort << std::endl;
return;
}
uint8_t commandData[MAX_COMMAND_SIZE];
while (true)
{
// Execute tracker commands.
int commandSize = inputSocket.read(commandData, MAX_COMMAND_SIZE);
if (commandSize > 0)
{
// Execute command.
if (!g_tracker.decodeAndExecuteCommand(commandData, commandSize))
{
g_log.print(cr::utils::PrintColor::RED, g_logFlag.load()) <<
"[" << __LOGFILENAME__ << "][" << __LINE__ << "] ERROR: " <<
"Can't execute command." << std::endl;
}
}
}
}