Hello everyone,
I'm excited to share my experiences and insights on video streaming capabilities when using Bittle in combination with a Raspberry Pi. I've explored two methods to achieve this, each with its own set of advantages and technical considerations.
Method 1: Python with Socket Protocol and OpenCV
Overview: Leveraging Python, along with the socket protocol and OpenCV, I developed a streamlined process for video streaming.
Repository & Configuration: For those interested in the technical specifics or looking to replicate this setup, you can find the details in my GitHub repository here. It includes all necessary Raspberry Pi configurations to integrate seamlessly with the NyBoard.
Method 2: ROS2 Integration
Overview: Adopting ROS2, I focused on publishing and subscribing to topics for video streaming.
ROS2 Package: The repository for the ROS2 package can be found here. This package is based on instructions originally designed for ROS Melodic, detailed here.
Technical Journey: Inspired by the bittle_driver.py in the Melodic repository, I ventured into creating a compatible ROS2 package using ROS2 Humble. My preference for the Raspberry Pi camera led me to explore additional ROS2 packages, culminating in the adaptation of the ROS2 Bittle package for video streaming, guided by this article.
Performance and Configuration
Both methods have demonstrated effectiveness, with the quality of video streaming being closely tied to the configuration of image capture and compression. My experiments were primarily conducted on Raspberry Pi Zero 2W and Raspberry Pi 4 platforms. While I have not quantified the frame rate for the socket-based Python approach due to its limitations for this application, the ROS2 framework has consistently delivered superior video quality.
Achievable Frame Rates:
Raspberry Pi Zero 2W: High 20s per second, achievable through specific launch file configurations (detailed below).
Raspberry Pi 4: Low 30s per second, with jpeg quality set at 40%.
Sample Launch file (robot_joystic_launch.py): from launch import LaunchDescription
from launch_ros.actions import Node
def generate_launch_description():
return LaunchDescription([
Node(
package='bittle_ros2',
executable='joystick_driver',
name='main_module',
output='screen'
),
Node(
package='v4l2_camera',
executable='v4l2_camera_node',
name='camera',
output='screen',
parameters=[
{'video_device': '/dev/video0'},
{'output_encoding': 'rgb8'},
{'pixel_format': 'YUYV'},
{'image_size': [640,480]},
{'io_method': 'mmap'},
{'image_raw.jpeg_quality' : 25}
]
)
])
Challenges and Ongoing Troubleshooting
Despite achieving satisfactory frame rates, I've encountered intermittent drops in frame rate, leading to sporadic choppy video feeds. Identifying the exact cause remains a challenge, with no clear correlation observed between network I/O, CPU usage, and video quality.
An important consideration for Raspberry Pi 4 users is the inadequate voltage supply from the NyBoard, potentially leading to system crashes. A workaround involving a custom wiring a voltage regulator from the + and - terminals on the NyBoard to the GPIO pins on the Rasberry Pi has been my solution.
Next Steps
Updates and breakthroughs will be shared in this forum.
I hope this framework serves as a solid starting point for those looking to explore video streaming capabilities on their robotic projects. Your feedback and contributions are highly welcomed.
Update for Pi Zero 2w: I've found that setting the image size to 320x240 with jpeg_quality at 20 provides low latency streaming.