Hello
i have just ordered one of these and wondering if there is anyone working towards autonomy, as in navigation, goals etc.
the reinforcement learning OpenCat Gym thread shows there is a way to incorperate some form of AI. although a tethered implementation has obvious limitations.
has anyone used the MU camera for this type of thing? or maybe proximity sensors or something?
im just looking for a starting point and if someone has already done some work in this area it would be helpfull to know.
a bit about me. i am a game programmer with decades of experience and have 'played' with neural networks but nothing too involved so far. i have no experience at all in robotics and the Bittle looks like an interesting way to start.
thanks
@Tegleg Just found your post. I am very interested in having autonomous behavior for Bittle. My approach is to use my laptop as the "brain" and use Bluetooth communication as part of the Bittle "nervous system". Initially, the sensors would be the IMU for heading info and the distance sensor for obstacle detection. The Mu camera would be later for me.
Happy to collaborate on autonomous behavior for Bittle if you're still working in this area.
The Bittle is indeed an interesting way to start. There are some extensible modules and a lot of useful tech docs in the Petoi Doc Center. For more details, please refer to:
Good luck and have fun.😀
ok, after a bit more digging i found this page about Grove - Ultrasonic Distance Sensor
https://www.yuque.com/tinkergen-help-en/bittle_course/sensor_pack_lesson_4
is this basically a HC-SR04 Ultrasonic Module?
are they compatible with Bittle?