Let us introduce Alpha, our quadrupled robot! Constructed to investigate the future of independent companionship. With the aid of development, we intend to upgrade our robot with AI applications. Of course, the inspiration for this project came from the Boston Dynamics "Spot Mini" robot. Within a $200 budget, a prototype was completed in a month.
Deok-yeon The 3D design and files on Thingiverse.com were initially created by Kim. We made small structural and design changes to the parts in Fusion360 to meet our requirements.. link - https://www.thingiverse.com/thing:3445283
Deok-yeon's design does not include any electronics or coding; these are things that we must create!
Impact Statement
The project showcases the integration of edge AI and robotics into a quadruped robot platform, demonstrating the growing accessibility of embedding intelligent features into physical devices. The robot has 12 degrees of freedom and inverse kinematics algorithms, enabling high mobility and naturalistic movements. It also has ultrasonic sensors, a camera, and deep learning-based object and face recognition capabilities. The project also incorporates voice control and emotional expression through an LED display, making it accessible to a wider audience of makers, students, and hobbyists.
Collecting Parts
PCA9685 - 16 Channel 12-Bit PWM Servo Motor Driver I2C Module For Arduino
B3 Lithium Polymer (LiPo) Battery Charger for 2S-3S Lipo
300W 20A DC-DC Buck Converter Step-down Module Constant Current LED Driver Module
LM2596 DC-DC Buck Converter Adjustable Step-Down Power Supply Module
Tower Pro MG996R Digital Metal Gear High Torque Servo Motor (180 Degree Rotation)
Lipo Battery Voltage Tester with Buzzer Alarm
1602 (16x2) LCD Display with I2C/IIC interface - Blue Backlight
Digital Multi Servo Tester ESC CCPM Consistency Master Speed Control
Most parts are PLA, except the foot, which is 3D printed in TPU plastic. Both are printed with 10% fill.
The black parts represent Alpha's "skeletal framework." The robot has 12 high-torque metal gear servos, three on each leg, for a wide range of motion.
To see the full demonstration video, click on the YouTube Video below.
Here is a brief overview of our robot's main attributes:
Twelve degrees of freedom, three for each leg, allowing for biomimicry.
Use inverse kinematics for adaptable movements like sitting, standing, and walking.
Two ultrasonic sensors to detect and avoid obstacles.
Its "eye," a Maixduino Kit Cam, offers a real-time point-of-view feed.
The Maixduino Kit Cam uses the Yolo Algorithm for face and object recognition.
An LED interface for updates to programs in real-time.
with a 3S Lithium-Polymer battery that provides about 30 minutes of use.
Assembling the robot is enjoyable and quite simple. it's similar to assembling a Lego set.
Circuit Connection
Both the ESP32-CAM and the Maixduino Kit for AI+IoT share features such as GPIO pin configuration, Wi-Fi capabilities, camera support, and programming interfaces, despite the Maixduino Kit being specifically designed for AI applications and based on a RISC-V architecture.
Because of this, their functional similarities allow wiring and circuit designs from one to be easily modified for use in the other.
Object Classification on Maixduino
Flashing the Firmware
Required Tools and Files
1. Kflash GUI: Used to flash firmware onto the Maixduino board.
Download link: https://github.com/sipeed/kflash_gui/releases/tag/v1.8.1Firmware Files: Choose one of the following firmware files based on the requirements:•maixpy_v0.6.3_2_gd8901fd22_openmv_kmodel_v4_with_ide_support.bin•maixpy_v0.6.3_2_gd8901fd22_minimum_wit h_kmodel_v4_support.bin•maixpy_v0.6.3_2_gd8901fd22_minimum_with_ide_support.bin•Download link: https://dl.sipeed.com/shareURL/MAIX/MaixPy/release/master/maixpy_v0.6.3_2_gd8901fd22
Flashing Steps
1. Open Kflash GUI.
2. Connect the Maixduino board to your computer.
3. Load the chosen firmware file in Kflash GUI.
4. Select the correct serial port and set the baud rate.
5. Click "Flash" to upload the firmware to the board.
6. Wait for the flashing process to complete, then disconnect and reconnect the Maixduino board.
Setting Up the MaixPy IDE
Required Tool
•MaixPy IDE: The interface for writing and executing scripts on Maixduino.
•Download link: https://dl.sipeed.com/shareURL/MAIX/MaixPy/ide/v0.2.5
Steps
Install MaixPy IDE on your computer.
Open MaixPy IDE and select your Maixduino board model from the Tool menu.
Click the Connect button at the bottom left to connect the IDE to the Maixduino board.
Transferring Files with uPyLoader
Required Tool
uPyLoader: Allows easy file management on Maixduino’s file system.
Download link: https://github.com/BetaRavener/uPyLoader/releases/tag/v0.1.4
Open uPyLoader and connect it to your Maixduino board.
Transfer any necessary files, like labels and scripts, to the board’s flash memory.
Note: If an error occurs during the first file transfer, go to the File menu and select Init transfer files to initialize.
Loading the Model and Labels
Required Files
Model File: mobilenet_0x300000.kfpkg
Download link: Mobilenet Model
Labels File: labels.txt (contains object classification labels)
Download link: Labels.txt
Steps
Use Kflash GUI to flash the mobilenet_0x300000.kfpkg model to the Maixduino board.
Transfer labels.txt to the file system using uPyLoader.
Adjusting the Garbage Collection Heap Size
Run the following script in MaixPy IDE to adjust the garbage collection heap size.
Once executed, this will reset the board with the new GC heap size.
Running the Object Classification Script: https://github.com/Circuit-Digest/Maixduino-AI-Projects/tree/main/1000%20Object%20Detection
Steps
.Ensure both labels.txt and the mobilenet_0x300000.kfpkg model are in place.
mobilenet_0x300000.kfpkg
label.txt
Run the script. The Maixduino board should now begin object classification, displaying the identified object and confidence score on the LCD screen.
Add an image of the Maixduino board running the object classification model, displaying results on the LCD.
This project will heavily rely on the Maixduino Kit to manage all edge AI functions, including object detection, voice recognition, and real-time data processing. The Maixduino Kit will be the central brain of the quadruped robot, handling advanced computer vision and audio capabilities.
The Maixduino Kit's camera interface and powerful AI processing will enable the robot to detect and identify objects as well as recognize faces. This will allow the robot to track and follow objects or people, further enhancing its awareness and adaptability within the environment. Additionally, the Maixduino's voice control system will enable the robot to comprehend and react to voice commands, providing a more intuitive and interactive user experience.
To support Maixduino's edge AI functions, the robot will incorporate ultrasonic sensors for obstacle avoidance, motion detection, and enhanced environmental perception. The seamless integration between the Maixduino Kit and these peripheral sensors will ensure the robot can navigate its surroundings safely and intelligently.
Furthermore, the robot's LED display will serve as a means for the Maixduino to communicate and express emotions, providing visual feedback to users and showcasing the robot's real-time perceptions. This multi-modal interaction, combining computer vision, voice control, and emotional expression, will create a more engaging and responsive user experience.
Inverse Kinematics
Now that we have the inverse-kinematic model for the robot, we can compute it and use it to make useful movements around each leg.
Because the robot's arm and elbow lengths are equal, its legs form an isosceles triangle. By utilising the COSINE/SINE rule (SSS Triangle) to determine the Shoulder Angle and Arm angle, we can determine the Robot's Height.
Coding
Numerous well-known frameworks are supported by the Sippeed Maixduino, such as the Arduino IDE, PlatformlO IDE, and MaixPy IDE. Additionally, it supports a number of real-time operating systems, such as RT-Thread and Free-RTOS.
IMU Sensor Integration
To keep the body horizontal, Test Spot Micro uses a PID controller in conjunction with an IMU (MPU 6050).
Voice Controlled Spot
A walking gait implementation and voice-commanding were achieved: https://github.com/cholan2100/ceasar
Turn the robot on while it is sleeping.
When the robot gets power, it will instantly stand up.
If there is a configuration error, the robot will remain idle for 10 seconds, allowing you to maintain control over it.
The robot will begin practising walking in the same position after ten seconds.
The robot will begin to walk in a straight line in three more seconds. The robot may yaw inadvertently while it is walking. We'll fix this in the future.
Raspberry Pi 5 Version
Reference: https://www.instructables.com/Quadruped-Robot-Alpha-ESP32-Based-Spot-Micro-Robot/
For all the code, follow the GitHub link below: