Week 9: Bringing It All Together
Anshul B -
Welcome back! My name is Anshul, and this week was a huge milestone for ICON—we successfully integrated every component, both hardware and software, into a working setup on a small-scale car. After weeks of building and testing each component, our focus this week was making sure everything connected and functioned as one.
On the software side, we finished and tested all three major algorithms: the Convolutional Neural Network (CNN) for object classification, the Depth from Defocus (DFD) method for measuring distance, and the steering command system that tells the car how to respond based on the preceding input. Getting these algorithms to work smoothly together was a huge accomplishment, which was done through the GitHub Pipeline.
The main action this week was syncing all the software with the hardware. This meant soldering the wires correctly between the Raspberry Pi and motor driver, confirming the motor driver powers on and sends voltage properly, and configuring the GitHub CI/CD pipeline so code can run on the Raspberry Pi in real time. Our webcam is now mounted and captures two images with different focal lengths to enable DFD calculations. It was all hands on deck, and each teammate contributed to ensure no connection was missed.
While there wasn’t much independent research this week, collaboration was key. The CNN is running directly on the Raspberry Pi, classifying objects and obstacles. Meanwhile, the DFD system estimates distances, and the steering commands control the car accordingly.
We’re excited for what’s next: full system testing. ICON is almost ready for the classroom floor! Our tests will begin next week, so stay tuned!
Comments:
All viewpoints are welcome but profane, threatening, disrespectful, or harassing comments will not be tolerated and are subject to moderation up to, and including, full deletion.