Week 7: Automating The Code & Moving the Car

Anshul B -

Hello and welcome back to my blog! My name is Anshul, and this week has been about automation in our code and making our car move. We’ve implemented a system to ensure our code is updated on the Raspberry Pi, and we’ve begun writing the control script to interact with the car’s movement.

Creating a CI/CD Pipeline

This week, one of the major developments was setting up a Continuous Integration/Continuous Deployment (CI/CD) pipeline. This system ensures that any changes pushed to our GitHub repository are automatically deployed to the Raspberry Pi without the need for manual updates. With this setup, we can work on debugging our code and testing it in real time. Also, I have included a picture of us creating our CI/CD pipeline on the Raspberry Pi monitor.

Starting the Control Code

We’ve also begun writing a Bash script to control and move the car. The script listens for inputs from a Bluetooth-connected mouse, interpreting specific button presses as movement commands. For example, scrolling up on the mouse could mean turning right, and scrolling down would mean turning left. These commands are sent to the Raspberry Pi, which then communicates with the motor drivers. Then, the motor drivers control the voltage being sent to each motor, effectively moving our car.

Project Progress

This week, my research on the Depth-from-Defocus (DFD) method is coming to an end, meaning we will soon be able to integrate it into our GitHub repository (I have already added the CNN that I built onto our repository). Finally, getting the car to move is a huge milestone for ICON, as it means we can now begin implementing and testing our code. Thank you for reading this week’s blog, and stay tuned for next week’s updates as we continue working on ICON!

 

 

 

More Posts

Comments:

All viewpoints are welcome but profane, threatening, disrespectful, or harassing comments will not be tolerated and are subject to moderation up to, and including, full deletion.

    Rahul Patel
    Hey Anshul, incredible progress—getting the car moving and automating deployments is a huge step! The CI/CD pipeline will definitely make iteration smoother. I’m curious, as you transition from Bluetooth-based controls to full autonomy, how will you handle edge cases where the model might misinterpret image data? Can’t wait to see how ICON tackles real-world variability!
    aashi_h
    Your project seems so interesting Anshul! Could you explain a little bit about the process of setting up the CI/CD pipeline?
    Anshul Baddi
    Hey Rahul, Thanks for your insightful question. There are several cases where the model misinterpreted the images, however we optimize the CNN with multiple forward and backward propagation models.
    camille_bennett
    Hi Anshul, Thank you for sharing! So cool that your car is on the move. Is there a reason you chose a mouse for your controller versus another device?
    tanay_n
    Hi Anshul! Your progress with ICON is impressive, especially the automation of deployments and the implementation of movement control via a Bluetooth mouse. Given your research on Depth-from-Defocus (DFD) and its upcoming integration, how do you plan to leverage this technique for your project?
    Anshul Baddi
    Hey Aashi, that's a great question. A CI/CD pipeline sets up GitHub Actions to automatically test, build, and deploy the project to the Raspberry Pi 5 via SSH or Docker. Then, we use systemd to manage the app as a service, ensuring it runs continuously and restarts on failure.
    Anshul Baddi
    Hey Ms.Bennett, We used the mouse last week just as a test to make sure ICON is able to move. We will implement our actual code next week and see how it runs!
    Anshul Baddi
    Hey Tanay, that's a wonderful question. The DFD method will be used to calculate a distance and based off that, navigate via a steering algorithm.

Leave a Reply

Your email address will not be published. Required fields are marked *