Week 6: Setting Up the Hardware & GitHub Pipeline

Anshul B -

Hello and welcome back to my blog! My name is Anshul. This week has been a big step forward in setting up ICON’s hardware and creating our software pipeline. We have gathered all the necessary hardware and begun integrating them into our system.

Finalizing the Hardware Setup

At my site placement, we successfully acquired and installed all the hardware onto our autonomous car. This includes mounting the battery pack and webcam onto the vehicle and connecting them to the Raspberry Pi. With these components now installed, we are closer to running real-world tests in our classroom environment. Also, I have attached an image of our final setup down below!

Building the GitHub Pipeline

An important achievement for us this week was setting up the GitHub repository to manage our code efficiently. We created a structured pipeline that allows us to write and test code on our local computers before transferring it to the Raspberry Pi. This is how the pipeline works:

  1. Local Development: Team members write code on their personal computers and save it to a .py file.

  2. Version Control: The code is uploaded to GitHub, allowing for a collaborative development space for all teammates.

  3. Automated Deployment: A package installer is used to automatically pull the latest updates from GitHub to the Raspberry Pi whenever changes are pushed.

  4. Execution on Raspberry Pi: The updated code is run directly on the Raspberry Pi, enabling real-time debugging of the code.

Independent Research

On my end, I provided the team with my personal webcam and power bank. Additionally, I continued to learn more about the software side of ICON, particularly in creating and implementing the Depth-from-Defocus (DFD) method and accurate steering commands.

With our setup now fully functional, we can shift our focus to creating ICON’s software and preparing for real-world testing in the classroom. Stay tuned for more updates, and thank you for reading my blog!

More Posts

Comments:

All viewpoints are welcome but profane, threatening, disrespectful, or harassing comments will not be tolerated and are subject to moderation up to, and including, full deletion.

    shreyash_p
    Hi Anshul, awesome update! I’m curious about how you plan to handle testing and debugging once the code is automatically deployed to the Raspberry Pi. If you find an issue while running it on the Pi, do you roll back changes in GitHub, or do you fix them locally and push a quick patch? I’m looking forward to seeing how you tackle real-world challenges with this pipeline!
    camille_bennett
    Hi Anshul, I love the picture of the car you are working on. Can you share a bit more about the the Depth-from-Defocus method?
    Anshul Baddi
    Hello Shreyash, Thanks for your insightful question. Since we are running a CI/CD, any changes made to the code on our local devices will automatically push to the raspberry pi and thus fix our issue.
    Anshul Baddi
    Hey Ms.Bennett, Thanks for the question. A brief summary of the DFD method is that our autofocus controls light intake, affecting accuracy at different depths. Then, a fixed focal length provides a definitive measure of whether an object is closer, farther, or at the same distance relative to the focal length.
    Rahul Patel
    Hey Anshul, loving the progress on ICON—especially how you’ve set up the GitHub pipeline for smooth deployment! It’s impressive how everything syncs seamlessly with the Raspberry Pi. How do you plan to handle changes in lighting conditions during real-world testing? Can’t wait to see how ICON performs!
    Anshul Baddi
    Hey Rahul, Thanks for the question. Since we are testing in a college classroom for our first round of testing, the lighting variations wont be a problem because it is a well-lit room.

Leave a Reply

Your email address will not be published. Required fields are marked *