James L's Senior Project Blog
|   | Project Title: Predicting Human Stupidity: Enhancing Safety With Machine Vision Based Motion Prediction BASIS Advisor: Debby Hermann Internship Location: University of Texas at San Antonio -Department of Electrical and Computer Engineering, Autonomous Control Engineering Lab Onsite Mentor: Yufang Jin, Ph.D, Professor, Department of Electrical and Computer Engineering | 
Project Abstract
In a world increasingly dependent on technology, ensuring human safety has become one of the most pressing challenges, particularly as our environments grow more automated. As industries adopt robotics and automation to enhance efficiency, the interaction between humans and machines introduces risks that must be mitigated to prevent accidents. This project aims to address these concerns by using machine vision techniques to predict and account for human error, ultimately protecting human lives in high-risk environments. The UTSA Autonomous Control Engineering Lab has been working on a foundational project that utilizes camera vision to monitor the surroundings of manufacturing robots. My project will expand on that groundwork by identifying potential dangerous human activities around the robot and implementing proper response mechanisms. To achieve this, the project will integrate Kalman filters which isolates human shapes from outside noise, OpenPose which tracks key points on a human body, and machine learning to analyze human poses and movements. The algorithm will be trained to recognize specific behaviors and postures commonly associated with distracted or risky actions, such as looking down at a phone or eating while walking. By identifying these actions in real time, the system will alert robots to slow down or adjust their trajectories to prevent collisions with humans. Through this research, we will be able to predict human movement 2-3 seconds in advance at a 90+ percent accuracy and limit human injuries from autonomous machines which could be applied on autonomous vehicles and manufacturing.
- Week 10 Senior Project Blog: Project Finished!- Hello everyone! Welcome to the final blog update for my senior project. At the start of the week, we focused on bringing everything together: camera vision, LEGO detection via the trained neural network, and robotic control via inverse kinematics. All the components had worked well independently, but the challenge was to make them operate smoothly... Read More 
- Week 9 Blog: LEGO Grabbing with Vision-Based Control- Welcome to Week 9! This week marks a major leap in functionality for our system! Building on last week’s integration of machine learning for object detection, we’ve now enabled the robots to autonomously grab LEGO pieces based entirely on what they see through the camera, without needing pre-programmed coordinates or operator input. To recap, last... Read More 
- Week 8 Blog Post: Implementing Neural Networks for Smarter LEGO Detection- Hi again! Welcome to Week 8 of my senior project. Building off last week’s success with autonomous picking and basic robot coordination, this week I focused on enhancing our object recognition system. Specifically, I began starting using some machine learning through neural networking to more reliably identify LEGO pieces on the board. Up to this... Read More 
- Week 7 Blog: Autonomous LEGOs- Hey everyone, welcome to Week 7 of my senior project! On-site, I was able to adapt the inverse kinematics (IK) code that I got from U-Factory to the code that we wrote. Inverse kinematics allows a robot to calculate how each joint should move in order for the gripper (the tool at the end of... Read More 
- Week 6 Blog: Limelight Camera Installation & Virtual Plane Testing- Hey everyone, welcome back to Week 6 of my senior project! This week we got a lot done! We finally installed the new Limelight camera on the robot and started testing it. Alongside that, we began testing the virtual planes drawn by the global camera, which are a key part of making our object detection... Read More 
- Week 5 Blog: Prepping for Hardware Upgrades and Better Camera Control- Hello everyone, and welcome back to Week 5 of my senior project! This week was all about fine-tuning the robot’s ability to pick up LEGO pieces reliably. We had already made some great progress last week: our object recognition system could draw contour lines around LEGO pieces using the local camera, which helped the robot... Read More 
- Week Four Blog: Object Recognition and 3D Printing- Hello everyone! Thanks for sticking around for another week of progress on our project. One of the big milestones this week was actually finishing the 3D printing for the local camera mount and attaching it to the robot. This camera mount is crucial because it allows the robot to have a stable, fixed position for... Read More 
- Week Three Blog: Object Recognition, Machine Learning, and Setting Up a Tripod- Hello everyone, and thanks for sticking around for the third week of my project! This week was a bit slower since my teammates were out of town for most of the week. However, I made significant progress in refining the object recognition component of our global positioning camera. One of my main tasks this week... Read More 
- Week Two Blog: Legos, Movement, and CAD- Hello! Welcome back to my senior project for week two! This week has been full of exciting developments and some major milestones! During our Monday meeting, we decided that we would be testing how accurate the robot would move through using Legos (video link: https://youtube.com/clip/Ugkx83eHD2jbNqH14WERBwi4AJdF1ezaFcru?si=QROurc2UOi-Mc8Z8). Our goal is that through camera vision the robot would... Read More 
- Week 1: When life gives you limes, buy a lime-light camera- Hello! My name is James and I’m a current senior at BASIS San Antonio Shavano Campus with a love for robotics and automation. I’m really excited to begin my senior project and would love for you to join me through my blogs! The next 10 weeks I’ll be working at the Autonomous Controls Engineering Lab... Read More 
