Senior Design Team 315:

Control Module/Interface for Service Robots

MEET OUR TEAM

Brendan Laney

Computer Engineering
Team Leader | Control Logic Engineer
Brendan is responsible for planning project timelines, communicating with stakeholders, organizing and leading team meetings, implementing the pathfinding algorithm, and assisting with the vision system.

Diego Guedez

Computer Engineering
Imaging Engineer
Diego is responsible for designing and testing the vision system to detect and track users and objects. He must also interface the vision system with the control module.

Jerry Jean-Pierre

Computer Engineering
Communications Engineer | Web Designer
Jerry is responsible for developing an interface that emulates the movement of a joystick between the control module and wheelchair. Other responsibilities include understanding wheelchair operation and building a testing environment.

Jossue Arzeta

Computer Engineering
Control Logic Engineer
Jossue is responsible for implementing the pathfinding algorithm with the control module and aids in designing the interface to send directions to the communication system.

Kyle Crawford

Computer Engineering
Applications Engineer
Kyle is responsible for designing and testing a mobile application to connect to the control module and switch between manual and semi-autonomous mode. He also aids in testing the communication system.

THANK YOU TO OUR SPONSOR AND ADVISOR

Department of Electrical and Computer Engineering

Oscar Chuy

Project Sponsor and Advisor

Linda DeBrunner

Project Reviewer

ABSTRACT

Automation, machine learning, and robotics are becoming more prevalent as time passes and are replacing or assisting manual labor in many fields. These processes can be applied to pushing around carts containing items in fulfillment centers, groceries stores, and hospitals. Instead, this task can be accomplished by having the cart follow the user.

The goal of this project is to create a device to allow a motorized cart to semi-autonomously follow a person. A motorized wheelchair will be used as our motorized cart. Additionally, a control module is developed and provides a seamless transfer from semi-autonomous mode to manual mode. The user can use hand gestures and a mobile app to have the robot follow and stop. The control module is designed to be a low-cost solution to install on motorized systems that lack semi-autonomy.

The control module uses a camera to process images for object detection. This will differentiate between the user and other objects/people in the environment. As well, the camera finds the distance to the user and other objects/people which controls the motor. This creates feedback that is fed into a pathfinding algorithm which determines the best way to navigate the environment and follow the user. The control module uses an emergency stop feature to avoid harming people and nearby objects.

In summary, this project develops a control module that converts a motorized mobile system to semi-autonomously follow a person. The benefit of this project automates a manual process so the user can focus on other tasks rather than manually pushing a cart.

HIGH RESOLUTION GRAPHIC

CONTENT TO DATE

VDR1 VDR2 VDR3 VDR4 VDR5 Signals Diagram CAN Communication Concept Generation Concept Selection Customer Needs Targets Codes and Standards Engineering Design Day Poster

FUTURE WORK

The remaining work to be completed for this project includes developing the code to recognize and follow a user using artificial intelligence and the artificial potential fields pathfinding algorithm, testing and verifying movement commands from the control module, and testing the bluetooth connectivity to control the wheelchair. The control module, camera, mobile application, and wheelchair all need to be connected and powered to perform a complete test of the entire design.

WORK TIMELINE