The document summarizes a prosthetic hand project that aims to develop a new control input for prosthetics using EEG sensors to detect brain waves. A physical prototype hand was created for testing, which uncovered areas for improvement. An EEG sensor system was also developed to measure brain activity and eye blinks for initial prototype testing. Future work includes refining the hand prototype, finalizing the EEG sensor design and electrode placements, improving the hand control code, and potentially presenting the design to end users.
2. Background Information/Goals
Prosthetics are becoming increasingly common due to an aging population and
many war veterans
Currently hands and arms are controlled either by using a sling with various
linkage that creates the hand movements based upon how the sling is moved at
the shoulder or by finding individual nerve endings that connect to the phantom
appendage
Finding and rerouting nerve endings to different muscles is typical as then EMG
(muscle activity) sensors can be used as control inputs
Muscle activity amplifies the nerve signal, making it easier to measure against noise
This method though requires extensive surgery for rerouting of nerves to different
muscles
The purpose of this project is to research and implement a new control input for
prosthetics that utilizes EEG sensors (brain waves)
A physical prototype prosthetic hand model was created for testing purposes
3. Physical Hand Prototype
Hand created based off of open source design (Tact Open Hand)
While constructing, it was found that multiple modifications were needed for
the hand to suit our needs
Upon construction and initial testing, it was found numerous improvements
should be implemented for a more realistic prototype
Passive method of reopening hand currently rubber bands are used to bring each
finger open again, creating the problem of constantly applying an opening force
It was found that DC motors do not like to hold their position against a force, thus this
improvement is necessary for longevity of the design
More secure method for attaching fingers to hand case
Grippy coating on finger tips for more secure object pick ups
Additional sensors
Limit switches for each finger DC motors encoders work okay to provide position
information, but can get lost quite easily
Force feedback force sensors on finger tips to enable the control system to know how
hard the hand is gripping an object
5. Hand Control
Controlled with chipkit MX4 microcontroller (or can use most microcontrollers)
Programmed in C using MPLAB X
Takes feedback from each of the DC motors quadrature encoders to determine how
much the motor has turned, and thus how much the finger is flexed
Allows for each finger to be individually set to a specific flex
Designed to take analog input and relate voltage to a finger flex for that specific
finger
Controls motors using H-bridges (HB5)
7. EEG Sensor
Uses an OpenBCI 32 bit EEG amplifier board
Supports 8 different electrode placements and two reference channels
Allows for very low noise signal to be sent to computer (very good CMRR in
instrumentation amplifier!)
Electrode placement and configuration is still in progress
Testing for functionality of the sensors has been done by measuring eye blinks
Right: Electrode placement for initial tests
Left: Blink recording and GUI interface
Top: EEG Amplifier, electrodes, USB Dongle
8. EEG System Output
In this trial, eye blinks were being measured. As can be seen three were
made, represented by spikes in the signal of each electrode
9. What Was Learned
Troubleshooting!!
The hand prototype had multiple design issues that needed to be resolved
3D printing process
Soldering
Individuals new to soldering were taught the basics, as soldering was needed for
motor connections
Brain physiology
Decisions needed regarding where to place the electrodes for best performance
Controlling DC motors with encoders
C coding for microcontroller in MPLAB X
In general, this year was a huge learning experience for everyone involved
10. Future Work
Determine all needed improvements to existing prototype hand
Design and construct a refined prototype
Finish calibration of EEG sensor and finalized headset design/electrode placements
Improve hand control code for more reliable operations
Possibly present finished design to an end user!
11. Mars Rover
What we accomplished this year:
Wrote a proposal to the Robo-ops
competition
Got a working drive system
Created and implemented the
electrical system to power the
rover
Created the drive control program
Got the Rover Driving
12. Mars Rover
What's in store for next year:
Write another competition proposal
Fine tune mechanical drive
components
Create and implement secondary
systems such as vision and arm
Add to the main program to include
vision, control for secondary
systems
Tune driving and turning algorithm
14. Team members:
Marcus Blaisdell CS
Vitaly Kubay ME
William Conner Cole EE
Kily Nhan
Photo taken with Kinect camera we modified to use with regular USB
15. Objectives:
Build a fully autonomous chess playing robot
Increase members knowledge in robotics
Vision processing
Multi-joint robotic arm movement
16. Intention
Use a pre-made kit arm as the base
Last year, the design of the arm was constantly being changed
This made it difficult to progress on any specific attribute as
the requirements were different from week to week
Using a pre-built arm allowed us to begin working on the coding
part sooner
We wanted to use the C++ version of OpenCV this year instead of
the Python version we were using last year.
The reason we wanted to work in C++ and not Python as we
believed it would provide us with greater ability to achieve our
goal
17. Progress:
The first semester, Marcus was unable to spend much time working on
this due to the demands of another project
Vitaly took the lead on the programming portion and wrote the
majority of the code for the object detection, movement
detection, and shape detection
Conner, Kily, and Marcus modified a Kinect camera to use regular USB
with the intention of using this as the arms primary camera
In the second semester, Marcus began working on the movement
algorithm attempting to implement the Denavit-Hartenberg method
As Marcus has not yet taken Linear Algebra, the transformation
matrixes proved to be too difficult and he resorted to using basic
trigonometry
18. Current state:
There is a movement algorithm that moves one joint at a time
The movement is smooth and effective
The vision system exists in multiple components in various states of
functionality
We can detect the pieces
We cannot yet determine their spatial position