Prosthetic Limbs and Machine Learning

Preface

I had worked on this project collaboratively with my close friend and colleague, Grant Pausanos, with whom I share equal contribution of work.

Grant and I presenting our project

1. Summary and Introduction

A prevalent issue that is abundant in modern day society is the development of medical technology that aids amputated citizens in order to attain more convenient day to day lifestyles. Especially during a time where war and fighting is rampant throughout many parts of the world, the need for an affordable bionic alternative is important. A 3D printed bionic arm modeled from scratch will be tested on a variety of tasks a typical hand would handle such as holding genuine items, pouring safe substances, and other normal hand gestures. Moreover, an electromyogram (EMG) will act as a biological input device detecting for electric contractions in muscle; so that the control from user is as intuitive as possible. These contractions will act as input for controlling servo motors that move the arm in various ways. Its mechanical component consists of a programmed Arduino with several servo motors and strings to allow the flexion of proximal, distal, and intermediate phalanges. In addition to the electrical component, an EMG sensor will be connected to a circuit to control the motion of the hand.

Figure 1. Example signal data of bicep muscle flexion detected by EMG (rectified data model)

2. Procedure of Mechanical Aspect

Initially, the hand was sketch with the reference of one’s right hand and forearm. These measurements must be precise in order for the artificial hand to achieve realistic dimensions. This major draft include the proximal, intermediate, and distal phalanges of the fingers and thumb. Looking at a top view of one’s hand, the width of the phalanges do not have to be recorded due to the implementation of screws with specific length in each phalanges. Be extremely careful with the palm as it’s sketch will eventually form into an irregular polygon with specific angles accounted for. In terms of the forearm, only record the length as its width might vary due to the five servo motors that will take a good portion of its volume.

Figure 2. Sketch of the palm of hand to note the measurements, wire flow, and general design of hand taken from personal notebook.
Figure 3. Final version of the palm of the hand to hold two servos and wire pulley mechanics

3. Electrical Aspect and Machine Learning

In terms of Arduino code, all that was done was to import the appropriate servo libraries so the Arduino can read them and assign them values based off the raw data received. This is where the integrated circuit also rectifies the data. Once this is done it tests it against a threshold value where if it is crossed then the arm itself contracts. Generally speaking, this is very simplistic code and does not even go near tackling the more complex tasks and issues that a regular arm should be capable of. The schematic for the 5+ servos was not anything difficult, simply an arrangement of servos to a breadboard. The plan was to use a servo shield however we had encountered some issues and due to the time constraint at the time, the shield was dropped.

Figure 4. Schematic of the electromyogram (EMG) used onboard the hand
Figure 5. Machine learning pipeline from movement to classification for EMG signal data

Movement

The patient who is controlling the arm flexes their muscles or does some other biologically controlled mechanical movement.

Movement Detection

The Myoware EMG sensor picks up on the change in action potential and records it. Some data rectification is done here combing the next step, however more computationally intense rectification can not be done here (ie. Fourier transform). The sensor takes a difference of two inputs (gain=110) which is then blown up using a differential amplifier (gain=-15). This amplified data is pass through a filter (gain=-1,fc=106.1Hz) and takes any negative values to be positive. Lastly, smoothing(gain=-1,fc=1.975Hz) and inverting(gain=-20, amplified again) the data to send to Arduino.

Feature and Labels Extraction

While most of this was technically done on the circuit board itself, any remaining data manipulation can be done here (other than one-hot which comes later). Fourier transforms was not used since signals as a function of frequency was not a valuable feature that was needed (also really hard considering I was taking Calc I at the time).

Dimensionality Reduction

There were not too many features to reduce other than some feedback that the servos were giving us since we were not using pulse-width-modulation (PWM). It should follow that the raw data was pretty much fed to the classifier. However I will say that at this stage I did do some one-hot encoding to make it easier for the classifier to classify.

def extract_data(filename):labels=[]
fvecs=[]
for line in open(filename):
row = line.split(',')
labels.append(int(row[0]))
#fvecs.append([float(x) for x in row[1]])
#fvecs_np = np.matrix(fvecs).astype(np.float32)
fvecs_np = emg_health[:,0]
labels_np = np.array(labels).astype(dtype=np.uint8)
labels_onehot = (np.arange(5) == labels_np[:, None]).astype(np.float32)return fvecs_np, labels_onehotif __name__ == '__main__':
extract_data(emg)
run = extract_data(emg)
print(run)

Classification

Since I was unable to deploy any models for the actual arm to use, this stage was never reached but only developed. I had decided to use an Extreme Learning Machine since I had hoped that different finger contractions would essentially give slightly different signal data. I read several papers and watched many videos accomplishing what I had sought out to do, however the papers I had read had more than a two channel EMG (meaning two electrodes on arm).

class ELMRegressor():
def __init__(self, n_hidden_units):
self.n_hidden_units = n_hidden_units
def fit(self, X, labels):
X = np.column_stack([X, np.ones([X.shape[0], 1])])
self.random_weights = np.random.randn(X.shape[1], self.n_hidden_units)
G = np.tanh(X.dot(self.random_weights))
self.w_elm = np.linalg.pinv(G).dot(labels)
def predict(self, X):
X = np.column_stack([X, np.ones([X.shape[0], 1])])
G = np.tanh(X.dot(self.random_weights))
return G.dot(self.w_elm)
## Linear Discrim Analysis
def LDA:
lda = LinearDiscriminantAnalysis(n_discriminants=2)
lda.fit(X_train, y_train)
X_lda = lda.transform(X)
## Spectral Regress Discrim Analysis
def SRDA:
srda = mlpy.Srda()
srda.compute(X-train, y_train)
srda.predict(X_train) # predict srda model on train data
srda.realpred # real-valued prediction
mysrda.weights(X_train, y_train) # compute weights on training data

4. Results

Three tests were done in order to test the efficacy of the arm as an actual practical piece of technology. One had used an EOS lip balm, a plastic cup, and a roll of electrical tape. The EOS lip balm was spherical in shape and was able to test the grip of the distal phalanges, the hand was safely able to hold the object in place. By ascertaining the hand’s ability to grip spherical objects one gain a better understanding of how it will perform with other objects. The next object that was tested was the plastic cup, in this case the hand had trouble initially grabbing the cup, due to the inherent flexibility of the plastic, however once the hand had a secure grip, the hand was able to hold it. This introduces a more practical aspect of using the hand in an everyday setting, specifically using the cup to drink water. Lastly, the hand attempted to grab and hold a roll of electrical tape. This test would indicate how well the hand could lift heavier objects since the electrical tape was heavier than the other objects. The hand failed to hold the tape for a prolonged amount of time. I failed to incorporate the machine learning pipeline to the arm in time for the presentation, as such it was dropped.

Figure 6. Final working product with proper muscle receiving and servo coordination

5. Conclusion

The initial goals of this project was to introduce a novel design for a biomechanical hand such that amputees and others in a healthcare setting can apply such technology in order to better enhance the quality of life and productivity. Another goal was to develop a low cost 3D printed myoelectric
prosthetic arm for third world countries where such technology is needed. The final design yielded moderately well to good performance and exhibits characteristics of a better design in that much of the features are easily extendable. Over the course of testing the system it is been proven that the design requires minimum maintenance and for the most part proves to be an excellent amateur design when compared against other non-academic designs.

Figure 7. Our proposed design against the designs of other hands in the market and currently being built (graph had been cited from Mahdi Elsayed Hussein, 3D Printed Myoelectric Prosthetic Arm)

Final Remark

The project was such a success that my high school had actually invited Grant and I to speak at our school’s award ceremony to show case our project and encourage future engineers. It was truly an amazing journey, one which I learned a lot and gained plenty of relevant skills that have transferred over to other projects.

I am an undergraduate student studying biomedical engineering at the University of British Columbia. Check out umarali.ca and my GitHub for more!