Process:
First we developed multiple models on custom vision in order to detect mainly the palm
area of the
hand,
which later would allow us to get its coordinates and map it to control the mouse. To do so we collected
our
own data during the hackathon using our own python script that is able to collect thousands of images in
a
matter of a few minutes. Then, by hand selecting different images and marking where the hands were, we
were
able to create a few working iterations with decent accuracy in order to control the mouse.
Secondly, we created a python script that connects to azure’s cognitive services and connected us to the
iteration we wanted to use in order to predict the hand location. From there simply using the PyAutoGUI
library we were able to move the mouse based on the hand’s position.
Lastly, using an Arduino Uno and the PySerial Library, we tried adding a button for left and right
clicks,
which would often be controlled with the left mouse.
The Future:
In the future, Magic Mouse could certainly include other features that make it easier to control the
mouse, such as the ability to detect certain hand gestures. It should also definitely have a way of
keeping the control of the mouse precise, maybe focusing on one part of the hand would help with that.
Lastly, working on optimizing the program to run faster is also another idea to focus on in order to
improve the quality of the program.
Creators:
Siddharth Nema, Mostafa Hussein, Vinesh Vivekanand, Armaan Rasheed