Paul Philipsberg, a biomedical engineering major on the pre-med track graduating this December 2015, is student working on one of the unique projects only found at the Stony Brook University Innovation Lab. Over the last few months (starting in September), he has been able to assemble a brain-controlled drone using a Mindwave headset, an Arduino board, a wireless module, Xbox 360 controller, and a quadcopter drone – all connected to a laptop.
This drone is the Syma Quadcopter, shown turned off in this photo.
A Mindwave headset is a portable EEG (electroencephalogram) sensor that monitors brainwave signals. This is wirelessly connected to the laptop. The laptop is directly connected by wire to an Arduino board, and the Xbox 360 controller. When the sensor detects the brainwaves, it sends the information to the laptop. Brainwave signals and activity are measured as numerical values. Upon receiving these values, the computer checks the level of relaxation to set a different value for the throttle, and sends it to the Arduino board, where the wireless module communicates it with the quadcopter. To control the movements of the quadcopter, Philipsberg practices control of his current emotional states that affect the brainwaves. When calm, the quadcopter rises off the floor, and is capable of hovering if he maintains that composure. For this to be possible, these calm and excited states had to be defined. Values within a certain threshold were set to reduce drift.
This is the Mindwave headset used to measure the brainwave signals.
It took Philipsberg several different versions to get to this point in his project. Before working with the quadcopter, he used a remote control toy car to test how it worked. Currently, Philipsberg is working on improvements. He is considering making changes using open VCI (a file extension) to refine its performance. Ideally, he would want the quadcopter to have more sensors, because it would be capable of balancing and controling movement better if it were “aware” of itself in XYZ space.
These are the some of the first components that Philipsberg worked with: a toy car, its remote, and an Arduino board.
Philipsberg sees potential in this project to build up to the technology necessary to making 3D gaming happening or wireless IP security cameras. It would need to be a joint effort of brain control and muscle control. Like the EEG sensor used, there is also one that is capable of detecting muscle activity and movement – the EMG, electromyogram. For 3D gaming, it would be necessary to include things such as a gyroscope, or an accelerometer. So far, this provides a potential answer to some of the environmental feedback necessary to make it work. For the security cameras, simple but distinct arm motions could be used to the control the way the cameras move. For example, moving your arm such as pointing in one direction could control the camera to point in that direction also. There are many possibilities to where Philipsberg could choose to branch out to.
Though Philipsberg has never worked with wireless modules or an EEG sensor before, he does, however, have experience working with Arduino, quadcopters, and LabVIEW (software used). This is currently his independent study project under Professor Baldwin and the Biology department. Philipsberg took up this project as a personal challenge while working in the Innovation Lab along with several other projects, but found this to be most interesting and relatable to the career he wants to pursue in neuropsychology. Philipsberg will be preparing to apply to graduate schools and medical schools after his graduation. He has plans to take the MCATs and to continuing working in this field after graduation.
Philipsberg giving a demonstration of his project. The quadcopter is beginning to hover a few inches above the floor.
For more, you could view a video of this on our video page: https://you.stonybrook.edu/researchtech/videos/