Researchers have developed biosensor technology that enables you to control machines and robots with only your thoughts.
A fresh investigation into the technology has now been published in the peer-reviewed journal ACS Applied Nano Materials. It demonstrates how strong, conductive, and simple to use the UTS-produced graphene sensors are.
To capture brainwaves from the visual cortex, sensors in the shape of hexagons are placed on the back of the skull. The sensors can operate in demanding workplace conditions. An augmented reality lens that the user wears on their head displays white, flickering squares. As the operator concentrates on a particular square, the biosensor detects brain waves, and a decoder converts the signal into orders. This technique was recently displayed by the Australian Army. Soldiers operated a four-legged Ghost Robotics robot using a brain-machine interface. The technology, which operates up to 94% of the time, enables you to control the robotic dog without really touching it.
Objective
There must be precise and trustworthy dry sensors for electroencephalography for brain-machine interfaces (BMIs) to be used widely (EEG). Yet, dry sensors consistently perform poorer than the industry-recognized Ag/AgCl wet sensors. However, the performance reduction with dry sensors is even more noticeable when monitoring the signal from hairy and curved parts of the scalp, which needs hefty and unpleasant acicular sensors.
This work demonstrates that three-dimensional micropatterned sensors that can capture the EEG signal from the intricate occipital area of the scalp can be created using subnanometer-thick epitaxial graphene. The occipital area of the brain, which houses the visual cortex, is crucial for BMIs based on the conventional steady-state visually evoked potential paradigm. Moreover, patterned epitaxial graphene sensors offer low impedance and good skin contact. They can achieve signal-to-noise ratios that are comparable to those of wet sensors as a result. Researchers have demonstrated using these sensors that brain activity may communicate with a four-legged robot without physically touching it.
Moreover, scientists have created tiny SiC on silicon EG EEG sensors with micro patterns. The graphene layer that contacts the skin is typically only a few nanometers thick. For the back of the head, they created ten m-deep structures with various shapes and packing ratios.
Conclusion
The graphene region has an impact on sensor contact impedance with the flat skin of the forehead, the researchers discovered. When the sensors are positioned on the occipital region, however, this link disintegrates. The researchers observed that an ideal design needed to strike a compromise between the overall graphene area and other characteristics, like being able to work with hair and transferring appropriate contact pressure, even if the study’s objective was not to produce a final design.
Using a hexagonal arrangement, the researchers were able to obtain the EEG signal from the occipital region of a subject with 5-mm-long hair (HPEG). These sensors had an excellent S/N ratio that could reach up to 25 5 dB at 50 Hz and a low average impedance of 155 10 k, which is very near to the industry norm.
Last but not least, the researchers demonstrated a full BMI system that used the SSVEP paradigm and an eight-channel HPEG sensor array to drive a four-legged robot with 94% accuracy. They assert that it is challenging to monitor EEG from the back of the head (occipital area) using dry sensors and that, for a given design, the observed variability results from the placement of the sensor, not from its individual characteristics. Less likely to change in this manner are wet sensors. The researchers believe that these three-dimensional micropatterned EG sensors represent a significant step towards this objective, even if it is still challenging to match the performance of wet Ag/AgCl sensors with dry sensors in practical applications.