How will brain-computer interface technologies change our relationship with machines?
Every day, millions of employees work in environments where their health and well-being depend on their decisions they and others make. Truck drivers, electrical operators, maintenance engineers, aircraft pilots, and many others make safety-critical decisions on a daily basis.
But, humans being humans, every single decision is biased by the emotional state of the people in charge - even with extended experience or after the most intensive training. And specific context, such as tiredness or stress, can alter their decision-making abilities in moments that can prove critical.
To provide a layer of control and safety, systems are now being given additional sensors to capture the emotional state of people engaging with them and their environment. This is often done by monitoring the brain's electromagnetic activity, a discipline that goes by the name of the brain-controlled interface (BCI), the applications of which are countless.
There are currently two classes of BCI systems:
— Noninvasive: these rely on straightforward helmets, enhanced by electroencephalogram (EEG) sensors, and are able to detect the electromagnetic waves emitted by the brain
— Invasive: where miniaturized chipsets are directly implanted in the brain of the users, making the system more precise. These are normally limited to specialized medical applications
BCI helmets have been around for a relatively long time. In 2016, Capgemini Engineering - then Altran - integrated one as part of a virtual factory solution, allowing users to "thought control" production equipment. Today, significant engineering challenges that previously prevented the mass roll-out of this technology are now being overcome.
The largest problem with EEG has long been electromagnetic noise, coming both from the user's direct surroundings (for example, operators in electric facilities) and from the user's brain itself, where algorithms may struggle to pinpoint the precise signal they are looking at.
By 2030, three major evolutions will likely dramatically improve the precision of EEG helmets:
1 — Advanced AI solutions are increasingly able to learn from the user and environment and filter the signal accordingly
2 — Miniaturization of electronics & increased computing power in the devices will enable more sophisticated solutions to operate those devices
3 — Complementary technologies are spreading and converging to inform on the context and emotional state of the user (with computer vision, voice analysis, …)
As a result, devices and machines should soon be able to capture information on their users' state of mind and adapt their behavior accordingly. Some applications are already there, such as glasses that can detect the dizziness in drivers, and drone command systems that are able to qualify the user's emotional state before accepting commands. The range of applications will likely spread massively, starting with operators of hazardous equipment, such as power facilities, chemical installations, and critical systems, and becoming progressively mainstream by gaining users' and society's acceptance and adequate regulations from the public authorities.