[ad_1]

The rapid advance of technology that gathers and applies information directly from the human brain carries a serious risk of bias and discrimination at work and threatens privacy, the UK data regulator has warned.

In a report published on Thursday, the Information Commissioner’s Office called for new regulations over neurotechnology applications in non-medical fields, such as wellbeing and marketing and in workplaces, to prevent ethical breaches.

Brain monitors and implants are already used in medicine to diagnose and treat diseases of the nervous system. Last month, scientists in Switzerland enabled a paraplegic patient to walk again with a neural implant that bypassed a spinal injury.

But Stephen Almond, ICO executive director of regulatory risk, said that while the idea of “brain data . . . conjures up images of science fiction films”, many people did not realise how fast its use was spreading.

Neurotechnology collects intimate personal information . . . including emotions and complex behaviour,” he said. “The consequences could be dire if these technologies are developed or deployed inappropriately.”

“We are most worried about the real risk of discrimination when models are developed that contain bias, leading to inaccurate data and assumptions about people,” he added, noting that workers with patterns of brain activity deemed undesirable may be unfairly overlooked for promotion.

The ICO said employers were likely to adopt neurotech devices on a significant scale by 2028 to monitor existing employees and hire new ones, but that it had concerns about how staff data would be held and analysed.

The watchdog is joining an international movement to lay down ethical, regulatory and governance guidelines for neurotech. Unesco will convene what it says will be the first big international conference on the subject at its Paris headquarters next month.

Unesco’s Audrey Azoulay: ‘[Neurotechnology] could threaten our rights to human dignity, freedom of thought and privacy’ © Nina Liashonok via Reuters

Audrey Azoulay, director-general of the UN scientific and cultural agency, said neurotechnology “could help solve many health issues, but it could also access and manipulate people’s brains and produce information about our identities and our emotions”.

“It could threaten our rights to human dignity, freedom of thought and privacy,” she added.

In the UK the ICO aims to provide guidance within two years after a wide-ranging consultation exercise.

Research by the watchdog suggested that professional sports would be an early adopter of neurotech during the next two to three years, using non-invasive brain monitoring devices to analyse athletes’ responses to stimulus and concentration levels, as well as tracking the long-term effects of head injuries.

In the longer term, the regulator’s report noted that “neuroenhancement” devices would aim to improve reaction times and muscular responses, “potentially allowing athletes to run faster, jump higher and throw further”.

Gaming and entertainment is set to be one of neurotech’s biggest consumer markets, as more sophisticated devices are introduced not only to control computers but also to boost users’ performance through “neuromodulation”.

The ICO also foresaw concerns about the use of consumers’ brain data “and what risks may be posed, should people choose to share it without fully understanding its potential uses and inferences”.

[ad_2]

Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *