Australian privacy laws need to be strengthened to prevent the misuse of a person’s brain data, which is being more easily obtained through the increased use of wearable technology.
While most current wearables track a person’s physical activity, a newer type of product is emerging which is based on neurotechnology.
Earlier this year, the Australian Human Rights Commission released a background paper on neurotechnologies, which it defines as “devices and procedures which can access, monitor, record or manipulate brain data”.
The Commission’s paper noted: “Because neurotechnologies can collect sensitive neural information, there is a significant risk to privacy”.
This risk, and the fact wearable technology is in general becoming more acceptable, has led to a University of New England PhD Law Student urging lawmakers to pay serious attention to the issue.
Writing in The Conversation, Edward Musole, said while much of neurotechnology was either still in development stage or confined to research and medical settings, consumers could already purchase several headsets that monitored brain activity.
Mr Musole said they were often marketed as meditation headbands and provided real-time data on a person’s brain activity which was fed it into an app.
“Such headsets can be useful for people wanting to meditate, monitor their sleep and improve wellness. However, they also raise privacy concerns – a person’s brain activity is intrinsically personal data,” he said.
He said the subtle creep in neural and cognitive data wearables was resulting in a data “gold rush”, with companies mining our brains so they could develop and improve their products.
“The extent to which tech companies can harvest cognitive and neural data is particularly concerning when that data comes from children. This is because children fall outside of the protection provided by Australia’s privacy legislation, as it doesn’t specify an age when a person can make their own privacy decisions.”
Mr Musole said the government should conduct an inquiry to investigate the extent to which neurotechnology companies collect and retain this data from children in Australia.
“The private data collected through such devices is also increasingly fed into AI algorithms, raising additional concerns. These algorithms rely on machine learning, which can manipulate datasets in ways unlikely to align with any consent given by a user,” he said.
Mr Musole said users should have complete transparency over what data their wearables collect, and how it was being used.
“Right now, Australians don’t have any legal protections from privacy infringement on their brain and cognitive data. Technology companies can mine the neural data of Australians – including children – and store this information outside Australia.”
He said one potential solution would be to update privacy legislation to work in conjunction with the Therapeutic Goods Administration (TGA), which regulates the supply of medical devices in Australia.