The intensive development of new neural tools over the past ten years has been accompanied by an important debate on the framework to be given to these increasingly invasive tools for privacy.
Let's go back in time: a little over 12 years ago, the first “serious” companies (having established enough preliminary research to attest to the impact of their products - therefore to be distinguished from companies of the early 2020s who sold more scams than tools) began to put neural tools on the market at reasonable prices: vision improvement headsets, augmented reality prosthesis, chip implanted directly on the neural system, etc.
Led in large part by companies such as Neuralink (from the legendary Elon Musk) but also subsequently Facebook, Google… this digital revolution 3.0 has enabled all the holders of these new tools to greatly increase their cognitive, physical and mental capacities.
But as we now know with regard to technological (r)evolutions, this new step was not without consequences and without cost to the individual. If social networks have been the paradise of data theft, neural tools have become the paradise of thought theft!
Beyond the simple analysis of individual behavior via external sensors (personal data left, like a little crumb on the way to the web), it is now internal sensors that have come to analyze not our actions but our thinking processes and our cognitive connections.
The era of neuro marketing has knocked on our door much faster than expected and our new tools have become keys opening the doors to our minds. To the delight of large companies who have seen here even more precise sources of analysis of our instincts.
Because this is where the rub is: as with personal data on social networks, which ultimately represented only a basic behavioral approach that left aside elements such as: facing our fears, being curious ... neuro marketing does not seek to balance the data obtained directly in our brain by an understanding of our decision-making processes and of what science still does not know how to explain: our consciousness. And with it our values, moral, social choices…
Building on past experience, the European Union has decided to take the lead and work on a regulatory framework that will severely limit the accessibility of companies selling products (and owners of operating codes for tools) to users' brain datas.
Beyond this element, the EU is also looking for a way to impose on companies the obligation to have all of their algorithms analyzed by approved research centers which will bring a neutral and objective scientific approach to limit the use implementation of an aggressive sales method promoting gross instinctive reactions.
In this context, the Pacific Islands Forum approached the EU to learn from the regulatory work that will be carried out in this area and to see how to help island states take into account this institutional framework and avoid that, like the networks social at the time, these new tools did not deeply impact the social organization of Pacific communities.