fbpx Brainstorming the research: the future of brain studies | Science in the net

Brainstorming the research: the future of brain studies

Read time: 4 mins

One of the most unknown places, still far from being explored and fully understood, is not (only) the deep universe: it is our brain. That is why USA and the European Union, in the very last years, started two separate, but parallel, initiatives to map the human brain. On March 2014, there has been a public announcement that the two projects will merge. Or will they smash?

The Human Brain Project

The first attempt to simulate some parts of the structure of a brain is dated back to 2005, when IBM, in collaboration with the Ecole Politecnique of Lausanne in Switzerland, has started a research program called the Blue Brain project. After just one year, researchers were able to create a virtual but biologically realistic model of a single neuron, thanks to a supercomputer called Blue Gene. In 2008, the Blue Brain project has recreated a cellular neocortical column of 10,000 cells and finally, in 2011, a virtual model of about a million neurons was built.

Given these results, the European Union decided in 2013 to foster the efforts and, thus, to map the entire human brain. Together with the Graphene flagship, the Human Brain Project (HBP) entered the funding program called Future and Emerging Technology (FET). This gave the researchers the access to over one billion euro in ten years. The HBP is still based in Lausanne (even after the recent tensions between Switzerland and the EU) and involves 135 partner institutions from 26 countries.

Briefly, the HBP is supposed to create a virtual but realistic model of the entire human brains. This will not only create a manageable model to study the mostly unknown brain diseases such as Alzheimer, but, through neuroinformatics, it is meant to have implications in neurorobotics and high performance computing. 

BRAIN initiative

The Brain Research through Advancing Innovative Neurotechnologies (BRAIN) is partly, but not deeply, different from the European HBP, both in aims and structure. Announced in April 2nd, 2013 by President Obama (who strongly endorsed the initiative), BRAIN will map the activity of every single neuron in our brain. The aim is to give “new dynamic picture of the brain that, for the first time, shows how individual cells and complex neural circuits interact in both time and space” (as explained in the interim report, which explains the six high priority research areas). The initiative is funded by the National Institutes of Health, the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF), with a budget of $ 40 millions for the fiscal year 2014. But, for the next year, President Obama has already doubled the funding, from $ 100 millions to $ 200 millions.

Synergies and difficulties

The two programmes will have approximately the same deadlines, the same budget (about 1 billion euro each) and many scientists from the two projects have already begun to collaborate informally, as it occurs, for instance at the Allen Institute for Brain Science. As reported by Nature, US government officials already gave the news of the merging process: HBP and BRAIN will work together. Meanwhile, after settling the political clash with the EU (at least for the moment), even Israel has been involved in the HBP/BRAIN partnership, through the HBP.

Many scientists said it was a natural wedding since, basically, the two programs deal with the same object (the brain) but from different perspectives (briefly: HBP is more focus on creating a “virtual brain”, BRAIN on discovering the interactions between all the brain's cells). But many other analysts fear that something can go very wrong: skeptics base their criticisms on at least three aspects.

First of all, many scientists say that the paradigm underlying the two projects is nowadays out of date or simply wrong. In other words, they say that the plasticity of the brain activities cannot be “photographed”, even through an incredibly complex virtual model. Then someone threatens a so-called “data tsunami”, as it has been estimated that a complete virtual brain could generate about 300,000 petabytes a year, namely 300,000 thousands billions of data every year. That is almost the same traffic of data the internet generated in 2013 (some analysts calculated it in 32,000 petabytes per month). For comparison, the Large Hadron Collider, once fully working, will generate “just” 15 petabytes. Finally, on the ethical side, shared principles have to be discussed, which is always not easy while dealing with the study of parts of the human body.

Aiuta Scienza in Rete a crescere. Il lavoro della redazione, soprattutto in questi momenti di emergenza, è enorme. Attualmente il giornale è interamente sostenuto dall'Editore Zadig, che non ricava alcun utile da questa attività, se non il piacere di fare giornalismo scientifico rigoroso, tempestivo e indipendente. Con il tuo contributo possiamo garantire un futuro a Scienza in Rete.

E' possibile inviare i contributi attraverso Paypal cliccando sul pulsante qui sopra. Questa forma di pagamento è garantita da Paypal.

Oppure attraverso bonifico bancario (IBAN: IT78X0311101614000000002939 intestato a Zadig srl - UBI SCPA - Agenzia di Milano, Piazzale Susa 2)

altri articoli

Epidemic: from reality to fantasy

Comparing the Covid-19 pandemic with two pandemics from literature: “The White Plague” by Frank Herbert and “Station 11” by Emily St. John Mandel

Epidemics is an often recurring theme in world literature, where authors share with us their realistic and unrealistic version of them. I recently read two books with global plagues in them: “The White Plague” by Herbert (1982) and “Station 11” by St. John Mandel (2014). These books came to mind at the outbreak of the new coronavirus epidemics, and I was reminded of the traits of their own epidemics and how puzzled they had left me. I will not compare these three diseases scientifically, as that would be impossible.