Вы находитесь на странице: 1из 78

FP7 – 604102 – HBP

CP-CSA-FF

Appendix 1: Overall Vision for the Human Brain Project

FP7 – 604102 – HBP

CP-CSA-FF

TABLE OF CONTENTS

TABLE OF CONTENTS

2!

1!

Strategic S&T roadmap

3!

1.1!

1.2!

Conceptual overview

3!

The HBP work plan

4!

1.2.1! Data

6!

1.2.2!

ICT platforms

17!

1.2.3!

Applications

35!

1.2.4!

SP14: The HBP Ethics and Society Programme

41!

1.2.5!

SP15: Project and programme management, education, dissemination and innovation

43!

1.3!

Risks and contingencies

46!

2!

HBP as a European Programme: collaboration and coordination

52!

3!

2.1!

2.2!

Forms of collaboration

52!

Governing the collaboration

52!

Impact

52!

3.1.1!

Transformational impact

52!

3.1.2!

Future neuroscience

53!

3.1.3!

Future medicine

55!

3.1.4!

Future computing

56!

3.1.5!

Benefits for European industry and the European economy

59!

3.1.6!

Benefits for European society

60!

3.1.7!

Strengthening of the interfaces between ICT and other disciplines

61!

3.1.8!

Specific impacts of the ICT Platforms

61!

3.1.9!

Exploitation and IPR

66!

3.1.10!

Use of the platforms

67!

3.1.11!

Sustainability

67!

3.1.12!

Additional Financial resources

68!

4!

Ethical, societal and gender policies of the Flagship

68!

4.1!

4.2!

Ethics and society

68!

Governance of ethical issues within the HBP Project

69!

4.2.1!

Strategic oversight

69!

4.2.2!

Research ethics

69!

4.3!

Gender aspects

69!

4.3.1!

Gender balance in HBP recruitment and management

69!

4.3.2!

Gender considerations in HBP research

70!

REFERENCES

71!

FP7 – 604102 – HBP

CP-CSA-FF

1 Strategic S&T roadmap

1.1 Conceptual overview

The Human Brain Project is a ten-year project, consisting of a thirty-month ramp-up phase, funded under FP7, with support from a special flagship ERANET, and a ninety-month operational phase, to be funded under Horizon 2020. The project, which will have a total budget of over Euro 1 billion, is European-led with a strong element of international cooperation. The goal of the project is to build a completely new ICT infrastructure for neuroscience, and for brain-related research in medicine and computing, catalysing a global collaborative effort to understand the human brain and its diseases and ultimately to emulate its computational capabilities.

The proposed infrastructure will consist of six ICT-based research platforms, providing neuroscientists, medical researchers and technology developers with access to highly innovative tools and services that can radically accelerate the pace of their research. These will include a Neuroinformatics Platform, that links to other international initiatives, bringing together data and knowledge from neuroscientists around the world and making it available to the scientific community; a Brain Simulation Platform, that integrates this information in unifying computer models, allowing in silico experiments, impossible in the lab; a High Performance Computing Platform that provides the interactive supercomputing technology neuroscientists need for data-intensive modelling and simulations; a Medical Informatics Platform that federates clinical data from around the world, providing researchers with new mathematical tools to search for biological signatures of disease; a Neuromorphic Computing Platform that makes it possible to translate brain models into a new class of hardware devices and to test their applications; a Neurorobotics Platform, allowing neuroscience and industry researchers to experiment with virtual robots controlled by brain models developed in the project. The platforms are all based on previous pioneering work by the partners and will be available for internal testing within eighteen months of the start of the project. Within thirty months, the platforms will be open for use by the community, receiving continuous upgrades to their capabilities, for the duration of the project.

The HBP will trigger and drive a global, collaborative effort that uses the platforms to address fundamental issues in future neuroscience, future medicine and future computing. A significant and steadily growing proportion of the budget will be devoted to research by groups outside the original HBP Consortium, working on themes of their own choosing. The expected end results will include a new understanding of the brain and its diseases and radically new forms of ICT, that exploit this knowledge. The social economic and industrial impact is potentially enormous.

To achieve these goals the project has set itself six Strategic Objectives for the Full Flagship (SOFF).

SOFF-1.

Design, develop, deploy and operate the ICT platforms, providing novel ICT-based services

SOFF-2.

for researchers in neuroscience, medicine and computing and creating a user community of research groups from within and outside the HBP. Catalyse ground-breaking research into the structure and function of the human brain, the

SOFF-3.

causes, diagnosis and treatment of brain disease, and brain-inspired computing technology Generate and collect the strategic neuroscience data, create the theoretical frameworks and

SOFF-4.

develop the scientific and technological capabilities required for the design and operation of the platforms Implement a strategy of responsible innovation, monitoring science and technological results

SOFF-5.

as they emerge, analysing their social and philosophical implications, raising awareness of these issues among researchers and among citizens and involving them in a far-reaching conversation about future directions of research. Implement a programme of transdisciplinary education, training young European scientists to exploit the convergence between ICT and neuroscience and creating new capabilities for European industry and academia.

FP7 – 604102 – HBP

CP-CSA-FF

SOFF-6. Develop a framework for collaboration that links the partners under strong scientific leadership and professional project management, providing a coherent European approach and promoting effective alignment of regional, national and European research and

programmes.

1.2 The HBP work plan

The HBP is organized into fifteen subprojects (thirteen in the ramp-up phase) covering five areas of research. Each subproject builds on existing work by the partners building, contributes to work in the other subprojects and acts as a catalyst for the integration of research from outside the HBP Consortium. Below we briefly summarise our specific aims in each area.

1. Data. This part of the HBP’s work aims to generate and analyse strategically selected data on the structure and function of the mouse and human brains at different levels of biological organisation (genetics, gene expression, cell numbers and morphologies, long-range connectivity, cognitive function etc.), deriving general principles of brain organisation. An important part of the work will focus on data (e.g. genetic data, data on gene expression in different cell types) that the project will subsequently use to predict features of the brain that are difficult or impossible to measure experimentally. The results, deposited in publicly accessible atlases (see below), will feed multi-level models of the mouse and human brain. Another, equally important part of the work will focus on cognitive function – combining human data from fMRI, DTI, EEG and other non-invasive techniques to provide a comprehensive spatial and temporal description of the neuronal circuits implicated in specific well- characterised cognitive tasks. The results will guide the development of high-level models of neuronal circuitry. High-level models will be used to validate biologically detailed models and vice versa, combining the advantages of top-down and bottom-up approaches.

2. Theory. HBP work in theoretical neuroscience will investigate the mathematical principles under- lying the relationships between different levels of brain organisation and the plasticity mechanisms that subserve the acquisition, representation and long-term memorisation of information about the outside world. The results will help to identify the critical data needed for modelling, and to simplify detailed brain models for implementation in IT and specifically in neuromorphic computing systems. The planned work includes the creation of a new European Institute for Theoretical Neuroscience with programmes for visiting scientists and young investigators.

3. ICT Platforms. The HBP will build, operate and continuously update an integrated system of six ICT platforms providing high-quality services to researchers and technology developers inside and outside the HBP. All the platforms will be remotely accessible through a single HBP web portal.

a. Neuroinformatics Platform. The Neuroinformatics Platform will use state-of-the-art ICT (semantic technology, distributed query technology, provenance tracking etc.) to give neuroscientists the ability to organise and search massive volumes of heterogeneous data, knowledge and tools produced by the international neuroscience community. New tools incorporated in the platform will allow researchers to analyse and interpret large volumes of structural and functional data and to construct brain atlases. The HBP will use these tools to develop detailed 3D multi-level atlases of the mouse and human brains. The atlases, accessible to the community through the HBP web portal, will be the main source of high-quality annotated data for brain modelling.

b. Brain Simulation Platform. The Brain Simulation Platform will provide a suite of software tools and workflows that allow researchers to build and simulate models of the brain at different levels of description, and to perform in silico experiments that are difficult or impossible in the lab. The project will use the platform to develop and validate first draft models of different levels of brain organisation, in mice and in humans. The ultimate goal will be to build and simulate multi-scale, multi-level models of the whole mouse brain and the whole human brain. The capabilities made available by the platform will contribute to identifying the neuronal architectures underlying specific brain functions, to studies of the mechanisms underlying neurological and psychiatric disease, and to new simulation-based techniques of drug discovery. Simplified versions of brain models will form the basis for novel neuromorphic computing systems.

FP7 – 604102 – HBP

CP-CSA-FF

c.

High Performance Computing Platform . The High Performance Computing Platform will provide the advanced supercomputing capabilities required for brain modelling and simulation and for the design of novel neuromorphic computing systems. The first element in the platform will be the HBP Supercomputer , a machine that will gradually evolve toward the exascale over the duration of the project. This will be complemented by satellite facilities dedicated to software development, molecular dynamics simulations, and massive data analytics. A key goal will be to develop a capability for in situ analysis and visualisation of exascale data sets and for interactive visual “steering” of simulations. These features will be invaluable not just for brain simulation but also for many other applications, in the life sciences and elsewhere.

d.

Medical Informatics Platform. The Medical Informatics Platform will federate genetics,

imaging, and other clinical data currently locked in hospital and research archives and make the data available to relevant research communities. An important goal will be to use the platform to identify biological signatures of disease. Success would accelerate the development of a new category of biologically based diagnostics, supported by strong, mechanistic hypotheses of disease causation. Hypotheses developed in this way could then be tested through in silico experiments on the Brain Simulation Platform . The results will help researchers to identify new drug targets and new strategies for treatment, providing valuable input for industry decision-makers before they invest in expensive programmes of animal experimentation or human trials.

e.

Neuromorphic Computing Platform. The Neuromorphic Computing Platform will allow non-expert neuroscientists and engineers to perform experiments with Neuromorphic Computing Systems (NCS): hardware devices incorporating simplified versions of the brain models developed by the Brain Simulation Platform , state-of-the-art electronic component and circuit technologies as well as new knowledge arising from other areas of HBP research (experimental neuroscience, theory). The platform will provide access to three classes of NCS: systems based on physical (analogue or mixed-signal) emulations of brain models (NM-PM), running much faster than real time; numerical models running in real time on digital manycore architectures, (NM-MC), and hybrid systems. The platform will be tightly integrated with the High Performance Computing Platform , which will provide essential services for mapping and routing circuits to neuromorphic substrates, benchmarking and simulation-based verification of hardware specifications.

f.

The Neurorobotics Platform. The Neurorobotics Platform will offer scientists and technology developers a software and hardware infrastructure allowing them to connect brain models, implemented through the Brain Simulation Platform or on neuromorphic computing systems to detailed simulations of robot bodies and their environments, or to physical robots. The capabilities provided by the platform will allow cognitive neuroscientists to perform closed-loop experiments dissecting the neuronal mechanisms responsible for specific cognitive capabilities and behaviours, and will support the development of neurorobotic systems for applications in specific domains (manufacturing, services, automatic vehicles etc.)

4.

Applications. The HBP will support research projects that use the platforms to accelerate research in neuroscience (dissecting the biological mechanisms responsible for cognition and behaviour), medicine (understanding brain disease, developing new diagnostic tools and treatments, personalised medicine) and computing (novel architectures for high performance computing; low-energy computing systems with brain-like intelligence; hybrid systems integrating neuromorphic and conventional technologies; applications for industry, services, vehicles, and the home). The majority of these projects will be carried out by groups from outside the current HBP Consortium, selected via independent peer review within the HBP Competitive Calls Programme.

5.

Ethics and society. The HBP will organise a large Ethics and Society Programme. The programme will include a Foresight Lab, responsible for investigating the project’s likely impact on society; academic studies of the project’s implications for beliefs about the human mind, identity, personhood, and our capacity for control; a far reaching programme to build ethical and social awareness among scientists working in the HBP; and a multifaceted series of initiatives to build and maintain a dialogue with civil society. An independent Ethical, Legal and Social Aspects committee will provide

FP7 – 604102 – HBP

CP-CSA-FF

governance and guidance on the ethical implications of the project’s expected outcomes; a second Research Ethics Committee will supervise day-to-day research practices (clinical research with human subjects, animal experimentation etc.) ensuring compliance with all relevant legal, regulatory and ethical requirements. As shown in Figure 4, each research area will be further divided into subprojects. In what follows, we provide a detailed description of individual research areas and subprojects.

1.2.1

Data

Modern neuroscience research has generated vast volumes of experimental data and large-scale initiatives launched in recent years will gather much more. Nonetheless, much of the knowledge needed to build multi- level atlases and unifying models of the brain is still missing.

The human brain contains close to 100 billion (10 11 ) neurons and a million billion (10 15 ) synaptic connections, each expressing different proteins on the cell membrane and each with its own complex internal structure. Despite huge advances, there is no technology on the horizon that allows us to characterise more than a tiny part of this complexity. Furthermore, obvious ethical considerations place tight constraints on the use of invasive techniques. This means that to understand the human brain we have to maximise the information we can extract from research in other mammalian species and from limited human datasets (e.g. data from neuroimaging, EEG and ECOG; data from autopsied brains; data from human IPSC; genetic data). The HBP’s data generation strategy will thus focus on a small set of strategically critical datasets, for mice and humans, which could allow the project to identify principles making it possible to predict difficult-to-measure values from data that is more readily available. EPFL’s Blue Brain Project has recently published a paper demonstrating the successful use of experimental data on neuron morphologies to predict connectivity in neuronal microcircuits [1]. The HBP plans to systematically extend this Predictive Neuroinformatics approach, deriving and validating general principles of brain organisation that reduce the need for direct experimental measurements and which point to gaps in our knowledge where such measurements are indeed essential. One of the first essential steps will be to develop methods making it possible to synthesise neuron morphologies and electrophysiological behaviour from data on the genes expressed in particular types of cell (cell-type transcriptomics) and to estimate the number of cells of different types in different areas of the mouse and human brains from publicly available maps of gene expression [2, 3].

Data generation in the HBP will be divided into three subprojects dedicated respectively to the Multi- level organisation of the mouse brain, the Multi-level organisation of the human brain, and Brain function and cognitive architectures.

1.2.1.1 SP1: Strategic mouse data

Operational objectives

As just described, ethical and technical considerations make it difficult to obtain data about the detailed structure and function of the human brain. To understand the role of different levels of biological organisation, it is essential, therefore, that we learn as much as possible from other mammals. One promising strategy is to collect systematic data describing different levels of brain organisation in a single species and analysing how variation in structure and function relates to genetic variation. Much of our current knowledge and methods come from studies in mice. The goal of the HBP will thus be to build on this knowledge, generating systematic data sets for mouse genomes and molecules, cells and circuits. The results will help to fill in gaps in the current data and to discover general principles applicable to models of the human brain.

State of the art

Current neuroscience comprises many disciplines and research communities, each focusing on a specific level of biological organisation, and on the brain regions, species and methods best adapted to its specific goals. Progress is rapid at all levels. However, there are many gaps in our current knowledge. At the molecular level, we lack a complete description of the genes expressed in single neurons or the protein composition of synapses. At the cellular anatomy and connectivity levels, we still do not have complete data for a single species; even in C. elegans – the only animal whose neuronal circuitry has been completely deciphered – essential information, such as data on neural morphologies is still missing. At the physiological level, we do not have a clear, quantitatively accurate picture of physiological response in different types of synapse, cell and circuit; data on long-range connections between different brain regions is similarly sparse. Without a

FP7 – 604102 – HBP

CP-CSA-FF

systematic programme of research in a single species, it will be extremely difficult to understand the relationships between different levels of brain organisation – for instance, the way in which a variant in a specific gene can affect the architecture of an animal’s neural circuitry and its subsequent behaviour. The species for which we have most data and the best techniques of data generation is mouse.

Although an enormous amount of work remains to be done, new technologies are making it easier to generate data on the mouse brain, and to relate the data to data for humans. At the molecular level, we already have a large volume of quantitative data on DNA sequences and modifications [4], RNA [5] and proteins [3, 6]. The last two years have seen the release of the first molecular-level atlas of the mouse [3]and human brains [2]. In principle, these atlases, combined with RNA and protein profiles for different cell and synapse types, could it possible to estimate the numbers of cells of different types in different brain regions and to relate the data for the two species, The Human Brain Project will fully exploit these possibilities.

At higher levels of organisation, breakthroughs in scalable methods, and particularly in optogenetics [7] and MRI are paving the way for comprehensive studies comparable to the work currently in progress in molecular biology and proteomics. In particular, there has been considerable progress in connectomics. Molecular tracer methods now make it possible to trace connections between different types of cells and their synapses. Data from these studies can be correlated with results from behavioural studies, traditionally performed in low-throughput settings, but now complemented by high-throughput methods using touchscreen based perception and learning tasks [8]. These methods mean it is now possible to measure and compare data from thousands of animals and compare it to data from human subjects.

Methodology

HBP work in this area will be based on a cohort of genetically diverse mouse strains expressing a range of mutations and normal gene variants. This method will allow the project to systematically generate strategically valuable data (DNA sequences, chromatin, mRNA and protein expression, synaptic connections and cell structure, physiology) and to compare it against human data sets. The results will cast light on the causal relationships between different levels of brain structure. Organisational principles derived from this work will help the HBP to estimate parameter values for human brain models that cannot be measured experimentally.

The study will seek to answer key questions concerning the relationships among different levels of brain organisation.

1. The genome. What is the relationship between differences in gene sequence and chromatin state and higher levels of brain organisation (gene and protein expression, distributions and densities of different cell types, connectivity, the size of different brain regions, large-scale structure of the brain)? How can we characterise the cascade of multi-level effects leading from genes to behaviour?

2. Gene expression. What combinations of genes are expressed in different types of cells at different ages? How do gene expression and its dynamics vary among cell types? What can we predict about cells by reading mRNA profiles? What are the mechanisms underlying spontaneous, stimulus and environmentally driven changes in gene expression?

3. Protein expression. What range of proteins is expressed in different types of neuron, glia and synapse? How do qualitative and quantitative variations in protein expression affect the electrical and pharmacological behaviour of cells? What are the molecular and cell biological principles governing the distribution of proteins within the cell? What can we learn from these distributions?

4. Cells. How many and what types of cells are present in different regions of the brain? What are their morphologies? How much do they vary in number and shape? What are the relationships between gene variants, gene expression and morphology?

5. Connectivity. How many different types of synapse are there? What are the rules governing the formation of synaptic connections between neurons of different types, and the long-term stabilisation of these connections? How are synaptic locations chosen? What are the rules governing long-range connections between different brain regions?

6. Electrophysiology. What are the different profiles of excitability and firing patterns of different neuronal types? What are the different types of synaptic plasticity? What are the mechanisms underlying diversity in synaptic transmission? How do neurons process synaptic input? What are the characteristic emergent behaviours of neuronal microcircuits? How do microcircuits of neurons work together to shape the dynamics of a brain region? How do past activity, network context and

FP7 – 604102 – HBP

CP-CSA-FF

neuromodulation affect the functional expression of neurons’ intrinsic properties and modulate plasticity (metaplasticity)?

7. The neuro-vascular-glial system. How do neurons, glia and blood vessels interact? What is the detailed architecture of the vasculature that directs blood within the brain? What is the structural relationship among neurons, glia and vessels? How do changes in neurons alter the properties of vessels and vice versa ?

Combined with behavioural data, these coherent, multi-level data sets will provide the fundamental information needed to gain insights into the relationships between different levels of biological organisation, to identify basic principles making it possible to predict parameter values where measurements are not available, and to bring data and principles together in unifying models. In this way, the HBP will be able to formulate and answer new questions. Which combination and sequence of activation of different brain regions support different forms of behaviour? How do genes and gene expression correlate with neuron morphology and electrophysiology? What is their relationship to cognition and behaviour? How are the building blocks of behaviour related to one another and what is their mechanistic underpinning at the molecular, cellular and circuit levels? What is the smallest network of neurons that can perform an isolated task? How does the composition of a neural microcircuit affect the computational operations it performs? What is the role of single cell types in the processing of sensory and motor information? How important is multisensory information processing for the individual senses?

The mouse data sets collected during the ramp-up phase will be carefully planned to match data sets for humans. For example, data on gene expression in single mouse neurons will be matched against equivalent data for human cells induced from pluripotent stem cells. Similarly cognitive data obtained from touchscreen behavioural testing in mouse will be matched against equivalent human data sets – now a standard approach. The HBP is aware that it can only generate a very small fraction of the data needed to build detailed models of the brain. The goal, therefore, is to create a scaffold that can then be fleshed out with data from other groups and from predictive neuroinformatics (see below). To pilot this approach, the HBP will collaborate with the Allen Institute for Brain Science (http://www.alleninstitute.org/) in its on-going study of the mouse visual system. At the structural level, this study will map the volumes of brain areas implicated in the visual system, obtaining cell numbers and distributions, as well as data on genetically characterised cell types and on neuron morphologies. Functionally, it will identify the role of single neurons and cell types in visual information processing and visual perception and learning paradigms. The results will be contributed to the INCF (www.incf.org), the GeneNetwork system (www.genenetwork.org), the Genes-to-Cognition programme (www.genes2cognition.org) and other internationally established databases, which will provide the HBP with standardised data resources for theoretical studies and modelling.

Roadmap and key milestones

M30: High throughput screening of the mouse brain phase 1

Methods. Informatics tools ready for managing molecular and cellular data

High throughput screening. Draft transcriptomes of major cerebellar neurons; draft mouse synapse proteome atlas distributions and relative densities of cells in the whole brain; statistical parameters characterizing spatial arrangements between neurons, glia and blood vessels; high-resolution synaptic map of brain regions in a given coronal plane at the mesoscopic level (numbers of PSD95 puncta); ultrastructural data using FIB/SEM revealing principles of neuropil organisation in multiple brain regions

Principles. Principles of quantitative connectomics and neuronal morphologies;

M60: High throughput screening of the mouse brain phase 2

Methods. Optimised workflow for depositing data in the mouse brain atlas.

High throughput screening. Large-scale gene sequencing, electrophysiological and behaviour profiling; refined generic neuronal, glial and synaptic proteomes; transcriptomes of major neuron types in major brain regions; high throughput whole brain mapping of selected ion channels and receptors; neuronal and glial morphologies in major brain regions; refined neuro-vascular-glial system; high throughput tracing of single axons in major brain regions (whole brain); EM block analysis of all brain regions.

FP7 – 604102 – HBP

CP-CSA-FF

Principles. Refined activity-dependent gene expression rules for environment-sensitive brain models; comparisons of mouse and human transcriptomes, proteomes, cellular morphologies, cell counts, presence and dimensions of brain regions.

M90: High throughput screening of the mouse brain phase 3

High throughput screening. Large-scale gene sequencing, electrophysiological and behaviour profiling complete; generic neuronal, glial and synaptic proteomes with data on individual variation; transcriptomes for all neuron types in all brain regions; high throughput mapping of further selected ion channels and receptors (whole brain); neuronal and glial morphologies for all brain regions; presence and dimensions of brain regions; high throughput tracing of single axons (whole brain); final neuro-vascular-glial system with sufficient data on variance to constrain a vascular synthesiser; EM block analysis of all mouse brain regions providing information on individual variations in the statistical data on synaptic connectivity and neuronal ultrastructure.

Principles. Refined activity-dependent gene expression rules for environment-sensitive brain models; comparisons of mouse and man transcriptomes, proteomes, cellular morphologies, cell counts.

M120: High throughput screening of the mouse brain phase 4

High throughput screening. High throughput mapping of further selected ion channels and receptors (whole brain); high throughput tracing of single axons (whole brain).

Principles. Comparisons of mouse and human transcriptomes, proteomes, cellular morphologies, and cell counts; presence and dimensions of brain regions.

1.2.1.2 SP2: Strategic human data

Operational objectives

Mouse data provides many insights that also apply to the human brain. Obviously, however, the human brain is different. It is essential therefore to compare mouse data with measurements from humans. Although ethical considerations limit the choice of methods, recent non-invasive techniques provide new options. The HBP will use these techniques to generate a scaffold of strategically selected data on the structure and functional organisation of the human brain at different ages and at different levels of biological organisation (see Figure 5), which it will use to catalyse and organise contributions from outside the project, filling in the gaps with data from predictive neuroinformatics. The results will provide essential input for multi-level models of the human brain and for the understanding of brain disease.

State of the art

Genetics and gene sequencing. Genetics is the method of choice for understanding genome-to-phenome linkage at the molecular, cellular and behavioural levels. Two genetic strategies have proven particularly valuable. The first compares the phenotypes produced by point mutations against controls; the second examines small populations of individuals and assesses the role of endogenous genetic variation (natural polymorphisms).

Combined with massive “-omic” data sets, such as ENCODE [9] and the recently released atlas of the adult human brain transcriptome [2], these approaches make it possible to build and test complex systems models where every trait, at every level and scale, can be linked to specific genes loci and regulatory sequences [6]. The recent introduction of computerised touchscreen approaches has made it possible to compare a subset of human cognitive functions with equivalent functions in mouse [10]. Despite the limitations of mouse models for predicting complex behaviour and cognition in humans, comparative studies of mice and humans can provide valuable information about putative mechanisms. Functions amenable to this approach include attentional processing, visual and auditory memory, as well as cognitive flexibility and response inhibition. These methods provide a valuable tool for studies of normal human genetic variation.

Human mutations as a major cause of brain disease. Studies have identified over two hundred single gene mutations affecting human postsynaptic proteins and over a hundred and thirty brain diseases in which these mutations are believed to play a role. Recent work suggests that mutations in regulatory sequences may also play an important role in pathogenesis [9]. Studies of individuals with these mutations can provide useful insights into the way variation in specific proteins contributes to differences in cognitive, behavioural and emotional phenotypes, while simultaneously providing valuable information on mechanisms of disease

FP7 – 604102 – HBP

CP-CSA-FF

causation. Particularly interesting are studies on individuals who carry these mutations, but who display no overt signs of disease.

Molecular systems biology. Molecular systems biology uses mathematical and computational methods to understand the molecular basis of information processing in the brain. For example, multi-scalar analysis of genomic variation data and quantitative phenotype data make it possible to map patterns of gene and protein expression to specific neuronal and synapse types. Massive, well-structured molecular data for key brain cell and synapse types make it possible to build rich quantitative models of higher order components – synapses, cells, neuronal ensembles and brain areas –and to link these models to precisely matched anatomical, functional, and behavioural data sets, a precondition for predictive modelling.

Cataloguing cell types using transcriptomic data. Large-scale mapping of gene expression patterns in the mouse brain [11, 12] has confirmed that morphologically distinct cells express different combinations of the same genes. The Allen Institute is now conducting similar studies on human brain material [13]. Combined with data from single cell transcriptomics – not yet available but on the horizon – this data will make it possible to predict the cell types present in different regions of the brain. In principle, the data could also enable prediction of the proteins present in different types of cells.

Cataloguing synapse types using proteomic data. Proteomics studies of human synapses have demonstrated that human synapses contain over a thousand different proteins [14]. The protein composition of synapses differs between different brain regions, different neuronal types and even along the same dendrite, and certain patterns of synaptic protein are typical of specific cell types and brain regions [15]. Array Tomography, a new technique, makes it possible to analyse between ten and twenty synaptic proteins, mapping synapse diversity at the single synapse level [16]. Recently developed optogenetic methods for labelling synaptic proteins allow rapid, highly efficient mapping of individual synapse types, characterisation of the synapses present in different regions of the brain and identification of their role in neuronal information processing.

Living human neurons from stem cells. It is now possible to study living human neurons derived from human induced Pluripotent Stem Cells (iPSCs) [17]. The combination of iPSCs with developmental neurobiology has made it possible to model human cortical function in a dish [18]. In particular, the zinc finger nuclease technique provides a tool to generate human neurons carrying specific disease mutations [19].

Imaging. Structural and functional imaging of the living human brain provide a valuable supplement to high- resolution data from anatomical post mortem studies [14]. Maps of the density of the main types of neurons in post mortem brains can link functional imaging data to underlying brain anatomy [15]. Although results still need to be validated, recent in vivo imaging techniques, particularly diffusion and resting state imaging, have made it possible to map large-scale patterns of structural connectivity [16-18]. Polarised Light Imaging (PLI), detecting the myelin surrounding axons, makes it possible to link DTI data to the microscopic level and to verify data from in vivo experiments [19]. Intra- and subcortical connection profiles for individual areas, obtained in this way, are likely to provide new insights into the structure and function of the brain. For the human brain, PLI is also one of the few methods that can bridge the gap between macroscopic organisation and more detailed knowledge about long and short fibre tracts. Given that most current information on human brain connectivity is extrapolated from animal and developmental studies, this is a crucial step.

Post mortem studies provide useful information about the distribution of different types of transmitter receptor in different regions of the brain [20]. Receptors play a key role in neurotransmission and are highly relevant for understanding neurological and psychiatric diseases and the effect of drugs. So far, however, most of this work has been based on static interaction representations that do not capture the full molecular dynamics of the nervous system. This will require Molecular Dynamics models that exploit HBP high performance computing capabilities

Today there is evidence suggests that many neurological and psychiatric diseases (e.g., epilepsy, schizophrenia, major depression) depend on the equilibrium among multiple receptors. Modelling and simulation provide an essential tool for understanding these complex mechanisms.

Brain models require precise data on the cellular organisation of different brain areas (e.g. cortical layers and columns) and their connectivity. Recent studies have combined post mortem studies of laminar cell

FP7 – 604102 – HBP

CP-CSA-FF

distributions with in vivo diffusion techniques to measure the distribution of cell and fibre diameters, opening the road to in vivo studies of human cytoarchitecture and connectivity.

Methodology

The single cell transcriptome. The HBP will measure the single cell transcriptome of specific types of clinically derived brain cells and from human iPSCs. It will then compare the data with data from mouse studies. Combined with modelling and gene expression maps, this data will make it possible to predict aspects of brain structure that cannot be measured experimentally.

The proteome. The HBP will measure the proteins expressed in human neurons, glial cells and synapses, and compare the results against data from mice.

Distribution of receptors. This work will map the distribution of receptors for different neurotransmitters in different brain regions, making it possible, in principle, to model the role of neurotransmission and neuromodulation in attention, learning, emotion, reward and sleep. Knowledge of receptor distributions will also make it possible to model some of the effects of drugs and of neurotoxins.

Neuron morphologies. This study will characterise the morphologies of different types of neuron present in different regions of the human brain. Combined with modelling, the results will allow the project to predict a large proportion of the short-range connectivity between neurons, without measuring the connectivity experimentally [1].

Neuronal architecture. Neuronal architecture differs between brain regions with respect to the density, size, and laminar distribution of cells, and the presence of cell clusters. Significant differences have been observed in primary vs. secondary, visual vs. auditory, sensory vs. motor, and phylogenetically old vs. younger areas. This work will map the architectures of different layers and areas of the brain, providing constraints for simulation and modelling by introducing area-specific information on the level of large cognitive systems and behaviour.

Human brain connectomics. The HBP will use Diffusor Tensor Imaging (DTI) and Polarised Light Imaging (PLI) to derive patterns of connectivity between brain regions and to identify fibre tracts connecting layers and cells within brain regions. This data is essential for modelling the large-scale structural architecture of the brain.

Mapping of the developing, adult and aging brain. Structural and functional MRI will make it possible to map inter-individual differences in the adult human brain, and to identify structural changes characteristic of different stages of development and aging. Such information is necessary, among other reasons, to understand and model the formation of fibre tracts, the development of human cognition and the transition to disease.

Roadmap and key milestones

M30. High throughput screening of the human brain phase 1

Methods. Method for alignment of DTI and PLI data established, informatics methods and tools refined and validated

High throughput screening. First set of high-resolution anatomical, diffusion, and functional MRI images from the ten subjects selected for massive mapping, first quantitative description of the major fibre tracts connecting human brain regions; quantification of T1 and T2 relaxation time in each major tract of the atlas;

M60: High throughput screening of the human brain phase 2

Methods. Optimised work flow for depositing data in the human brain atlas.

High throughput screening. Initial generic neuronal, glial and synaptic proteomes; transcriptomes of human neurons using iPSCs; high throughput mapping of selected receptors (whole brain); neuronal and glial morphologies in major brain regions; initial human neuro-vascular-glial system; EM block analysis of selected major brain regions providing statistical data on synaptic connectivity and neuronal ultrastructure; fMRI and DTI mapping of developing, adult and aging brains using standardised stimulus protocols to determine macrostructure connectivity, and patterns of activation.

Principles. Initial activity-dependent gene expression rules for environment-sensitive brain models using iPSCs; comparisons of mouse and human transcriptomes, proteomes, cellular morphologies, cell counts; comparison of presence and dimensions of brain regions.

FP7 – 604102 – HBP

CP-CSA-FF

M90: High throughput screening of the human brain phase 3

High throughput screening. Refinement and continuation of screening from phase 2.

Principles. Activity- and environment-sensitive gene expression rules, transformations from mouse to human data.

M120: High throughput screening of the mouse brain phase 4

High throughput screening. Refinement and continuation of screening from phase 3.

Principles. Full spectrum data for predictive reverse engineering algorithms.

1.2.1.3 SP3: Cognitive architectures

Operational objectives

The goal of this subproject will be to use well-defined cognitive tasks, already partially studied by cognitive neuroscience, to dissect associated patterns of brain activation and response dynamics, and to extract principles of cognitive architecture that can be ultimately transferred into neuronal models. Studies will span scales ranging from global networks to local cortical maps and, where possible, sets of individual neurons. The results will allow the project to develop high-level models of the cognitive architectures implicated in particular competencies. Combined with behavioural performance data, they will provide benchmarks for the validation of the detailed brain models produced by the Brain Simulation Platform and guide the development of simplified models for use in neuromorphic devices and neurorobotic systems.

State of the art

between brain structure and cognitive function came from post mortem neuro-anatomy, the recent neuro- imaging revolution has greatly refined our understanding of cortical and subcortical functional specialisation [21]. Thanks to these techniques, we now have relatively precise information about the areas of the human brain responsible for processing particular categories of visual information (e.g. information on faces, body parts, words), so-called core knowledge systems (systems handling information about space, time or number), language processing, and representing other people’s minds (theory of mind).

Neural codes. The localisation of the areas responsible for specific functions is a means, not an end. Recent studies have thus attempted to characterise areas and regions in functional terms, studying how activation varies with stimuli and tasks, and attempting to understand internal coding principles. Although the neural basis of fMRI is not yet fully understood, high-resolution fMRI, repetition suppression and multivariate analyses of activation patterns form an essential toolkit that, in the best cases, allows precise inferences about the underlying neuronal codes [22, 23]. Although these codes vary across brain areas and cognitive domains, the hierarchical Bayesian perspective emerges as a cross-domain unifying principle: neuronal populations act as statistical predictive-coding devices that represent priors, sensory evidence, and posterior probabilities and use them to infer and anticipate upon external events.

Spontaneous activity. Further insights come from studies of the way functional activity changes over time, including “resting state” studies, in which brain activity fluctuates “spontaneously” [24]. While some scientists see these fluctuations as nothing more than a consequence of neural noise in a non-random structural network, others interpret them as memory traces in a dynamic internal model of the environment [25]. What is certain is that continuous but irregular spontaneous activity is a key characteristic of the brain that distinguishes it from engineered information processing systems. Understanding resting states and their dynamics could provide a strategy for systematically parsing functional brain areas and circuits in the living human brain.

Neurophysiological dynamics. Timing information from fMRI has made it possible to parse the dynamics of language and executive networks at ~200 millisecond resolution [26, 27]. Electrophysiological recordings, using non-invasive MEG and EEG in healthy human subjects, invasive intracranial grids and single-electrode recordings in epilepsy patients, and grids and multi-electrodes in non-human primates provide an even higher level of spatio-temporal detail. This work has made it possible to characterise the neural codes for high-level vision and decision-making, and to identify specialized neurons in human neocortex cells that respond selectively to particular objects, faces, places, and people or that encode specific action goals independently of motor plans. Similar techniques have thrown new light on attention. A prominent proposal suggests that attentional filtering is implemented by selective synchronisation among neurons representing behaviourally

FP7 – 604102 – HBP

CP-CSA-FF

relevant information [28]. Recordings of local field potentials (LFP) and recordings from multiple neurons with MEAs have made it possible to test this proposal [29].

High-level cognitive functions. The recent literature includes descriptions of networks for language comprehension, reading, and mathematics, and their development from infancy to adulthood. Other studies have focused on the way humans and other primates form strategies, detect errors and switch between tasks. This work has shown how networks crossing the prefrontal and parietal regions implement a “central executive” system, also called the “global neuronal workspace” or “multiple-demand” system [30].

Capabilities unique to the human brain. Comparative studies often pose the question of which cognitive abilities, if any, are unique to humans [31, 32]. Recent work shows that, at the sensory-motor level, humans and other primates are highly similar in many respects[33, 34]. However, humans appear to be distinguished by their recursive combinatorial ability, the capacity to bind words or other mental objects into hierarchical nested or constituent structures, as seen in sentence formation, music and mathematics. Monkeys can perform elementary arithmetic operations similarly to humans, and even acquire symbols for digits [35] but apparently only humans can “chain” multiple operations into nested algorithms [32]. Recent studies have identified neuronal networks associated with these capabilities [see for example 36, 37]. Finally, the human brain may have a unique ability to represent an individual’s own mind (second order or “meta” cognition) and the thoughts of others (“theory of mind”). fMRI studies have identified a reproducible social brain network, active during theory of mind, but also during self-oriented reflections and the resting state. Interestingly, this network is modified in autism [38, 39].

Methodology

The HBP will combine human data from fMRI, DTI, EEG and other non-invasive techniques to provide a comprehensive spatial and temporal description of neuronal circuits implicated in specific well-characterised cognitive tasks. The project will focus on the following functions.

Perception-action. Invariant visual recognition; mapping of perceptions to actions; representation of action meaning; multisensory perception of the body and the sense of self.

Multimodal sensory-motor integration. Integration of data from vision, audition, body representations and motor output.

Motivation, decision and reward. Decision-making; estimating confidence in decision and error correction; motivation, emotions and reward; goal-oriented behaviour.

Learning and memory. Memory for skills and habits (procedural memory); memory for facts and events (episodic memory); working memory.

Core knowledge of space, time and numbers. Fundamental circuits for spatial navigation and spatial memory; estimation and storage of duration, size and numbers of objects.

Capabilities characteristic of the human brain. Processing nested structures in language and in other domains (music, mathematics, action); generating and manipulating symbols; creating and processing representations of the self in relation to others.

Architectures supporting conscious processing. Brain networks enabling the extraction and maintenance of relevant information; representations of self-related information, including body states, decision confidence, and autobiographical knowledge.

In each case, the HBP will develop highly structured, easily reproducible experimental paradigms, (localiser tasks) applicable initially in human adults, but ultimately also in infants. The study will use a variety of techniques including high-field MRI for the creation of high-resolution activity maps, M/EEG data for the reconstruction of neural dynamics as well as intracranial electro-corticogram (ECOG) and single-cell recordings in epilepsy patients.

To provide a comprehensive view of the circuitry implementing these functions in individual brains, the project will recruit a small group of subjects for repetitive scanning (10-20 scanning sessions over three months, repeated every year). MRI scans at 3, 7 and ultimately 11.7 Tesla will make it possible to characterise the geometrical relationships between the areas involved in each individual subject and their relationship to anatomy and connectivity. Results will be deposited in the HBP Brain Atlas and Brainpedia.

Work will begin during the ramp-up phase. First, the NeuroSpin centre will set up an environment that makes it possible to scan about ten individual human brains, year after year for the duration of the HBP. To do this,

FP7 – 604102 – HBP

CP-CSA-FF

the centre will address ethical and organisational issues, define recruitment criteria and disclosure policies, scan a primary pool of ~fifty volunteers; selecting ten, and run a first set of high-resolution anatomical, diffusion, and functional MRI protocols with these subjects. Meanwhile, a coordinated network of HBP teams will launch research into a subset of specific cognitive tasks. Work during the ramp-up phase will be dedicated to the definition of experimental protocols allowing accurate parsing of the relevant cortical and subcortical brain networks in individual subjects. The main results will thus be experimental protocol analysis tools and pilot charts of the relevant brain system. In two cases (perception of actions and spatial navigation), where the generation of neuroimaging and neurophysiological data is already advanced, the HBP will propose working simulation models of the neuronal circuitry involved.

Following this initial work, demonstrating the feasibility of the approach, the team will begin studies of the full range of cognitive functions covered by the project work plan, using fMRI, MEG, EEG, and intracranial recordings to investigate principles of circuit-level organisation and neuronal coding. The team will then go on to develop simple experimental paradigms, which will first be validated within the relevant HBP team, then transferred to the NeuroSpin facility where they will be run on the reference set of ten human subjects.

To investigate the neurophysiological mechanisms underlying these functions, the team will combine systematic recordings of local-field potentials with single-neuron recordings in human subjects (subjects undergoing clinical investigations of epilepsy). Parallel investigations in non-human animal models, using multi-scale electrophysiological recording and brain-imaging, will make it possible to identify generic principles of cortical organisation that are common to human and non-human brains. For instance we hypothesise that in many species, the layered structure of cortex implements a Bayesian statistical computation for perception, multi-modal sensory integration, spatial navigation, and abstract extraction of “meaning” features. In the second phase of the project, the team will study these functions in a broad range of species, where the layer structure of cortex is simpler. We expect that these investigations will lead to the identification of computational primitives that are conserved in evolution and of others that are highly specific to the human brain (e.g. the structures underlying language, the use of symbols and social representations).

Finally, these neuro-cognitive observations will be synthesised into high-level neuronal models, taking their spatial layout and connectivity from actual anatomical observations, and integrating additional principles of cognitive coding specific to each area, and cognitive computations permitted by intra- and inter-areal computations between subpopulations of neurons. This modelling research will be performed in close collaboration with the Theory and the Brain Simulation subprojects.

Roadmap and key milestones

M30: Cognitive Architectures 1

Localisers. Standardised localisers for perception-action, space, time, numbers, motivation,decision and reward, multimodal perception, learning and memory, uniquely human capabilities

Cognitive circuits. Cognitive architectures v1 – based on results from localizer experiments

Model preparation. Strategies for neuromorphic implementation of models

Data Management. Strategy for management of data and registration in Human Brain Atlas

FP7 – 604102 – HBP

CP-CSA-FF

M60: Cognitive Architectures 2

Experiments. Experiments to characterise cognitive architectures for perception-action, multimodal perception, space, time and numbers

Cognitive circuits. Cognitive architectures v2 - based on results from localiser experiments

Model construction. Targeted experiments enabling the construction of models of perception-action, multimodal perception, space, time and numbers

Data management. Registration of data in the Human Brain Atlas

Principles of cognition. numbers

Principles of perception-action, multimodal perception, space, time and

M90: Cognitive Architectures 3

Experiments. Experiments to characterise cognitive architectures for motivation, decision and reward,

learning and memory; first results from longitudinal study of human subjects

Cognitive circuits. Cognitive architectures v2 - based on results from localiser experiments

Model construction. Targeted experiments enabling the construction of models of motivation, decision

and reward, learning and memory

Data management. Registration of data in the Human Brain Atlas

Principles of cognition. Principles of motivation, decision and reward, learning and memory

M120: Cognitive architectures 4

Stimulus application. Experiments to characterise cognitive architectures for uniquely human capabilities (symbolic representation, recursive structures and language); first results on longitudinal study on a consistent group of human subjects

Cognitive circuits. Cognitive architectures v3 - based on results from localiser experiments

Model construction. First models of language

Data management. Registration of data in the Human Brain Atlas.

Principles of cognition. Uniquely human capabilities (symbolic representation, recursive structures, and language)

1.2.1.4 SP4: Mathematical and theoretical foundations of brain research

Operational objectives

Theoretical insights from mathematics can make a valuable contribution to many different areas of HBP research, from modelling of low-level biological processes, to the analysis of large-scale patterns of brain activity and the formalisation of new paradigms of computation. The HBP will thus include a cohesive programme of theoretical research addressing strategically selected themes essential to the goals of the project:

mathematical techniques to produce simplified models of complex brain structures and dynamics; rules linking learning and memory to synaptic plasticity; large-scale models creating a bridge between “high-level” behavioural and imaging data; and mathematical descriptions of neural computation at different levels of brain organisation. In addition to the plannable, cohesive theory program implemented by the HBP partners, brain theory also needs to explore unconventional ideas, which are most likely to come from outside the current HBP Consortium. To foster interaction with outside scientists, the HBP will therefore establish a European Institute for Theoretical Neuroscience, which will provide a home for HBP postdocs to work, as well as attract theoreticians not presently involved in the project, and act as an incubator for approaches that challenge traditional wisdom. This work will be funded under the HBP Competitive Calls Programme

State of the art

Understood as mathematical modelling, theoretical neuroscience has a history of at least a hundred years. In general, theoreticians have focused on models addressing specific levels of brain organisation, for instance, the relation of Hebbian learning to cortical development [40], the recall of associative memories [41],the link of temporal codes and Spike Timing-Dependent Plasticity [42] and the dynamics of neuronal networks with balanced excitation and inhibition [43, 44]. In most cases, the output has consisted of “toy models” amenable to mathematical analysis and to simulation on small personal computers. What is not clear is how to connect the insights from these models, or how to ground them in detailed biophysical observations.

FP7 – 604102 – HBP

CP-CSA-FF

These are key themes in the work of the theoretical neuroscientists who have contributed to the preparation of the HBP proposal. For example W. Gerstner has shown how to extract parameters for simple neuron models directly from experimental data, and from detailed biophysical models [45, 46]. M. Tsodyks, W. Gerstner, N. Brunel, A. Destexhe, and W. Senn have produced models of synaptic plasticity suitable for integration in models of large-scale neuronal circuitry [47-50]; W. Gerstner, D. Wierstra, and W. Maass have explored models in which plasticity is modulated by a reward signal [11, 12, 51], a basic requirement for so-called reinforcement learning. N. Brunel has produced models of population dynamics using networks of randomly connected simple neurons [44], an approach exploited by G. Deco to construct models of decision-making [52]. A. Destexhe [53, 54]has investigated the integrative properties of neurons and networks, while W. Maass has studied their underlying computational principles [10, 13].

Methodology

Theoretical work in the HBP will address a broad range of issues, all related to the goal of achieving a multi- level understanding of the brain.

1. Bridging scales. Studies will establish mathematical principles making it possible to derive simplified models of neurons and neuronal circuits from more detailed biophysical and morphological models, population models and mean field models from simplified neuron models, and brain region models from models of interacting neuronal populations. Other studies will model brain signals at various scales from intracellular signals to local field potentials, VSD, EEG and MEG. The results, validated by comparison with results from the subproject on Brain Function and Cognitive Architectures, will provide basic insights into the relationships between different levels of brain organisation, helping to choose parameter values for large-scale modelling, and guiding the simplification of models for implementation in neuromorphic technology.

2. Synaptic plasticity, learning and memory. This work will develop learning rules for unsupervised and goal-oriented learning. Key themes will include the derivation of learning rules from biophysical synapse models, the identification of rules for unsupervised learning and emergent connectivity, rules describing the role of neuromodulation in learning (the role of reward, surprise and novelty), and the functional and medical consequences of disturbances in plasticity on different time scales. Theoretical results will be validated against experimental results from the subproject on Brain Function and Cognitive Architectures.

3. Large-scale brain models. The HBP will develop simplified large-scale models of specific cognitive functions. These models will provide a bridge between “high-level” behavioural and imaging data and detailed multi-level models of brain physiology. Topics for modelling will include perception-action, multi-sensory perception, working memory, spatial navigation, reward systems, decision-making and the sleep/wakefulness cycle. These models will make a direct contribution to the design of architectures for neuromorphic computing systems.

4. Principles of brain computation. Studies in this area will develop mathematical descriptions of neural computation at the single neuron, neural microcircuit and higher levels of brain organisation. The results will provide basic insights into the multi-level organisation of the brain, while simultaneously contributing to the high-level design of neuromorphic systems.

5. To encourage collaboration among theoreticians engaged in different areas of theoretical neuroscience, we propose that the HBP creates a European Institute for Theoretical Neuroscience based in the Paris area. The Institute will run active young researcher, young investigator and visiting scientists programmes and will serve as an attractive meeting point for workshops on topics related to HBP goals.

Roadmap and key milestones

M30: Theoretical Foundations 1

Bridging scales. From morphologically and biophysically detailed neurons to point neurons; first draft principles, algorithms and models for representing neural signals deposited in Human Brain Atlas

Learning and memory. Impact of learning rules on neural circuitry. first draft principles, algorithms and models for representing neural signals deposited in Human Brain Atlas

FP7 – 604102 – HBP

CP-CSA-FF

Large-scale models. Theoretical approach to building large-scale models using data from cognition; first draft principles, algorithms and models of perception-action, working memory, and attention, wakefulness, and sleep deposited in Human Brain Atlas.

Principles of brain computation. Stochastic computing in neurons and circuits; methods to implement stochastic computing in neuromorphic computing systems

EITN: Institute set up and in operation. Visiting scientists programme in operation. First two series of workshops completed. M60: Theoretical Foundations 2

Bridging scales. From molecular detailed neurons to point neurons; from point neurons to mean field equations; from mean field equations to conceptual models; model for representing neural signals at different scales

Learning and memory. Detailed understanding of learning rules under the influence of neuromodulators, in particular motivation and reward, usable for large-scale models

Large-scale models. Models of perception-action, multimodal perception, representation of space, time and numbers in brain theories and neuronal models (see also milestones p. 28)

Principles of brain computation. Stochastic and liquid computing – computational paradigms, which can be translated into neuromorphic implementations

M90: Theoretical Foundations 3

Bridging scales. From molecular and biophysically detailed models of synaptic transmission and plasticity to phenomenological models and learning algorithms

Learning and memory. Multi-level synaptic and circuit learning rules related to goal-oriented learning and memory; theoretical predictions of functional consequences of plasticity for selected human brain diseases

Large-scale models. Models with motivation and reward for learning, decision making and memory (see also milestones p. 28); memory consolidation during sleep

Principles of brain computation. Stochastic computing with self-referential learning in recursive systems and their neuromorphic implementation; principles of brain computation and information routing at the neuronal, circuit and brain areas level

M120: Theoretical Foundations 4

Bridging scales. A comprehensive theory relating neuronal information and signals at different scales, from neurons to brain areas

Learning and memory. A comprehensive theory of learning from synapses to memory including principles of autonomously learning systems

Large-scale models. A comprehensive theory of the interaction between different brain states, from wakefulness to sleep and systems models for (i) spatial navigation (ii) working memory (iii) goal- oriented behaviour

Principles of brain computations. A comprehensive theory of brain computation from the single neuron to the whole brain level, including symbolic structures and recursive representations (see also milestones p. 28)

1.2.2 ICT platforms

The HBP’s first strategic goal will be to develop and operate an integrated system of six ICT platforms, dedicated to Neuroinformatics, Brain Simulation, Medical Informatics, High Performance Computing, Neuromorphic Computing, and Neurorobotics, respectively. Each of these platforms is associated with a distinct subproject in the HBP work plan. Each platform will provide services to the other platforms. To cite just one example, the Neurorobotics Platform, will use brain models developed by the Brain Simulation Platform, high performance computing capabilities provided by the High Performance Computing Platform and Neuromorphic Computing systems provided by the Neuromorphic Computing Platform.

The HBP will use the platforms to support its own research and to provide high quality, profession- ally managed services for the scientific community. These will take the form of research projects solicited through a programme of Competitive Calls evaluated by independent peer review . The ultimate goal is to provide a new ICT foundation for future neuroscience, future medicine and future computing.

FP7 – 604102 – HBP

CP-CSA-FF

1.2.2.1 SP5: Neuroinformatics Platform

Operational Objectives

One of the HBP’s most important objectives is to make it easier for neuroscientists to organise and access the massive volumes of heterogeneous data, knowledge and tools produced by the international neuroscience community - a goal it shares with the INCF [14] and with other on-going projects – in particular the Allen Institute’s Brain Atlas projects (www.brain-map.org). The Neuroinformatics Platform will contribute to these efforts, offering new tools for the analysis and interpretation of large volumes of structural and functional data and for the construction of multi-level brain atlases. The HBP will use these tools to develop detailed multi- level atlases of the rodent and human brains, bringing together data from the literature, and from on-going research, and providing a single source of annotated, high quality data for the HBP modelling effort and for the international neuroscience community.

Another key feature of the platform will be support for Predictive Neuroinformatics: the mining of large volumes of data and analysis of activity data to identify patterns and relationships between data from different levels of biological organisation, making it possible to predict parameters where experimental data is not yet available and to test and calibrate model implementations. Systematic application of this strategy has the potential to drastically increase the amount of information that can be extracted from experimental data, rapidly filling gaps in our current knowledge and accelerating the generation of data required for brain modelling.

State of the art

Virtually all areas of modern science face the challenge of providing uniform access to large volumes of diverse data. In neuroscience, with its broad range of experimental techniques, and many different kinds of data, the challenge is particularly severe. Nearly a hundred years of research has generated a vast amount of knowledge and data, spread across thousands of journals. The challenge now is to provide uniform access to this data.

The first attempts to achieve this goal date back to 1989, when the Institute of Medicine at the US National Academy of Sciences received funding to examine how to handle the growing volume and diversity of

neuroscientific data. The study report, published in 1991 [15] enabled NIMH to create its own Human Brain Project, an effort that lasted until 2004. The work produced many important neuroscience databases. However,

it never created a standard interface for accessing the data and provided no specific tools for relating and

integrating the data.

Soon after the end of the NIMH project, the OECD Global Science Forum initiated the INCF [16].Since 2005, the INCF has driven international efforts to develop neuroscience ontologies, brain atlases, model descriptions and data sharing, and has played an important role in coordinating international neuroscience research and setting up standards. Other important initiatives such as the US-based Neuroscience Information Framework (NIF) [17], and the Biomedical Informatics Research Network (BIRN) [18] are now collaborating closely with INCF on issues related to infrastructure, brain atlases, ontologies and data sharing. Another important organisation, working in this area, is the Allen Institute, today a world leader in industrial-scale acquisition of neuroscience data. The Allen Institute has developed mouse and human atlases for gene expression, connectivity, development, sleep and the spinal cord, and has recently investing an additional $300M for in vivo data acquisition and modelling related to the mouse visual system [19]. Data from this work will contribute directly to the HBP.

A second key area of activity for the HBP will be Predictive Neuroinformatics, a relatively new area of research.

Examples of work in this area include a recently published algorithm that can synthesise a broad range of dendritic morphologies based on sample morphologies [[20], algorithms to generate specific motifs in network connectivity from sampled connectivity patterns [55], and algorithms to predict synaptic strength based on results from electrophysiological studies of connectivity [56]. In related research, a recent study has demonstrated that biophysical models of neurons’ electrophysiological properties can successfully predict ion channel distributions and densities on the cell surface [57]. Combining these predictions with cellular composition data makes it possible to predict protein maps for neural tissue. Finally, predictive neuroinformatics can help to resolve one of the most important challenges for modern neuroscience, namely the classification and categorisation of different types of cortical interneurons [58]. A recent model [59] uses gene expression data to predict type, morphology and layer of origin with over 80% accuracy. The same model reveals rules for the combinatorial expression of ion channel genes [60].

FP7 – 604102 – HBP

CP-CSA-FF

Methodology The HBP will build and integrate a Neuroinformatics Platform incorporating five distinct sets of components.

Tools for brain atlases. The HBP will create a general-purpose open source software framework, allowing researchers to build and navigate multi-level atlases of the brain of any species. These tools will allow researchers to upload and access multi-level information about any part of the brain. The information contained in the atlases will be distributed across databases in different physical locations. The framework will provide a shared data space, ontologies, data mining tools, standards and a generic “Atlas Builder”, making it possible to build, manage and query such atlases. In addition to this work, the project will also create and manage an HBP “Brainpedia” – a community-driven Wiki that provides an encyclopaedic view of the latest data, models and literature for all levels of brain organisation.

Tools to analyse data on brain structure. Much of the structural data produced by modern neuroscience takes the form of image stacks from light and electron microscopy, MRI, PET etc. Given that many of these techniques produce terabytes of data in a single session, the best way to unlock the information they contain is through automatic image processing. The HBP will develop tools for this purpose, which it will share with the community, via the INCF. These will include software to automate the extraction of cell densities and distributions; the reconstruction of neuron morphologies; the determination of subcellular properties such as synapse and organelle geometry, size and location; and the identification of the long-range fibre tracts underlying connectivity.

Tools to analyse data on brain function. Understanding of brain function depends on data from a wide range of techniques. It is important that simulation results should be comparable against this data. To meet this need, the HBP will develop new tools and techniques to compare data from simulations against data from experiments (single neuron recordings, measurement of local field potentials, EEG, fMRI, MEG etc.). Some of these tools will build on previous work in the BrainScaleS project.

Predictive neuroinformatics. The HBP will make a major effort to develop new tools for predictive informatics, using machine learning and statistical modelling techniques to extract rules describing the relationships between data sets for different levels of brain organisation. An important goal will be to predict neuron morphology and electrophysiology from data on gene expression in different types of neuron.

Brain atlases. The HBP will use the tools just described to build multi-level atlases of the mouse and human brains. The design will encourage research groups outside the project to deposit data in the atlases, enabling global collaboration to integrate data across scales in a single atlas for each species.

Roadmap and milestones

M30: Collaborative platform for databasing the brain (Neuroinformatics Platform v1)

Access and user services. Web site; guidebook

Brainpedia and Atlas tools. Common dataspace, ontologies, and standardised workflows; spatial referencing of data; drag and drop interface for depositing data; integration of genomic and anatomical data; advanced query capabilities using techniques developed in the HPC sub- project, Analysis capabilities for EM segmentation and LFP analysis (spike sorting); functional analysis capabilities for spike sorting and LFP analysis;

Brainpedia. Brainpedia v1 (initial data on mouse ion channels, neuron types, and microcircuitry)

Atlases. First draft of a navigable 3D mouse atlas. First draft of a navigable human 3D atlas

Predictive reverse engineering. First draft algorithm to identify brain regions targeted by axonal projections M60: Internet accessible 3D mouse brain atlas and encyclopaedia (Neuroinformatics Platform v2)

Access and user services. Community measures to drive deposition of data (recognition for researchers, agreements with journals, editing rights, free access to analysis and visualisation capabilities); updated guidebook

Brainpedia and atlas tools. Cross-species links between mouse and human atlases; semantic search; spatial and temporal registration tools; automated search and pull for new data and literature; automated author notifications

FP7 – 604102 – HBP

CP-CSA-FF

Brainpedia. Brainpedia v2 (all levels of brain organisation; community driven maintenance and auto- aggregation of global content.

Atlases. Mouse atlas completed (all levels of structural and functional brain organisation); Human atlas v2.

Predictive reverse engineering. Algorithm to predict statistical features of neuron morphology from cell-type gene expression patterns. Algorithm to predict the cell proteome (including ion channels and receptors) from cell-types gene expression patterns. Algorithm to predict the synaptic proteins and physiological type synapse from pair-wise gene expression and predicted cell-type proteomes. Algorithm to predict ion channel and receptor locations using gene sequences, gene expression patterns, cell biological principles of protein localisation, and constraints from systems biology. Enhanced algorithms to predict cellular composition by matching single cell gene expression patterns, genetic cell- type distributions and staining maps. Enhancements to the algorithm for the prediction of long-range axonal projections.

M90: Globally federated data and knowledge on the human brain (Neuroinformatics Platform v3)

Access and user services. Security systems to protect anonymity of human data; updated guidebook

Brainpedia and atlas tools. Enhancement of v2 tools

Brainpedia. Brainpedia v3 (all levels of human brain organisation)

Atlases. Human atlas v3 (major brain areas, major cells, major proteins, major genes)

Predictive reverse engineering. Enhanced versions of algorithms developed in previous phase. Algorithms to predict missing human data using data from mouse and other animals.

M120: Internet accessible 3D human brain atlas and encyclopaedia (Neuroinformatics Platform v4)

Access and user services. Community development of code for brain Atlas and Brainpedia, updated guidebook; public APIs for community driven data analysis

Brainpedia and atlas tools. Enhancements to previous versions. Cross-species analytics.

Brainpedia. Brainpedia for the human brain.

Atlases. Human Brain Atlas

Predictive reverse engineering. Enhanced versions of existing algorithms

1.2.2.2 SP6: Brain Simulation Platform

Operational Objectives

The Brain Simulation Platform will consist of a suite of software tools and workflows that allow researchers to build biologically detailed multi-level models of the brain displaying “emergent” structures and behaviours that cannot be predicted from small data sets. The platform will make it possible to build models at different levels of description (abstract computational models, point neuron models, detailed cellular level models of neuronal circuitry, molecular level models of small areas of the brain, multi-scale models that switch dynamically between different levels of description), allowing experimentalists and theoreticians to choose the level of detail most appropriate to the questions they are asking and to the data and computing power available. The platform will be designed to support continuous refinement and automated validation as more data becomes available, ensuring that models become steadily more accurate and detailed as the project proceeds.

The tools made available through the platform will allow researchers to perform in silico experiments including systematic measurements and manipulations impossible in the lab. Such experiments will contribute to identifying the neuronal architectures underlying specific brain functions, to studies of the mechanisms underlying neurological and psychiatric disease and to the simplification of neuronal circuitry for implementation in neuromorphic technology (see below). The project will use these tools to develop and validate first draft models of different levels of brain organisation, in mice and in humans. The ultimate goal is to develop multi-scale, multi-level models of whole mouse and human brains, in which different brain areas are modelled at levels of detail, appropriate to the state of current knowledge and data, the computational power available for simulation and the needs of researchers.

State of the art

Early models of the brain attempted to explain brain functions, such as learning and memory, in terms of the behaviour of neurons and neuron populations, thus giving rise to the fields of Artificial Neural Networks and

FP7 – 604102 – HBP

CP-CSA-FF

Machine Learning [61]. In parallel, other researchers developed mechanistic models that explained brain functions in terms of biological processes. In particular, Hodgkin and Huxley’s seminal model of the generation of neuronal Action Potentials [62] and Rall’s application of cable theory to signal propagation in dendrites [63] made it possible to build models of the brain from its basic components. Yet other models cast light on the dynamics of large networks of excitatory and inhibitory neurons. In the 1980s, Roger Traub [64, 65] used an IBM 3090 mainframe computer to simulate 10,000 neurons, each with about twenty compartments. Ten years later, De Schutter and Bower pushed the complexity of multi-compartment neuron models to simulate a cerebellar Purkinje cell [66, 67], with over 1,600 compartments, and Obermayer et al. pioneered the use of parallel computers for large-scale simulations of simplified neurons [68]. Since then, rapid improvements in supercomputer performance have made it possible to simulate ever-larger models. In 2005, for instance, Izhikevich reported a feasibility study simulating a network with 10 11 neurons and 10 15 synapses, numbers comparable to the numbers of neurons and synapses in the human brain. In this model, each neuron was represented by a single compartment, synapses were not explicitly represented and connections had to be recomputed on each simulation step [69]. However, in 2007, Djurfeldt et al. reported a large-scale simulation of a columnar cortex with 10 7 detailed multi-compartment neurons and 10 10 synaptic connections [70]. In the same year, Morrison reported the simulation of a network with 10 9 synapses and spike-timing dependent plasticity (STDP) [71]. In 2009, the Modha group at the IBM Almaden Research Centre reported the simulation of a network, with roughly the same numbers of neurons and synapses as the brain of a cat (10 9 neurons and 10 13 synapses) [72, 73].

In parallel with work on very large-scale networks, many groups have developed general-purpose simulators allowing simulation of the brain at different levels of biological detail. Examples of simulators for large networks of relatively simple neurons include Topografica [74], PCSIM [75], MIIND [76], and NEST[77]. NEURON [78] makes it possible to simulate morphologically complex neurons and networks of neurons, and can be integrated with molecular-scale simulations that add biochemical details to its electrical modelling. STEPS [79], MCELL [80] and Brownian Dynamics simulations bridge the gap between NEURON’s compartment electrical model and the molecular-scale processes of diffusion in complex fluid environments and reaction mechanisms such as ligand binding to receptors. Drug binding events and protein-protein interactions are captured using atomistically accurate but computationally demanding molecular dynamics simulations. To date, however, there have been relatively few attempts to integrate models and simulations across multiple levels of biological organisation. This is one of the aims of EPFL’s Blue Brain Project [81], the first attempt to develop a unifying model of the neocortical column of juvenile rat, based on detailed anatomical and electrophysiological data. A key part of the work of the project has been the development of the necessary software and workflows [82, 83]. This work will be further developed in the Human Brain Project.

Methodology The HBP will develop a suite of software tools, workflows and services allowing researchers from inside and outside the project to collaboratively build and simulate models of the brain, at the level of detail best adapted to the questions they seek to answer. These will be “snap-shot” models, representing the multi-level structure of a brain at a given stage in its development. Initial parameter values will be based on statistical data from experiments and Predictive Neuroinformatics and validated against data from biological experiments. We hypothesise that by “training” such models in closed-loop set-ups, it will be possible to build systems displaying realistic behavioural and cognitive capabilities.

To achieve these goals the Brain Simulation Platform will provide the following functionality.

1. Brain Builder. The Brain Builder will make it possible to build brain models of any species, at any age at any desired level of detail, so long as the necessary data is available. The same software will make it possible to model hypotheses of disease causation (e.g. absence of specific ion channels or receptors, pathological patterns of network connectivity). The Brain Builder will include tools to embed data from brain atlases (see above), a “multi-scale slider” allowing modellers to vary the resolution of their models, as well as tools to set up closed-loop experiments and to deploy simulations to high performance computing platforms.

2. Brain simulation engines. The platform will incorporate a multi-scale simulation framework integrating existing simulation engines for molecular, cellular, and network level simulation. The framework will make it possible to build and simulate models representing the same tissue at different scales, and lay the foundations for studies of the relations between levels.

FP7 – 604102 – HBP

CP-CSA-FF

3. Molecular dynamics simulations. The HBP will use molecular dynamics simulations and coarse- grained techniques to generate molecular level information for the project’s multi-scale models. The project will use this information to improve models of ion channels and receptors, to create coarse- grained models of the dynamics of cell membranes and organelles, to understand protein-protein interactions and to understand the way drugs bind to proteins. The same information will guide the development of coarse-graining strategies for large-scale molecular simulations.

4. Brain models. The platform will incorporate first draft models representing different levels of brain organisation (molecular level models of selected neurons, neuromodulation and synapses, synaptic plasticity and homoeostasis, glia and neuro-vascular coupling, cellular level models of major classes of neurons, and of neural microcircuits in important regions of the brain, cellular level models of whole brain regions and brain systems, mixing point neurons and detailed neuron models). These will lead to models of whole mouse and human brains, exploiting the platform’s multi-scale capabilities.

5. The Brain Simulation Platform. The platform will provide researchers with an Internet-accessible Brain Simulation Cockpit, allowing them to perform in silico experiments investigating the relationships between different levels of biological organisation in the healthy and the diseased brain and preparing the way for the re-implementation of neuronal circuits in neuromorphic hardware (see below). A professionally managed service centre will provide them with the necessary support and training. The project will support a vigorous visitors programme for external scientists wishing to make use of these services in their own research.

Roadmap and key milestones

M30: Collaborative platform for building and simulating brain models (Brain Simulation Platform v1)

Access and user services. Website; guidebook

Brain model builders. Morphology Synthesizer v1, Ultrastructure Synthesizer1, Electrical Model Fitter v1, Molecularizer v1, Connector v1, Microcircuit builder v1, Brain region builder v1, Vascularizer v1, Whole Brain v1, Synaptic transmission v1, Plasticity engine v1

Brain simulators. Molecular, Cellular and Network Simulators (v1)

Molecular predictors. Molecular dynamics to predict reaction kinetics

Brain models. Initial molecular level levels, cellular-level models of mouse neocortical neurons

micro-

circuits and brain regions; Initial draft of simplified (point neuron) whole brain model for mouse

Experiments,

analysis and visualisation requests

analysis

and

visualisation.

Capability

to

define

experiments

and

send

remote

Simplification. Export2Neuromorphic 1 a tool implementing a method to simplify detailed models

for

 

neuromorphic implementation

Links and Interfaces. Links to HPC, NMC and Neurorobotics platforms

M60: Collaborative brain reconstruction and simulation (Brain Simulation Platform v2)

Access and user services. Enhanced access functionality with user notifications of platform, algorithmic and model updates. Updated guidebook

Brain model builders. Adaptation of previous model builders (v2); Whole Brain Builder v2

Brain simulators.

Enhancement of v1 simulators (v2) to exploit initial interactive supercomputing

capabilities

Molecular predictors. Enhancement of reaction kinetic predictions; Molecular Dynamics simulations of ion channels

Brain models. Generic molecular models for neurons, synapses, glia; major cell-types and brain regions of the mouse brain

Experiments, analysis and visualisation. Capability to define experiments and send remote analysis and visualisation requests

Simplification. Export2Neuromorphic 2

Links and Interfaces. Enhanced closed-loop coupling to the Neurorobotics Platform

FP7 – 604102 – HBP

CP-CSA-FF

M90: Mouse brain reconstruction and simulation (Brain Simulation Platform v3)

Access and user services. Collaborative model building. Updated guidebook

Brain model builders. Enhance v2 brain model builders (v3) to integrate new biological data in the mouse brain atlas. Enhancement of the Connector v2 to implement constraints from predictive neuroinformatics (Global Neuronal Addressing System)

Brain simulators. Enhancement to v2 simulators (v3) to exploit multi-scale supercomputing capabilities

Molecular predictor. Prediction of reactants and the kinetics of the reactions based on the chemical structure of a drug. Molecular Dynamics of protein-protein interactions, ion channels and receptors for improved biological accuracy.

Brain models. Molecular level, cellular level and simplified models for all possible cell-types based on cell-type and whole-brain gene expression data; mixed detail models for all regions of the mouse brain; complete integration of multi-scale neuro-vascular-glial system models; first draft model of the whole mouse brain with mixed detail depending of the level of reconstruction possible for each brain region and cell-type

Simplification. Export2Neuromorphic 3

Links and Interfaces. Enhanced links to the HPC Platform for simulations of the whole mouse brain; closed-loop coupling to the Neurorobotics Platform for experiments in virtual cognition and behaviour

M120: Reconstruction and simulation of the human brain (Brain Simulation Platform v4)

Access and user services. Updated collaborative model building tools, Updated guidebook

Brain model builders. Enhanced brain builder algorithms to modify template mouse brain models with human-specific features (differences in gene expression, cell morphologies, cellular composition, neuro-vascular-glia system, cellular layering, dimensions of cellular nuclei, dimensions of brain regions, long range connectivity)

Brain simulators. Enhancement of v2 simulators (v3) to exploit exascale computing capabilities

Molecular predictors. Capability to define an eDrug, which provides all the quantitative changes in reactions and reaction kinetics that the drug will induce on the system

Brain Models. Molecular level, cellular level and simplified models for all possible human cell-types based on cell-type and whole-brain gene expression data; mixed detail models for all human brain regions; multi-scale neuro-vascular-glial system models; first draft model of the whole human brain with different levels of detail depending of the level of reconstruction possible for each brain region and cell-type

Simplification. Export2Neuromorphic 4

Links and Interfaces. Enhanced links to the HPC platform for whole human brain simulations; closed-loop coupling to the Neurorobotics Platform for virtual cognition and behaviour experiments.

1.2.2.3 SP7: High Performance Computing Platform

Operational Objectives

The main goal of the High Performance Computing Platform will be to provide the HBP and the wider community with the supercomputing capabilities, system and middleware support necessary to simulate multi- scale models of a complete human brain.

The platform will consist of a central HBP Supercomputer that will gradually evolve toward exascale performance and data management capacity over the duration of the project. This machine will be complemented by three satellite facilities dedicated to software development, molecular dynamics simulations, and massive data analytics, respectively. The HBP supercomputer will complement the capabilities provided by the Partnership for Advanced Computing in Europe (PRACE).

On the hardware side, the HPC capabilities required will be based on innovative, energy efficient technologies including multi and many-core processors expected to reach exaflop/s performance by the end of 2020. These technologies may be complemented by neuromorphic acceleration. The system will include hierarchical memory and I/O sub-systems with multi-Petabytes of capacity and data rates of many Terabit per second as well as hardware-integrated optical communication technologies with the lowest possible latencies, possibly

FP7 – 604102 – HBP

CP-CSA-FF

complemented by brain-inspired communication sub-systems. On the software side, the HBP’s HPC system software and middleware will integrate and operate such vast numbers of elements and eclectic components providing resilience over millions of processing cores and routing devices through failure anticipation, system wide checkpoint mechanisms as well as the realization of end-to-end data integrity.

A key objective will be to develop middleware to support in situ analysis of multi-Petabyte data sets and real-

time visualization and visual computational steering of simulations. This kind of interactive supercomputing will be invaluable not just for brain simulation but also for a broad range of other applications, in the life sciences and elsewhere.

A further objective will be to provide tool sets for performance analysis, prediction and improvement together

with numerical algorithms to cope with the unprecedented scalability requirements of full brain simulation. In

a midterm perspective such, algorithms may benefit from the novel capabilities of brain-inspired neuromorphic computing and communication devices.

State of the art

Today’s supercomputers are massively parallel systems, with more than a million interconnected processor cores and around a Petaflop (10 15 bytes) of memory. Today’s largest machine has a peak performances of 20 Petaflops (10 15 flops) [84].

Since the introduction of the first supercomputers in the 1960/70s, trends in computer performance and memory have followed “Moore’s Law” doubling the number of transistors on a computer chip approximately every eighteen months. According to the International Technology Roadmap for Semiconductors (ITRS) [85] this trend will continue for several processor generations to come.

Since the introduction of the Cray-1 in 1976, improvements in supercomputer performance have outstripped Moore’s Law, increasing by roughly a thousand fold every ten years - an improvement primarily due to the use

of ever increasing numbers of processors. Achieving exaflop performance by 2020 – a thousand fold increase

with respect to 2010 – will require further massive increases – a goal that poses severe technical challenges [86, 87]. For environmental and business reasons, vendors have set themselves the task of containing energy consumption to a maximum of 20 megawatts per exaflop/s, driving processor design in the direction of power- efficient many-core CPUs, similar to today’s GPUs but with greater autonomy. Issues of resilience combined with memory and I/O constraints present additional obstacles, including problems with end-to-end data integrity. With present technology, it is unlikely that memory capacity and bandwidth will keep up with the expected increase in compute performance.

International supercomputer vendors like IBM and Cray and several European research projects are engaged in intensive efforts to solve these problems [88, 89]. IBM is exploring the use of storage-class memory technologies as in its highly innovative BGAS project. Cray focuses on the exploitation of parallelism, at all levels. In Europe three complementary research projects are studying different aspects of the exascale challenge. CRESTA [90], coordinated by University of Edinburgh, is working with Cray and other partners to explore potential applications of exascale computing and to develop appropriate system software. DEEP [91], led by Jülich, with the collaboration of Italian integrator EuroTech, aims to achieve very high scalability using many-core X86 technology from Intel and the very low latency EXTOLL network. Mont-Blanc [92], led by BSC, is working with French integrator Bull to study energy efficiency using Arm embedded system cores.

Ever since the pioneering work of Gerstein and Mandelbrot in the 1960s [93], brain simulation has used the latest computing hardware available. This tendency continues today as teams in the USA, Europe, and Japan work to increase the power of simulation technology. In the USA, many of these efforts are coordinated by the DARPA SyNAPSE programme [94]. In Japan, efforts to simulate the whole brain are funded by the MEXT “Next Generation Supercomputer” project [95]. In Europe, the EU-funded BrainScaleS [96] and the UK- funded SpiNNaker [97] projects are working to enable multi-scale simulations of the brain on custom neuromorphic hardware.

These projects focus mainly on models with large numbers of neurons and synapses but with relatively little or no detail at lower levels of biological organisation. By contrast, the HBP will build and simulate biologically realistic models of the complete human brain, at least at the cellular level, and use them as the basis for in silico experiments. EPFL’s on-going Blue Brain Project (BBP) [81], which has pioneered this approach, has produced

FP7 – 604102 – HBP

CP-CSA-FF

a parallel version of the NEURON code, running on an IBM Blue Gene/P supercomputer with a peak performance of 56 Teraflops. The project has demonstrated that this capability is sufficient to run cellular-level models with up to one million detailed, multi-compartment neurons. A simple extrapolation suggests that after optimisation, a large Blue Gene/P system such as the 6 Petaflop-system at the Jülich Supercomputing Centre would provide enough computing power and memory to simulate up to five hundred million neurons. Cellular-level simulation of the 100 billion neurons of the human brain will require compute power at the exascale (10 18 flops, 100 Petabytes of memory).

A second unique requirement of the Human Brain Project is that supercomputers should act as flexible interactive scientific instruments, providing researchers with visual feedback and allowing them to “steer” simulations while they are underway. This is very different from the batch mode in which most supercomputers are operated today. Creating this capability will require completely new developments in supercomputing software, including new techniques for in situ visualisation and data analysis.

Methodology

Building and simulating multi-level models of the complete human brain will require exascale supercomputing infrastructure with unprecedented capabilities for interactive computing and visualisation. During the ramp-up phase, the HBP will work with manufacturers and European exascale projects to ensure that the architectures and technologies adopted by the project meet the requirements of brain simulation, which in many respects are qualitatively different from those of other applications, for example in physics. The results will guide the step-wise procurement of the HBP supercomputer, planned to start at the start of phase two. Key themes for research and development include the following.

Developing exascale supercomputing for brain research. The HBP will collaborate with major international manufacturers (IBM, Cray) and with exascale research initiatives like DEEP, Mont-Blanc and CRESTA, which include European HPC manufacturers (EuroTech, Bull) and several SMEs. The end goal will be to influence the design and deploy the supercomputing technology required by the project, gradually moving towards exascale capabilities, expected to be available by 2020.

Numerical methods, programming models and tools. To support efficient interactive simulation of brain models, the project will develop new numerical methods, parallel programming models, and performance analysis tools adapted to the extreme parallelism of future exascale systems and new middleware for workflow and I/O management. A long-term goal is the integration of brain-inspired hardware technology into supercomputers.

Interactive visualisation, analysis and control. The project will develop a novel software framework allowing interactive steering and in situ visualisation of simulations. The development work will produce both general-purpose software and neuroscience-specific interfaces – virtual instruments allowing scientists to work with virtual specimens in the same way they work with biological specimens.

Exascale data management. HBP brain simulations will generate massive amounts of data. An important task will thus be the design and development of technology making it possible to manage, query, analyse and process this data, and to ensure that it is properly preserved.

The High Performance Computing Platform. The HBP High Performance Computing Platform will make the project’s supercomputing capabilities available to the project and the community. The platform will consist of a production-scale HBP Supercomputer at Jülich, a smaller software development system at CSCS, Switzerland, a system for molecular-level simulations at Barcelona Supercomputing Centre, and a system for massive data analytics at CINECA, Italy. The four systems will be connected via a dedicated communication network, allowing the partners to share massive datasets ranging from petabytes to exabytes of data. Data storage will be provided directly by the centres and through cloud services. The project will also provide user support and training, coordination with PRACE and other research infrastructures, and cooperation with industry.

These goals will be implemented in several phases, each with specific targets for the size of simulations to be supported. During the ramp-up phase, HBP will utilise and enhance the computational resources provided by the participating centres and by PRACE and other research infrastructures. This will allow the project to perform cellular level simulations on the scale of the rodent brain, using 200-400 Terabytes of memory per simulation. In parallel, the High Performance Computing subproject will work with manufacturers and European exascale computing projects to evaluate candidate architectures for the “HBP Supercomputer” – a

FP7 – 604102 – HBP

CP-CSA-FF

large machine designed to evolve to the exascale over the duration of the project. This work will prepare the procurement of the new machine at the start of the second phase of the project.

The following two phases will be dedicated to deploying and building up the capabilities of the new machine in a co-design process that brings together applications and systems developers. By the end of phase two, the HBP will have the supercomputing capabilities for brain simulations on the scale of the cat brain (1 Petabyte per simulation); in phase three it will develop the capability to support simulations of the monkey brain (10s of Petabytes per simulation). In the fourth and final phase of the project, when the HBP Supercomputer reaches the exascale, it will be able to perform cellular-level simulations of the complete human brain (100s of Petabytes per simulation).

Roadmap and key milestones

M30: Roadmap to exascale computing for data intensive applications (High Performance Computing Platform v1)

Access and user services. Unified model for secure access; guidebook

Exascale roadmap. Human brain, multi-scale, interactive, remote access requirements established. test series (phase 3 of PCP) completed and results assessed;

HPC Systems. HBP Supercomputer providing 5-6 Petaflops of peak performance, more than 400 Terabytes of main memory and an 8 Petabyte scratch file system with an aggregated performance of about 160 GB/sec in a full production environment (capability to simulate cellular-level models with 100 million neurons, -roughly the scale of the mouse brain - and microcircuits with molecular level neuron models); HBP Development System providing 100 Terabytes of main memory and 4 Petabytes of storage for software development; HBP Molecular Dynamics Supercomputer providing 1 Petaflop of peak performance and 100 Terabytes of main memory for molecular-level simulations: HBP Massive Data Analytics Supercomputer providing 2 Petaflops of peak performance and 200 Terabytes of main memory, integrated with a mass storage facility of more than 5 Petabytes suitable for data analytics tasks in the HBP; High-speed network linking the HBP systems above with bandwidth of 10 Gigabits/s M60: Remote visually guided supercomputing (High Performance Computing Platform v2)

Access and user services. Updated guidebook

HPC capabilities. Middleware for data-intensive, interactive supercomputing (remote steering, sharing and scheduling for simultaneous simulation, visualisation and analysis); multi-scale visualisation; prototype of multi-view analysis tools

M90: Dynamic multi-scale brain model reconstruction and simulation (High Performance Computing Platform v3)

Access and user services. Updated guidebook for the use of the HPC Platform

HPC Systems. Enhancement of v2 with a 100 Petaflops HBP Supercomputer providing 10 Petabytes of main memory, capable of supporting brain models with 10 billion neurons

HPC capabilities. Advancements and scaling of interactive supercomputing middleware; middleware for dynamic multi-scale engines; dynamic indexing and data-prefetching functionality

M120: Exascale supercomputing for human brain simulation (High Performance Computing Platform

v4)

Access and user services. Updated guidebook

HPC Systems. Enhancement of v3 with the Exascale HBP Supercomputer, capable of supporting brain models with 100 billion neurons.

HPC Capabilities. Production-quality middleware for interactive simulations, visualisation and analysis of the whole human brain (multiscale models) and associated distributed and federated data management

FP7 – 604102 – HBP

CP-CSA-FF

1.2.2.4 SP8: Medical Informatics Platform

Operational Objectives

The goal of the Medical Informatics Platform will be to federate clinical data, including genetics and imaging, currently locked in hospital and research archives, while providing technical guarantees that researchers cannot link the data to individual patients except under strict medical control and legal supervision. The capabilities provided by the platform will include tools to search, query and analyse the data. An important goal will be to use the platform to identify biological signatures of disease and on this basis to develop a new, comprehensive classification of brain diseases, based on parameterised combinations of biological features and markers. Success would accelerate the development of a new category of biologically based diagnostics, supported by strong mechanistic hypotheses of disease causation. These could then be tested by in silico experiments with the Brain Simulation Platform. The results will help researchers to develop new targets and new strategies for treatment. Brain simulation will make it possible to predict the desirable and adverse effects of treatments, providing valuable input for industry decision-makers before they invest in expensive programmes of animal experimentation or human trials.

State of the art

Traditional epidemiology and drug development both rely on a univariate model in which a single outcome is linked to a small set of risk factors (epidemiology) or the modulation of a single drug target (drug development). This model fails to take account of the complexity of biological systems, in which multiple redundancies can stabilize the functioning of the system even when a particular pathway is blocked [79]. This is particularly true of the brain, whose intrinsic plasticity gives it the ability to adapt to major changes in the external environment and even to significant internal damage. This means that most psychiatric and neurological diseases cannot be identified through a simple biomarker and cannot be treated by modulating a single drug target.

The HBP Medical Informatics Platform is based on the premise that the best way of identifying more complex disease signatures and exploring new treatment options is to use bioinformatics to explore very large volumes of multivariate patient data. Under the impulse of the Human Genome Project, bioinformatics has already developed extremely effective tools for exploring and annotating genetic data. To date, however, there has been relatively little work on other classes of clinical data.

This poses technical, cultural and organizational issues. On the technical side, it has long been recognized that that the needs of researchers seeking to store, query and manipulate scientific data are profoundly different from the commercial needs that have driven the development of relational database technology [98] [99]. In the case of medical informatics, these issues are especially acute, leaving many gaps between the requirements of research and the capabilities of the technology. Despite intensive research, this requirements gap has yet to be adequately filled.

One crucial issue is “differential privacy” – the need to provide formal privacy guarantees where they are required without perturbing the exploitation of the data by legitimate users. Current methods are not adequate for this purpose [100] and new ones are urgently needed. A second issue is how to provide scientists with quick access to raw medical data, such as data from imaging [101]. Loading these large datasets into a database is a time consuming process`, particularly when it is not known what parts of it will actually be used. Ideally`, researchers should be able to access and query their data immediately`. However, the development such functionality will require extensive research on how to execute queries on different raw data formats [102].

On the organizational side, sharing of data is less common among clinical scientists than in other scientific communities. According to Visscher et al. [103], the reasons include the need for standardisation, the time required to transfer data to repositories, the need to protect clinical confidentiality, the perceived risk of jeopardising publications, and difficulties in assessing the accuracy of results. All these problems are soluble in principle, and have already been solved by other scientific communities.

Imaging presents an illustration of the challenges and potential solutions. European hospitals and research establishments generate an enormous number of brain images for clinical purposes, most of which are only viewed once before being archived on hospital or laboratory servers. Much of this data consists of structural

FP7 – 604102 – HBP

CP-CSA-FF

(sMRI) images scanned at 1.5 or 3.0 Tesla. The variance introduced by averaging image data from multiple imaging platforms is less than the variance attributable to disease [104]. This suggests that archived images represent a largely unused resource for population-based investigations of brain diseases.

Several attempts to exploit such data are already in progress. Preliminary international data generation initiatives, such as the ADNI database [105] have demonstrated practicability and value for money, informing a broad range of experiments conceived, executed and independently published by internal and external collaborators. The ENIGMA Consortium (http://enigma.loni.ucla.edu), has recently brought together 125 institutions in a very large brain imaging study, analysing brain images and genome-wide scan data from 21,151 subject. As a result of these and similar studies, grant-awarding institutions such as the NIH and Wellcome Trust now require that studies they fund make their databases available on the Internet, facilitating data sharing. Switzerland, among other countries already allows hospital data mining by health economists and insurance companies to improve the quality of health care. Pilot studies by partners in the HBP are profiting from this favourable situation to mine anonymised patient data collected by pharmaceutical firms, including data from failed clinical trials.

Methodology

The Medical Informatics Platform will build on existing international data generation initiatives, allowing researchers to query and analyse large volumes of clinical and other data stored on hospital and laboratory servers. The work required to build and manage the platform can be summarised as follows.

Federated data management. The Medical Informatics Platform will provide a software framework allowing researchers to query clinical data stored on hospital and laboratory servers, without moving the data from the servers where it resides and without compromising patient privacy (in situ querying). The framework will include a server on the sites where data is stored, creating an interface that allows researchers to read and query but not to modify the data. The data made available in this way will include brain scans of various types, data from electrophysiology, electroencephalography and genotyping, metabolic, biochemical and haematological profiles, data from validated clinical instruments used to quantify behaviour and emotion as well as relevant data on provenance.

Data acquisition and integration. The HBP will recruit hospitals, research labs, industrial companies and other large-scale data gathering initiatives (e.g. large longitudinal studies) to make their data available through the platform. As part of this work, it will develop common protocols for data capture. A key goal will be to move away from a culture of data protection to one of data sharing, exploiting the robust data protection offered by the platform. To create incentives for this change, the HBP will ensure that researchers who contribute data to the platform have free access to data contributed by other researchers. Such a policy will encourage greater efficiency in the use of data, stronger collaboration among researchers, the creation of larger cohorts, and more effective approaches to rare diseases.

Medical intelligence tools. The HBP will build software tools and techniques of data analysis, making it easier for researchers to analyse data made available through the platform. These will include Machine Learning tools, Bayesian model selection techniques and high dimensional data mining algorithms, capable of dealing with incomplete data and data with unusual distributions The tools and techniques made available by the project will be used to study clinical and experimental data for the full range of brain disease, and to detect recurrent patterns (biological signatures). Focused scientific investigation will make it possible to associate biological signatures of diseases with differential sites of dysfunction, abnormal development and disorganisation, providing evidence of causative mechanisms and allowing the identification of potential drug targets.

Building and operating the platform. The Medical Informatics Platform will offer researchers interactive web- based tools to contribute data to the platform and to analyse and exploit the data it provides. Researchers using the platform will receive all the necessary training, technical support and documentation. During the ramp-up phase, the development team will prepare to move the platform to Neuropolis – a new campus for brain science operated by EPFL, University of Lausanne and University of Geneva, opening in 2016.

Roadmap and key milestones

M30: Collaborative platform for federating clinical data and analysis (Medical Informatics Platform v1)

FP7 – 604102 – HBP

CP-CSA-FF

Access and user services. Patient privacy requirements. Provable guarantees of patient privacy. Website for federated clinical data. Guidebook

Data federation tools, search and analysis. Data federation software; software for data and privacy protection; prototype in situ querying software for local hospital servers; “Sign up, configure and federate” for hospitals, researchers and industry

Algorithms for deriving disease signatures, mechanisms and treatments. Testing existing algorithms. Benchmarking against traditional machine learning approaches. Exploratory development of radically alternative approaches (e.g. clustering based on algebraic topology, other advanced math approaches)

Federation of hospital, researcher and industry data sources. At least five hospital, researcher, and industry data sources M60: Tools to derive unique signatures of brain disease (Medical Informatics Platform v2)

Access and user services. Updated guidebook; first draft of a standardised informatics-derived disease signature format

Data federation tools, search and analysis. Enhanced versions of data federation and in situ querying software; massive HPC-supported data analytics for clinical data

Algorithms for deriving disease signatures, mechanisms and treatments. Prototype implementations of the three most powerful algorithms identified in previous phase

Federation of researcher, hospital and industry data sources. Federation of at least fifty hospital, researcher and industry data sources

M90: Global federation of clinical researchers, hospitals and industries (Medical Informatics Platform

v3)

Access and user services. Updated guidebook. Second draft of standardised informatics-derived disease signature format

Data federation tools, search and analysis. Enhancements to federation and in situ querying software. Enhanced version of massive data analytics. Enhancements for “easy sign-up, configure and federate” module

Algorithms for deriving disease signatures, mechanisms and treatments. Full implementation of the two most powerful algorithms for deriving disease signatures from the previous phase

Federation of hospital, researcher, and industry data sources. Federation of at least five hundred hospital, researcher and industry sources located around the world.

M120: Personalised disease diagnostics: signatures, mechanisms and treatment (Medical Informatics Platform v4)

Access and user services. Updated guidebook; final draft of a standardised informatics-derived disease signature format; personalised reports on individual patients

Data federation tools, search and analysis. Enhanced versions of all tools

Algorithms for deriving disease signatures. Enhancement of the winning algorithms for deriving disease signatures

Federation of researcher, hospital and industry data sources . Federation of at least 500 hospital, researcher and industry sources located around the world

1.2.2.5 SP9: Neuromorphic Computing Platform

Operational Objectives

The HBP will design, implement and operate a Neuromorphic Computing Platform that allows non-expert neuroscientists and engineers to perform experiments with configurable Neuromorphic Computing Systems (NCS) implementing simplified versions of brain models developed on the Brain Simulation Platform as well as on generic circuit models. The NCS will be hardware devices incorporating state-of-the-art electronic component and circuit technologies as well as new knowledge arising from other areas of HBP research (experimental neuroscience, theory, brain modelling). The platform will provide NCS based on physical (analogue or mixed-signal) emulations of brain models (NM-PM), running in accelerated mode, numerical models running in real-time on digital multicore architectures, (NM-MC), and hybrid systems. The platform will be tightly integrated with the High Performance Computing Platform, which will provide essential services

FP7 – 604102 – HBP

CP-CSA-FF

for mapping and routing circuits to neuromorphic substrates, benchmarking and simulation-based verification of hardware specifications.

State of the art

The primary challenges for traditional computing paradigms are energy consumption, software complexity and component reliability. One strategy to address these challenges is to use neuromorphic technologies inspired by the architecture of the brain. Some approaches have focused on physical emulations of brain circuits. These approaches have the potential to exploit the characteristics of inherently noisy and unreliable micro- or nanoscale components with feature sizes approaching the atomic structure of matter, and with an energy cost per neural operation that is more than six orders of magnitude lower than the equivalent cost for brain models running on conventional supercomputers. Other approaches use massively parallel many-core architectures that simulate neural models on digital processors. In both strategies, communications among model neurons use clockless, inherently asynchronous “spiking neural networks”– a “brain-like” feature that offers major savings in energy consumption. Other advantages include support for plasticity and learning and (for the analogue approach) the ability to run at speeds from 1,000 to 10,000 times faster than biological real time, meaning that model systems can emulate real world learning processes and physical dynamics lasting weeks, months and even years.

Neuromorphic computing was pioneered by the group of Carver Mead [106] at Caltech, the first to integrate biologically inspired electronic sensors with analogue circuits and to introduce an address-event-based asynchronous, continuous time communications protocol. Today, the Mead approach is followed by many groups, notably the Institute for Neuroinformatics at ETH Zürich (Switzerland) [107].

The Mead work focuses on the demonstration of basic computational principles. IBM’s SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project, by contrast, aims to reproduce large systems that abstract away from the biological details of the brain and focus on the brain’s larger-scale structure and architecture – the way its elements receive sensory input, connect to each other, adapt these connections, and transmit motor output. Proponents argue that the inherent scalability of this approach will allow them to build systems that match the computing efficiency, size and power consumption of the brain and its ability to operate without programming [94].

The European FACETS project has pioneered a different approach that combines local analogue computation in neurons and synapses with binary, asynchronous, continuous time spike communication [108-110]. FACETS systems can incorporate 50 *10 6 plastic synapses on a single 8-inch silicon wafer. In the near future,

advances in CMOS feature size, connection technologies and packaging will make it possible to build multi- wafer systems with 10 13 plastic synapses operating at acceleration factors of 10.000 compared to biological real- time. The FACETS group has also pioneered a unified concept for a network description language (PyNN) that provides platform independent access to software simulators and neuromorphic systems [111]. BrainScaleS –

a follow-up project – is pioneering the use of the technology to replicate behaviour and learning over periods of up to a year while simultaneously emulating the millisecond-scale dynamics of the system.

Another strategy is to implement brain models in classical many-core architectures. This is the approach adopted by the UK SpiNNaker group [112, 113]. The group has a strong grounding in the ARM architecture, which offers an excellent basis for scalable digital many-core systems operating at real time with low power, has recently completed the integration of a SpiNNaker chip into an operational system and is now running experiments. Each chip has eighteen cores and a shared local 128M byte RAM, and allows real-time simulation of networks implementing complex, non-linear neuron models. A single chip can simulate 16.000 neurons with eight million plastic synapses running in real time with an energy budget of 1W.

As shown above, neuromorphic computing is making enormous progress. However, designing and using a neuromorphic computing system computer requires detailed knowledge of the underlying electronics and their

often proprietary interfaces. This is in stark contrast to the situation with everyday digital computers or even specialized devices like FPGAs (Field Programmable Gate Arrays), which can be easily programmed by users with only limited knowledge of the hardware and where software support for programming and configuration

is readily available.

Methodology The distinguishing feature of the HBP’s strategy for neuromorphic computing is that neural architectures will

FP7 – 604102 – HBP

CP-CSA-FF

be derived from detailed multi-level brain models, developed on the Brain Simulation Platform. The HBP will systematically study the relationship between different features of the models and their computational performance, identifying and implementing strategies to reduce complexity while preserving functionality. The HBP Neuromorphic Computing Platform will consist of an infrastructure to develop and run Neuromorphic Computing Systems (NCS) in hardware. Starting in month 30, the platform will offer researchers 24/7 access to neuromorphic computing systems (NCS). It will also provide software simulations of NCS, software support for setting up, running and analysing experiments as well as individual user support through training and a helpline. The platform will be distributed between two sites, in Heidelberg and in Manchester. Heidelberg will provide NCS based on non-classical, physical emulation of neural circuits; Manchester will offer a classical, programme-based many-core approach. Research and development for the platform will focus on the following themes.

1. Neuromorphic computing through physical emulation of brain models. During the first five years of the

project, the HBP will implement two generations of Neuromorphic Computing System (NM-PM-1 and NM- PM-2), based on analogue neural and synapse circuits and digital spike communication. NM-PM-1, derived from the BrainScaleS Hybrid Multi-scale Facility, will incorporate up to 100 wafer-modules and run between 1,000 and 10,000 times faster than real time a preliminary version with at least 10 wafer modules will be available in month 18. NM-PM-2 will contain 1000 modules implemented in 65nm circuit technology and will pioneer the use of wafers embedded in printed circuit boards. In the second phase of the project, the HBP will build a third generation NM-PM-3 system, exploring options for systems that can shift between different speeds of operation, from real time (a pre-condition for robotics and many other applications) to 10,000 faster than real time. It is planned that NM-PM-3 will incorporate 10,000 wafer modules with 1016 components. This will give the system the ability to emulate of a substantial fraction of the human brain.

2. Neuromorphic computing with digital many-core simulation of brain models. The project will also

develop two NMCs (NM-MC-1 and NM-MC-2) implemented on scalable many-core digital ASICs. The devices will offer on-chip floating point operations and memory management, as well as fast light-weight packet switched networks, making it possible to develop real-world applications that operate in real time (e.g. controllers for robots, systems for applications outside robotics). NM-MC-1 will consist of 1 million ARM CPU cores with a simulated bisecting bandwidth of 1010 spikes per second (NM- MC-1). NM-MC-2 will contain at least 1011 components.

3. Common software tools and HPC integration. The platform will include a suite of software tools to

support the design and development of neuromorphic systems and applications. These will include tools to import and simplify brain models, to develop Executable Systems Specifications (software models of the device under construction), and to measure performance. Preliminary versions have already been tested in the FACETs and BrainScaleS projects.

4. Novel technologies for neuromorphic circuits. The HBP will also investigate new hardware approaches to

the implementation of neuromorphic circuits. The work programme includes new technologies for distributed memory based on insights from material science (e.g. resistive memories and spintronics), nanoscale switches, high-density assembly technologies, 3D silicon integration, and novel design methodologies for Neuromorphic VLSI. Where appropriate, the project will develop functional demonstrators, as a first step towards integrating the technologies in the platform. With the exception of high-density connection technologies and design methodologies, the work in these areas will start only after the ramp-up phase. We will also run an exploratory task to follow up on technology development outside HBP.

5. Integration with HPC. The Neuromorphic Computing Systems developed by the HBP will be complex

electronic systems, which, by the end of the project, will contain as many as 10 16 components, mainly memory devices. Today, and for the foreseeable future, the best way to configure such systems and to analyse their behaviour is to integrate them with high performance computing systems to create hybrid setups. This is particularly true for the physical model systems, which have to be interfaced to conventional computers for data I/O and analysis. Such hybrid systems will also open the road towards multi-scale systems in which some parts of the model are emulated on neuromorphic devices and others are functionally described by software running on conventional digital hardware. Physical models systems will be integrated with digital numerical processors on three levels: (1) the systems will have local on-silicon digital processors with rapid access to local

data for data-pre-processing or for running local plasticity, learning and development algorithms, (2) they will be linked to an on-site moderate sized HPC cluster with low latency to run closed-loop experiments with

FP7 – 604102 – HBP

CP-CSA-FF

virtual environments, (3) for circuit import, mapping, routing and simulation they will be linked to the High Performance Computing Platform via high bandwidth, higher latency links.

The HBP will integrate hardware, software and methods coming from this work in an integrated remotely accessible, Neuromorphic Computing Platform, which it will operate for the benefit of scientists and engineers from inside and outside the HBP Consortium. The first version of the platform, available after thirty months, will allow them to work with the NM-PM-1 and NM-MC, advanced software tools for the simplification of brain models and for system design, and a local High Performance Computing Cluster, offering compute services for low latency closed-loop experiments. A dedicated platform service centre will provide documentation, training and technical support while an active visitors programme will provide support for researchers wishing to perform on-site research.

Roadmap and key milestones

M30: Collaborative platform for neuromorphic computing (Neuromorphic Computing Platform v1)

Access & user services. Website allowing users to configure and operate NCS; guidebook Computing capabilities. NM-PM-1 (10 configurable wafer system featuring 2 Million AdEx neurons and 500 million dynamic synapses running 10.000 times faster than real time; NM-MC-1: 1 million ARM CPU cores with simulated bisection bandwidth of 10B spikes per second)

Technology development. Software models for NM-PM2 and NM-MC2

M60: Mouse-scale neuromorphic computing systems (Neuromorphic Computing Platform v2)

Access and user services . Enhanced configuration and operation functionality; updated guidebook

Computing capabilities . NM-PM2: scaled-up version of NM-PM1 with 100 wafers

Technology development . Massive data analytics; self-adaptive information processing; simplified whole brain mouse models; closed-loop experiments with the Neurorobotics Platform ; cognitive architectures based on Bayesian and other brain inspired whole brain models

M90: Human brain-scale neuromorphic computing systems (Neuromorphic Computing Platform v3)

Access and user services . Enhanced configuration and operation functionality; Updated guidebook

Computing capabilities . NM-PM-3: 1000 configurable wafer system in advanced deep-submicron CMOS technology featuring at least 1 billion neurons and 10,000 billion dynamic synapses; NM-MC- 2;NM-MC-2: 4M ARM CPU cores with a simulated bisection bandwidth of 100B spikes per second

Technology development. Workflow from detailed and theory inspired brain models to implementation in super real-time physical neuromorphics, parameter optimisation and implementation in real-time multi-core neuromorphics

M120: Emulation of human-level cognition (Neuromorphic Computing Platform v3)

Access and user services . Enhanced configuration and operation functionality; updated guidebook

Neuromorphic Capability and Capacity . NM-PM-4: 10,000 configurable wafer system with 10 16 components; execution speed ranging from real time to 10,000

Technology development . Optimised workflow from detailed and theory inspired brain models to implementation in super real-time physical neuromorphics, use of models to emulate human cognition; parameter optimisation and implementation in real-time multi-core neuromorphics; support for closed-loop experiments

1.2.2.6 SP10: Neurorobotics Platform

Operational Objectives

The Neurorobotics Platform will offer scientists and technology developers a software and hardware infrastructure allowing them to connect pre-validated brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems in in silico experiments and technology development.

State of the art

Neurorobotics can be defined as the science and technology of robots that perform behavioural tasks, are situated in a real-world environment, sense environmental cues and act upon their environment, and which

FP7 – 604102 – HBP

CP-CSA-FF

are controlled by a simulated nervous system that reflects, at some level, the architecture and dynamics of the brain [114]. Robots with these properties make it possible to study brain models in closed-loop experiments, that is, experiments where the brain is inside a body, embedded in a realistic environment where the robot’s actions influence future sensory inputs.

Probably the first researcher to develop a robot that fulfilled these criteria was Thomas Ross, who in 1933 devised a mobile robot with a small electromechanical brain, which could navigate through a maze in real time [115]. Today, there are two main strands in neurorobotic research, the first focussing on biologically inspired robots, including body, sensors and actuators, the second on brain-inspired control architectures

Biologically inspired robots. Historically, biologically inspired robots have mainly come from academic research. However, recent advances in humanoid and four-legged robots have lead to a renewed interest in applications for the military (BigDog, BostonDynamics.com), aeronautics (NASA Robonaut2), and entertainment (Honda ASIMO, Sony AIBO).

Biologically inspired robots are adaptable and can display rich perceptual and behavioural capabilities. In contrast to industrial robots, they often use compliant materials, which make their mechanics intrinsically flexible. Researchers have also developed a large number of robots, mimicking characteristics of specific animals such as locomotion in snakes and spiders, or “whisking” in rodents. Three of the most advanced, biology-inspired, humanoid robots to date are iCub (a humanoid robot “child”) [116], Kojiro (a humanoid robot with about 100 “muscles” [117] and ECCE (a humanoid upper torso that attempts to replicate the inner structure and mechanisms of the human body [118].

Brain-inspired control architectures. Brain-inspired control architectures are robotic control systems, which, at some level, reflect properties of animal nervous systems. In general, they are tailor made for a specific set of tasks, often using a combination of Artificial Neural Networks, Computer Vision/Audition, Machine Learning algorithms, and recently Spiking Neural Networks [119] [120] [121]; [122]. A typical experiment might involve the emulation of a rat as it navigates through a maze. In this case, the control architecture for the simulated rat could comprise sensory areas, a hippocampus, and a motor area to generate movements. The HBP will deviate radically from this strategy. Rather than designing specific neural control architectures for each experiment, HBP neurorobots will be controlled by generic brain models, provided by the Brain Simulation Platform. To design a robot for use in an experiment, researchers will connect models of sensors (vision, audition, touch, balance) and actuators to a brain model, calibrate the robot brain so that it can process the relevant signals and translate the model’s neural activity into control signals for the robot. They will then use classical techniques (lesion studies, manipulations of neurons etc.) to identify the control architecture for specific tasks. The advantage of this approach is that it allows researchers to monitor and control all states and parameters of the experiment (brain, body, and environment) – something technically impossible to achieve in the laboratory.

Methodology The Neurorobotics Platform will provide tools to reverse engineer specific computational and cognitive functions and specific cognitive architectures within the brain. To this end, the Neurorobotics Platform will allow researchers to design simulated robot bodies, to connect these bodies to brain models, to embed the bodies in rich simulated environments, and to calibrate the brain models to match the specific characteristics of the robot’s sensors and “muscles”. The resulting set-ups will allow researchers to perform in silico experiments, initially replicating previous work, but ultimately breaking new ground. During the second phase of the HBP, under Horizon 2020, the platform will make it possible to develop physical implementations of Neurorobotic Systems with a broad range of commercial applications.

Even with the high-performance computers of the HBP, it will initially not be possible to simulate HBP brain models in real time. Thus, the Neurorobotics subproject will initially rely on simulated robots and simulated environments. The platform will provide researchers with access to detailed brain models on the Brain Simulation Platform running slower than real time, and to emulated models, on the Neuromorphic Computing Platform, running faster than real time. It will also allow them to use mixed models in which some areas of the brain are represented in full biological detail while others are represented by phenomenological models. The tools provided by the platform will allow researchers to operate robots remotely, to repeat experiments as often as they need, and to visualise the behaviour of the robots – as if they were running in real time.

FP7 – 604102 – HBP

CP-CSA-FF

During the second phase of the HBP, it will also provide access to physical robots, controlled by brain-models that can be executed in real time on analogue or digital neuromorphic hardware, provided by the Neuromorphic Computing subproject.

The Neurorobotics Platform will consist of four core modules:

1. The Neurorobotics Cockpit. This module is the central user interface to all the following modules.

This module will allow researchers to build simulated robots based on detailed

specifications and will include the following components.

A Robot Builder: a generic tool to design, develop and deploy simulated robots.

A Sensory System Builder: a tool to generate models of perception in different modalities (auditory perception,

visual perception etc.).

A Motor System Builder: a tool to generate models of motor systems (muscles or motors) and of the peripheral

nervous system.

A Brain-Body Integrator:

body sensory and motor systems.

3. Simulated environments.

which to test their robots and run experiments. The module will provide the following tools.

An Environment Builder: a generic software tool for designing and deploying dynamically changing simulated environments.

An Experiment Designer: a tool to configure experiments and to specify testing and measuring protocols.

An Electronic Coach: a software tool allowing researchers to define and execute multi-stage training protocols to calibrate Neurorobotic systems (brain, body, and environment) and to perform experiments on learning and memory.

4. Closed-loop engine.

simulated or a physical robot and its environment. It will include:

A Closed-Loop Engine:

physical robots and to other devices.

a software tool that allows human experimenters to interact with robots and

their environment.

A Performance Monitor: a set of tools to monitor and analyse the performance of the neurorobotic system in

its environment, and to produce configurable diagnostic messages.

Implementation strategy. The Neurorobotics subproject will integrate the five core models described above in a Neurorobotics Toolkit, which it will make available to the community through the Neurorobotics Platform, due for release in Month 30. A team of four software architects, which will design the software, will develop the core software modules and its interfaces, write and maintain the specifications for the software, and oversee its implementation. The detailed requirements and specifications will be developed in close collaboration with Brain Simulation, and High Performance Computing subprojects, and with future users. The actual code will be developed and tested by outside groups, selected via the HBP Competitive Calls Programme. Software modules will be derived from established open source tools with a strong developer community and from software developed in the Blue Brain Project. All code will flow back to the open source community, allowing groups from other fields of science and technology to join the HBP effort.

A Human Interaction Interface:

This module will make it possible to create a closed-loop between a brain model, a

This module will allow researchers to build rich simulated environments in

automated routines for the calibration of brain models to work with the selected

2. Simulated robots.

a generic tool to couple software and neuromorphic brain models to simulated and

Building and operating the platform. The Neurorobotics Platform will provide researchers with a control centre, where they can configure, execute and analyse the results of neurorobotics experiments. A dedicated team will provide users with the training support and documentation required to make effective use of the platform. The HBP will run an active visitors programme for scientists wishing to use the platform.

FP7 – 604102 – HBP

CP-CSA-FF

Roadmap and key milestones

M30: Prototype platform for closed loop in silico experiments (Neurorobotics Platform v1)

Access and user services. Website; guidebook

Neurorobotics capabilities. Simulated robots and environments; implementation in an open source software environment; software for managing closed-loop experiments.

M60: Virtual prototypes of cognitive devices (Neurorobotics Platform v2)

Access and user services. Enhanced user access and control; updated guidebook

Neurorobotics capabilities. Enhancements to simulated robots, environments and experiments; closed-loop support for conceptual models and simplified models derived from detailed models; links to Brain Simulation , HPC and Neuromorphic Computing platforms ; first simulated robots and devices, environments and experimental conditions

M90: Simulated mouse cognition and behaviour (Neurorobotics Platform v3)

Access and user services. Enhanced user access and control; updated guidebook

Neurorobotics capabilities. Closed-loop support for in silico mouse brain experiments; comprehensive library of simulated robots and devices, environments and experimental conditions for customisation

M120: Simulated human cognition and behaviour (Neurorobotics Platform v4)

Access and user services. Enhanced user access and control. Updated guidebook for the use of the Neurorobotics Platform

Neurorobotics capabilities . Closed-loop support for human brain in silico and emulation experiments; services for customisation of robots and devices, environments and experimental conditions

1.2.3

Applications

Note: during the ramp-up phase, HBP applications research will be limited to three small-scale pilot projects, designed to test the platforms and provide a first indication of their potential value. In this phase, therefore, the three applications subprojects described below (SP11-13) will be merged into a single subproject on Applications.

1.2.3.1 SP11: Future neuroscience

Operational Objectives

The tools provided by the Neurorobotics Platform will allow cognitive neuroscientists to create set-ups in which a brain model (a model of a specific circuit, region of system, a whole brain model) is coupled to a simulated robot (a virtual “tissue sample”, a “virtual animal”), which interacts with a virtual environment (a simulated experimental set up). Neuroscientists will use these set-ups to replicate classical experimental paradigms and eventually to develop new ones. This will make it possible not only to replicate the kinds of studies it is possible to carry on in brain slices, but to dissect the mechanisms linking specific characteristics of the neural system (topology, cell properties, adaptation mechanisms, etc.) to behaviour. To demonstrate these capabilities, the project will design experimental set-ups equivalent to those used in classical neuroscience studies and use them to gain new insights into the causal relationships linking the basic constituents of the brain (genes, molecules, neurons, synapses, microcircuits, brain regions and brain systems) to perception, cognition and behaviour. In the ramp-up phase, the HBP will demonstrate its approach for the case of Visual Perception (basic psychophysics of visual perception).

State of the art

The evolutionary function of a brain is to control an organisms’ behaviour – the way its body interacts with the environment, as defined by the nervous system within the perception-action loop. In principle, the only way to achieve this is to link the model to body acting in an environment, usually in a closed loop. We can then interrogate the model through well-designed experiments, identifying causal mechanisms by lesioning or manipulating specific brain regions, transmitter systems, types of neuron etc. Although robotics has yet to win broad recognition as a valid tool for cognitive and behavioural research, a number of groups have attempted to use robots as an experimental tool. An interesting example is Barbara Webb and Henrik Lund’s work on

FP7 – 604102 – HBP

CP-CSA-FF

cricket phonotaxis [119]. In this pioneering study, the two researchers built an artificial neural network (ANN) reproducing known features of the neuronal circuits believed to be responsible for the female response to the male mating song. Franceschini has pioneered the use of robots to investigate the fly visuomotor system [123] characterizing and replicating simple behaviours such as obstacle avoidance. Other studies using a similar approach have simulated the role of place cells and hippocampus in rodent navigation [124], motion control by the cerebellum [125] [126-128], motor control [129] [130] [131] [132] binocular vision vergence [133] the behaviour of the basal ganglia [134, 135], and [136].

Different studies have simulated nervous systems at different levels of abstraction, ranging from abstracted functional “boxes” with well defined characteristic functions, to spiking neural networks based on point neuron models. Some of the most sophisticated models have incorporated spatio-temporal cell dynamics and the synaptic conductances generated by AMPA, GABA and NMDA channels.

Methodology As reported earlier, robotics has yet to win broad recognition as a valid methodology for research in cognitive neuroscience. This means that methods are likely to develop rapidly as the HBP proceeds and researchers discover new strategies. In the early stages, the HBP approach will involve the following steps.

Researchers will choose a cognitive or behavioural paradigm that has already been well characterised in classical experimental paradigms. They will then design one or more in silico experiments reproducing these paradigms

They will then choose a brain model (a model of a specific microcircuit, a larger brain region or a larger brain system, a whole brain model) and the level of detail of the model (a biologically detailed model, a point neuron model, a multi-scale model representing different elements at different levels of detail)

They will use the capabilities of the Brain Simulation Platform to characterise and refine the behaviour of the model (e.g. response to stimuli), and to calibrate it to the requirements of the robot body (sensors, actuators etc.).

Once the model has been established and calibrated, they will export a simplified version to the Neurorobotics Platform and run the model (i) on a conventional computer cluster provided by the platform itself; (ii) on a physical emulation of the model running on a NCS; or (iii) on a multicore NCS. The choice of hardware will depend on the requirements of the experiment. If the experiment requires the model to run for extended periods of time (e.g. for purposes of training), experimenters will choose the physical emulation of the brain model, which runs much faster than real time.

Results from in silico experiments will be compared against results from the original paradigm. Quantitative and qualitative differences will be analysed, and the results used to refine the brain model, the robot body and the training protocol.

Once the model has been validated, researchers will design new experiments to dissect the neuronal mechanisms responsible for its observed behaviour. These may involve manipulations (e.g. simulated lesions, simulated application of drugs) and systematic measurements (e.g. precise measurements of the activity of large numbers of cells) that are difficult or impossible in animals or human subjects.

Members of the HBP Consortium will pioneer the use of the new techniques. However, the majority of research will be performed by researchers from outside the HBP, selected through the HBP Competitive Calls Programme. The HBP will encourage experimental investigations of a broad range of perceptual, cognitive and motor capabilities, beginning with capabilities that are relatively simple and gradually moving towards more advanced functionality. Candidate capabilities include basic visual, auditory and somatosensory processing including multisensory perception; object recognition (recognition of faces, body parts, houses, words etc.); action recognition; novelty detection (e.g. auditory novelty detection through mismatch negativity); motivation, emotion and reward; premotor transformations, motor planning and execution of motor behaviour; representations of the spatial environment and navigation; decision-making and error correction; information maintenance and memory encoding: working memory, time-dependent stabilisation of cortical representations; and language production and processing. The ramp-up phase will focus on visual perception. This work will benefit from experimental investigations in the Multi-level Organisation of the Mouse Brain (WP 1.7) and in Cognitive Architectures (WP 3.1) subprojects.

FP7 – 604102 – HBP

CP-CSA-FF

Roadmap and key milestones

M30: Foundations for future neuroscience

Preparation. Guidelines for large-scale collaborative experiments

Models and tools. Models of mouse microcircuits; early prototype of virtual robots and environments.

Experiments. First closed-loop experiments with cortical microcircuit model simplified from a biologically detailed model; basic experiment design and performance testing on a visual cognitive neuroscience task.

Visualization. Parallel visualization of brain model activity and task performance.

Analysis. Learning rates; critical cortical layers, neurons, synaptic pathways involved; spatio-temporal pattern analysis M60: Demonstrations in future neuroscience; perception and action

Models and tools. Models of mouse microcircuits and brain regions

Experiments. First closed-loop experiments with a simplified model of the mouse visual system and thalamus; set-up of a standardised perception-action task.

Visualisation. Parallel visualisation of brain model activity and perception-action task during learning

Analysis. Learning rates; spatio-temporal pattern analysis; principles of perception-action

M90: Mechanisms and principles of cognition: Spatial navigation and goal-oriented behaviour

Models and tools. Models of whole mouse brain (detailed and simplified models).

Experiments. First closed-loop experiments using simplified whole brain mouse models for spatial navigation and goal-oriented tasks (Morris water maze, radial 8-arm, etc.)

Visualisation.

Parallel visualisation of brain model activity and goal-oriented spatial navigation task

during learning

Analysis. Learning rates; critical cortical layers, neurons, synaptic pathways, proteins and genes involved; spatio-temporal pattern analysis; mechanisms and principles of goal-oriented spatial navigation

M120: Mechanisms of cognition

Models and tools. Simplified models of whole human brain

Experiments. First closed-loop experiments with a simplified whole brain human model replication of standardised cognitive and behavioural tasks defined by the subproject on brain function and cognitive architectures

Visualisation. Parallel visualisation of brain model activity and cognition during learning

Analysis. Learning rates; critical brain regions, connections, cortical layers, neurons, synaptic pathways, proteins and genes involved; spatio-temporal pattern analysis; mechanisms and principles of cognition

1.2.3.2 SP12: Future medicine

Operational Objectives

The HBP seeks to provide researchers in medicine and pharmacology with the tools they need to accelerate research into the causes, diagnosis and treatment of neurological and psychiatric disease. HBP work in this area will have four specific aims

1. To facilitate the identification of specific differential disease signatures by the Medical Informatics Platform from data it generates from different levels of biological organisation and to develop new nosological classifications based on predisposing factors and biological dysfunctions rather than symptoms and syndromes.

2. To use biological signatures of disease as a source of insights into disease processes, testing specific hypotheses of disease simulation through modelling and simulation.

3. To use disease models to identify potential drug targets and other possible treatment strategies and to predict desirable and adverse effects.

4. To develop strategies for personalised medicine, allowing the development of treatments adapted to the specific condition of individual or specific subgroups of sensitive or vulnerable patients.

FP7 – 604102 – HBP

CP-CSA-FF

In each of these areas, initial investigations will be led by members of the HBP Consortium. However the bulk of funding will go to projects proposed by scientists from outside the Consortium.

State of the art

Since the development of the first anti-psychotic agents in the 1950s and SSRIs in the 1990s, there has been relatively little progress in the development of new drugs for neurological and psychiatric disease. Nearly all drugs currently are designed to manage patient symptoms rather than to alter the course of the underlying disease and in many cases are effective only for a subset of patients. Many also have unattractive side-effect profiles leading to poor patient compliance with treatment regimes. Efforts to develop new, more effective medicines have been hampered by expensive failures in clinical trials. Poor success rates are leading large pharmaceutical companies to withdraw from this area of research [137].

Part of the difficulty in developing new treatments for brain disease is due to difficulties in diagnosing patients, particularly in early asymptomatic stages of disease progression, when intervention is most likely to be effective. Information from Genome-Wide Association Studies (GWAS) has made it increasingly clear that many diseases with different biological causes (e.g., the spino-cerebellar ataxias and multiple associated mutations) present with similar symptoms and that diseases with the same underlying cause (e.g. Huntington’s disease) can present with very different symptoms (e.g. emotional disorders, cognitive deficits, movement disorders). This makes it difficult to create homogeneous trial cohorts, meaning that trials are larger and more expensive than in other pathologies, where diagnosis is easier.

However, the main obstacle to the development of new treatments is the lack of detailed causal explanations of neurological and psychiatric diseases and their clinical presentation. Currently, there are only very few diseases whose causes are fully understood even when their patho-anatomy and patho-physiology are largely known. In Parkinson’s disease, for example we still do not understand the steps that lead from degeneration of less than a million specific nigro-striatal cells to the first clinical symptoms (tremor, akinesia), which appear when 60% of these cells have already been lost [138]. This situation is complicated by the fact that other relatively common brain diseases have similar Parkinsonian manifestations. It is not known why such symptoms are so common.

Problems with current systems of disease classification and scientific advances – particularly in genetics – are slowly leading researchers to shift their attention from syndromic to biologically grounded classifications of disease. Until recently, for instance, the dementias were still diagnosed in terms of dementing syndromes, which often failed to match final post mortem analyses [139]. Today, by contrast, clinicians are beginning to interpret neurodegenerative disorders, including the dementias, as diseases of protein misfolding [140]. The Medical Informatics Platform will place Europe in a position where it could pioneer this new biological approach to nosology.

Another area of research, highly relevant to the HBP, is simulation-based pharmacology. Current applications of simulation in drug design focus on the dynamics of molecular interactions between drugs and their targets. To date however, there has been little or no work simulating the complex cascade of events that determines desirable or adverse effects at higher levels of biological organisation. The inability to predict these effects may be one reason for the high rate of failure of CNS drugs in clinical trials. Recent pharmacogenetic studies of anticonvulsants (patient responsiveness to positive drug effects and predisposition to adverse effects) support this hypothesis [141].

Methodology

Biological signatures of disease – new classifications of disease. One of the HBP’s most important goals will be to identify specific biological signatures that characterise disease processes in the brain. The discovery of such signatures could potentially lead to a new nosology, based on objective and reproducible biological and clinical data such as brain scans of various types, electrophysiology, electroencephalography, genotyping, metabolic, biochemical and haematological profiles and validated clinical instruments providing quantitative measurements of emotion and behaviour. Initial work by the HBP will focus on the biologically grounded categorisation of neurodegenerative disease (especially the dementias). However, a large part of the overall budget will be reserved for Competitive Calls for research by scientists from outside the Consortium. The calls will encourage systematic study of the full range of neurological and psychiatric disease, making no distinction between disorders of perception, cognition, action, mood, emotion and behaviour.

FP7 – 604102 – HBP

CP-CSA-FF

Simulate hypotheses of disease causation. The discovery of biological signatures for a disease will suggest hypotheses of disease causation. The Brain Simulation Platform will allow researchers to test these hypotheses, modelling alterations in brain physiology and structure and simulating the complex non-linear interactions leading to changes in cognition and behaviour.

We are at a tipping point. The realisation that the brain is not susceptible to linear analysis has come slowly. The Brain Simulation Platform will make it possible, for the first time, to simulate the effects of brain lesions on the overall functioning of brain systems, including the short-term, adaptive plasticity effects that normally palliate lesions. Simulation will also facilitate the testing of causative hypotheses for diseases for which there are no available animal models, and for disorders where such models are inadequate, for example, when disorders are associated with defects in higher cognitive function. Finally, simulation will teach researchers to distinguish between causative and secondary alterations associated with disease processes.

Simulation-based testing of drugs and other treatments for brain disease. The Brain Simulation Platform will make it possible to simulate the dynamics of drug delivery on multiple scales extending from the transport of the drug through the vasculature to its uptake at the cellular scale and the resulting changes in synaptic and neuronal function. In particular it will make it possible to explore combination therapies – a strategy which is becoming ever more important in the treatment of infectious disease and cancer [85]. Given the number of drugs with effects on the nervous system and the number of possible combinations, this is an option that is very difficult to explore in the clinic.

Virtual instruments provided by the platform (in silico analogues of standard methods such as MRI, PET and EEG) will allow researchers to observe the effects at the brain region level of blocking, up-regulating or down- regulating, a specific target or combination of targets, allowing comparisons with patient data. The platform will also allow researchers to model disease-specific syndromes or deficits (e.g., impaired vascular function due to stroke, selective loss of cells due to Parkinson’s disease, etc.) and to simulate the effects of drugs on the modified model. Once validated, the functionality offered by the platform will play a valuable role in selecting drug targets, candidate drugs and treatment therapies for laboratory investigations and for expensive animal and human trials.

Services for personalised medicine. The discovery of reliable biological signatures for psychiatric and neurological disorders will represent a major step towards personalised medicine in which treatments are tailored to the conditions of individual patients. The HBP will collaborate actively with hospitals and industry to develop projects that implement and validate such techniques.

Roadmap and key milestones

M30 Foundations for future medicine –Alzheimer’s disease and other dementias

Data and tools. Federated anonymised clinical data, algorithms for identification of disease signature

Experiments. First draft signatures for classification and diagnosis of dementias

M60: Demonstrations of future medicine – neurological diseases

Experiments. First draft signatures for classification and diagnosis of neurological diseases present in the data sources

M90: Demonstrations of future medicine – psychiatric diseases

Experiments. First draft signatures for classification and diagnosis of psychiatric diseases present in the data sources. First test of applying disease signatures on a mouse model of disease

M120: Future medicine - personalised disease signature-based diagnostics and disease and drug simulation

Experiments. First simulation of a biological disease signature n human brain model: first simulation of the impact of a drug on the disease

Services. Personalised diagnostics through application of disease signature to individual patient data

FP7 – 604102 – HBP

CP-CSA-FF

1.2.3.3 SP13: Future computing

Operational Objectives

One of the main goals of the HBP will be to use results from brain modelling to develop new computing technologies. The ICT platforms will make it possible to build and test novel software, hardware and robotic systems, inspired by knowledge of the brain and to explore their applications. Such systems have the potential to overcome critical limitations of current ICT, including limits on programmability, power consumption and reliability. The HBP will implement early projects to demonstrate these possibilities and will dedicate a significant part of its funding to support for projects proposed by researchers from outside the project. If these initiatives are successful, the end result will be novel applications with a potentially revolutionary impact on manufacturing, services, health care, the home, and other sectors of the economy.

State of the art

Although the numbers of components per chip and performance per unit of investment continue to grow exponentially, other measures of computing performance such as power consumption per chip or clock speeds have already reached saturation. On measures such as the number of components per unit area, current technology is rapidly approaching fundamental limits imposed by the atomic structure of matter. Already, today’s deep-submicron technologies suffer from extreme requirements in lithography and production technology, making investment in chip foundries a multi-billion dollar endeavour. These trends go hand in hand with ever increasing software complexity and power consumption. The computer industry is already exploiting parallelism and redundancy in many-core processors and many-processor computing systems interconnected by high bandwidth and low-latency interconnection fabrics. Brain-inspired technologies can take this parallelism to the extreme, while simultaneously introducing new brain-inspired technologies, such as low powered analogue computing devices and asynchronous inter-processor communications. These developments can open the road to low-power, very fast, highly reliable systems with the ability to learn new tasks without explicit programming [85].

Methodology

The HBP will collaborate with industry partners and researchers from outside the HBP to demonstrate the potential of the ICT platforms for the development of novel systems and applications, inspired by the architecture of the brain. The Consortium will consider proposals in four areas of systems or applications development.

High performance computing. Neuromorphic cores for conventional high performance computers, brain-inspired communications protocols; brain-inspired strategies for information storage and retrieval; massively parallel very low-power computing cores

Software. Applications incorporating advanced capabilities for pattern recognition, feature recognition, motor control, decision-making etc. (e.g. applications in industrial control, image processing, language processing etc.)

Neuromorphic computing systems and devices. Neuromorphic controllers for manufacturing, household appliances, vehicles, image and video processing, mobile telecommunications etc.; neuromorphic processors for use in high performance computing, neuromorphic processors for use in commodity computers and mobile devices

Robotics. Specialised neurorobotic systems for applications in manufacturing, services, health-care, ambient assisted living, the home and entertainment.

Milestones and roadmap

M30: Foundations of future computing

Methods and tools. Conceptual design of future neuromorphic computing cores finished; user access strategy for neuromorphic systems demonstrated; hybrid operation of non-von Neumann and von Neumann architectures demonstrated; middleware concept for future interactive supercomputing demonstrated

Applications. Data mining on neuromorphic systems demonstrated

FP7 – 604102 – HBP

CP-CSA-FF

M60: Demonstrations of future computing

Methods and tools. Set-up for learning and development on accelerated neuromorphic system demonstrated; initial interactive visual supercomputing demonstrated; user environment for interactive experiments with virtual robots on HPC demonstrated

Applications. initial neuromorphic systems for generic prediction making demonstrated; initial neuromorphic controller demonstrated; initial export of a trained circuit to a custom neuromorphic chip demonstrated; initial use of nanoelectronic components in neuromorphic circuits demonstrated

M90: Consolidation of future computing

Methods and tools. Interactive visual supercomputing established; user environment for interactive experiments with virtual robots on HPC in routine use; routine closed loop leaning and development on accelerated neuromorphic systems

Applications. Routine off-site application of neuromorphic systems for non-biological applications; neuromorphic controller used in off-site industrial demonstrator mode; routine export of trained circuits to custom neuromorphic chips; custom neuromorphic chips used in off-site demonstrator mode; hybrid digital and neuromorphic computer operational for project use

M120: Establishment of future computing

Methods and tools. Exascale computing capability established and accessible to the HBP; Fully interactive and multi-scale supercomputer remotely accessible in routine use

Applications. Routine closed loop virtual robot experiments with an interactive HPC (human scale; hybrid digital and neuromorphic computer operational for outside project use; use of exported custom neuromorphic chips in industrial applications

1.2.4 SP14: The HBP Ethics and Society Programme

Operational Objectives

HBP research and technology development has numerous social, ethical and philosophical implications. The project thus has an interest in recognising concerns early and in addressing them in an open and transparent manner. In particular, early engagement can provide scientists with opportunities to gauge public reaction to their work, and to hone their research objectives and processes in the light of these reactions. The HBP will therefore launch a major Ethics and Society Programme, with the goal of exploring the project’s social, ethical and philosophical implications, promoting engagement with decision-makers and the general public, promote responsible research and innovation by raising social and ethical awareness among project participants, and ensuring that the project is governed in a way that ensures full compliance with relevant legal and ethical norms. The programme will draw on the methods developed during empirical investigations of emerging technologies in genomics, neuroscience, synthetic biology, nanotechnology and information and communication technologies [142] as well as on the biomedical tradition of engaging with ethical issues through the application of formal principles [143] – now usually implemented through ethical review processes.

State of the art

Forecasting innovation and its social and economic impact

HBP research entails high expectations of social and economic benefits. However, the impact of basic research results on society often depends not so much on the research itself as on developments in apparently unconnected areas of science and technology or on social, political and legal factors external to science [144-

146].

Current approaches to forecasting development pathways use one of two strategies. The first studies the views, attitudes and strategies of key stakeholders with methods from the empirical social sciences involving interviews, focus groups and other ways of assessing the views, intentions and strategies of key participants [147, 148]; the second, which has reached its highest stage of development in the UK (www.bis.gov.uk/foresight), uses systematic foresight techniques such as modelling, horizon scanning and scenario planning. The goals of these exercises include, on the one hand, the early identification of new developments and assessment of their potential impact over the short, medium and longer term; on the other an assessment of key ethical concerns such as privacy, autonomy, transparency, the appropriate balance of

FP7 – 604102 – HBP

CP-CSA-FF

risks and benefits, responsibility and accountability, equity and justice [149]. Foresight exercises play a central role in responsible innovation as they enable ‘anticipatory’ action to be taken to shape the pathways of development in desired ways and to assess and manage risks in a timely manner to maximise the societal benefits of research.

Conceptual and philosophical issues

Since the 1960s, scientific and technical advances [150] have made it ever easier to anatomise the brain at the molecular, cellular and circuit levels, encouraging claims that neuroscience is close to identifying the physical basis of mind. Such claims have major implications not only for medicine but also for policies and practices dealing with normal and abnormal human conduct, and for conceptions of personhood. The significance and consequences of these developments are strongly debated, with some authors arguing that we now know enough to understand the neural bases of human selfhood and higher mental functions [151, 152], while for others, the neuroreductionist model attributes capacities to brains that can only properly be attributed to persons [153, 154]. Some have suggested that progress in neuroscience will lead to radical improvements in our ability to treat psychiatric disease [155, 156]; others are more doubtful [157, 158]. Although functional imaging has been crucial in the development of new conceptualisations of human mental states, many leading researchers remain highly critical [159].

Meanwhile, studies of the neural basis of higher brain functions have fed scientific and semi-popular debates about ideas of personhood [160-162] and free will [163-165] while studies combining psychophysics and brain imaging (e.g., [166]) have encouraged philosophers to readdress the eternal mystery of conscious awareness. The emerging discipline of neuroethics, a home for some of these discussions, has produced an extensive literature both on general conceptual issues [167-170], and on specific questions such as the functional neuroimaging of individuals belonging to different ethnic and age groups [171, 172], cognitive enhancement and memory distortion [173-176], neuroscience and law [177-179] and potential military applications of neuroscience [180]. The capabilities developed by the HBP will provide new material for these debates.

The public, dialogue and engagement

Attempts to achieve public dialogue and engagement during the development of new technologies [181, 182] have used a range of methods and approaches [183] including consensus conferences, citizen juries, stakeholder workshops, deliberative polling, focus groups and various forms of public dialogue. In the UK, for example, the dialogue is organised through the nationally funded ‘ScienceWise’ initiative (see http://www.sciencewise-erc.org.uk/). The Rathenau Institute in the Netherlands (http://www.rathenau.nl/en.html) has also been very active. The motivations for such exercises [145, 184, 185] are sometimes normative – citizens affected by research have a right to participate in crucial decision-making – sometimes instrumental. Many authors have argued, for instance, that dialogue can reduce conflict, help to build trust and smooth the introduction of innovative technology. The strongest conclusion from these debates is that not even the best prepared exercises can comprehensively represent the positions of all parts of society or resolve the issue of which groups or opinions should have most weight in a particular decision. It is important, therefore, that such exercises respect scientists’ legitimate desire to inform the public about their research, while avoiding self-conscious attempts to steer public opinion in a particular direction. Experience from other areas of emerging technology research shows that this requires a sensitive approach [186]. Public engagement exercises are successful only if participants are convinced that they can genuinely influence the course of events [187].

Researcher awareness

Ethical issues cannot be reduced to simple algorithms or prescriptions: moral statements and positions always require higher-level ethical reflection and justification. From an ethical point of view, this reflection will come, not just from external “ethical experts”, but also from researchers and their leaders. This kind of general reflexivity is currently not the norm and is likely to meet resistance. Studies suggest that the best way of achieving it is to embed measures to raise researcher awareness in governance structures [188], a technique already applied in other areas of cutting-edge technical research, notably nanotechnology (www.nanocode.eu) [189] and synthetic biology.

Governance and regulation

Today’s science regulatory environment is a result of research that provoked a vigorous social and governmental response [190]. One example is animal research, in which the response took the form of The

FP7 – 604102 – HBP

CP-CSA-FF

Council of Europe’s Convention for the Protection of Vertebrate Animals used for Experimental and other Scientific Purposes (ETS 123) (1985), and the EU Directive for the Protection of Vertebrate Animals used for Experimental and other Scientific Purposes [191] – documents that have set European standards for the laboratory use of mice and other vertebrates. Another more recent example is synthetic biology. In this case, the reaction came only after a private institution had created the first self-replicating bacterial cell from a completely synthetic genome [142]. Equally compelling cases can be gleaned from biomedicine, genetics, information and computer technology, bioengineering, neurorobotics, and nanotechnology [186].

Modern governance of innovation in biotechnology involves a variety of actors, including research organisations, national and supranational regulators, governmental or quasi-governmental organisations, professional bodies, publishers of science journals, and representatives of the mass media and public opinion. As Gottweis [192] noted for the case of transnational research on embryonic stem cells, decision-making takes place “… at the fuzzy intersection between science, society, and politics”. This is complicated, in the case of international projects, by the need to take account of different national jurisdictions.

Methodology

The Human Genome Project’s Ethical, Legal and Social Issues (ELSI) programme [193] – which absorbed 3- 5% of the project’s total budget – demonstrated that open public discussion is an effective strategy for handling potentially controversial issues raised by scientific research. The HBP will draw on the lessons of this programme, setting up its own Ethics and Society Programme, running for the whole duration of the project. The programme will bring together scholars in the brain sciences, social sciences, and the humanities to study and discuss relevant issues, using all available channels to encourage open, well-informed public debate.

The HBP Ethics and Society Programme includes the creation and operation of a Foresight Lab designed to identify project research with potentially large impacts and to study its ethical, social and economic implications. Other work in the programme will investigate the philosophical and conceptual implications of HBP research. This programme will be accompanied by specific measures – a European Citizen’s Deliberation, citizen juries, consensus conferences, web-based dialogue tools, education programmes – to encourage debate around central issues, among stakeholders and in civil society. These will be accompanied by a parallel programme to raise awareness of social and ethical issues within the HBP Consortium. Finally the HBP will design a detailed system of ethical governance to ensure that research carried out within the project meets the highest possible ethical standards and that it complies with relevant law and regulations. The governance system will include an Ethical, Legal and Social Aspects Committee that oversees the overall activities of the project and a Research Ethics Committee that collects and reviews HBP research ethics applications prior to submission to external Independent Review Boards.

Milestones and roadmap

The HBP Ethics and Society Programme will run continuously for the duration of the programme. The work plan for the ramp-up phase defines specific milestones for the set-up of the programme. The HBP Consortium does not believe it is appropriate to fix detailed milestones for the full duration of the project.

SP15: Project and programme management, education, dissemination and innovation

This subproject will be dedicated to the management of the program and to running the HBP education programme, further described below. The objectives of the subproject and the methods applied will be derived from those developed during the ramp-up phase, adapted and revised in the light of the experience of the ramp-up phase and the new form of the Flagship Programme under Horizon 2020.

1.2.5

1.2.5.1 Programme management

As described in section 1.1, a key strategic objective for the Full Flagship project (SOFF-6) is to “Develop a framework for collaboration that links the partners under strong scientific leadership and professional project management, providing a coherent European approach and promoting effective alignment of regional, national and European research and programmes”. This work will begin in the ramp-up phase, with the creation of a European Research Programme Office that will coordinate efforts to build collaboration with European Programmes, national funding agencies, large European and international research initiatives and

FP7 – 604102 – HBP

CP-CSA-FF

projects in relevant areas of research and with industrial companies, thereby bringing in new partners and guaranteeing the long-term financial sustainability of the project. This work will continue in the fully operational phase. The HBP’s governance structure will be designed to support these goals. For a detailed discussion of the HBP as a European programme the reader is referred to section 2.

1.2.5.2 The HBP Education Programme

Operational Objectives

In addition to its scientific programme the HBP will organise a large-scale programme of education with three distinct objectives (SOFF-5). The first is to provide HBP researchers with innovative forms of multidisciplinary education that bridge the gap between ICT and the life sciences, especially in the early stages of their scientific careers; the second is to provide them with the specialist training they need to make effective use of specific HBP platforms; the third is to offer these same services to the broader scientific community, preparing a new generation of young researchers with the skills and knowledge to apply ICT to neuroscience, the life sciences in general, and the applications of neuroscience and the life sciences, in medicine and computing.

Methodology The HBP Education Programme will consciously create and sustain a community of HBP students providing a reservoir of skills and ideas for the project itself and for related scientific communities. To this end the programme will combine a broad range of different instruments including studentships and fellowships, schools and workshops, online education, hands-on training on HBP sites, social networking and prizes.

The HBP student community. All students participating in any part of the HBP Education Programme will automatically be enrolled in the HBP Student Community and will maintain their status for at least three years after they have left the programme. The project will provide community members with many different forms of organisational and online support, including formal recognition of the status of “HBP Student”; recognition of special rights (privileged access to HBP tools and data; privileged channels of communication to HBP researchers); privileged access to information on the HBP programme; participation in an annual HBP student conference; participation in online forums; access to a career service, providing them with information and advice on open positions in academia and industry; and formal representation in the HBP General Assembly.

A curriculum for multidisciplinary brain research and its applications. The HBP will design, and pro- mote

a multidisciplinary curriculum for young European scientists and technologists interested in new ways of using ICT in brain research and its applications in medicine and computing technology.

Lab visits. The HBP will intensively promote exchanges of early stage and advanced researchers between different labs in the project and between HBP labs and outside institutions. Visits will last between one and six months. The HBP will fund expenses for travel and living expenses but will not pay salaries. Funds for exchanges with labs outside the HBP Consortium will come from the Competitive Calls budget; funds for exchanges with labs inside the Consortium will come from the normal research budget.

Studentships and fellowships. The HBP will fund studentships and fellowships to early stage, advanced researchers and senior scientists from outside the HBP Consortium. Studentships and fellowships will be awarded for specific topics in the HBP work programme, and will be funded by the HBP Competitive Call Programme. The award will include funding for travel and accommodation and, in the case of junior awardees,

a salary, based on local rates. The availability of studentships and fellowships will be well publicised in leading journals and through other appropriate channels. Awardees will be selected by a transparent peer review process based exclusively on candidates’ scientific qualifications. Awardees will be able to choose the institution where they carry out their work. Over the ten years of the project, the HBP will fund the following types of award.

Studentships for Ph.D. students. Starting in the second phase of the project, the HBP will provide a certain number of three-year studentships for students wishing to pursue a Ph.D. in one of the partner institutions, on a topic defined in the HBP work plan.

Postdoctoral fellowships. Again in the second phase of the project, the HBP will provide two-year post- doctoral fellowships for advanced researchers (researchers who have received their Ph.D. within the previous four years) wishing to pursue post-doctoral research in one of the partner institutions, on a topic defined in the HBP work plan.

FP7 – 604102 – HBP

CP-CSA-FF

Visiting fellowships. The HBP will provide a certain number of fellowships for senior scientists wishing to carry out original research projects using the facilities provided by one or more of the HBP platforms and/or the European Institute of Theoretical Neuroscience.

Workshops, Schools, on-site training and conferences. The HBP will organise a broad range of workshops, schools, on-line training and conferences.

Multidisciplinary workshops on HBP-related research topics. Each year, the HBP will organise a series of at least six multidisciplinary workshops on specific research topics relevant to the HBP. The workshops, each hosted by one of the HBP partners, each lasting between one and two days, will be designed to provide an introduction to the topic for researchers from other specialities. The majority of presentations will be given by HBP researchers, but where useful, workshops may include one or two invited speakers. Participants will cover their own costs. The workshops, which will have an average of twenty participants, will be open to researchers from inside and outside the HBP.

Platform training workshops. Each year from year two onwards, each of the HBP platforms will organise at least two hands-on training workshops, for researchers intending to use the platforms in their research. Participants will cover their own costs. The workshops, which will have an average of ten participants, will be open to researchers from inside and outside the HBP.

Summer schools. Each year, the HBP will organise and fund a summer school dedicated to a specific theme of HBP research. The length (up to two weeks) will be adapted to the subject under discussion. Each school will involve a mix of speakers from inside and outside the HBP. Participants, roughly 50% from inside and 50% from outside the HBP, will be selected by peer review and will receive full funding for their travel and accommodation.

Annual student conference. Each year, the HBP will fund an annual HBP student conference open to all HBP students from inside and outside the HBP Consortium. Participants will cover their own costs. The HBP will work with third party organisations to offer travel grants to a limited number of participants.

Online education and community services. The HBP use online technology to deliver education and community services to HBP students and those responsible for their education. The HBP website will include a special section dedicated to HBP students. This will provide the following services.

Online lectures and podcasts. The HBP will organise a series of online lectures and shorter online talks by leading researchers within the HBP. The lectures will be accessible to users, inside and outside the HBP.

Student Discussion forums. The HBP will host moderated discussion forums with access restricted to HBP students. Each forum will address a specific topic in HBP research. Students will be encouraged to open and manage their own forums. The web platform will provide tools to support this function.

“Virtual laboratory facilities”. The web site will include “virtual laboratory facilities” based on the HBP platforms. Lecturers will use these facilities in their teaching.

Online learning materials for the HBP platforms. The web site will allow students to access online learning materials related to the use of the HBP platforms.

Social networking. The web site will provide “social-networking” services facilitating interactions among HBP students. Services will include the possibility for students to define personal profiles and friends lists, to control their visibility and to communicate with other members of the community.

Prize. Every year the HBP will provide a prize of Eur 5,000 for the best Ph.D. thesis by a student in the HBP programme for Early Stage Researchers. The first awards will be made when these students complete their theses, probably in year four or five.

Milestones and roadmap

The HBP Education Programme will run continuously for the duration of the programme. The work plan for the ramp-up phase defines specific milestones for the set-up of the programme. The HBP Consortium does not believe it is appropriate to fix detailed milestones for the full duration of the project.

FP7 – 604102 – HBP

CP-CSA-FF

1.2.5.3 Encouraging take-up: dissemination and innovation

A strategic goal for the HBP is to “catalyse ground-breaking research into the structure and function of the human brain, the causes, diagnosis and treatment of brain disease, and brain-inspired computing technology” (SOFF-2). This will require a major to disseminate information about the project and to encourage innovation. This work will begin in the ramp-up phase and continue for the full duration of the project. For a detailed description of HBP policy on these issues the reader is referred to section 0

1.3 Risks and contingencies

Most of the key risks facing the Human Brain Project are discussed in the DoW for the ramp-up phase, and will not be discussed further. Many of these risks will be cleared during the ramp-up phase. In the section below, we discuss a small number of risks that mainly affect later phases of the project and which are not therefore discussed in the DoW for the ramp-up phase.

FP7 – 604102 – HBP

CP-CSA-FF

Objective

Risk from

Risk for

Probability

Impact

Risk management strategy

Develop, deploy and

Failure/delays in the

Quality of service; acceptance of the platforms by potential users

Moderate

High

Current status. With the exception of partners working in High Performance Computing, the Consortium has limited experience in providing community access to its resources.

operate

the

transition to

a

platforms

service-oriented

 

managerial culture

Avoidance and contingency planning. The Consortium recognizes that this is a potentially critical problem for the operational phase. During the ramp-up phase, the partners will invest considerable resources in building up their capabilities in this area.

 

End of risk. This risk will be cleared when all partners are operating regularly with a significant number of users, probably around year 4.

 

Failure/delays

in

Feasibility of large-scale human brain modelling and simulation

Moderate

High

Current status. It is very unlikely that the Consortium will have the financial capability to purchase a dedicated exascale computer. It is likely therefore that the project will have to share capacity with other organizations. This will be challenging, in organizational and financial terms.

gaining

access

to

required

HPC

 

capabilities (exascale

computing)

 
 

Avoidance and contingency planning. The Consortium recognizes that this is a potentially critical problem. It has already formulated clear plans for the procurement of pre-exascale capabilities. The partners believe that they will have enough time to negotiate satisfactory arrangements for access to the capabilities they need.

End of risk. This risk will be cleared when all partners are operating regularly with a significant number of users, probably around year 4.

Demonstrate the scientific value of the

Failures/delays

in

Increased requirements for experimental data; reduced amounts

Moderate

High

Current status. Predictive informatics is a fundamental element in HBP strategy to model and simulate the brain. The HBP partners have already demonstrated the power of the approach (e.g. as a means to predict connectivity in neural microcircuits from data on neural morphology). However, each predictive model represents a scientific problem in its own right. Hence, success in one area does not guarantee success in other independent problems. This means that complete failure of the strategy is very unlikely but that the chance of failure for any particular model is high. An individual failure would not halt the project but multiple failures would be very serious.

predictive

platforms

for

informatics

   

research

 

of data available for modelling; less accurate models

FP7 – 604102 – HBP

CP-CSA-FF

Objective

Risk from

Risk for

Probability

Impact

Risk management strategy

         

Avoidance and contingency planning. Formally, delays will be handled in the same way as delays with the platforms (see above). However, reviews will bear in mind that this is new scientific territory and that it is not possible to plan scientific breakthroughs with the same precision as technological milestones. In cases where predictive informatics is unsuccessful the project will use existing sources of data and approximations.

End of risk. Risks for individual predictive problems will be cleared one at a time. Some risk will remain for the whole project

 

Failure/ delays in scaling up existing brain models and simulations

Credibility of HBP strategy for brain modelling and simulation