Você está na página 1de 15

Share Print

Goals
The ultimate goal of the Blue Brain Project is to reverse engineer the mammalian brain. To achieve this goal the project has set itself four key objectives: 1. Create a Brain Simulation Facility with the ability to build models of the healthy and diseased brain, at different scales, with different levels of detail in different species Demonstrate the feasibility and value of this strategy by creating and validating a biologically detailed model of the neocortical column in the somatosensory cortex of young rats Use this model to discover basic principles governing the structure and function of the brain Exploit these principles to create larger more detailed brain models, and to develop strategies to model the complete human brain

2.

3. 4.

Share Print

Strategy
The Blue Brain projects strategy hinges on two key elements. The first is the creation of a Brain Simulation Facility integrating the complete process of producing brain models from the acquisition of data from neuroscience experiments and the literature through the databasing and analysis of this data to model-building, simulation and the analysis and visualization of the results. This has required the development of detailed workflows and specialized software applications for every stage in the process. It has also required the creation and continuous updating of the necessary technological infrastructure: state-of-the art set-ups for the acquisition of experimental data and massive supercomputers for neuroinformatics, model building, simulation, data analysis and scientific visualization The second element in the strategy is the systematic search for basic principles of design that make it possible to predict specific features of the brain without measuring them directly. Examples include prediction of the distribution of ion channels from neurons electrical behavior and prediction of microcircuit connectivity from data on neuron morphology. This is what the project calls predictive reverse engineering.
Share Print

Infrastructure

Main components of the infrastructure

The Blue Brain workflow depends on a large-scale research infrastructure, providing:

State of the art technology for the acquisition of data on different levels of brain organization (multi-patch clamp set-ups for studies of the electrophysiological behavior of neural circuits, Multielectrode Arrays MEAs allowing stimulation of and recording from brain slices, facilities for the creation and study of cell lines expressing particular ion channels, a variety of imaging systems, systems for the 3D reconstruction of neural morphologies); An IBM 16,384 core Blue Gene/P supercomputer for modeling and simulation (provided by CADMOS); A 32-processor SGI system, connected to the Blue Gene machine via dedicated 10Gbit/s fiber optic cables and providing facilities for users to interact with visual representations of simulation results; A data center providing networked servers for use in data archiving and neuroinformatics.

Data acquisition infrastructure


The success of the Blue Brain project depends on very high volumes of standardized, high quality experimental data covering all possible levels of brain organization. Data comes both from the literature (via the projects automatic information extraction tools) and from experimental work conducted by the project itself. Blue Brains Data Acquisition Infrastructure provides the physical equipment necessary for this work. Most of the experimental equipment is currently made available by the EPFL Laboratory of Neural Microcircuitry (LNMC). The planned Human Brain Project, if accepted, will massively increase the range of data sources.

High Performance Computing

The Blue Brain workflow creates enormous demands for computational power. In Blue Brain cellular level models, the representation of the detailed electrophysioloy and communication of a single can require as many as 20,000 differential equations. No modern workstation is capable of solving this number of equations in biological real time. In other words, the only way for the project to achieve its goals is to use High Performance Computing (HPC). The Blue Brain projects simulation of the neocortical column incorporates detailed representations of 10,000 neurons. A simulation of a whole brain rat model at the same level of detail would have to represent up to 100 million neurons and would require 20,000 times more memory. Simulating the human brain would require yet another 1,000-fold increase in memory and computational power. Subcellular modeling, modeling of the neuro-glial vascular system and the creation of virtual instruments (e.g. virtual EEG, virtual fM RI) will furt her exp and the se req uire me nts.

In the initial phase of its work the Blue Brain project used an IBM BlueGene/L supercomputer with 8,192 processors. Today, it uses a 16,384 core IBM BlueGene/P supercomputer with almost 8 times more memory than its predecessor. This machine is large enough to prototype mesoscale circuits containing up to several million neurons. Industry roadmaps suggest that exascale computers large enough to meet the projects requirements will be available by 2018-20. However, the transition to the new class of machines poses significant challenges.

Constraints on energy consumption mean that supercomputers at the exascale and beyond will be less generic than current generation machines. If the new machines are to be useful for Blue Brain, the project will need to influence technology development. Key requirements include -Extremely large memory; -High memory bandwidth; -High IO capabilities; -In situ data analysis and visualization capabilities; -High-availability, high-uptime and interactivity. The design of the new machines will require novel hardware-software co-design strategies and novel tools supporting these strategies. Designers will need to support new modes of interaction between domain scientists and supercomputers, including realtime, interactive visualization, navigation and control of simulations and the use of the supercomputer as a virtual instrument; this will require a radical rethink of basic architectures. The Blue Brain Facility includes a powerful infrastructurefor High Performance Computing. including:

A 4-rack IBM Blue Gene/P supercomputer for modeling and simulation; A 32-processor SGI system, providing facilities for users to interact with visual representations of simulation results

Share Print

Publications

2011
1. Hay E., Hill S., Schrmann F., Markram H, Segev I (2011). Models of Neocortical Layer 5b Pyramidal Cells Capturing a Wide Range of Dendritic and Perisomatic Active Properties. PLoS Computational Biology 7(7): e1002107. doi:10.1371/journal.pcbi.1002107 Lasserre S., Hernando J., Hill S., Schuermann F., Anasagasti P.M., Jaoud, G.A., Markram H. (2011), A Neuron Membrane Mesh Representation for Visualization of Electrophysiological Simulations, IEEE Transactions on Visualization and Computer Graphics, 99 (preprints): p. 1-1. Shaul Druckmann, Thomas K, Berger. Felix Schrmann, Sean Hill, Henry Markram, ldan Segev (2011), Effective stimuli for constructing reliable neuron models, Plos Computational Biology, 7(8): e1002133. doi:10.1371/journal.pcbi.1002133 Romand, S., Wang, Y., Toledo-Rodriguez, M., & Markram, H. (2011). Morphological development of thicktufted layer V pyramidal cells in the rat somatosensory cortex. [Original Research]. Frontiers in Neuroanatomy, 5. Anastassiou A.C., Perin R., Markram H. & Koch C., (2011), Ephaptic coupling of cortical neurons, Nature Neuroscience, 14:2 p 217 Perin R., Berger T.K., & Markram H. (2011) A synaptic organizing principle for cortical neuronal groups, PNAS, 108 (12) Hines M, Kumar S and Schrmann F (2011). Comparison of neuronal spike exchange methods on a Blue Gene/P supercomputer. Front. Comput. Neurosci. 5:49. doi: 10.3389/fncom.2011.00049 Ramaswamy S, Hill SL, King JG, Schrmann F, Wang Y, Markram H (2011). Intrinsic Morphological Diversity of Thick-tufted Layer 5 Pyramidal Neurons Ensures Robust and Invariant Properties of in silico Synaptic Connections. J Physiol. 2011 Nov 14. [Epub ahead of print] Markram H, Gerstner W, Sjstrm PJ (2011). A history of spike-timing-dependent plasticity. Front Synaptic Neurosci. 2011;3:4. Epub 2011 Aug 29.

2.

3.

4.

5. 6. 7. 8.

9.

10. Markram H, Perin R (2011). Innate neural assemblies for lego memory. Front Neural Circuits. 2011;5:6. Epub 2011 May 16.

2009
1. 2. King, J. G., Hines, M., Hill, S., Goodman, P. H., Markram, H., & Schurmann, F. (2009). A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON. Front Neuroinformatics, 3, 10. Berger, T. K., Perin, R., Silberberg, G., & Markram, H. (2009). Frequency-dependent disynaptic inhibition in the pyramidal network: a ubiquitous pathway in the developing rat neocortex. J Physiol, 587(Pt 22), 5411-5425.

2008
1. Markram, H. (2008). Fixing the location and dimensions of functional neocortical columns. HFSP Journal, 2(3), 132-135. Druckmann, S., Berger, T. K., Hill, S., Schurmann, F., Markram, H., & Segev, I. (2008). Evaluating automated parameter constraining procedures of neuron models by experimental and surrogate data. Biol Cybern, 99(4-5), 371-379. Ascoli, G. A., Alonso-Nanclares, L., Anderson, S. A., Barrionuevo, G., Benavides-Piccione, R., Burkhalter, A., et al. (2008). Petilla terminology: nomenclature of features of GABAergic interneurons of the cerebral cortex. Nat Rev Neurosci, 9(7), 557-568. Kndgen, H., Geisler, C., Fusi, S., Wang, X.-J., Lscher, H.-R., & Giugliano, M. (2008). The Dynamical Response Properties of Neocortical Neurons to Temporally Modulated Noisy Inputs In Vitro. Cerebral Cortex, 18(9), 2086-2097. Hines, M. L., Eichner, H., & Schurmann, F. (2008). Neuron splitting in compute-bound parallel network simulations enables runtime scaling with twice as many processors. J Comput Neurosci, 25(1), 203-210.

2.

3.

4.

5.

6.

Cal, C., Berger, T., Pignatelli, M., Carleton, A., Markram, H., & Giugliano, M. (2008). Inferring connection proximity in networks of electrically coupled cells by subthreshold frequency response analysis. Journal of Computational Neuroscience, 24(3), 330-345. Hines, M. L., Markram, H., & Schurmann, F. (2008). Fully implicit parallel simulation of single neurons. J Comput Neurosci, 25(3), 439-448. Kozloski, J., Sfyrakis, K., Hill, S., Schurmann, F., Peck, C., & Markram, H. (2008). Identifying, Tabulating, and Analyzing Contacts between Branched Neuron Morphologies. IBM Journal of Research and Development, 52(1/2), 43-55.

7.

8.

2007
1. Arsiero, M., Lscher, H.-R., Lundstrom, B. N., & Giugliano, M. (2007). The Impact of Input Fluctuations on the FrequencyCurrent Relationships of Layer 5 Pyramidal Neurons in the Rat Medial Prefrontal Cortex. The Journal of Neuroscience, 27(12), 3274-3284. Le Be, J. V., Silberberg, G., Wang, Y., & Markram, H. (2007). Morphological, electrophysiological, and synaptic properties of corticocallosal pyramidal cells in the neonatal rat neocortex. Cereb Cortex, 17(9), 2204-2213. Silberberg, G., & Markram, H. (2007). Disynaptic inhibition between neocortical pyramidal cells mediated by Martinotti cells. Neuron, 53(5), 735-746. Markram, H. (2007). Bioinformatics: industrializing neuroscience. Nature, 445(7124), 160-161. Druckmann, S., Banitt, Y., Gidon, A., Schurmann, F., Markram, H., & Segev, I. (2007). A novel multiple objective optimization framework for constraining conductance-based neuron models by experimental data. Front Neurosci, 1(1), 7-18.

2.

3.

4. 5.

2006
1. Berger, T., Lscher, H. R., & Giugliano, M. (2006). Transient rhythmic network activity in the somatosensory cortex evoked by distributed input in vitro. [doi: DOI: 10.1016/j.neuroscience.2006.03.003]. Neuroscience, 140(4), 1401-1413. Markram, H. (2006). The Blue Brain Project. Nature Reviews Neuroscience, 7(February 2006), 153-160. Le B, J.-V., & Markram, H. (2006). Spontaneous and evoked synaptic rewiring in the neonatal neocortex. Proceedings of the National Academy of Sciences, 103(35), 13214-13219. Migliore, M., Cannia, C., Lytton, W. W., Markram, H., & Hines, M. L. (2006). Parallel network simulations with NEURON. J Comput Neurosci, 21(2), 119-129. Wang, Y., Markram, H., Goodman, P. H., Berger, T. K., Ma, J., & Goldman-Rakic, P. S. (2006).Heterogeneity in the pyramidal network of the medial prefrontal cortex. [10.1038/nn1670]. Nat Neurosci, 9(4), 534-542. Le Be, J. V., & markram, H. (2006). [A new mechanism for memory: neuronal networks rewiring in the young rat neocortex]. Med Sci (Paris), 22(12), 1031-1033.

2. 3.

4.

5.

6.

2005
1. Muhammad, A. J., & Markram, H. (2005). NEOBASE: databasing the neocortical microcircuit. Stud Health Technol Inform, 112, 167-177.

2.

Maciokas, J. B., Goodman, P., Kenyon, J., Toledo-Rodriguez, M., & Markram, H. (2005). Accurate dynamical models of interneuronal GABAergic channel physiologies. [doi: DOI: 10.1016/j.neucom.2004.10.083]. Neurocomputing, 65-66, 5-14. Kalisman, N., Silberberg, G., & Markram, H. (2005). The neocortical microcircuit as a tabula rasa. Proc Natl Acad Sci U S A, 102(3), 880-885. Silberberg, G., Grillner, S., LeBeau, F. E. N., Maex, R., & Markram, H. (2005). Synaptic pathways in neural microcircuits. [doi: DOI: 10.1016/j.tins.2005.08.004]. Trends in Neurosciences, 28(10), 541-551. Richardson, M. J. E., Melamed, O., Silberberg, G., Gerstner, W., & Markram, H. (2005). Short-Term Synaptic Plasticity Orchestrates the Response of Pyramidal Cells and Interneurons to Population Bursts. Journal of Computational Neuroscience, 18(3), 323-331. Toledo-Rodriguez, M., Goodman, P., Illic, M., Wu, C., & Markram, H. (2005). Neuropeptide and calciumbinding protein gene expression profiles predict neuronal anatomical type in the juvenile rat. The Journal of Physiology, 567(2), 401-413. Grillner, S., Markram, H., De Schutter, E., Silberberg, G., & LeBeau, F. E. N. (2005). Microcircuits in action from CPGs to neocortex. Trends in neurosciences, 28(10), 525-533.

3.

4.

5.

6.

7.

2004
1. Markram, H., Toledo-Rodriguez, M., Wang, Y., Gupta, A., Silberberg, G., & Wu, C. (2004).Interneurons of the neocortical inhibitory system. [10.1038/nrn1519]. Nat Rev Neurosci, 5(10), 793-807. Toledo-Rodriguez, M., Blumenfeld, B., Wu, C., Luo, J., Attali, B., Goodman, P., et al. (2004).Correlation maps allow neuronal electrical properties to be predicted from single-cell gene expression profiles in rat neocortex. Cereb Cortex, 14(12), 1310-1327. Wang, Y., Toledo-Rodriguez, M., Gupta, A., Wu, C., Silberberg, G., Luo, J., et al. (2004). Anatomical, physiological and molecular properties of Martinotti cells in the somatosensory cortex of the juvenile rat. J Physiol, 561(Pt 1), 65-90.

2.

3.

2002
1. Maass, W., Legenstein, R., & Markram, H. (2002). A New Approach towards Vision Suggested by Biologically Realistic Neural Microcircuit Models. In H. Blthoff, C. Wallraven, S.-W. Lee & T. Poggio (Eds.), Biologically Motivated Computer Vision (Vol. 2525, pp. 1-6): Springer Berlin / Heidelberg.

The Human Brain Project


Blue Brains success in modeling the rat cortical column has driven the development of the Brain Simulation Facility and has demonstrated the feasibility of the projects general strategy. But, this is only a first step. The human brain is an immensely powerful, energy efficient, self-learning, self-repairing computer. If we could understand and mimic the way it works, we could revolutionize information technology, medicine and society. To do so we have to bring together everything we know and everything we can learn about the inner workings of the brain's molecules, cells and circuits. With this goal in mind, the Blue Brain team has recently come together with 12 other European and international partners to propose the Human Brain Project (HBP), a candidate for funding under the EUs FET Flagship program. The HBP team will include many of Europes best neuroscientists, doctors, physicists, mathematicians, computer engineers and ethicists. The goal is to build on the work of the Blue Brain Project and on work by the other partners to integrate everything we know about the brain in massive databases and in detailed

computer models. This will require breakthroughs in mathematics and software engineering, an international supercomputing facility more powerful than any before and a strong sense of social responsibility. Experimental and clinical data is accumulating exponentially. Computers powerful enough to meet the projects initial requirements are already here. As technology progresses and the project discovers new principles of brain design, it will build ever more realistic models. The benefits for society will be huge, even before it achieves its final goals. The HBPs thirst for computing power will drive the development of new technologies for supercomputing and for scientific visualization. Models of the brain will revolutionize information technology, allowing us to design computers, robots, sensors and other devices far more powerful, more intelligent and more energy efficient than any we know today. Brain simulation will help us understand the root causes of brain diseases, to diagnose them early, to develop new treatments, and to reduce reliance on animal testing. The project will also throw new light on questions human beings have been asking for more than two and a half thousand years. What does it mean to perceive, to think, to remember, to learn, to know, to decide? What does it mean to be conscious? In summary, the Human Brain Project has the potential to revolutionize technology, medicine, neuroscience, and society.
Share Print

Glossary
Term -omics Description The study of the humane genome gave rise to the science and technology of genomics the study of the genome. Other omics studies and technologies refer to other, higher levels of biological organization e.g. transcriptomics (the study of mRNA produced when genes are transcribed; proteomics (the study of the proteins produced when mRNA is translated). When omics is used as a general term, it refers to the complete set of all such studies and technologies covering many levels of biological information. A work of reference (e.g. the Allen Mouse Atlas) showing how one or more data sets (e.g. gene expression data) map to specific regions and subregions of the brain. See Blue Gene/P. IBM supercomputer. The BlueGene/P used in the Blue Brain project is a massively parallel, tightly interconnect machine with 16384 processors, 56TeraFlops of peak performance, 16TeraByte of distributed memory and a 1 PetaByte file system. An enlarged part of the axon of a nerve cells forming the presynaptic terminal of a synapse with another cell. A Blue Brain software application used to build models at a specific level of brain organization. Standardized Blue Brain characterization of the electrical behavior of a neural microcircuit, obtained through application of a benchmark stimulation protocol. A mathematical formulation, derived from classical electrodynamics, describing the

Atlas

BG/P BlueGene/P

Bouton

Builder c-code

Cable equation

Capability job

Capacity job

Channelome Class hierarchy

Compartmentalized model

Connectome Connectomics Curation

Data Model

Diffusion Tensor Imaging (DTI) DTI e-code

Electron Microscopy (EM) EM Endoplasmic Reticulum (ER). Ephaptic effects Field Potential Genetic Algorithm High Performance Computing (HPC)

Hodgkin-Huxley channel models

propagation of current and voltage along a cable or (in the case of neuroscience) a neural fiber. A class of compute jobs that requires exclusive access to very large, tightly-coupled supercomputers. Simulation-based virtual experiments belong to this class. A class of compute jobs that requires medium size supercomputing resources. Numerous preparatory steps in the Blue Brain facility (e.g. builders) involve this kind of job. The full set of ion channels expressed by a cell. Computer science term describing a hierarchy of classes at increasing levels of generality (e.g. individual, species, genus, family, etc.). Any object is an instance of a class in the hierarchy and of the superclasses to which the class belongs. A spatially discretized model of a neuron in which the neuron is represented by a set of digitized compartments each with its own individual attributes. Used for purposes of numerical approximation. The complete connectivity map between neurons, including the locations of all synapses. The study of the connectome. Human processing (quality control, normalization of terminology, annotation etc.) of data prior to use in modeling. An abstract model of the different classes of data used by a computer application or system and the hierarchical relationships between these classes. A magnetic resonance imaging (MRI) technique used to produce images of neural tracts. See Diffusion Tensor Imaging. Standardized Blue Brain characterization of the electrical behavior of a cell, obtained through application of a benchmark stimulation protocol. Use of an Electron Microscope to obtain highly magnified images of biological tissues. See Electron Microscopy. An extensive system of parallel and folded membranes within a neuron. Effects due to current flow through the extracellular space. An electrical potential created by a set of current sources. An optimization technique based on a highly stylized version of Darwinian evolution. The use of parallel processing to run advanced applications programs efficiently, reliably and quickly. The term HPC is sometimes used as a synonym for supercomputing, although technically a supercomputer is a system that performs at or near the currently highest operational rate for computers. Phenomenological description of genetically

HPC Ion channels

macrocircuit Marching cube space fill MCell

MEA Mesh representation

Mesocircuit Microcircuit

Morph

Multi-Electrode ARRAY (MEA) Multiomics Multi-objective Optimization

Neocortical column

NEURON

Out of core

Patch clamp

Phenome

Phenomenological model

prescribed ion channels. See High Performance Computing. Proteins controlling the passage of ions through the cell membrane. Ion channels are targets for neuromodulatory systems and for drugs. The distribution of ion channels determines the electrical behavior of the cell. Circuit linking different regions of the brain. Algorithm used to extract a surface mesh from a volume representation for parametrized geometry. A widely used simulator from the Computational Neurobiology Lab, SALK Institute, USA. Mcell is used in reaction diffusion simulations of molecular interactions. See Multi-Electrode Array. A term from 3D computer graphics. A representation based on a polygon mesh or an unstructured grid: a collection of vertices, edges and faces that defines the shape of a polyhedron. Circuit linking multiple microcircuits. A neural circuit lying within the dimensions of the local arborizations of neurons (typically 200 500 m). Adapt a template model to a new set of parameters indicated by experiments or defined by a researcher. An array of electrodes allowing simultaneous stimulation of and recording from neural tissue. The study of a system using multiple -omic levels of organization. An optimization procedure that measure the fitness of alternative solutions in terms of more than one objective. A basic functional unit of the neocortex organized as a densely interconnected column of neurons traversing all 6 layers. A well known environment for the empiricallybased simulations of neurons and networks of neurons. Developed by Michael Hines, Yale University, USA. Computer memory outside the core of the machine (e.g. the memory used to store the content of a database). The use of out of core memory makes it possible to perform computations on data sets too large to fit into core memory. A widely used technique for simultaneously stimulating and recording from neurons. The Blue Brain Project has pioneered the use of patch clamp techniques with as many as 12 neurons. The complete set of phenotypic entities (morphology, behavior etc.) expressed by a cell, a tissue, an organ, an organism or a species. A model that reproduces an observed behavior without faithfully accounting for the underlying biophysics.

Predictive disease diagnostics Predictive reverse engineering

Probabilistic Synapse Model Proteome Receptome Reconstruction

Registration

Repair (of morphologies)

Representation SBML STEPS

Systems Biology Markup Language (SBML) Template model

Touch

Touch region Tract tracing Transcriptome

Tsyodyks-Markram Synapse Model Ultrastructure Volume Representation Voxel

Techniques making it possible to predict the risk that a human subject will develop a disease. Techniques making it possible to predict unknown data from a small subset of the data, or from data at other levels of biological organization or from other species. Extension of the Tsyodyks-Markram Synapse Model for probabilistic release. The set of information required to fully represent the proteins expressed by a cell The set of information required to fully represent the receptors expressed by a cell. Technique used to trace and digitize the 3D morphology of a nerve cell from stained tissue through 2D microscopy. The process whereby a concept (e.g. the name of a brain region) or a set of experimental data (e.g. data describing a brain region in an individual) is mapped to coordinates in an atlas. Correction of 3D neuron morphologies to remove artifacts (typically slicing artifacts and artifacts due to tissue shrinkage). A structured set of data representing an underlying physical reality (e.g. the morphology of a neuron). See Systems Biology Markup Language. A simulator for stochastic reaction-diffusion systems in realistic morphologies from the Theoretical Neurobiology group, University of Antwerp, Belgium. A computer-readable format for representing models of biological processes. A generic model representing the structure and/or the function of the brain or a part of the brain at a specific level of detail. In the Blue Brain strategy, template models can be morphed to match different parameter sets coming from experiments or defined by a researcher e.g. morphing of data from one region of the brain to match the features of another region; morphing of data from one species to characteristics of a different species. A structural contact. The point where the axon of one neuron comes within a threshold distance of part of another neuron (usually a dendrite). The region surrounding a touch. The use of a labeling agent to detect long-range pathways across the brain. The set of information required to fully represent all mRNA expressed by a cell during transcription of the genome. Phenomenological description of major synapse classes and their short-term adaptation. The fine (microscopic) structure of a cell. Voxelized representation of a 3D space, typically a regular three-dimensional matrix. Term from computer graphics. A volumetric pixel

Work flow

or, more correctly, a Volumetric Picture representing a value on a grid in 3D space. Analogous to a pixel in 2D images. Term used both in management engineering and in computer science. A sequence of steps leading to a well-defined outcome.

Share Print

In brief
Reconstructing the brain piece by piece and building a virtual brain in a supercomputerthese are some of the goals of the Blue Brain Project. The virtual brain will be an exceptional tool giving neuroscientists a new understanding of the brain and a better understanding of neurological diseases. The Blue Brain project began in 2005 with an agreement between the EPFL and IBM, which supplied the BlueGene/L supercomputer acquired by EPFL to build the virtual brain. The computing power needed is considerable. Each simulated neuron requires the equivalent of a laptop computer. A model of the whole brain would have billions. Supercomputing technology is rapidly approaching a level where simulating the whole brain becomes a concrete possibility. As a first step, the project succeeded in simulating a rat cortical column. This neuronal network, the size of a pinhead, recurs repeatedly in the cortex. A rats brain has about 100,000 columns of in the order of 10,000 neurons each. In humans, the numbers are dizzyinga human cortex may have as many as two million columns, each having in the order of 100,000 neurons each. Blue Brain is a resounding success. In five years of work, Henry Markrams team has perfected a facility that can create realistic models of one of the brains essential building blocks. This process is entirely data driven and essentially automatically executed on the supercomputer. Meanwhile the generated models show a behavior already observed in years of neuroscientific experiments. These models will be basic building blocks for larger scale models leading towards a complete virtual brain.

Share Print

A tool for researchers


The brains extreme complexity makes it one of the most difficult subjects to study. For example, it is totally impossible to observe what is happening within a small group of neurons while at the same time imaging the activity of the whole brain. A virtual model would make such observations possible. In terms of public health, the stakes are high. A realistic simulation could provide a better understanding of the way drugs act on the brain, and of their possible side effects. It could even help to develop completely new treatments. Today, every new drug put on the market costs an average of 1.3 billion francs to develop. Neurological diseases are destined to represent an ever-increasing share of health-care budgets, and are a source of considerable suffering for those afflicted and their family and friends. The Blue Brain project sets out to make neuroscientific research more efficient and in the long run will help to limit the need to use laboratory animals.

Share Print

What do we simulate?
The Blue Brain project represents an essential first step toward achieving a complete virtual human brain. The researchers have demonstrated the validity of their method by developing a realistic model of a rat cortical column, consisting of about 10,000 neurons. Eventually, of course, the goal is to simulate systems of millions and hundreds of millions of neurons. Understand the Cortical Column The cortical column can be considered the basic unit of the cortex. Notably, it is by accumulating an ever-increasing number of columns that the brain has evolved over millions of years. Each column seems to be allotted to a simple yet essential function. For example, it has been possible to show that in the rat, one specific column is devoted to each whisker. The cortical column is no larger than the head of a pin. In the rat, it contains only about 10,000 neurons. But as a basic unit, it represents an essential component of cerebral mechanics. That is why, initially, the researchers are working to simulate its functioning. The Blue Brain project team has succeeded in isolating about fifty different types of neuron within the cortical column. As in an ecosystem, each species differs from the others in essential characteristics such as morphology, behavior, population density, etc. Move From the Real to the Virtual The researchers have been working to explain the behavior of and the way they connect to form circuits. This kind of knowledge makes it possible to isolate basic principles they can incorporate in their simulations. The scientists have translated their observations into mathematics, developing powerful algorithms to represent neuronal behavior in a realistic way, and to make the best possible use of supercomputing power. Test the Model Each two weeks, on average, the BlueGene/P supercomputer generates and runs a new model of the cortical column, incorporating the latest data from experiments. Often these simulations reproduce experiments that have already been performed with living neurons. For example, they may stimulate a virtual neuron and observe the way it reacts. The scientists do not intervene in the model to affect the results. In this way, they can truly test the general principles behind the model. The results so far show that the models are already achieving a high level of realism.

Share Print

What's next?
The ultimate goals of brain simulation are to answer age-old questions about how we think, remember, learn and feel, to discover new treatments for the scourge of brain disease and to build new computer technologies that exploit what we have learned about the brain. Blue Brain is a first step in this direction. But we need to go further. This is why Blue Brain recently joined with other 12 partners to propose the Human Brain Project a very large 10 year project that will pursue precisely these aims. The new grouping has just been awarded a Eur 1.4 million European grant to formulate a detailed proposal. The EU decision to launch the project is expected in 2012.

Share Print

Timeline
2002

Henry Markram founds the Brain Mind Institute (BMI) at the EPFL.

2005 On June 6, the EPFL and IBM sign an agreement to launch the Blue Brain project. The agreement provides for the installation of a BlueGene supercomputer on campus.

2006 In February, the project takes shape and Henry Markram publishes an article about the Blue Brain project inNature Reviews Neuroscience. During the summer, the first cortical column model of 10,000 neurons is created using a simplified neuronal model. In December, an auto-generated cortical column is completed. It is a biologically valid model, from a neuronal standpoint.

2007 In January, the project is presented to the Davos forum. On November 26, the end of the first phasethe modeling and simulation of the first rat cortical columnis announced (a link to a picture would be good).

2008 Electrophysical, anatomical and genetic laboratory experiments are used to test the data-driven auto-generation process for neurons. In June, an article on determining the position and size of functional cortical columns is published in the HFSP Journal.

2009 In June, the BlueGene/L supercomputer is replaced by BlueGene/P, increasing computing power by doubling the number of processors. "In silico" experimentation is in full swing, testing the protocols published by other research groups and adding to knowledge of the principles governing cortical column construction.

2010 An article entitled An Approach to Capturing Neuron Morphological Diversity is published by MIT press inComputational Modeling Methods for Neuroscientists. In December, with a large group of partners, the Blue Brain Project applies to the European Commission(Seventh Framework Programme - FP7) for financing for a larger project aiming to continue the work and simulate an entire brain, specifically a human brain (find out what we can say HBSP).

Glossary
Axon: A fine extension of neurons, which makes them the longest cells in the human body and conducts electrical signals very efficiently thanks to a special sheath surrounding it. Ion channel: A group of specific molecules located in cell membranes. The molecules special properties allow or prevent the passage of certain substances between the inside and outside of the cell. This process works in many different ways depending on the molecules electrochemical properties. Cortical column: A group of neuronsabout 10,000 in rats and 70,000 in humansthat is vertically structured in relation to the various levels of the cerebral cortex and constitutes a unit associated with a specific function. Cortex: A region of the brain consisting of several areas associated with specific complex cognitive processes. It is structured as different layers of neurons and other brain cells.

Dendrite: An extension of the neurons cell body, ending in synapses; dendrites conduct electrical currents from the synapse toward the center of the cell. Neuroinformatics: A branch of science covering the organization of neuroscience data and the development of analytical models and tools. Not to be confused with neuromorphic computing, which seeks to reproduce cerebral mechanisms for useful purposes. Neuron: A cell of the nervous system that can transmit and receive bioelectrical signals (the nerve impulse). Its star-like shape allows it to interconnect with other neurons. Neurotransmitters: Chemical compounds released into the synapses by neurons. They make communication between the neurons possible. They can also facilitate or inhibit signal transmission. Synapse: The area of contact between nerve cells that enables them to exchange electrical or chemical signals and permits communication between them

Você também pode gostar