Why did I do that?
Indeed, no significant contribution to neuroscience could be made by
simulating one second of a model, even if it has the size of the human
brain. However, I learned what it takes to simulate such a large-scale
system.
Implementation challenges:
Since 2^32 < 10^11, a standard integer number cannot even encode the
indices of all neurons.
To store all synaptic weights, one needs 10,000 terabytes. Not even
Google has that much free space.
How was the simulation done? Instead of saving synaptic connections, I
regenerated the anatomy every time step (1 ms).
Question: When can we simulate
the human brain in real time?
Answer: The computational power
to handle such a simulation will be avaiable sooner than you think.
The benchmark "1 sec = 50 days on
27 3GHz processors" and the Moore's law result in the table
Time |
Number of processors
|
Processor speed
|
2006, January 1 (now)
|
116640000
|
3 GHz
|
2007, July 1
|
58320000
|
6 GHz
|
2009, January 1
|
29160000
|
12 GHz
|
2010, July 1
|
14580000
|
24 GHz
|
2012, January 1
|
7290000
|
48 GHz
|
2013, July 1
|
3645000
|
96 GHz
|
2015, January 1
|
1822500
|
192 GHz
|
2016,
July 1
|
911250
(possibility*)
|
384
GHz
|
2018, January 1 |
455625
|
768 GHz
|
2019, July 1 |
227813
|
1536 GHz
|
2021, January 1 |
113907
|
3072 GHz
|
2022, July 1 |
56954
|
6144 GHz
|
2024, January 1 |
28477
|
12288 GHz
|
2025, July 1 |
14239
|
24576 GHz
|
2027, January 1 |
7120
|
49152 GHz
|
2028, July 1 |
3560
|
98304 GHz
|
2030, January 1 |
1780
|
196608 GHz
|
2031, July 1 |
890
|
393216 GHz
|
2046, July 1
|
1
|
402653184 GHz
|
*A cluster of a million of processors would not be prohibitively
expensive.
However, many essential details of the anatomy and dynamics of the
mammalian nervous system would probably be still unknown.
Take home message: Size
doesn't matter; it's what you put into your model and how you
embed it into the environment (to close the loop).