Open access book by Giorgio Griziotti is here. Technical book for you techies. The blurb:
“Technological change is ridden with conflicts, bifurcations and unexpected developments. Neurocapitalism takes us on an extraordinarily original journey through the effects that cutting-edge technology has on cultural, anthropological, socio-economic and political dynamics. Today, neurocapitalism shapes the technological production of the commons, transforming them into tools for commercialization, automatic control, and crisis management. But all is not lost: in highlighting the growing role of General Intellect’s autonomous and cooperative production through the development of the commons and alternative and antagonistic uses of new technologies, Giorgio Griziotti proposes new ideas for the organization of the multitudes of the new millennium.”
Kurzweil builds and supports a persuasive vision of the emergence of a human-level engineered intelligence in the early-to-mid twenty-first century. In his own words,
With the reverse engineering of the human brain we will be able to apply the parallel, self-organizing, chaotic algorithms of human intelligence to enormously powerful computational substrates. This intelligence will then be in a position to improve its own design, both hardware and software, in a rapidly accelerating iterative process.
In Kurzweil's view, we must and will ensure we evade obsolescence by integrating emerging metabolic and cognitive technologies into our bodies and brains. Through self-augmentation with neurotechnological prostheses, the locus of human cognition and identity will gradually (but faster than we'll expect, due to exponential technological advancements) shift from the evolved substrate (the organic body) to the engineered substrate, ultimately freeing the human mind to develop along technology's exponential curve rather than evolution's much flatter trajectory.
The book is extensively noted and indexed, making the deep-diving reader's work a bit easier.
If you have read it, feel free to post your observations in the comments below. (We've had a problem with the comments section not appearing. It may require more troubleshooting.)
“This is the first time scientists have been able to identify a patient’s own brain cell code or pattern for memory and, in essence, ‘write in’ that code to make existing memory work better, an important first step in potentially restoring memory loss”
“We showed that we could tap into a patient’s own memory content, reinforce it and feed it back to the patient,” Hampson said. “Even when a person’s memory is impaired, it is possible to identify the neural firing patterns that indicate correct memory formation and separate them from the patterns that are incorrect. We can then feed in the correct patterns to assist the patient’s brain in accurately forming new memories, not as a replacement for innate memory function, but as a boost to it.”
An improvement to the Neural Simulation Tool (NEST) algorithm, the primary tool of the Human Brain Project, expanded the scope of brain neural data management (for simulations) from the current 1% of discrete neurons (about the number in the cerebellum) to 10%. The NEST algorithm can scale to store 100% of BCI-derived or simulated neural data within near-term reach as supercomputing capacity increases. The algorithm achieves its massive efficiency boost by eliminating the need to explicitly store as much data about each neuron’s state.
Abstract of Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
Given the human brain’s approximately 80 billion neurons, it would take tens of thousands of these devices to record a substantial volume of neuron-level activities. Still, this is a remarkable achievement.
The system would simultaneously acquire data from more than 1 million neurons in real time. It would convert the spike data (using bit encoding) and send it via an effective communication format for processing and storage on conventional computer systems. It would also provide feedback to a subject in under 25 milliseconds — stimulating up to 100,000 neurons.
Monitoring large areas of the brain in real time. Applications of this new design include basic research, clinical diagnosis, and treatment. It would be especially useful for future implantable, bidirectional BMIs and BCIs, which are used to communicate complex data between neurons and computers. This would include monitoring large areas of the brain in paralyzed patients, revealing an imminent epileptic seizure, and providing real-time feedback control to robotic arms used by quadriplegics and others.