from a 1,576-neuron dataset that was used to assess 42 human patients. The dataset with the patient responses helped map and characterize neuron behavior during behavioral and memory activities. It gave new insights into memory tasks, including forming new memories, retrieving, and describing those memories (Faraut 2018). C. Spiking neural networks 1. Visual system neuron spiking model Masquelier published the results of his research and phenomenological spiking modeling of a cat’s early visual system, composed of the retina, neurons (lateral geniculate nucleus), and primary visual cortex (V1), evaluating relative spike time coding and spiking timing-dependent plasticity (STDP) orientation factors. As a result of their experimental observations of the cat’s response to visual stimuli, they created a computational model. They used a virtual retina simulator and developed lateral geniculate nucleus and V1 models in MATLAB and C code (Masqueler 2011). 2. Neuron sensor firing to the brain Aljadeff et al. published their results on neuronal firing from the sensor to the brain, seeking to better understand the neural activity. With the experimental data from rat experiments, the authors tried to interpret spiking information by using four different models (spike-triggered average [STA], spike-triggered covariance [STC]+STA, maximum noise entropy [MNE], and generalized linear model [GLM]) (Aljadeff 2016). 3. Neuron algorithms and trades Bouvier et al. published a survey and overview of the strategies utilized by algorithms in hardware, along with the advantages and challenges (Bouvier 2019). 4. Spiking neuron algorithms Doboerjeh et al. published their findings on an algorithmic method to explore spiking neural networks for learning, classification, and comparative brain data analysis (Doberjeh 2016). 5. Brain neural networks Wang and Sun published their results on an example of a brain recurrent neural network (RNN) that connects the neocortex and the somatic motor cortex. Work done on artificial recurrent neural networks is useful to help understand the results that are found. “Here, we show a long-range neuronal network, which can be described as an innate RNN. It is formed with a self-feedback connectivity in the medial prefrontal cortex (mPFC; the hidden unit), which integrates inputs from basal lateral amygdala (BLA) and insular cortex (IC) neurons (the input units) and further innervates the somatic motor cortex (sMO) infragranular-layer-projecting neurons (the output units) (Wang 2021).” 6. Phosphorylation signaling in proteins Marks, in his textbook Protein Phosphorylation, explores the detail of how protein phosphorylation works. He also draws out the many ways phosphorylation and neural networks have similarities (Marks 1996). 7. Human learning Benjamin et al., in their book Human Learning: Biology, Brain, and Neuroscience, explore a variety of topics in human learning and cognition. They discuss the advances in cognitive neuroscience, brain chemistry, and brain imaging. The book contains four sections: (1) human learning and cognition, (2) cognitive neuroscience, (3) human motor learning, and (4) animal model systems. First, the human learning section explores the varied approaches to human learning and memory. Second, the cognitive neuroscience section discusses how thought is implemented in the brain. Third, the human motor learning section explores learning skills and the identification of neural mechanisms for motor learning and control. Fourth, the animal model systems section discusses the animal model systems that have enabled significant progress in the understanding of the neural mechanisms of learning and memory (Benjamin 2008). D. Neuromorphic computing 1. Energy-efficient neuromorphic computing Zheng, in his textbook Learning in Energy-Efficient Neuromorphic Computing, explores approaches to energy-efficient neuromorphic computing. Starting with a history of neural networks, it discusses the similarities and differences between spiking neural networks used in the brain and artificial neural networks that mimic brain neural networks to a certain level of accuracy, aiming at implementable approaches with current microelectronic means. It then explores approaches in artificial neural networks that have been utilized in machine learning for decades, how artificial neural networks have been implemented in hardware, and efforts to move toward creating more realistic spiking neural networks (Zheng 2019). 2. Neuromorphic computing chip Davies provides an in-depth explanation of Intel’s Loihi neuromorphic processor, which represents a microelectronics package that contains a biomimicry realization of neurons in an artificial neural network and implements a spiking neural network with a leaky-integrate-and-fire variant model (Davies 2018). Intel published this article as a technology brief describing their Loihi 2 neuromorphic computing systems chip, which continues to mature the capabilities it demonstrated with its earlier Loihi chip. Using the same architectural model, Intel has created a network on a chip that is closer in some regards to what is done in a biological neural network (Intel 2021). 3. Neuromorphic computing roadmap Christensen led the team that published an in-depth article assessing the current capability of neuromorphic computing, along with projections for future capability. Subject matter experts from academic and research laboratories discuss their research in subarticles in this paper. The article highlights the types of research that are required to attain the future desired performance. It discusses (1) materials and devices, (2) neuromorphic circuits, (3) neuromorphic algorithms, (4) applications, and (5) ethics (Christensen 2022). 4. Neuromorphic computing algorithms and applications Schuman et al. published a survey article summarizing their assessment of the research and accomplishments in neuromorphic computing algorithms and applications. It compares the Von Neumann architecture to the neuromorphic architecture at the operation, orJOHANSEN Human brain function and the creation model 2023 ICC 314
RkJQdWJsaXNoZXIy MTM4ODY=