Neural Information & Complexity
Coordinator: Nithin Nagaraj
Nithin Nagaraj
Flow of information in living organisms is as important and essential as flow of energy for their efficient functioning. Information flow is carried by Neural Signals – stochastic sequences of action potentials produced by interconnected network of neurons in response to external stimuli. These neurological signals are the language of the brain and nervous system, both for its internal communication and computation, and for external interaction with the outside world. Investigating information flow between individual as well as a network of neurons is necessary for understanding how the brain learns and responds. Shannon’s Information Theory (Entropy and Mutual Information) has been the mainstay for analyzing information flow in neurons. However, due to limitations in estimating Shannon Entropy, complexity measures (such as Lempel Ziv, Approximate Entropy and Effort-to-Compress) have recently been employed. Apart from these being our focus, the larger question that interests us: Is it time to re-visit the notion of Information and Complexity in brain activity leading to Conscious Experience?

 



 



 

 



 

 


 


 
Design By :Dreamweaver-Templates.org