DONATE

Publications

by Keyword: Information theory

Bos, J. J., Vinck, M., Marchesi, P., Keestra, A., van Mourik-Donga, L. A., Jackson, J. C., Verschure, P., Pennartz, C. M. A., (2019). Multiplexing of self and other information in hippocampal ensembles Cell Reports 29, (12), 3859-3871.e6

In addition to coding a subject’s location in space, the hippocampus has been suggested to code social information, including the spatial position of conspecifics. “Social place cells” have been reported for tasks in which an observer mimics the behavior of a demonstrator. We examine whether rat hippocampal neurons may encode the behavior of a minirobot, but without requiring the animal to mimic it. Rather than finding social place cells, we observe that robot behavioral patterns modulate place fields coding animal position. This modulation may be confounded by correlations between robot movement and changes in the animal’s position. Although rat position indeed significantly predicts robot behavior, we find that hippocampal ensembles code additional information about robot movement patterns. Fast-spiking interneurons are particularly informative about robot position and global behavior. In conclusion, when the animal’s own behavior is conditional on external agents, the hippocampus multiplexes information about self and others.

JTD Keywords: CA1, Decoding, Information theory, Interneuron, Mutual information, Place cells, Place field, Tobot, Docial behavior, Tetrode


Arsiwalla, X. D., Signorelli, C. M., Puigbo, J. Y., Freire, I. T., Verschure, P., (2018). What is the physics of intelligence? Frontiers in Artificial Intelligence and Applications (ed. Falomir, Z., Gibert, K., Plaza, E.), IOS Press (Amsterdam, The Netherlands) Volume 308: Artificial Intelligence Research and Development, 283-286

In the absence of a first-principles definition, the concept of intelligence is often specified in terms of its phenomenological functions as a capacity or ability to solve problems autonomously. Whenever an agent, biological or artificial, possesses this ability, it is considered intelligent, otherwise not. While this description serves as a useful correlate of intelligence, it is far from a principled explanation that provides a general, yet precise definition along with predictions of mechanisms leading to intelligent behavior. We do not want an explanation to depend on any functionality that itself might be a consequence of intelligence. A possible conceptualization of a function-free approach might be to formulate the concept in terms of dynamical information complexity. This constitute a first step towards a statistical mechanics theory of intelligence. In this paper, we outline the steps towards a physics-based definition of intelligence.

JTD Keywords: Complexity, Information Theory, Physics of Intelligence


Fonollosa, Jordi, Vergara, Alexander, Huerta, R., Marco, Santiago, (2014). Estimation of the limit of detection using information theory measures Analytica Chimica Acta 810, 1-9

Abstract Definitions of the limit of detection (LOD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, proposed methodologies to estimate the LOD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LOD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from an analytical system depends on the probability of presenting the two different possible states. We propose a new definition of LOD utilizing information theory tools that deals with noise of any kind and allows the introduction of prior knowledge easily. Unlike most traditional LOD estimation approaches, the proposed definition is based on the amount of information that the chemical instrumentation system provides on the chemical information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise.

JTD Keywords: Limit of detection, Information theory, Mutual information, Heteroscedasticity, False positive/negative errors, Gas discrimination and quantification


Fonollosa, Jordi, Fernérndez, Luis, Huerta, Ramón, Gutiérrez-Gálvez, Agustín, Marco, Santiago, (2013). Temperature optimization of metal oxide sensor arrays using Mutual Information Sensors and Actuators B: Chemical Elsevier 187, (0), 331-339

The sensitivity and selectivity of metal oxide (MOX) gas sensors change significantly when the sensors operate at different temperatures. While previous investigations have presented systematic approaches to optimize the operating temperature of a single MOX sensor, in this paper we present a methodology to select the optimal operating temperature of all the MOX sensors constituent of a gas sensor array based on the multivariate response of all the sensing elements. Our approach estimates a widely used Information Theory measure, the so-called Mutual Information (MI), which quantifies the amount of information that the state of one random variable (response of the gas sensor array) can provide from the state of another random variable representing the gas quality. More specifically, our methodology builds sensor models from experimental data to solve the technical problem of populating the joint probability distribution for the MI estimation. We demonstrate the relevance of our approach by maximizing the MI and selecting the best operating temperatures of a four-sensor array sampled at 94 different temperatures to optimize the discrimination task of ethanol, acetic acid, 2-butanone, and acetone. In addition to being applicable in principle to sensor arrays of any size, our approach gives precise information on the ability of the system to discriminate odors according to the temperature of the MOX sensors, for either the optimal set of temperatures or the temperatures that may render inefficient operation of the system itself.

JTD Keywords: MOX gas sensor, Temperature optimization, Limit of detection, Mutual Information, E-nose, Sensor array, Information Theory, Chemical sensing