Information Theory

Created on Jan. 14, 2013, 1:56 p.m. by Hevok & updated by iliastambler on May 2, 2013, 4:44 p.m. Hevok on May 2, 2013, 4:44 p.m.

Information theory is a field of applied mathematics, engineering and computer science that measures information.

It originates from the finding of the fundamental limits on signal processing operations, such as compressing data, and on reliable storing and communicating data.

Aging is a life process that involves multiple parameters and the interrelation of different parameters, e.g. the expression of longevity genes.

IT is about handling information. Traditional statistics allows us to interpret results, but information theory precisely measures the relations and combinations of parameters, for example it can determine how several genetic interventions work together to further extend the lifespan.

Specifically, information theory allows to measure the mutual information between parameters and systems. If the mutual information is close to 1, then the systems are similar, if it is close 0, the systems are distinct.

Information theory allows to combine several parameters and determine how the combinations of parameters (such as drugs, genes and lifestyle factors) are related to a parameter of choice (such as the lifespan).

Experts from different fields can input their diverse kinds of data. IT allows to see the interrelation of all these different parameters. Whole series of diverse parameters can be used, obtained from lay persons and experts.

Information theory uses data from tables but the results can be visualized in graphs (e.g. decisions trees).

One can establish similarities between humans and model organisms via information theory. With information theory one can precisely establish whether a particular model or intervention is applicable to humans or not, as a mutual information of 1 would mean that the models are identical and therefore perfectly applicable, while 0 would mean they have nothing in common.

Single parameters often have weak predictive power and are less informative than several parameters combined together by the means of mutual information.

Information theory provides a universal language to describe life processes in terms of entropy and mutual information.

Age-related correlation using information theory

Age-related changes are not easily ascertainable by statistics, as they are often non-linear. Aging is an entropic process.

Lifespan extension is about maintaining homeostasis. Homeostasis is the necessary condition for life. Aging, diseases and especially age-related diseases, are impairments of stable homeostasis. Information theory allows to establish the degree of stability and homeostasis in particular systems and to study the effects of various interventions on the homeostasis.

One of the goals of using information theory in the study of aging is to show the strong relation between aging and disease. To show this, massive amounts of health data need to be processed by means of information theory. Efficient ways to retrieve such data and build models need to be constructed. These abilities need to be provided to any interested researcher.


Tags: computation, mathematics, engineering, data
Categories: News, Quest
Parent: Programming

Update entry (Admin) | See changes

Comment on This Data Unit