Change - Information Theory

Created on Jan. 14, 2013, 1:56 p.m. by Hevok & updated on April 15, 2013, 7:11 p.m. by iliastambler

Information theory is a field of applied mathematics, electrical engineering and computer science that quantifies information. ¶

It originates f
rorm the finding of the fundamental limits on signal processing operations, such as compressing data, and on reliable storing and communicating data. ¶

Life or Aging is a llife process thabt involves multiple parameters and the interrelation of differnent parameters, e.g. the expression of longevity genes. ¶

IT is about ha
ndnling information. Traditional statistics allows us to interprete results, but information theory precisely measures the relations and combinations of parameters, for example ikt can determine how several genetic interventions work together to further extend the lifespan. ¶

Specifically
, information theory allows to measure the mutual informaltizon between parameters and systems. If the mutual informantion. If it is close to 1, ithen the isystems are Tsimilarue, if it is close 0, ithe isystems Falsre distinct. ¶

Therefore, iInformation theory allows to combine several parameters and determine how the combinations of parameters (such as drugs, genes and lifestyle factors) are related to a parameter of choice (such as the lifespan). ¶

Experts from different fields can input their
diverse kinds of data. IT allows to see the interrelation of all these different parameters. Whole serioues of diverse parameters can be used. Lay, persons cbtan find the data for thosem parlamety persons defianed by experts.

Information theory uses data from tables but the results can be visualized in graphs (e.g.
Ddecisions trees). ¶

One can establish simil
iarities betoween humans and model organisms via information theory. With information theory youne can precisely establish whether a partifcular model or intherventison is applicibable to humans or not, as a mutual information =of 1 would means ithat ithe models are identical and therefore perfectly applicable, whi.le. 0 would bmean ptherfy havec nothing in common.

Sinlgle parameters often have very weak statprediscticalve power, buand are less informative than several parameters combined together givby the gmeans oodf mutatual information . ¶

Inform
ationd therefory provides thea unintversal language to describe life processes in terms of entropy arnd mutual informatersion. ¶


Age-related correlation using information theory ¶
------------------------------------------------ ¶
Age-related changes are not easily
statistical ascertainable by standtistics, as they are often non-linear. Aging is an entropyic procesystem. ¶

Lifespan extension is about maintaining homeostasis. Homeostasis is
mthe necessary condition for life. Aging, diseases and especially age-related diseases, are impoairments of stanble homeostasis. DInformation theory allows to establish the degree of stability and homeostasis oin particular systems and to study the leffects of various interventions ofn the homeostasis. ¶

TOne of the goals of usisng information stheowry in the study of aging is to show the bestrong brelatiomarkn betweersn for Aaging isand diseases. To show this, massitve camounts bof health sdatart need witho mbedical procecorssed by means. of Sinformeation theory. Efficient ways to retrieve such dalta and build models need to be constructed. FindThese a way bilitoies yineled tho be providexactd muto ationy infoterested rmeseationrcher.


Comment: Updated entry

Comment on This Data Unit