About: Chain rule for Kolmogorov complexity     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : owl:Thing, within Data Space : dbpedia.demo.openlinksw.com associated with source document(s)
QRcode icon
http://dbpedia.demo.openlinksw.com/describe/?url=http%3A%2F%2Fdbpedia.org%2Fresource%2FChain_rule_for_Kolmogorov_complexity&invfp=IFP_OFF&sas=SAME_AS_OFF

The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term:

AttributesValues
rdfs:label
  • Chain rule for Kolmogorov complexity (en)
rdfs:comment
  • The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term: (en)
dct:subject
Wikipage page ID
Wikipage revision ID
Link from a Wikipage to another Wikipage
sameAs
dbp:wikiPageUsesTemplate
has abstract
  • The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term: (An exact version, KP(x, y) = KP(x) + KP(y|x*) + O(1),holds for the prefix complexity KP, where x* is a shortest program for x.) It states that the shortest program printing X and Y is obtained by concatenating a shortest program printing X with a program printing Y given X, plus at most a logarithmic factor. The results implies that algorithmic mutual information, an analogue of mutual information for Kolmogorov complexity is symmetric: I(x:y) = I(y:x) + O(log K(x,y)) for all x,y. (en)
prov:wasDerivedFrom
page length (characters) of wiki page
foaf:isPrimaryTopicOf
is Link from a Wikipage to another Wikipage of
is Wikipage redirect of
is Wikipage disambiguates of
is foaf:primaryTopic of
Faceted Search & Find service v1.17_git145 as of Aug 30 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 08.03.3331 as of Sep 2 2024, on Linux (x86_64-generic-linux-glibc212), Single-Server Edition (378 GB total memory, 49 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software