http://dbpedia.org/ontology/abstract
|
The entropic vector or entropic function i … The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities.Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called Shannon-type inequalities.However, it has been proven that already for variables, no finite set of linear inequalities is sufficient to characterize all entropic vectors.ient to characterize all entropic vectors.
|
http://dbpedia.org/ontology/wikiPageID
|
17519721
|
http://dbpedia.org/ontology/wikiPageLength
|
14119
|
http://dbpedia.org/ontology/wikiPageRevisionID
|
1103920183
|
http://dbpedia.org/ontology/wikiPageWikiLink
|
http://dbpedia.org/resource/Conditional_information +
, http://dbpedia.org/resource/Coset +
, http://dbpedia.org/resource/Inequalities_in_information_theory +
, http://dbpedia.org/resource/Convex_cone +
, http://dbpedia.org/resource/Claude_Shannon +
, http://dbpedia.org/resource/Ingleton%27s_inequality +
, http://dbpedia.org/resource/Category:Information_theory +
, http://dbpedia.org/resource/Conditional_mutual_information +
, http://dbpedia.org/resource/Kolmogorov_complexity +
, http://dbpedia.org/resource/Mutual_information +
, http://dbpedia.org/resource/Von_Neumann_entropy +
, http://dbpedia.org/resource/Quantum_information_theory +
, http://dbpedia.org/resource/Total_correlation +
, http://dbpedia.org/resource/Submodular +
, http://dbpedia.org/resource/Nicholas_Pippenger +
, http://dbpedia.org/resource/Joint_entropy +
, http://dbpedia.org/resource/Information_entropy +
, http://dbpedia.org/resource/Group_%28mathematics%29 +
, http://dbpedia.org/resource/Closure_%28topology%29 +
, http://dbpedia.org/resource/Convex_set +
, http://dbpedia.org/resource/Information_theory +
, http://dbpedia.org/resource/Discrete_uniform_distribution +
, http://dbpedia.org/resource/Andrey_Kolmogorov +
|
http://dbpedia.org/property/wikiPageUsesTemplate
|
http://dbpedia.org/resource/Template:ISBN +
|
http://purl.org/dc/terms/subject
|
http://dbpedia.org/resource/Category:Information_theory +
|
http://purl.org/linguistics/gold/hypernym
|
http://dbpedia.org/resource/Concept +
|
http://www.w3.org/ns/prov#wasDerivedFrom
|
http://en.wikipedia.org/wiki/Entropic_vector?oldid=1103920183&ns=0 +
|
http://xmlns.com/foaf/0.1/isPrimaryTopicOf
|
http://en.wikipedia.org/wiki/Entropic_vector +
|
owl:sameAs |
http://rdf.freebase.com/ns/m.047r5v9 +
, http://www.wikidata.org/entity/Q5380785 +
, https://global.dbpedia.org/id/4jSXN +
, http://dbpedia.org/resource/Entropic_vector +
|
rdfs:comment |
The entropic vector or entropic function i … The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : most the sum of the entropies of and of :
|
rdfs:label |
Entropic vector
|