English Kullback–Leibler divergence Cited by user Lfstevens on 14 Feb 2023 In mathematical statistics, the Kullback–Leibler divergence, denoted, is a type of statistical distance: a measure of how one probability distribution P is different…
English Bretagnolle-Huber inequality Cited by user Citation bot on 28 Aug 2022 In information theory, the Bretagnolle-Huber inequality bounds the total variation distance between two probability distributions P and Q by a concave and bounded…
English Pinsker's inequality Cited by user 174.91.112.91 on 05 May 2022 In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical…
English Total variation distance of probability measures Cited by user GünniX on 18 Dec 2020 In probability theory, the total variation distance is a distance measure for probability distributions.