Skip to main content

Research Repository

Advanced Search

An Unforeseen Equivalence between Uncertainty and Entropy

Muller, Tim

Authors

TIM MULLER Tim.Muller@nottingham.ac.uk
Assistant Professor



Abstract

Uncertainty and entropy are related concepts, so we would expect there to be some overlap, but the equality that is shown in this paper is unexpected. In Beta models, interactions between agents are evidence used to construct Beta distributions. In models based on the Beta Model, such as Subjective Logic, uncertainty is defined to be inversely proportional to evidence. Entropy measures measure how much information is lacking in a distribution. Uncertainty was neither intended nor expected to be an entropy measure. We discover that a specific en-tropy measure (EDRB) coincides with uncertainty whenever uncertainty is defined. EDRB is the expected Kullback-Leibler divergence between two Bernouilli trials with parameters randomly selected from the distribution. EDRB allows us to apply the notion of uncertainty to other distributions that may occur in the context of Beta models.

Citation

Muller, T. (2019). An Unforeseen Equivalence between Uncertainty and Entropy

Conference Name 13th IFIP WG 11.11 International Conference on Trust Management
Start Date Jul 17, 2019
End Date Jul 19, 2019
Acceptance Date May 24, 2019
Publication Date Jul 19, 2019
Deposit Date Jul 8, 2019
Publicly Available Date Jul 8, 2019
Public URL https://nottingham-repository.worktribe.com/output/2142227
Related Public URLs http://ifiptm2019.compute.dtu.dk/IFIPTM19/IFIPTM.html

Files




Downloadable Citations