UFOs: ‘The truth is still out there’

A boolean variable can have values true, false and null. Most of the universe is still a mystery whatever the fact checkers say.

In information theory, the entropy of a random variable is the average level of “information”, “surprise”, or “uncertainty” inherent in the variable’s possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper “A Mathematical Theory of Communication”, and is sometimes called Shannon entropy in his honor.

There are more things in heaven and earth, Horatio,

Than are dreamt of in your philosophy.

Hamlet