information entropy

Meaning

Noun

  • A measure of the uncertainty associated with a random variable; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.

Related

Similar words

Modern English dictionary

Explore and search massive catalog of over 900,000 word meanings.

Word of the Day

Get a curated memorable word every day.

Challenge yourself

Level up your vocabulary by setting personal goals.

And much more

Try out Vedaist now.