| entropy | computing dictionary |
<theory> A measure of the disorder of a system. Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy).
The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way.
Shannon's formula gives the entropy H(M) of a message M in bits:
H(M) = -log2 p(M)
Where p(M) is the probability of message M.
entrochite, entropion, entropionise, entropium < Prev | Next > entropy, entropy trapping, entry
Bookmark with: ![]() ![]() ![]() ![]() ![]() |
word visualiser | Go and visit our forums ![]() |
| entropy | medical dictionary |
<radiobiology> The amount of disorder in a system.
(09 Oct 1997)
entropion, entropionise, entropium, entropy < Prev | Next > entropy trapping, entry, entry profile
Bookmark with: ![]() ![]() ![]() ![]() ![]() |
word visualiser | Go and visit our forums ![]() |

dictionary help





