Entropy is...
…still hard to define…
“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.” - Claude E. Shannon.
“The entropy of a random variable X is a mathematical measure of the expected amount of information provided by an observation of X. As such, entropy is always relative to an observer and his or her knowledge prior to an observation.” - US National Institute of Standards and Technology (NIST).
“Entropy is always relative to an observer…” is the relevant and most misunderstood part. The ‘observer’ can be computer code or an algorithm. It doesn’t have to be a real person. The consequence is that fixed files contain entropy. Otherwise we’d all be downloading at infinite speed and never ever filling up hard disks, as all files would compress to absolutely zero. As daft as it sounds, there are pro-am cryptographers out ’there’ who vigorously maintain that only sources have entropy, and not their products. PKZIP proves them very wrong. Although it does vindicate Shannon’s quotation.
Also see shannon++ for further works on entropy measurement, and problems therein. But, there is a terrific picture of it here ↓