>>16724474I don't particularly like thermodynamics either (or physics for that matter), but entropy itself isn't a particularly confusing or "unscientific" concept. People say a lot of dumb things with "entropy" used as a justification, but entropy is just another way of measuring "disorder" of a particular random variable.
Unlike variance (which tells you how much a particular set of random samples will vary), entropy tells you disorder in the sense that it measures how much knowing previous samples will reduce your uncertainty about future samples. If knowing that the previous observation of X_0 significantly helps you in guessing what X_1 will be the next time you make an observation, that is a "low entropy" or structured system. If knowing X_0 gives changes how wrong you next guess at X_1 is by very little, then this is a "high entropy" or very unstructured system.
This is mostly useful in a statistical context, as it lets you quantify how much you can reduce your uncertainty in a statistical estimate by taking another observation. However, it finds uses in a lot of other contexts as well.