![]() ![]() Furthermore, an observer who is simply ignorant of some aspect of the system's macro-state (e.g. All of those things have an observer independent status to them. In information theory that isn't a problem, since the entropy there is just a measure of the surprise of the observer of the channel's output.įor thermodynamics, though, the entropy is tied to things like temperature, chemical reaction rates, internal energy, etc. For this demon, both systems would have zero information entropy. Classically, you could invoke a Maxwell's demon type character that knows the micro-state of the system/the future outputs of the channel. If you say that the output of a communication channel is analogous to the outcome of an experiment measuring the state of a system, it's tempting to say that these quantities are the same thing. There the probabilities aren't tied to "states of the system" but to some enumerated set of symbols that could be produced by a communication channel. Information theory borrowed the formula for Gibbs entropy, that's why they called "information" "entropy". If you compare it to the Gibbs entropy, it is the same as if you assume all states with non-zero probability are equally probable. This is the form of entropy that you use for an isolated system. Where $W$ is the number of microstates consistent with the macro-state of the system. In a little more detail, the Boltzmann formula for entropy is I think, honestly, that thermodynamic entropy may end up being observer dependent but only if quantum entanglement can make the number of states available to the system observer dependent. For starters, information theory entropy is observer dependent, while thermodynamic entropy is, to the best of our ability to determine it, not. Now, there are many who object that thermodynamic entropy and information entropy are not the same thing. Like angles can be measured in "radians," "cycles," or "degrees," entropy can be measured in "bits," "digits," or "nats." The units for entropy are $J\,\mathrm$ as measuring entropy in approximately deci-Avogadro's number of nats.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |