>Edit: the GPS receiver in my phone is giving me some coordinates defined as “position” that happen to be in the middle of the road. However, I know precisely where I am. Don’t you think that the meaning of that “position” is somehow affected by this additional information?
No it is not affected by it. The meaning of position is never changed. Your knowledge of your position can change, but your actual position exists regardless of your knowledge or inaccuracies of your tools.
>As in jbay808’s xkcd example, if you have a random number generator and you know the sequence of numbers that will be generated, do you have a random number generator? The random number generator is still giving you a number defined as “random”, right?
Random number generators are a rabbit hole. There's not even a proper mathematical definition for it. We're not sure what a random number is... we just have an intuition for it. Case in point, the xkcd article could not define it mathematically. This is the reason why the joke exists, because we're not even truly sure what it is or if random numbers are a thing. We have intuition for what a random number is but this is likely some kind of illusion similar to the many optical illusions produced by our visual cortex. If formalization of our intuitions are not possible then there is likelihood that the intuition is not even real.
>Statistical mechanics is based on looking at the ensemble of microscopic descriptions possible given what is known about the system and their probabilities.
They're talking about deriving the entropy formula for fair dice. But they talk about it as if we don't have knowledge about physics, momentum and projectile motion. We have the power to simulate the dice in a computer simulation and know the EXACT outcome of the dice. The dice is a cube and easily modeled with mathematics. So then why does the above discussion even exist? What is the point of fantasizing about dice as if we have no knowledge of how to mechanically calculate the outcome? The point is they chose a specific set of macrostates that have uniform distribution across all the outcomes. It is a choice that is independent of knowledge.
You didn't address the first line in my comment about the definition and meaning of "pressure" so maybe we actually agree.
To ellaborate a bit, one may define "pressure" as the reading of a device that measures its exchange of momentum with the particles of gas averaged over time. The last bit is important because those microscopic impacts are discrete events. If we know [in a classical mechanics framework] the state of every particle in the gas we can predict when they will happen - and succesfully calculate the (averaged) "pressure" measurement.
However, one may also define and interprete "pressure" as a variable that - together with volume and temperature - characterizes completely the behaviour of an ideal gas in equilibrium. But if we have a precise knowledge of the physical state we could in principle do impossible things - like compressing the gas without effort or creating a temperature gradient.
If we have a fish contaminated with mercury and the concentration of 0.01% characterizes completely its toxicity we won't eat it. If we also know that the mercury is only on the surface we won't eat it either but in principle we could if we are careful. The content of arsenic in the fish remains the same although the meaning of that number changes - but of course if we're a bear unable to clean our fish the additional information doesn't change anything at all.
> They're talking about deriving the entropy formula for fair dice. But they talk about it as if we don't have knowledge about physics, momentum and projectile motion. We have the power to simulate the dice in a computer simulation and know the EXACT outcome of the dice. The dice is a cube and easily modeled with mathematics. So then why does the above discussion even exist? What is the point of fantasizing about dice as if we have no knowledge of how to mechanically calculate the outcome? The point is they chose a specific set of macrostates that have uniform distribution across all the outcomes. It is a choice that is independent of knowledge.
I can make a model where the moon is made of cheese. That model is independent of any knowledge about the true nature of the moon. But if I visit the moon and find that - surprisingly! - it's made of lunar rock I may re-evaluate the pertinence of that model.
The model where all the outcomes of the die are equally likely it's particularly useful when all the outcomes of the die are equally likely. If you have no additional knowledge - apart from the number of outcomes - you have no reason to prefer one outcome to another. All of them are equally likely - to you. You can calculate the entropy of one event assuming that there are six equally-probable possible outcomes.
If I know exactly the future outcomes of the die - 4, 2, 5, 1, ... - I can also calculate the entropy of each event assuming that there is one single possible outcome that will happen with certainty. You have one model. I have one model. Are all models created equal? If we play some game you'll painfully realize that my model was better than yours - or at least you'll believe than I'm incredibly lucky.
All mathematical formulas representing physical phenomena are called models. Some models are more accurate then other models.
Entropy is one such model. The mathematical input parameter that goes into this model is a macrostate. We are also fully aware that the model is an approximation Just like how we're aware newtonian mechanics and probability itself is an approximation.
If you feel entropy is too vague of a description then you can choose to use another model for the system. One with billions of parameters and can record the exact state of the system. Or you can use Entropy, which has it's uses just like how classical mechanics still has uses.
Ok, we agree then. Models may or may not represent a physical reality. They may be in conflict with reality - as in "the moon made of cheese". They may be incomplete - as in "the fish is 0.01% mercury". Those inaccuracies may or may not have practical relevance. Fundamentally it makes a difference though. In principle, someone with a better model of the die can consistently win bets contradicting the predictions of the "fair die" model and someone with a better model of the gas can do things forbidden by the "entropy is a measure of the energy unavailable for doing useful work" interpretation.
To reconcile those views in the context of your first comment: "Entropy is not a function of knowledge."
Entropy is a function of the macrostate. The macrostate is defined by state variables (the constraints on the system). Those state variables represent what is known about the system. Given P1, T1 we calculate S(P1, T1). Given P2, T2 we calculate S(P2, T2). The entropy obviously change with our knowledge in the sense that if we know that the pressure is P1 and the temperature is T1 we calculate one value and if we know that the pressure is P2 and the temperature is T2 we calculate a different value. If we don't know P and T we cannot calculate _one_ "entropy value" for the system at all because the corresponding macrostate is not defined.
"Two people with varying and different levels of knowledge of a system does not mean the system has two different entropy values."
What is the “entropy value of the system”?
Imagine that the system is composed of two containers with equal volumes of an ideal gas at the same temperature and pressure that are then put together - the volume is now the sum of the volumes, the pressure and temperature don’t change.
Alice can calculate S1 and S2 and the final entropy is SA=S1+S2.
Bob knows something that Alice ignores: that it was hydrogen in one container and helium in the other. They will mix and he can calculate that in the end SB>S1+S2.
What is the “entropy value of the system”? It seems to be more a property of the description of the system than of the system itself.
>What is the “entropy value of the system”? It seems to be more a property of the description of the system than of the system itself.
Yes. That is what entropy is as defined.
>If we don't know P and T we cannot calculate _one_ "entropy value" for the system at all because the corresponding macrostate is not defined.
If the input is macrostate. And you don't know the macrostate. Then you can't calculate the value. That's pretty basic and this applies for ANY model. If you don't know the input variables, you can't calculate anything. Nobody talks about mathematical models this way. This applies to everything.
I don't think you picked up on my model argument either. You seem to think you made progress on us agreeing that entropy is a "model." I'm saying every single math formula that representing physical phenomena on the face of the earth is a "model." Thus it's a pointless thing to bring up. It's like saying all mathematical formulas involve math. If entropy uniquely has a parameter called knowledge that affects it's outcome, citing properties universal to everything doesn't lend evidence to your case.
Let's "reconcile" everything:
You're implying that there is some input parameter modeled after knowledge. And that input parameter affects the outcome of the entropy calculation. I am saying no such parameter exists. Now your saying that knowledge of the input parameter itself is what your talking about. If you don't know the input parameter you can't perform the calculation.
The above is an argument for everything. ANY model on the face of the earth if you don't know the input parameters you can't derive the output. Entropy is not unique for this property and obviously by implication we're talking about how you believe entropy is uniquely relative to knowledge.
>Alice can calculate S1 and S2 and the final entropy is SA=S1+S2.
Who says you can add these two entropies together? S1 and S2. The macrostates are different and Mixing the two gases likely produces a third unique set of macrostates indpendent of the initial two.
> You seem to think you made progress on us agreeing that entropy is a "model.
I thought we had agreed that entropy is something you calculate with a model, in fact.
> You're implying that there is some input parameter modeled after knowledge.
I was trying to say that the inputs to S(...) are the things that we know because we did measure them or set their values. It seems that we agree on that because it's extremely obvious.
Hopefully we also agree that if there are other other relevant things that we know in addition to the inputs to that model we could refine our model. I fully acknowledge that we may choose to ignore the additional knowledge and keep using the old model - and it may be good enough for some uses. (We may also choose to incorporate the additional knowledge. Maybe it rules out some microstates and we could be using a smaller macroset to represent what we know about the system.)
When all we know is the macrostate, the macrostate is the most detailed description - and gives the most precise predicitions - available to us regarding the system. However, if we know more the original macrostate is no longer "complete". Because we do know - and we can predict - more precise things. There is a fundamental change from "the macrostate represents all we know and is the basis of everything we can predict" to "not the case anymore".
Which also seems obvious. Probably we agree on that as well! (Sure, it applies to everything. Anytime one ignores information one has a suboptimal model compared to the model one could have. The improved model may or may not be better for a particular purpose.)
> Who says you can add these two entropies together? S1 and S2.
Alice, who considers two equal volumes of an ideal gas at the same temperature and pressure.
> The macrostates are different
They were the same in my example. Same volume. Same temperature. Same pressure.
> and Mixing the two gases likely produces a third unique set of macrostates indpendent of the initial two
For an ideal gas doubling the volume and the number of particules (so the pressure remains the same for a fixed temperature) doubles the entropy. If you have two identical systems the total entropy doesn't change when you put together the two containers resulting in a single container twice as large with twice as many particles.
If you thought that the number of microstates - and the entropy - increases when you bring toghether two identical systems because they will mix with each other that's not correct. (Even though there are still debates about this issue 120 years later.)
The entropy would increase however if they are different ideal gases (it doesn't matter how different). Bob - who knows that they are different - would calculate the correct entropy.
It could be the other way. Maybe they're actually the same gas but Bob treats them as different because he isn't aware and keeps the general case. He calculates an increase in entropy due to the mixing. While for Alice, who knows that they are the same gas, the total entropy hasn't changed.
Ax Maxwell wrote: "Now, when we say that two gases are the same, we mean that we cannot separate the one from the other by any known reaction. It is not probable, but it is possible, that two gases derived from different sources but hitherto regarded to be the same, may hereafter be found to be different, and that a method be discovered for separating them by a reversible process."
If we think that the two gases are the same the entropy is 2S but if we discover later a way to tell apart one from the other the entropy is higher (there are more microstates for the same macrostate that we thought initially).
>I thought we had agreed that entropy is something you calculate with a model,
We did agree. I never said otherwise. Where are you getting this idea? I'm saying our agreement on this fact is useless. Why don't you actually fully read what I wrote.
>I was trying to say that the inputs to S(...) are the things that we know because we did measure them or set their values. It seems that we agree on that because it's extremely obvious.
I spent paragraphs remarking on this ALREADY. I get what your saying. You're not even reading what I wrote. Every mathematical model has this property you describe. It is not unique to entropy. If you don't know the parameters of even the Pythagorean theorem, then you can't calculate the length of the hypotenuse. Does this mean the pythagorean theorem depends on your knowledge of the system? Yes but kind of a pointless thing right? If this is the point your trying to make, which I highly doubt, then why are we focusing only on entropy? Because knowledge of any system is REQUIRED for every single mathematical model that exists or the model is useless.
I don't think your clear about the argument either. If your not talking about knowledge as a quantifiable input parameter then I don't think your clear about what's going on.
>I fully acknowledge that we may choose to ignore the additional knowledge and keep using the old model - and it may be good enough for some uses.
Entropy is used with full knowledge that it's an fuzzy model. It's based on probability. It doesn't matter if we "ignore" or don't know the additional properties of the model. The model doesn't incorporate that data regardless of whether that information is known or not known.
>They were the same in my example. Same volume. Same temperature. Same pressure.
No. The boltzman distribution changes with gas type as well. The models are different.
>For an ideal gas doubling the volume and the number of particules
In this case yes. But only for an ideal gas. I don't recall if you mentioned the gases were both ideal. Let me check. You did mention it. But then you mention the gases are different. Hydrogen and helium. Neither gas is technically ideal, and the quantum mechanical effects would likely influence the boltzman distribution when mixed. There are contradictions in your example that make it not clear.
The article you linked explains it away. It's the choice of Macrostates, effects the entropy outcome. The article says it's subjective in the sense that it's your choice of Macrostates. The Macrostates don't change based off your knowledge. You choose the one you want.
>>I thought we had agreed that entropy is something you calculate with a model,
>We did agree. I never said otherwise. Where are you getting this idea? I'm saying our agreement on this fact is useless. Why don't you actually fully read what I wrote.
It was a minor correction. I wouldn't say that entropy is a "model". But essentialy we agree, that's what I meant. We agree that we agree!
>> I was trying to say that the inputs to S(...) are the things that we know because we did measure them or set their values. It seems that we agree on that because it's extremely obvious.
> I spent paragraphs remarking on this ALREADY.
Again, I was stressing that we had also reached a clear agreement on that point. (Except that I don't know what do you mean by me implying something about "some input parameter modeled after knowledge" if every input corresponds to knowledge and that's a pointless thing to discuss.)
> The Macrostates don't change based off your knowledge. You choose the one you want.
And if you want, you can choose a new one when your knowledge changes! One that corresponds to everything you know now about the physical state. Then you can do statistical mechanics over the ensemble of states that may be the underlying unknown physical state - with different probabilities - conditional on everything you know. In principle, at least.
>Again, I was stressing that we had also reached a clear agreement on that point.
And I'm stressing the agreement was pointless and even bringing up the fact that entropy is a model doesn't move the needle forward in any direction. You haven't responded to that. I still don't understand why you brought it up. Please explain.
I also don't understand how zero knowledge of the input parameter applies as well. This argument works for every model in existence and is not unique to entropy. Again not sure why you're bringing that up. Please explain.
I never said it was unique to entropy. (And i still don’t understand if there some meaning that escapes me in calling entropy a “model”. Are temperature and pressure also “models” or is it unique to entropy?)
If I have a system in thermal equilibrium with a heat bath and know the macrostate P,V,T I can calculate the energy of the system only as a probability distribution - it’s undefined. If I knew the state precisely I could calculate the exact energy of the system.
If I define lycanthropy as “error in the determination of energy” it’s positive given the macrostate for the system in a heatbath and zero given the microstate of the system. Of course, given the microstate one can know the energy but can also pretend that the energy is still indeterminate.
While the distribution of energies - and the whole thermodynamical model - may still be useful its meaning would change. It would no longer be the most complete description of the system that encodes what can be predicted about it. Of course if that was never the meaning for you, you’ll see no loss. But I thought we were talking about physics, not mathematics. The meaning of thermodynamics is a valid point of discussion. The interpretation of the second law remains controversial.
I think this discussion has run its course - I may no longer reply even if you do. Thank’s again, it was interesting.
No it is not affected by it. The meaning of position is never changed. Your knowledge of your position can change, but your actual position exists regardless of your knowledge or inaccuracies of your tools.
>As in jbay808’s xkcd example, if you have a random number generator and you know the sequence of numbers that will be generated, do you have a random number generator? The random number generator is still giving you a number defined as “random”, right?
Random number generators are a rabbit hole. There's not even a proper mathematical definition for it. We're not sure what a random number is... we just have an intuition for it. Case in point, the xkcd article could not define it mathematically. This is the reason why the joke exists, because we're not even truly sure what it is or if random numbers are a thing. We have intuition for what a random number is but this is likely some kind of illusion similar to the many optical illusions produced by our visual cortex. If formalization of our intuitions are not possible then there is likelihood that the intuition is not even real.
>Statistical mechanics is based on looking at the ensemble of microscopic descriptions possible given what is known about the system and their probabilities.
ok take a look at this: https://math.stackexchange.com/questions/2916887/shannon-ent...
They're talking about deriving the entropy formula for fair dice. But they talk about it as if we don't have knowledge about physics, momentum and projectile motion. We have the power to simulate the dice in a computer simulation and know the EXACT outcome of the dice. The dice is a cube and easily modeled with mathematics. So then why does the above discussion even exist? What is the point of fantasizing about dice as if we have no knowledge of how to mechanically calculate the outcome? The point is they chose a specific set of macrostates that have uniform distribution across all the outcomes. It is a choice that is independent of knowledge.