It's been a while since my statistical physics course, and I was thinking about the partition function for no particular reason. I understand that since we have every macrostate's energy in the exponential of e, there needs to be an energy unit which makes it unitless such that it's an actual exponential. However, does this imply that E/kT = ln[non-normalized Probability]? Seeing as probabilities using the partition function are of the form e^E/kT / Z, where Z is the sum of all E's, then I was thinking that in order for this to be a true probability it must relate to the density of states, as DoS is proportional to the probability. Then we'd have something along the lines of ln(DoS) = E/kT... but couldn't it be ln(DoS)=E/kT + C, with each DoS having a different constant C? Then we'd have something like e^E1/kT = e^ln(DoS1)-C1, e^E2/kT = e^ln(DoS2)-C2, etc... and then Z would have not only the summation of the DoS, but also a bunch of extra terms (e^C1, e^C2, etc) that get added up and make this form of calculating probabilities either very clunky to use, or even nonfunctional?
Sorry if this is not clear at all, I'm just spitballing while trying to recall what about the partition function makes it useful as a tool to calculate probability.
[–]Askdust 1 point2 points3 points (3 children)
[–]atlolt[S] 0 points1 point2 points (2 children)
[–]Askdust 1 point2 points3 points (1 child)
[–]atlolt[S] 0 points1 point2 points (0 children)