PlanetPhysics/Equation of State for a Monatomic Gas
FOURTH LECTURE.
editTHE EQUATION OF STATE FOR A MONATOMIC GAS.
editFrom the Eight Lectures on theoretical physics delivered at Columbia University in 1909 by Max Planck.
My problem today is to utilize the general fundamental laws concerning the concept of irreversibility, which we established in the lecture of yesterday, in the solution of a definite problem: the calculation of the entropy of an ideal monatomic gas in a given state, and the derivation of all its Thermodynamic properties. The way in which we have to proceed is prescribed for us by the general definition of entropy:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{13} S = k \, log \, W }
The chief part of our problem is the calculation of for a given state of the gas, and in this connection there is first required a more precise investigation of that which is to be understood as the state of the gas. Obviously, the state is to be taken here solely in the sense of the conception which we have called macroscopic in the last lecture. Otherwise, a state would possess neither probability nor entropy. Furthermore, we are not allowed to assume a condition of equilibrium for the gas. For this is characterized through the further special condition that the entropy for it is a maximum. Thus, an unequal distribution of density may exist in the gas; also, there may be present an arbitrary number of different currents, and in general no kind of equality between the various velocities of the molecules is to be assumed. The velocities, as the coordinates of the molecules, are rather to be taken a priori as quite arbitrarily given, but in order that the state, considered in a macroscopic sense, may be assumed as known, certain mean values of the densities and the velocities must exist. Through these mean values the state from a macroscopic standpoint is completely characterized.
The conditions mentioned will all be fulfilled if we consider the state as given in such manner that the number of molecules in a sufficiently small macroscopic space, both which, however, contains a very large number of molecules, is given, and furthermore, that the (likewise great) number of these molecules is given, which are found in a certain macroscopically small velocity domain, i.e., whose velocities lie within certain small intervals. If we call the coordinates , and the velocity components , then this number will be proportional to [1]
It will depend, besides, upon a finite factor of proportionality which may be an arbitrarily given function of the coordinates and the velocities, and which has only the one condition to fulfill that
Failed to parse (unknown function "\tag"): {\displaystyle \tag{14} \sum f \cdot \sigma = N }
where denotes the total number of molecules in the gas. We are now concerned with the calculation of the probability of that state of the gas which corresponds to the arbitrarily given distribution function .
The probability that a given molecule possesses such coordinates and such velocities that it lies within the domain is expressed, in accordance with the final result of the previous lecture, by the magnitude of the corresponding elementary domain:
therefore, since here
(m the mass of a molecule) by
Now we divide the whole of the six dimensional "state domain" containing all the molecules into suitable equal elementary domains of the magnitude . Then the probability that a given molecule fall in a given elementary domain is equally great for all such domains. Let denote the number of these equal elementary domains. Next, let us imagine as many dice as there are molecules present, i.e., , and each die die to be provided with equal sides. Upon these sides we imagine numbers , so that each of the sides indicates a given elementary domain. Then each throw with the dice corresponds to a given state of the gas, while the number of dice which show a given number corresponds to the molecules which lie in the elementary domain considered. In accordance with this, each single die can indicate with the same probability each of the numbers from to , corresponding to the circumstance that each molecule may fall with equal probability in any one of the elementary domains. The probability sought, of the given state of the molecules, corresponds, therefore, to the number of different kinds of throws (complexions) through which is realized the given distribution . Let us take, e.g., equal to molecules dice) and elementary domains (sides) and let us imagine the state so given that there are \\
3 molecules in 1st elementary domain \\ 4 molecules in 2d elementary domain \\ 0 molecules in 3d elementary domain \\ 1 molecules in 4th elementary domain \\ 0 molecules in 5th elementary domain \\ 2 molecules in 6th elementary domain \\
then this state, e.g., may be realized through a throw for which the 10 dice indicate the following numbers:
1st | 2d | 3d | 4th | 5th | 6th | 7th | 8th | 9th | 10th |
2 | 6 | 2 | 1 | 1 | 2 | 6 | 2 | 1 | 4 |
Under each of the characters representing the ten dice stands the number which the die indicates in the throw. In fact,
3 dice show the figure 1 \\ 4 dice show the figure 2 \\ 0 dice show the figure 3 \\ 1 dice show the figure 4 \\ 0 dice show the figure 5 \\ 2 dice show the figure 6 \\
The state in question may likewise be realized through many other complexions of this kind. The number sought of all possible compliexions is now found through consideration of the number series indicated in (15). For, since the number of molecules (dice) is given, the number series contains a fixed number of elements (10 = N). Furthermore, since the number of molecules falling in an elementary domain is given, each number, in all permissible complexions, appears equally often in the series. Finally, each change of the number configuration conditions a new complexion. The number of possible complexions or the probability of the given state is therefore equal to the number of possible permutations with repetition under the conditions mentioned. In the simple example chosen, in accordance with a well known formula, the probability is
Therefore, in the general case:
The sign denotes the product extended over all of the elementary domains.
From this there results, in accordance with equation (13), for the entropy of the gas in the given state:
The summation is to be extended over all domains . Since is a large quantity, Stirling's formula may be employed for its factorial, which for a large number is expressed by:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{16} n! = \left ( \frac{n}{e} \right )^n \sqrt{2 \pi n} }
therefore, neglecting unimportant terms:
and hence:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{17} S = k log \, N! - k \sum f \cdot log \, f \cdot \sigma }
The quantity is, to the universal factor , the same as that which L. Boltzmann denoted by , and which he showed to vary in one direction only for all changes of state.
In particular, we will now determine the entropy of a gas in a state of equilibrium, and inquire first as to that form of the law of distribution which corresponds to thermodynamic equilibrium. In accordance with the second law of thermodynamics, a state of equilibrium is characterized by the condition that with given values of the total volume and the total energy , the entropy assumes its maximum value. If we assume the total volume of the gas
and the total energy
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikiversity.org/v1/":): {\displaystyle \tag{18} E = \frac{m}{2} \sum (\dot{x}^2 + \dot{y}^2 + \dot{z}^2)f\sigma }
as given, then the condition:
must hold for the state of equilibrium, or, in accordance with (17):
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikiversity.org/v1/":): {\displaystyle \tag{19} \sum(log \, f + 1) \cdot \delta f \cdot \sigma = 0 }
wherein the variation refers to an arbitrary change in the law of distribution, compatible with the given values of N, V and E.
Now we have, on account of the constancy of the total number of molecules N, in accordance with (14):
and, on account of the constancy of the total energy, in accordance with (18):
Consequently, for the fulfillment of condition (19) for all permissible values of , it is sufficient and necessary that
or:
wherein and are constants. In the state of equilibrium, therefore, the space distribution of molecules is uniform, i.e., independent of , and the distribution of velocities is the well known Maxwellian distribution.
The values of the constants and are to be found from those of N, V, and E. For the substitution of the value found for in (14) leads to:
and the substitution of in (18) leads to:
From these equations it follows that:
and hence finally, in accordance with (17), the expression for the
entropy S of the gas in a state of equilibrium with given values for N, V and E is:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{20} S = const + k N(\frac{3}{2} log \, E + log \, V) }
The additive constant contains terms in N and m, but not in E and V.
The determination of the entropy here carried out permits now the specification directly of the complete thermodynamic behavior of the gas, viz., of the equation of state, and of the values of the specific heats. From the general thermodynamic definition of entropy:
and obtained the partial differential quotients of S with regard to E and V respectively:
Consequently, with the aid of (20):
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikiversity.org/v1/":): {\displaystyle \tag{21} \left ( \frac{dS}{\delta E} \right )_V = \frac{3}{2} \frac{kN}{E} = \frac{1}{T} }
and
Failed to parse (unknown function "\tag"): {\displaystyle \tag{22} \left ( \frac{\delta S}{\delta V} \right )_E = \frac{kN}{V} = \frac{p}{T} }
The second of these equations:
contains the laws of Boyle, Gay Lussac and Avogadro, the latter because the pressure depends only upon the number N, and not upon the constitution of the molecules. Writing it in the ordinary form:
where n denotes the number of gram molecules or mols of the gas, referred to , and R the absolute gas constant:
we obtain by comparison:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{23} k = \frac{R n}{N} }
If we denote the ratio of the mol number to the molecular number by , or, what is the same thing, the ratio of the molecular mass to the mol mass:
and hence:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{24} k = \omega R }
From this, if is given, we can calculate the universal constant , and conversely.
The equation (21) gives:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{25} E = \frac{3}{2} k N T }
Now since the energy of an ideal gas is given by:
wherein denotes in calories the heat capacity at constant volume of a mol, A the mechanical equivalent of heat:
it follows that:
and, having regard to (23), we obtain:
Failed to parse (unknown function "\tag"): {\displaystyle \tag{26} c_v = \frac{3}{2} \frac{R}{A} = 3.0 }
the mol heat in calories of any monatomic gas at constant volume.
For the mol heat at constant pressure we have from the first law of thermodynamics
and, therefore, having regard to (26):
a known result for monatomic gases.
The mean kinetic energy L of a molecule is obtained from (25):
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikiversity.org/v1/":): {\displaystyle \tag{27} L = \frac{E}{N} = \frac{3}{2} k T }
You notice that we have derived all these relations through the identification of the mechanical with the thermodynamic expression for the entropy, and from this you recognize the fruitfulness of the method here proposed.
But a method can first demonstrate fully its usefulness when we utilize it, not only to derive laws which are already known, but when we apply it in domains for whose investigation there at present exist no other methods. In this connection its application affords various possibilities. Take the case of a monatomic gas which is not sufficiently attenuated to have the properties of the ideal state; there are here, as pointed out by J.D. van der Waals, two things to consider: (1) the finite size of the atoms, (2) the forces which act among the atoms. Taking account of these involves a change in the value of the probability and in the energy of the gas as well, and, so far as can now be shown, the corresponding change in the conditions for thermodynamic equilibrium leads to an equation of state which agrees with that of van der Waals. Certainly there is here a rich field for further investigations, of greater promise when experimental tests of the equation of state exist in larger number.
Another important application of the theory has to do with heat radiation, with which we shall be occupied the coming week. We shall proceed then in a similar way as here, and shall be able from the expression for the entropy of radiation to derive the thermodynamic properties of radiant heat.
Today we will refer briefly to the treatment of polyatomic gases. I have previously, upon good grounds, limited the treatment to monatomic molecules; for up to the present real difficulties appear to stand in the way of a generalization, from the principles employed by us, to include polyatomic molecules; in fact, if we wish to be quite frank, we must say that a satisfactory mechanical theory of polyatomic gases has not yet been found. Consequently, at present we do not know to what place in the system of theoretical physics to assign the processes within a molecule - the intra-molecular processes. We are obviously confronted by puzzling problems. A noteworthy and much discussed beginning was, it is true, made by Boltzmann, who introduced the most plausible assumption that for intra-molecular processes simple laws of the same kind hold as for the motion of the molecules themselves, i.e., the general equations of dynamics. It is easy then, in fact, to proceed to proof that for a monatomic gas the molecular heat must be greater than 3 and that consequently, since the difference is always equal to 2, the ratio is
This conclusion is completely confirmed by experience. But his in itself does not confirm the assumption of Boltzmann; for, indeed, the same conclusion is reached very simply from the assumption that there exists intra-molecular energy which increases with the temperature. For then the molecular heat of a polyatomic gas must be greater by a corresponding amount than that of a monatomic gas.
Nevertheless, up to this point the Boltzmann theory never leads to contradiction with experience. But so soon as one seeks to draw special conclusions concerning the magnitude of the specific heats hazardous difficulties arise; I will refer to only one of them. If one assumes the Hamiltonian equations of mechanics as applicable to intra-molecular motions, he arrives of necessity at the law of "uniform distribution of energy," which asserts that under certain conditions, not essential to consider here, in a thermodynamic state of equilibrium the total energy of the gas is distributed uniformly among all the individual energy phases corresponding to the independent variables of state, or, as one may briefly say; the same amount of energy is associated with every independent variable of state. Accordingly, the mean energy of motion of the molecules , corresponding to a given direction in space, is the same as for any other direction, and, moreover, the same for all the different kinds of molecules, and ions; also for all suspended particles (dust) in the gas, of whatever size, and, furthermore, the same for all kinds of motions of the constituents of a molecule relative to its centroid. If one now reflects that a molecule commonly contains, so far as we know, quite a large number of different freely moving constituents, certainly, that a normal molecule of a monatomic gas, e.g., mercury, possesses numerous freely moving electrons, then, in accordance with the law of uniform energy distribution,k the intra-molecular energy must constitute a much larger fraction of the whole specific heat of the gas, and therefore must turn out much smaller, than is consistent with the measured values. Thus, e.g., for an atom of mercury, in accordance with the measured value of , no part whatever of the heat added may be assigned to the intra-molecular energy. Boltzmann and others, in order to eliminate this contradiction, have fixed upon the possibility that, within the time of observation of the specific heats, the vibrations of the constituents (of a molecule) do not change appreciable with respect to one another, and come later with their progressive motion so slowly into heat equilibrium that this process is no longer capable of detection through observation. Up to now no such delay in the establishment of a state of equilibrium has been observed. Perhaps it would be productive of results if in delicate measurements special attention were paid the question as to whether observations which take a longer time lead to a greater value of the mol-heat, or, what comes to the same thing, a smaller value of , than observations lasting a shorter time.
If one has been made mistrustful through these considerations concerning the applicability of the law of uniform energy distribution to intra-molecular processes, the mistrust is accentuated upon the inclusion of the laws of heat radiation. I shall make mention of this in a later lecture.
When we pass from stable atoms to the unstable atoms of radioactive substances, the principles following from the kinetic gas theory lose their validity completely. For the striking failure of all attempts to find any influence of temperature upon radioactive phenomena shows us that an application here of the law of uniform energy distribution is certainly not warranted. It will, therefore, be safest meanwhile to offer no definite conjectures with regard to the nature and the laws of these noteworthy phenomena, and to leave this field for further development to experimental research alone, which, I may say, with every day throws new light upon the subject.
Footnotes
edit[1] We can call a "macro-differential" in contradistinction to the micro-differentials which are infinitely small with reference to the dimensions of a molecule. I prefer this terminology for the discrimination between "physical" and "mathematical" differentials in spite of the inelegance of phrasing, because the macro-differential is also just as much mathematical as physical and the micro-differential just as much physical as mathematical.
References
editThis article is a derivative from the public domain work in [2].
[2] Planck, M. "Eight Lectures on Theoretical Physics" Delivered at Columbia University in 1909, translated by A.P. Wills. Columbia University Press, New York, 1915.