When two gases are mixed, the molecules of the gases intermix to achieve more randomness. The process of vaporisation produces an increase in randomness in the distribution of molecules, hence an increase in entropy. For example, when a solid changes to a liquid, an increase in entropy takes place, because with the breaking of the orderly arrangement of the molecules in the crystal to the less orderly liquid state, the randomness increases. Conversely, if the change is one in which there is an increase in orderliness, there is a decrease in entropy. Physical significance: Entropy has been regarded as a measure of disorder or randomness of a system. Thus when a system goes from a more orderly to less orderly state, there is an increase in its randomness and hence entropy of the system increases. It is an exothermic reaction, therefore, favours the process, but randomness factor opposes the reaction.Īs the reaction takes place, the energy factor must be greater than the randomness factor. (b) The reaction between hydrogen and oxygen to form water. Since the process is known to take place, randomness factor must be greater than energy factor. Evaporation of water is endothermic, therefore, energy factor opposes the process. When the two tendencies act in the opposite direction, the tendency with the greater magnitude determines whether the process is feasible or not. (iii) the driving force is the resultant of the magnitude of the two tendencies. (ii) the two tendencies may work in the same direction or opposite direction in a process and (i) the two tendencies act independent of each other, The overall tendency of a process to take place by itself is called the driving force. (ii) the tendency to acquire a state of maximum randomness or disorder. (i) the tendency to acquire a state of minimum energy, and In this context, a the change in entropy can be described as the heat added per unit temperature and has the units of Joules/Kelvin (J/K) or eV/K.The overall tendency of a process to occur can be expressed on the resultant of two tendencies namely: For the case of an isothermal process it can be evaluated simply by ΔS = Q/T. It can be integrated to calculate the change in entropy during a part of an engine cycle. This is often a sufficient definition of entropy if you don't need to know about the microscopic details. The relationship which was originally used to define entropy S is dS = dQ/T This is a way of stating the second law of thermodynamics. As a large system approaches equilibrium, its multiplicity (entropy) tends to increase. You can with confidence expect that the system at equilibrium will be found in the state of highest multiplicity since fluctuations from that state will usually be too small to measure. The multiplicity for ordinary collections of matter is inconveniently large, on the order of Avogadro's number, so using the logarithm of the multiplicity as entropy is convenient.įor a system of a large number of particles, like a mole of atoms, the most probable state will be overwhelmingly probable. The fact that the logarithm of the product of two multiplicities is the sum of their individual logarithms gives the proper kind of combination of entropies. The entropy of the combined systems will be the sum of their entropies, but the multiplicity will be the product of their multiplicities. It also gives the right kind of behavior for combining two systems. The logarithm is used to make the defined entropy of reasonable size. This is Boltzmann's expression for entropy, and in fact S = klnΩ is carved onto his tombstone! (Actually, S = klnW is there, but the Ω is typically used in current texts (see Wikipedia)).The k is included as part of the historical definition of entropy and gives the units joule/kelvin in the SI system of units. One way to define the quantity "entropy" is to do it in terms of the multiplicity. The multiplicity for seven dots showing is six, because there are six arrangements of the dice which will show a total of seven dots. The multiplicity for two dots showing is just one, because there is only one arrangement of the dice which will give that state. In throwing a pair of dice, that measurable property is the sum of the number of dots facing up. Here a "state" is defined by some measurable property which would allow you to distinguish it from other states. That is to say, it is proportional to the number of ways you can produce that state. The probability of finding a system in a given state depends upon the multiplicity of that state. Entropy Entropy as a Measure of the Multiplicity of a System
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |