The Definition and Application of Entropy

: As entropy is more widely used in various fields, the physical concept of entropy has become more and more important. The topic studied in this paper is the definition and application of entropy. The conclusions of this paper were drawn by analyzing papers on entropy research. This paper summarizes and sorts out the concept of entropy in the history of physics and its multi-dimensional application through literature analysis and comparison with the research of scholars in related fields. The significance of this paper is to popularize the concept of entropy and the application of entropy to the general public. The conclusion of the study is that entropy is an extremely important concept for many fields, including economic, biological and cultural fields.


INTRODUCTION
In the industrial age, people are always looking for new energy sources when people find that the first type of perpetual motion machine fails.Scientists at that time thought, why can't lower the seawater by 1 degree to release heat to do work.In this way, mankind has an inexhaustible source of energy, until the second law of thermodynamics proves that this is impossible.Entropy is also derived from this.After hundreds of years of development, entropy, an important concept of thermodynamics, has been extended to all fields of daily life.Such as biology, information technology and even science fiction movies.Generalized entropy has gradually become a significant concept in various fields.The purpose of this paper is to popularize the concept of entropy in the second law of thermodynamics and the application of entropy in other fields.As a thermodynamic state function, entropy can even involve the future of the universe.Thinking about entropy is closely related to energy issues.This is because entropy involves the utilization of efficient energy.The significance of this paper is to popularize the concept of entropy and the application of entropy to the general public.

Origin of Entropy
After Carnot proposed Carnot's theorem in 1850, Clausius and Kelvin discovered a limitation of Carnot's theorem, which is the directionality of energy transfer.Clausius and Kelvin discovered that Carnot's theorem must rely on law, thus proposing the second law of thermodynamics and a new concept -entropy.They found that work can be completely converted into heat energy without any consumption, but heat energy cannot spontaneously gather to do work.Instead, it needs to be dissipated from hot to cold.To explain the new concept of the second law of thermodynamics, Clausius and Kelvin formulated it differently.Clausis's definition of entropy is "It is impossible to construct a device which operates on a cycle and produces no other effect than the transfer of heat from a cooler body to a hotter body."[2] The definition of Kelvin is "It is impossible to construct a device which operates on a cycle and produces no other effect than the transfer of heat from a single body in order to produce work."Both of their formulations are essentially testified that the entropy of a closed system is constantly increasing.

Two Ways to Describe Entropy
The role of mathematics is to quantify physics.In 1865, to explain the physical concept of entropy, Clausius defined entropy by the increment of entropy.Clausius used heat and temperature to define this new thermodynamic parameter entropy ds and dQ are the tiny increments of entropy and heat respectively.Temperature is a macroscopic manifestation of the random motion of a large number of molecules.,Clausius use temperature in the definition of entropy.In another word, Clausius defined the thermodynamic parameter entropy macroscopically.This formula is also the expression of entropy in classical thermodynamics.In statistical physics, another expression that includes the Boltzmann constant, is also known as Boltzmann entropy.The formula of Boltzmann entropy is SHS Web of Conferences 144, 01016 (2022) https://doi.org/10.1051/shsconf/202214401016STEHF 2022 Figure 1 Gas diffusion at the microscopic level leads to a spontaneous increase in entropy [14] Among them, ω represents the number of microscopic states corresponding to a substance in a certain macroscopic state.k is the Boltzmann constant.Boltzmann entropy is also known as entropy at the microscopic level.Since entropy is related to the number of microstates, the degree of the chaos of a system can be described by the Boltzmann entropy formula.[7] Therefore, the concept of entropy can become a variable used by different professions to describe the degree of disorder.Whether it is Clausius entropy or Boltzmann entropy, entropy always increases in their theory.In Boltzmann's expression, heat is transferred from high temperature to low temperature, and the entropy of the system increases.In Boltzmann entropy, matter always changes from a small number of microscopic forms to a large number of microscopic forms, so the entropy also increases.[7]

Entropy and Philosophy
In a closed system, entropy always increases.If the universe is viewed as a completely isolated system, then the entropy of the entire universe will also gradually increase.Until all the energy available in the universe is converted into heat, no energy in the universe can sustain motion or life.This doctrine is called the heat death theory.Many scientists describe entropy as the arrow of time, and when this arrow hits, human civilization ceases to exist.Since everything is going to perish, human existence is meaningless.The idea of heat death is very similar to nihilism in philosophy.Such a fatalism would make people nihilistic.One of the important questions in philosophy is why people exist, and the concept of increased entropy directly negates the meaning of human existence.

Physicists Question Entropy
The important law of the second law of thermodynamics has been questioned after the proposal.Even the proposal of microenen entringent Bolzman also proposed his questioning on the second law of thermodynamics.Even if the proposal of Bolzman entropy is also a skeptical attitude of entropy, Bolzman is a loyal believer of atomism.He questioned the entropy from the entropy through statistical perspective.In formula

S=klnω
(3) The system will spontaneously change from a less microscopic state to many microscopic states, we call it thermodynamic probability.But even small probability events may also occur, and this fluctuation causes entropy reduction.It is also possible, but unlikely, that all air molecules are concentrated in one place and do work spontaneously.Because according to the molecular kinetic theory, gas molecules are in random motion.This statement was later affirmed, but Boltzmann entropy is actually a statistical law, and in the case of a large sample, this does not happen.Another questioner was Maxwell, who assumed that there was a demon that automatically dispensed gas molecules in a closed gas container, and he did not do work on the gas molecules but put the faster molecules aside, and the slower molecules on the other side.This not only ensures that the container is a closed system.It can also reduce the entropy in the container.This assumption directly undermines the second law of thermodynamics, and the flaw in Max's Demon's assumption lies in information.Information is also a kind of entropy, Demons need to know the molecular motion speed information to be classified, and at that time, the concept of information entropy had not been proposed.

Information Entropy
The founders of information theory believed that information is the reduction of uncertainty.Uncertainty is similar to confusion.And entropy is the physical concept that describes the degree of chaos.Therefore, Shannon described information by introducing the physical concept of entropy.Shannon also defined an informatics concept information quantity I, which is the amount of information contained in one possibility.
One of the possibilities included in the information, the probability of this possibility is p.Shannon defines information entropy by the definition of information.The information entropy in a piece of information includes the amount of information contained in each possibility multiplied by the probability of this possibility.The formula of information entropy is Hs is the entropy of information.pi and Is are the same definitions as above.The introduction of the concept of information entropy is not just a theoretical description like entropy in physics.The unit of information entropy is bit, which is the basic unit of data.It sets a lower limit for the transmission of a specific piece of information.If a piece of information is transmitted with the least data, the number of bits used is the information entropy of this piece of information.This limit directly avoids the use of redundant data.Information entropy has made a great contribution to the information age, and it can be called the foundation of information theory.Shannon is also known as the father of information theory because of the introduction of information entropy.

Economic Entropy
In economics, we can use entropy to represent the number of firms and the frequency of their activities.The higher the entropy, the more investment the firm has, the more intense the competition, and the more developed the economy as a whole [11].In thermodynamics, the entropy of a closed system always increases.In economics, this law still holds true, treating effects such as wars and natural disasters as work done on the system, in the absence of external effects such as wars or natural disasters.People can think of this economy as a closed system, and because of technological innovation, the economy is always developing, and the economic entropy is always increasing.The fluctuation in Boltzmann entropy shows that entropy reduction is possible.In economic entropy, this becomes even more pronounced, turning fluctuations into cyclical increases and decreases.With the rapid increase of economic entropy, the degree of chaos will increase, and the number of enterprises will increase.People's purchasing power cannot support the huge number of enterprises, resulting in the bankruptcy of enterprises.Unemployment has risen and purchasing power has fallen further.Demand has decreased, prices have also decreased, the level of consumption of the population has risen, more companies have competed, and economic entropy has increased again.[11] On the whole, the economic entropy increases.

Biological Entropy
The modern physicist Erwin Schrödinger mentioned a point about biological entropy in his book What Is Life."Life is a non-equilibrium system and feeds on negentropy."[12] Schr¨odinger believed that there is a biological entropy in every organic life.Under the condition that the organic life does not exchange substances with the outside world, the biological entropy of the organism is always increasing until the biological entropy of the organism reaches its maximum value, and it will die.People can think that entropy represents the energy that can be used, and all biological activities need to consume energy and increase entropy.In order to resist the increase in entropy generated by the organism, it needs to introduce a negative entropy flow.For heterotrophs, seeking food is their way of introducing negative entropic flows.For autotrophs, the sun and water are sources of negative entropic flow.This author believes that the amount of spontaneous change in the entropy of an organism is different at each moment.In childhood, the spontaneous change of entropy is the smallest, and the value of the negative entropy flow introduced is greater than the value of spontaneous increase, so the biological entropy is in a state of decrease.The whole system of life becomes more orderly.We believe that the amount of spontaneous change in the entropy of an organism is different at each moment.This is the formula for biological entropy [13]: The formula of negative entropy is: The formula of spontaneously increase in entropy is: a, b are coefficients, c, d are time constants.In childhood, the spontaneous change of entropy is the smallest, and the value of the negative entropy flow introduced is greater than the value of spontaneous increase, so the biological entropy is in a state of decrease.The whole system of life becomes more orderly.In adulthood, the spontaneous entropy increase is equal to the negative entropy flow, and the biological entropy remains unchanged.When the organism enters old age, the system gradually becomes disordered and the metabolic rate slows down.Both the rate of acquiring negative entropy flow and the rate of spontaneous entropy gain are slowed.But the rate of spontaneous generation of biological entropy gradually exceeds the introduced negative entropy flow, and the overall entropy gradually increases to an extreme value, and finally enters death.The essence of death caused by disease is the instantaneous maximization of entropy due to external factors.

Entropy in Science Fiction Movie Tenet
Director Christopher Nolan mentioned the concept of entropy several times in his sci-fi film Tenet.Scientists throughout history have used entropy as a symbol of the direction of time.And director Nolan's brain is wide open, directly making entropy and time equivalent.That is to say, if the entropy of a closed system can be reduced spontaneously, time can be reversed.Nolan imagined a machine that could turn entropy increase into entropy decrease.So everything will develop into a more orderly situation.In the movie, after the male protagonist was surrounded by fire in a car accident, he was frostbitten by the fire.This is because in a world with reduced entropy, heat is transferred from cold to hot matter.In a world of reduced entropy, oxygen is not evenly distributed, so people need to wear oxygen cylinders to inhale oxygen.
People can no longer walk normally because heat dissipation due to friction is gone.People think the film is unquestionably artistically spectacular, and the assumptions put forward are refreshing.But from a scientific point of view, it is impossible to build a machine that reverses the entropy increase of a closed system, which violates the basic laws of thermodynamics.

CONCLUSION
Entropy, as a physical variable proposed in the nineteenth century, could provide more or less help and inspiration for many different fields in the twenty-first century with the rapid development of science and technology.From a historical perspective, entropy has become an extremely important concept for human development.Whether in informatics, economics or even art, it has played an important role.We believe that the concept of entropy will also be an important part of sociology, psychology, artificial intelligence and other fields in the future.and contribute to these areas.This paper lacks more materials to explore the specific application examples of entropy in different fields.Future related research can explore the application effect of entropy in different fields more deeply through the combination of data analysis and case analysis.