Disclaimer
I want to make it clear that this post is meant as a conceptual introduction to an often misunderstood topic. I do not go into the mathematics associated with entropy, nor do I discuss the contributions of perhaps the most crucial person in the history and development of entropy, Ludwig Boltzmann. This discussion is meant to introduce the reader to the basic terminology, including the laws of thermodynamics as well as the concept of entropy.
If I ask you the following question: “what is entropy,” how would you answer? Would you describe it as the tendency of a system to move towards disorder or chaos? This seems to be the oversimplified definition that many people have learned. I’d like to offer a different description of entropy, one that is a bit more in-depth and one that is more accurate. This idea of an increase in disorder is not altogether incorrect; it is somewhat incomplete. I hope to give you a conceptual picture of entropy while avoiding the cumbersome math involved.
Thermal Physics Vocabulary and The Laws of Thermodynamics
We need to discuss some topics of thermodynamics in order to more accurately understand entropy. I’d like to introduce some common vocabulary terms associated with thermal physics. These terms are presented in no particular order. Kinetic energy is the energy of a body in motion. The greater the speed of a particle or particles, the greater the kinetic energy. For our purposes, temperature can be described as the average kinetic energy of an object. Meaning the average kinetic energy of the particles at the microscopic level. As particles are heated up, they move faster and collide more often with each other than when the particles are at a lower temperature. Thermal equilibrium means that two objects that are in contact with one another have reached the same temperature. Thermal energy is simply heat. Absolute zero is a temperature so low that there is no heat is available to transfer to other systems. Sciencedaily.com describes absolute zero as “the point at which the fundamental particles of nature have minimal vibrational motion, retaining only quantum mechanical, zero-point energy-induced particle motion.” Absolute zero is measured a 0 on the Kelvin scale, which corresponds o -273 on the Celsius scale and -460 on the Fahrenheit scale.

Let’s look at the terminology that is specific to the concept of entropy. A microstate, according to khanacademy.org, is “the arrangement of each molecule in the system at a single instant.” A macrostate is the observable configuration of the microstates. So what does that mean? Think of flipping 3 coins, labeled as A, B, &C. How can you end up with 2 heads and 1 tail? You can end up with an arrangement of heads for coin A, heads for coin B, and tails for coin C. You could end up with tails for coin A, heads for coin B, and heads for coin C, or you could end up with heads for coin A, heads for coin B, and tails for coin C. The exact configuration of which state (heads or tails) a specific coin lands on is the microstate. In the example I gave, you have a total of 3 microstates corresponding to a single macrostate. The macrostate is 2 heads and 1 tails. The microstate is the specific arrangement of the coins, as seen below.

In the above chart, you can see the specific 252 microstates that are associated with the microstate of a 10-coin system.
The Laws of Thermodynamics
The next topic worth discussing is the numbering system as well as the laws of thermodynamics. There are four laws, but they begin with the zeroth law and end with the third law. The zeroth law was actually included as a law much later than laws 1,2&3. “British physicist Ralph H. Fowler who first coined the term ‘zeroth law,’ based on a belief that it was more fundamental even than the other laws.” (https://www.thoughtco.com/laws-of-thermodynamics-p3-2699420) This zeroth law states that if two systems are in thermal equilibrium with a third, then they (the first two systems) are in equilibrium with each other. This is the basis of temperature and is actually how the old glass thermometers work. You place the thermometer under your tongue and wait for the thermometer, the patient, and the fluid in the thermometer to come to thermal equilibrium and take the temperature reading.

The first law of thermodynamics is a restatement of the law of conservation of energy. It states that energy in the universe can neither be created or destroyed; it can, however, be transformed from one form to another and be transferred to another object. An example of transformation from one form to another would be dropping a rock from a high cliff. Before the rock is dropped, it contains potential energy (think stored energy), and once it is dropped, the potential energy is transformed into kinetic energy (energy of motion) as the object falls to the ground. When the rock hits the ground, it transfers its energy to the ground.
According to https://www.grc.nasa.gov/WWW/k-12/airplane/thermo2.html, the second law of thermodynamics states “that if a physical process is irreversible, the combined entropy of the system and the environment must increase. The final entropy must be greater than the initial entropy for an irreversible process: Sf > Si (irreversible process)” This law tells us that the entropy of the universe is always increasing. There may be a local decrease in entropy in a system, but the entropy increase in the surroundings will always be greater than a local decrease.

The third law of thermodynamics states, according to http://physicsforidiots.com/physics/thermodynamics/, “As temperature approaches absolute zero, the entropy of a system approaches a constant minimum.” This law is also used to illustrate that absolute zero is a temperature that is so low that it can never actually be reached.

So Now What?
So now we have these vocabulary terms and a set of laws, but how do they apply to entropy? First off let’s look at a definition of entropy, according to www.merriam-webster.com/dictionary/entropy entropy is “a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system’s disorder, that is a property of the system’s state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system.” So is this the best working definition of entropy we can come up with? Short answer no.
Let’s take a look at what is really going on with thermodynamics and entropy and why the idea of “the tendency of a system to increase disorder” is not the best description of entropy. Think of a cup of black coffee which has liquid creamer poured into it.

It may appear that the creamer is in a highly disordered state, and as a result, you might predict that this cup of coffee would have greater entropy compared to a cup of coffee below.

The coffee cup in the first image does appear to be more disordered, but it is the coffee cup in the second image that has greater entropy. How can this be? If we change our working definition of entropy to be the tendency of a system toward the state with the maximum number of microstates, we can answer this question. In the first image of the coffee and creamer, as with a macrostate of 3 coins being heads, heads, heads, there are fewer ways in which the objects, molecules of creamer, or coins can arrange themselves for that specific configuration. If we look at the coffee with the creamer uniformly mixed with the coffee, we can see that there are more ways for the particles to reach uniform mixture because it doesn’t matter which particles of creamer mix with which particles of coffee in order to get the even mixture. Just as obtaining 2 heads and 1 tails is more likely than obtaining 3 heads because of the number of possible ways to obtain either outcome. There is only one way microstate that gives a macrostate of 3 heads, but there are 3 microstates that can deliver the macrostate of 2 heads and 1 tails.
Another example which illustrates that disorder does not necessarily equate to entropy is the universe. The entropy of the universe was very low after the big bang and had been increasing ever since. The universe moments after the big bang would have had little structure as compared to what we have now, galaxies, stars, planets, people. The kicker is that because of so many ordered and structured objects in the universe today, and there are many more microstates available to create all these objects.
The above are two examples why we need a better working definition for entropy than one that focuses on disorder. A more accurate or complete definition must include the idea of a system tending toward a state of the maximum amount of microstates for a given macrostate. There are many times when the old fashioned definition of chaos and disorder applies just fine, but it is not a completely accurate description.