Information vs. Entropy

Entropy is that strange, difficult-to-understand/define property of the physical world which is somehow related to equilibrium and as anyone who ever sweated through a physics class knows, in a closed system, always increases. 
The change in the amount of entropy in a system is inversely related to the amount of information needed to describe that system. 
Consider the following classic example:  Ice melting.  Take a glass of ice water and put it into a box which is at room temperature.  Now you have a closed system with three different states of water and three different temperatures: water frozen solid in the ice, water in a liquid form (a little warmer than the ice) and water vapor suspended in the air in the box (at room temperature, a little warmer  than the liquid water).
So, in order to describe the system, you need to use 6 different variables (the three different states of matter water can be in (a. solid, b. liquid and c. gas and the three different temperatures).

Eventually, the warmer water the ice is floating in causes the ice to melt.  In doing so, it imparts some of its latent heat to the ice and cools down.  Now, you have only four different variables to describe your system:  Liquid water (d) and water vapor (c).  The liquid water is cooler than the vapor.  So you have two different states and two different temperatures, down from the 6 in the previous state. 
Eventually, the water vapor imparts some of its heat to the liquid water which heats it until they are both the same temperature.  At that point, you have only two variables:  The liquid state of the water (d), and the gaseous state of the vapor (c).  There is only a single, common temperature that they both share.
Over time, the liquid water evaporates and you have only a pair of values:  The single temperature of all the water vapor (c).  No ice, no liquid water, nothing other than the water vapor and it's room temperature.
The second law of thermodynamics tells us that no system will lose entropy over time.  In fact, entropy naturally increases.
A given closed system will contain more entropy at any future point than it does in the present.
IF the information required to describe that system is inversely related to the amount of entropy, THEN a given closed system will require less information to describe that system at any future point then it does in the present.  A simpler way to put it is to say that the system will contain less information.
Therefore information is anti-entropy, or entropy is decoherence.  One increases over time, the other decreases over time.
Another possibility, is that Information and entropy are the same thing.  One is viewed in the arrow of time, and the other is viewed in reverse.