Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy.
Entropy was discovered when it was noticed to be a quantity that behaves as a function of a state. Entropy is an extensive property, but it is often given an intensive property of specific entropy as entropy per unit mass or entropy per mole.
It is often said that entropy is an expression of disorder, or randomness of a system, or of our lack of information about it (which on some views of probability amounts to the same thing as randomness).