Well, simple explanation...
It's a measurement of one system's disorder...
For example: Consider a system of particles and each particle can be found in some number of physical states, for example, the energy states (like every particle can have the energy 0 or some number E for example)
The more particles you have and the more states (of energy) they can be in, the entropy is larger... The point is that you don't (and can't) know the exact state of every particle. You can measure for example, temperature, pressure, volume, total energy of the system,... but there are many possible states (for particles) that correspond the same macroscopic state.
If you, for example, have only one particle which can be found in only one state, the entropy is 0, because you know everything about the system.
The more particles you have, and the more states they can be in, the less you know about the system (you can know it's macroscopic properties, but you don't know which particle is in which state)... which means the entropy is larger.
Hope this helps a bit :)