Entropy tells us about the "uncertainty" of a probability distribution, i.e. roughly how much information is needed to describe an event that is described by a probability distribution. With this view it is natural that the uniform distribution is the one that maximizes the entropy function. This is the information theory point of view of entropy.
On the other hand, in physics textbooks the second law of thermodynamics says that a system tends towards a a macrostate that maximizes the entropy. For a finite system this means choosing a macrostate whose corresponding microstates have the greatest multiplicity.
These two viewpoints should be in agreement, but I'm confused as they almost seem to be in contradiction with one another. For example, in physics the multiplicity graphs usually have a very sharp and localized peak, which is almost as far as a uniform distribution as you can get. How does the physics view of entropy agree with the idea of getting as closed to a uniform distribution as possible?