I'm honestly very ignorant about 20th century history, but from what I've heard, the 60s and 70s were a very important time for social change. It seems as if a huge amount of social injustice was righted in those two decades, but then it seems as if the impetus kind of dropped off. It seems like now, in the 21st century, that yearning for social equality is getting a bit of a second wind.
So, what happened in the 80s and 90s? Did people just become complacent with all the change that had just happened and not want to fight any more? Or is there a large chunk of history I'm ignorant about?