The term American dream has been within America for generations. People came here from all over the world to reach this dream too. Basically what it meant to them was a better life and a better future for themselves. Many did not reach this goal but at least they had the chance. They for once had options or here says or how there life may turn out.
But America is totally different now, we have evolved and formed new ideas of how we should live. So do you think the American dream has a new definition now compared to when it first came out? I believe it does for many reasons.
- America is now all about drugs, sex, and money. Nobody truly cares about that person to the left or right of themselves. We as all Americans are so selfish that we can’t help out a fellow neighbor even if we have too much.
- Education is a lot bigger than it once was. This is a dog eat dog world so without knowledge you can’t go anywhere in life. When this term was first introduced everyone worked instead of going to school. School was seen as a place mostly for the rich.
Those are all just the tips of the ice burges though. The website reference goes into more depth about it all.