americans only save the world in movies
28
4
5132
It’s the lie we tell ourselves
0
0
5
In the movies, the end of the world means something bad happens in New York or Los Angeles.
No American movie gives a damn about what happens in Guatemala or Cameroun.
1
0
27
So, we are talking about "American movies", then.
Why shouldn't American movies see the end of the world from the perspective of American protagonists with "and in other countries" represented by the "international news segments".
It's the same in the movies.
1
0
4
In movies that some of us really believe are real or really possible #TopGun
1
0
1
American movies
0
0
2
What she said!
0
0
0
True… since 1945.
1
0
2
Since ever. 😂😂
0
0
4
In all fairness, they've typically also created the context which endangered the world in those movies, so…
0
0
34
We hope they stayed in books and tv🤣🤣

Cuase they have never once done anything good.😅
0
0
0
American movies that is.
0
0
4
Id say the world is pretty shitty. Maybe its not ment to be saved. But just like anything,that we once loved and was never good for us...it doesn't make it any easier to "let go"...
0
0
8
It feels like that, Lena.
America needs a hero right now, but unfortunately they are pretty sparse on the ground. I wish Congress would grow a spine and deal with Trump. How difficult would it be to toss him out of the White House? They
managed it with Nixon.
0
0
2
And since none of us can read a lot of us believe that myth. Telling us this is a lie makes us very upset.

I hate that I have to say us.
0
0
3
...and only in American movies.
0
0
41
Hey now hey now, also in fiction books.
1
0
33
Yeah, but we all know Americans don’t read books and that’s kinda why we’re at this point right now.
1
0
44