Friday, January 13, 2012

Why do Americans think they own America?

first of all, pilgrims killed and murdered tons of poor Indians and took over where they lived. Started a manifest destiny and started a war agaist Mexico to take over California, Texas and a whole bunch of other states that now belong to the U.S. Took over Hawaii and Alaska were there are mostly natives. ANd now they want to say "mexicans are taking over". Whats up with that??? Thats called karma. Whats your opinion?

0 comments:

Post a Comment