Ask Your Question today
When you say Americans, you mean the United States right? Strange, people don't know the difference between a continent and a country.
Is it normal that I think republicans are destroying america?
← View full post
When you say Americans, you mean the United States right?
Strange, people don't know the difference between a continent and a country.