Daz_Hockey, America didn't colonize the continent. They pruchased Louisiana, won the Oregon territory through a peace treaty, united with Texas, and grabbed California from the Mexicans. All that land was under control of someone else, not indiginous peoples. And they didn't claim it as freedom, read up on Manifest Destiny. If it hadn't been them, it would have been somebody else.
And Britain may have been better than the United States at dealing with the natives, but the French were better than th English before the French and Indian Wars (7 Years War). The vast majority of the natives sided with the French in that litle war.
And Britain may have been better than the United States at dealing with the natives, but the French were better than th English before the French and Indian Wars (7 Years War). The vast majority of the natives sided with the French in that litle war.