Tag: France-United States Relations
The United States is linked to France's involvement in North America. France's explorers and colonies spread across the continent. American troops were indispensable to the United States' independence from Great Britain. France's purchase of the Louisiana Territory puts the United...