France- United States Relations: How US Influenced France

France-United States Relations: How US Influenced France

The United States is linked to France's involvement in North America. France's explorers and colonies spread across the continent. American troops were indispensable to the United States' independence from Great Britain. France's purchase of the Louisiana Territory puts the United...