Answer :
Answer:
Manifest Destiny was the widespread belief that it was the duty of the US to explore the uncharted territory westward and find other places to make profitable good. This meant, in short, an obligation to move west.
Please Give Me Brainliest So I Can Level Up! plz