Answer:
The Civil War profoundly changed the United States. First, as a result of the war, slavery was abolished with the Thirteenth Amendment. While the country would not grant full civil rights to African Americans yet, ending a practice that had been abolished in much of the Western world was a nice start.
Explanation: