(1865–1877)—The end of the Civil War marked the beginning of Reconstruction, the era of trying to put the Union back together. Congress instituted sweeping political, economic, and social changes in the former Confederate States. During this era of reconstruction, African Americans gained many well-deserved rights but were still denied full equality, most notably in the south.