Will the country ever unite again, or are we destined to a divided America and will go the way of the Roman Empire?
I'm not convinced Biden can unite the country, I'm not sure all these executive orders he has signed in the last 2 weeks is helping to unite the country, instead he may be causing the far right to dig in even deeper. Your thoughts?