The Democrats are destroying America? Democrats are blind to the fact that their policies are not only destroying cities but our country as well. Post Comment Receive notifications Add a comment...