Is America better off when the Democrats are in control of government or the Republicans?
Why?
Is America better off when the Democrats are in control of government or the Republicans?
Why?
Republicans don't govern. They just engage in performances, virtue signaling, and culture wars. They tout the merits of "small government," so they figure that doing nothing of substance is the way to do their jobs.
Democrats govern. They try to improve people's lives and correct social inequities. They believe that they were elected to do a job and effect change.
Whether you prefer one side of the other largely depends on whether you approve of the changes that Democrats make/propose. An exception to that would be Republican fiscal policy, like tax cuts for the rich and periodically shutting down the government. If you LUVVV all that, well, then, go RepubliQ.
Is the country better off? Republicans make things worse/do nothing. Democrats improve things. But NEITHER side is "in control" right now, so not shit gets done. We're not even paying our bills, while the RepubliQ warm up their idiotic "inquiry" into Hunter's mystical mythical magical laptop.
Oh, and the RepubliQ are unethical, dishonest assholes.