When Joe Biden won the 2020 presidential election, he promised to be a “transition candidate.”
Biden’s presidency sparked hope of stability after four tumultuous years under Donald Trump. A global pandemic certainly played a factor, but the reality is that America had not seen such radical, sweeping changes during a four-year tenure as they had in any other electoral cycle.
President Biden’s role seemed to be to reverse the most extreme Trump-era policies, while laying the groundwork for a future Democratic president to take the reins in 2024. It’s not that anyone expected him to do nothing, but many assumed that Biden would bring America back to the center after four years of right-wing leadership.
Instead, Biden took the nation past the center into a new radical left where even Democrats wonder if he’s gone too far.
[Oh, I don’t know about that. To paraphrase Han Solo, “I imagined quite a bit” of awful when he took the throne. ~ Beege]
Join the conversation as a VIP Member