Democrats are identified as "liberal," but for the most part they're centrists. Republicans have become extreme right wing conservatives.
Trump is a Republican. Biden was a Democrat.
In the past, things were different. In the mid-19th century, for example, the Democrats were the establishment part and very conservative. It was the Democrats of the time who supported slavery. The newly founded Republicans, in contrast, opposed slavery and supported maintaining the union; I don't know if you'd call them liberal or not.
This was the way of things through the Civil War and most of Reconstruction. Over time, though, the Republicans became the party of business interests and favored "rugged individualism" over communalism, while the Democrats became the part of labor and working people. By the time of the Great Depression, both parties had flipped, mostly.
After World War II, there was a lot of turmoil in the Democratic party. Part of the party supported ideas that might be called socialist — Social Security and such — as well as equal rights for women and minorities. Other parts of the Democratic party held on to its blatantly racist past. These were the Dixiecrats (I'm unsure if they ever actually split off into a third party).
After the recessions, social upheaval and other issues of the '60s and '70s, America turned to an aged former actor named Ronald Reagan to "make America great again." By this time, the Republicans and solidified as the pro-business, pro-conservative values party. As it was no longer socially acceptable to publicly campaign against things like equal rights, the Republicans allied themselves with the conservative evangelical movement and adopted an anti-abortion stance as its rallying cry. The Reagan years overturned a lot of what America had valued up to that point, laying the groundwork for the disastrous regimes of the Bushes and the current felon-in-chief.
So, there's a quick overview of American politics, and my pedantic lecture of the day.
Class dismissed.