Real American History That The ‘Woke’ Left Doesn’t Want You To Know
It's no secret that school textbooks don't cover the whole of American history, and that important facts about this great nation are often omitted to serve a narrative. In the name of keeping truth alive, The Onion presents these real accounts from American history that the woke" left doesn't want you to know.