Most people need to wait a lifetime to watch the relevance of their first job fade off into the distance. I only needed to wait a decade.
You see, my first job after college was working as a fact checker for Chicago Magazine.
Back in 2009, this title used to carry a bit more weight than it does today. Nothing went to the presses without passing through the fact-checking team first. We were the gatekeepers, the arbiters of truth. And we took our jobs very seriously.
To “fact check” an article, I’d print out the entire story and attack it with array of multi-colored pens and highlighters. The goal was to identify any facts or potentially problematic language that I’d need to verify somehow.
To give you a sense of what this might look like, here’s an example paragraph from a piece I wrote for Chicago Magazine back in 2009 showing how I might mark up a document.
Fact Checker’s Key:
Circled in red = Fact: Need to confirm accuracy
Strike-through = Not a fact: Filler language
Highlighted in yellow = Potentially inaccurate; may need to reword
After identifying all of the potentially problematic language, I’d pour over haphazardly organized notes from reporters, re-read research papers, and even call back interviewees to confirm key details.
As you might imagine, this was often a horrifyingly grueling and tedious process.
I once spent an entire week trying to validate how many “out of office” days President Bush had spent in his first 100 days in office, compared to President Obama. This process included several unsuccessful email exchanges with the White House and presidential library teams that only led to even more questions, such as:
- Does a half-day golfing count as an out-of-office day?
- If the President interacted with another national leader on the golf course, was that a “meeting” or a “vacation” day?
- If a President started the day at Camp David, then returned to DC that evening, how should that be counted?
As you might imagine, fact-checking was a fairly thankless job. But it was not for the faint of heart.
To prepare for the herculean task of ensuring accuracy at all time, I spent four years conducting high-intensity journalism training at Northwestern University. Essentially, this was all to make sure we’d be scared shitless to never to tell a lie in public.
To whip me and my peers in shape, our classes featured memorization quizzes on entire sections of The AP Style Guide and obscure spelling words like dybbuk. (Interestingly, that’s the first time I’ve ever managed to actually ever use that word.) Our homework assignments would earn us an immediate “F” if it contained even a single spelling mistake or misappropriated fact. And our professors drilled grammar rules and conventions into our heads with so much conviction that I’m fairly certain I’ll still be puzzling over mind-benders like “lay vs. lie” while I’m on my deathbed.
Of course, this obsession with accuracy didn’t come without its consequences.
When I interned at a local newspaper during the summer of my sophomore year, I made my first public mistake when I misspelled the name of a musician who conducted a local concert. As punishment, I locked myself in my bedroom and sobbed for three-hours straight as I wrote heartfelt apologies to the conductor, my editor, and the newsroom team. I seriously debated quitting my internship on the spot.
That night, I had a terrifying episode akin to the musical mental breakdown, “Spooky Mormon Hell Dream” that character Elder Price experiences in The Book of Mormon.
Ten years later, and I wish I could tell you that I even remembered the guy’s name. But of course, these things never quite matter as much as you think they do.
Some people might debate that facts don’t quite matter as much as they used to. This naturally makes me pretty sad. Who’d have thought that all those courses on libel and defamation would have the potential to be eradicated with just a single Tweet?
A fact by any other name
The world seems to have flipped 180 degrees in the decade since I’ve graduated.
Now, rather than obsess over every minute detail, it’s almost as if we’ve collectively decided that facts no longer matter. Or rather, that anything can be a “fact” as long as it’s been cited or quoted anywhere else first.
I was having a hard time coming to terms with this new reality so I made myself a handy guide to help:
Fact Checking Guide: Then And Now
I’ve come a long way since graduation. For one, I’m a lot less anal than I used to be.
Sometimes… I’ll use “less than” when I really mean “fewer” or link only to a Wikipedia article as evidence that something exists. Once, in an email exchange to my entire team, I used “to” when I really meant, “too.” I saw the mistake immediately but I didn’t “undo send.” I just let it sit there, waiting for the Grammar Gods to strike me down on the spot.
But nothing happened.
(Today, I consider that single act of rebellion to be the start of a slippery slope toward both grammatical and social conformity in my life.)
That said, it’s been harder than I thought it’d be to accept this new norm. I’d be lying if I didn’t admit I slept a little easier when AP Style finally changed their approved use of the word “Web site” to “website.” (All those years of guilty emails were starting to get to me.)
The thing that gets to me the most — living in this “post-fact” world — is just the derogatory way that the phrase, #factcheck has gotten a bad rap in recent years. Any time I see this hashtag in a Tweet or the phrase “Fact check!” shouted in a political debate or angry dispute, it triggers me a little bit.
How DARE they throw that phrase around with such flippancy? Who gives the average person with an untrained eye the right to claim ownership over a craft that took me YEARS to learn?
After all, I put in my time in learning the rules. I paid my penance in anti-plagiarism lectures and stress-eating during Finals Week as I wondered whether the draft I’d turned into my professor included that spelling mistake. I’ve got the writer’s wounds, the failed assignments, and the mental scars to prove it.
Today, on the other hand, everyone is a journalist. Anyone can prove anything to be true — just long as they get it online first.
What’s that you say? You have a blurry Instagram photo from someone’s account with 5 followers that you think proves your point?
I say — You don’t know the first thing about fact checking! To be a 22-year-old college graduate and believe that your entire life and career is built upon your performance on one single task.
To make sense of the jumble of words thrown your way and painstakingly research every last word to guarantee that every reader gets the accurate, objective content that they deserve. To feel the weight of the world on your shoulders to hold true to that implicit promise you make to every reader: That you will always tell the truth.
Oh, I’m sorry. You *also* have a second source from some guy who used his iPhone camera and then posted a short video that has been Retweeted 1,302 times and counting.
To that, I challenge you: How do you REALLY know that he shot that video? Did you actually look at his phone? Track his geolocation and prove he was where he said he was at the time the event took place? I mean, honestly, do you even know this guy is legit?? I bet he’s the kind of person who thinks “a lot” is one word! Come on now. You’re better than this. We all are.
I’m sorry. I’m getting carried away.
I like to think the job of a fact checker is more important now than ever before. I have to believe that there’s still a place in this crazy world for the kind of person who is committed to the truth. That we still care about legitimacy and validity and accuracy.
Maybe today, the biggest difference is simply that fact checking jobs aren’t assigned to some fresh-out-of-college graduate getting paid minimum wage. Maybe it actually is incumbent on all of us to hold each other accountable for what’s factual. Maybe this is the best and only way to make the truth more mainstream, to call attention to liars, to construct evidence-based arguments.
And I guess if that’s the case, then I’m okay with it. It’ll be less lonely to be the only one obsessing over these things, to be sure. Just so long as we all agree on one thing: Using the Oxford comma is always a good idea.
Also published on Medium.