The Second Eclectic

Technology changes how we relate to God and each other

What What Would I Say? Says About Us

The AutoBots are here. They are us. This week, by storm, on Facebook, they took us. My Facebook feed blew up with autogenerated status updates starting yesterday afternoon. It's addictive. The What Would I Say? website explains: The AutoBots are actually a computer program called a “Markov bot”—based on a statistical math model named for Russian mathematician Andrey Markov. The Markov bot uses “bigram and unigram probabilities derived from your past post history.”

I didn’t know what this meant so I did some research. And by “did some research” I mean, I went to Wikipedia.

What It Does

Bigrams and unigrams are subsets of what are called “n-grams.” The “n” in “n-gram” is simply a place holder for “any variable.” In this case, the variable is a single word or word-pair. So with unigrams, we’re talking about “one word.” With bigrams, two. In other words, the What Would I Say? AutoBot selects a word or two and then predicts what the next word or two is most likely to be.

How does it predict the next word? By collecting all your previous statuses—to the program, they're just a “bag of words”—and by analyzing the proximity of each word to the words around it. How often does the word “parrot” appear next to “pirate”? And so forth. The Markov bot runs this analysis and then, by selecting a word or two, selects another likely word or word-pair to pair with them. Parrot? Pirate! Perfect.

So: What does it say that What Would I Say? has stormed in and taken us? Why are we so interested to know the answer? Why is it SO ADDICTIVE?

What It Says

The answer seems to be somewhere in our current relationship to computers and computing. We want to know just what a computer could do with all we’ve put into it. “Enough about me. Let’s talk about you. What do you think about me?” It’s the same sort of thing here.

We’ve spent years on Facebook punching in punchy, pithy phrases for ourselves and our friends. Now we want to know what sense  of us a computer could make. We want it to reflect ourselves back to us. We like looking in the mirror.

The results, of course, are ridiculous, fun, funny, and somehow sort of true. The sentences are disjointed and fragmentary. Entertaining for their foibles, but also for their accuracy. And in fact, the AutoBots do seem a bit like the mirrors at a carnival. They stretch and exaggerate. The scatter and gather. The distort and contort and compress. But there’s still something accurate about them. And for that, we laugh and wonder. What does What Would I Say? tell me about myself? What do I tend to say? What are the most probable words and word-pairs? What do I post about a lot?

What Would I Say? does exactly what every technology does: Reflect something of ourselves back to us. Whether its a car translating our financial status into an object and our legs into wheels, or satellites which extend our eyes higher than maps and mountains. Sometimes, like in the chrome of a water faucet, the reflection is too distorted for us to readily recognize our own faces, but somewhere in there, the reflection is. At other times, as in a mirror, the reflection seems uncanny, albeit reversed.

Now, imagine if What Would I Say? became the default way of posting our statuses. Imagine if it became our normal grammar and syntax. Imagine if the distortion came to be recognizable and taken for reality. Well, the reality is that such is the case. We are embedded and ensconced in technology, just like the Markov bot. We use technology to project a distorted image of ourselves all over Facebook, Twitter, Instagram, Pinterest, and the rest. Over voicemail and email and gmail and gchat and text messaging. Only a portion of ourselves makes it through—some portions more recognizable than others. But all of them contribute to who people think we are—for better or worse: The AutoBots are here. They are us. And for a moment, at least, we can see through a glass darkly to the glass darkly, and well enough to see partly how darkly it is.