"India to send world's last telegram. Stop." from Christian Science Monitor
The world's last telegram will be sent on July 14 in India. 144 years after Samuel Morse first sent his telegrams, the technology is finally obsolete.
"So, You Wanna Be an Android?" from MIT Technology Review
Downloading our minds into machines seems crazy to most of us, but a lot of people met recently to talk about the possibilities, including MIT and Harvard professors who have worked with the US government.
It's no surprise that an ongoing conscious existence is something we desire as humans: to avoid death and obtain eternal life. In extreme scenarios like this, seeing the conflicts between technology and the Christian faith is easier, but the conflict still exists in less extreme cases as well. And Christians would do well to consider how their faith informs their technology. For example, all technology is driven by the impulse for control--implicitly or explicitly--and that alone should give Christians pause--after all God calls us to live by faith. These impulses are often at odds, and we need to explore where the line is between the two.
"Connecting the Dots, Missing the Story" from Slate
"Why are things the way they are?" Evgeny Morozov says that we care less and less what the answer is. Why? Because we can change it--we can "fix" it.
Evgeny Morozov points out how we prefer big data to the big narrative. We would rather know how to fix something (data) than know why things are the way they are (narrative). Technology trains us to look past the reasons and look for ways to modify the outcomes (big data).
This preference brings into focus a few things: First, the Bible offers to explain why things are they way they are. And when it comes to fixing the way things are, Big Data is the solution.
Second, methods disappears inside every device so that all we see is the output. Every technology is focused on results, not on understanding what is. We're focused on fixing the problem and changing the outcome, but we've given up trying to understand the problem. Yet everyone knows that how you frame the question determines the range of possible answers. If you're only looking to change the outcome, if you're only always focused on outcomes, you may overlook real answers that lie hidden in an understanding of why things are the way they are (the process).
If you're focused on changing outcomes today, step back and try to understand why things are the way they are. That's also one of the reasons I do this blog: to understand how technology has made things the way they are, and how human nature plays into that.
"Mapping the Future with Big Data" from The Futurist
Along with the article above, and in keeping with some other articles about mapping that I have featured in the Warp and Woof, here's another one. This one gets more interesting in the second half, looking at the problems created by mapping.
Among the problems is the potential for maps to empower discrimination. The author exonerates the mapping technology though, saying "none of these problems is the fault of . . . interactive maps." The underlying assumption here is a common one. Like any technology, maps offer certain options to users, but also limit other options. The user sees himself as having more options, when in fact its a different set of options, with various options lobotomized and completely elided. With the options given, a person is empowered to choose. He believes he has as many options as possible; but he doesn't see the options he's not offered. The map doesn't show him what he doesn't have. So he's ushered into a certain set of biases. Of course "interactive maps" should take some of the blame for the problems outlined in the article.
The solution offered in the article? Use more technology. This is a classic attitude: using technology to fix the problems created by technology.
The article concludes with another interesting commentary. "We've dissected our world into specialists of science." That dissecting is the nature of technology. In the process, we kill the things we've dissected. You can't bring a dissected body back to life. What we've taken apart we can't put back together, no matter how much we'd like to believe we can. There will always be screws left over. The article concludes: "How do you put it all back together again? That" he says, "is the interesting part." We take things apart only to try and put them back together? Why does this make sense? We refuse to accept the unity of something that we can't understand, so we dissect it. But in doing so, we lose the essence of the thing.
"Social Networking in the 1600s" from the New York Times
Tom Standage is a great history of technology writer. He wrote a delightful book called The Victorian Internet, and has another one coming out about social media. This article is a foretaste. He claims that coffeehouses were an earlier edition of social media. I recommend the article.
However, fundamental differences remain between coffee houses and the Internet. Even if similar human impulses and attitudes produce both, the structures are quite different, even beyond the lack of face-to-face interaction. The ethos of the coffeehouse was to disregard class distinctions and promote conversations with strangers, but the ethos of the Internet, while broad, is closed to certain segments of the world population. And as for conversation, Twitter is horrendous for coherent conversation, Facebook is a little bit better, but limited to your social circles, and blogs which few read and the comments there are mostly painful. Conversation doesn't thrive on the Internet. Just a lot of talking without listening.
The spirit of the Internet and coffeehouses may be similar, but the structures entirely alter the character of what is actually happening.
"Voice of the Graduate" (pdf) from McKinsey on Society
Among college students, liberal-arts and performing-arts majors are less well employed than those is business, science, technology etc. This points to an increasingly technical workplace--one driven by technology and the technical mindset, marginalizing the arts (see the third article above). The mechanical world of business, science, and technology empowers those with such aptitudes and alienates those without. Society as a whole is increasingly mechanized and technical, and it favors those with skills that reinforce that pattern.
Students, as a result, find they need more specialized skills. Again this is a product of a technical society--this parsing out--where outcomes matter, and experts can obtain such outcomes. Those preferring context, favored by the arts, are marginalized. Morozov points to this by describing Big Data and Big Narrative, one that ignores context, the other which tries to understands. McLuhan and Postman both criticized the expert as a product of technical society. It's clearly in play here. The specialist is favored.
Warp and Woof 6/7/13
Warp and Woof 5/30/13
Warp and Woof 5/25/13
Warp and Woof 5/17/13
Warp and Woof 5/10/13
Warp and Woof 5/3/13