War Is Over (If You Want It)

Apparently, the only 24-hour waiting period everyone can agree to is a moratorium on controversial speech after gun violence, because everybody and their dog seems to be talking about on my Facebook today.

My usual stance that “You only think you’re gonna be a hero until you accidentally tag a five-year-old with a ricocheting bullet while trying to save the day on an adrenaline rush” will go largely disregarded by the rabid advocates of the 2nd Amendment, and that’s fine. I never said I was right. I’m just saying that I’m not wrong, either.

Armed violence happens every day in this country. Last night around 4 AM, while the allegedly 24-hour news networks were busy rerunning their analysis of Sandy Hook or playing infomercials for kitchen gadgets, an armed lunatic shot up a hospital in Birmingham, Alabama, inflicting injury upon a cop and two staffers before being (how did they say it in the Oregon mall case?) “neutralized.” (I’m reminded of Katt Williams’ speech about the power of the word “insurgent” for disinformation purposes.)

The day after the Aurora shooting, a bus full of undocumented immigrants tipped over, killing as many (if not more, I don’t feel like looking it up) people as the armed gunman the night before. The only difference between those 16 lives and the other is the terror caused by the violation of the sacred space of the theater. But the specifics of a tragedy don’t change the fundamental nature of tragedy itself.

So just go about your lives, people. It’s not that there’s nothing to see here; quite the contrary. It’s just that all that remains in the end is the abyss of the greater demons of human nature, and we all know what happens when you look at that for too long.

Noah, the Ark, and Information Overload

The various media industry trade groups have a very real problem. They can see the floodwaters rising, crashing against the very old levees that protect their cars, their houses, even their penthouse offices in some extreme cases. And they’re scared. And they don’t know what to do. And they’re asking the government to declare a disaster area and extend them special powers to stem the tide. But the tide of history waits for no man, and I think it’s time for them to find ways to divert the waters in a way that will let there be fertile ground for the people down their income stream.

Old media face two cascading waters flowing relentlessly towards them: the relentless march of freedom that drives tech innovation; and quite possibly the most disaffected, cynical, and entitled generation of Homo sapiens to ever walk the earth as their client base. The Information Flood has arrived, and there seem to be three approaches to riding it out:

1. At the most basic, a media corporation can shore up against the stream. But right when Newt Gingrich, the Class of ’94, and Bill Clinton had handed the keys to the kingdom to the media corporations with the Telecommunications Act of 1996, the first clouds began forming on the horizon: the Internet, a literal doomsday tool of communication designed to survive a volley of Russian nukes, had opened like Pandora’s Box upon the unsuspecting world just a couple years prior.

The rivers swelled with innovation. The flowing waters began eroding the barriers to entry for the average citizen of the Web. But these things were never more dramatic when Web upstarts began cutting canals to divert this flow to their own purposes. Soon, in due course, battles began forming over rights and damages, as they have since the dawn of civilization. The poor, the bandits, and the unscrupulous cut their own channels such that the dams, now breached, will never be whole again.

2. So some have built massive towers. The biggest five media conglomerates, in their shining monuments to analog distribution, have formidable resources at their disposal to keep sending lawyers to go after the thieves, either personally or by reporting to Washington to reinforce the levees. But even they can only stem the waters for so long.

We’ve already witnessed the disruption that modest grassroots can cause when they push up and crack through the stoneclad foundations of repressive society in the Arab world. The cracks have become as innumerable as the blades of grass, and the legal attempts by the RIAA et al to spread weed killer on the grass often destroy more of their client base than they intend to.

3. Or you can build an Ark, like the most successful Web companies have, full of two of everything known to man, whether it’s search results (Google) or the Long Tail (Amazon). Or a party boat with you and your best friends.  Or a tiny little boat for surfing the net, like Twitter. And a dozen ways to bottle the water, like freemium (ads in apps), “season pass” (episodic gaming), or appealing to one’s fans to do the right thing (Louis C.K.’s Live at the Beacon).

Or even better, you can be up in the Cloud, among all the free-floating raindrops in there, creating static and electricity that sometimes is drawn to the earth along the path of least resistance, creating a hot, brilliant flash of energy that can power machines and create wonderful works of art through the works of its beholders’ eyes.

It’s kind of a beautiful metaphor, isn’t it?

Insert Coin to Continue

We had to write a research paper to cap off my Summer II “New Media Issues” class. I was pretty stoked for it, because new media is practically in my DNA. Being born in 1985, I had a natural progression out of the analog era into the digital, and I actually remember what it was like before the Digital Revolution. The nostalgia cuts both ways–a word literally meaning “the pain of remembering home.” These days, the only place you’re going to hear the beeps and screeches of a telephony modem is in dubstep, there aren’t even physical video stores to tell you to “be kind, rewind,” and when is the last time anyone saw an arcade?

I did my paper on the few arcades that remain, and how this is possible in a world where technology, sociology, and economics should have utterly destroyed arcades as a third place in America. Not that it’s universally dead. This article by Kyle Orland of Ars Technica interviews Brad Crawford, who is currently in post-production on 100 Yen: The Japanese Arcade Experience. Crawford has observed that the continued success of Japanese arcades has as much to do with sociology as economics:

To Crawford, the very different fates that have befallen North American and Japanese arcades has to do with demographics and urban planning as much as differing gaming tastes between the two countries. “Japan is so dense in population, such a mass of people, so that’s really why things are this way, there’s so much foot traffic everywhere,” he said. “A lot of North America just doesn’t have that quality to it.”

Japan’s train-based transportation culture makes it more inconvenient for those who live downtown to visit friends in the suburbs, Crawford said. Generally limited living space means the giant suburban basements and rec rooms that serve as gaming palaces for many Americans are a rarity in Japan. In this kind of environment, downtown arcades became a convenient place for people to hang out and have fun before heading home for the night.

“People are stuck downtown waiting for their friends; they don’t have massive social scenes at their houses,” he said. “I don’t know if I invited that many people over to my house in Japan. It doesn’t happen.”

These factors have combined to create a modern Japanese arcade that would be practically unrecognizable to a Westerner with memories of dingy, badly lit rooms full of young hoodlums. The…. only reason American arcades are remembered with a bad reputation? They never got the chance to grow out of it.

The arcades may have never grown in America, but the kids that used to frequent them are now thirty-somethings with kids, disposable income, and nostalgia out the yin-yang. Some places, like Dave & Buster’s, have attempted to capture this demographic with a full bar and restaurant, well-appointed facilities, and specials on gameplay credits. Main Event Entertainment doesn’t have a restaurant, but they have a bar, as well as a bowling alley, rock wall, laser tag, and mini golf (among other attractions at their various locations). Chuck E. Cheese’s has climbing equipment and cartoon mascots (some of which have (unfortunately) been made over to make them more appealing to youthful demographics, like it’s the 90s all over again).

But what all three of those major corporate chains have in common is this: their big money maker is the redemption games. It’s hardly surprising, in consideration of gambling addiction being a Thing, that this is the case: in these Skinner boxes, the marginal consideration of risk versus reward can be tweaked as fast as you can say “reinforcement schedules.” This is different from even a casino, where the money ostensibly spends the same on the outside as inside; here, everybody should be aware that the tickets are not worthless, just worth less (in the interest of padding profit margins). At D&B’s, the 250GB model Xbox 360 went for far more tickets than its $300 MSRP plus tax next door at the Best Buy would suggest. Even the price difference between it and the 4GB model didn’t quite scale linearly.

Pinballz Arcade, the locally-owned 13,000 sft. retro palace located over by Burnet and Research, immediately struck me as a place that knows its history as well as its business. For one, it was the only place set up so that the entrance opened up onto a dozen of its namesake pinball tables instead of the money-making midway. (They evidently mix that up once a month, which could be feng shui or marketing.) They have a bunch of old games and other rarities, like a Zoltar the Magnificent fortune-teller (which doesn’t work, or rather, doesn’t grant wishes) and a couple cocktail table games. Best of all, this place is a showroom, and most of the games are up for sale. Not the really good stuff–would you part with a six-man X-Men cabinet?–but “most” of it. I even found Revolution-X, the crappy 1994 Aerosmith-branded light gun game where you fight The Man using SMGs that shoot CDs (upgradable to laserdiscs). Now that’s dedication to a cause.

I’m also supposed to include a brief write-up of the class itself. It was a joy to work with Dr. Royal and the rest of my SJMC graduate colleagues. For the first time in a long time, I felt like what I wanted to do was legitimate research with interesting possibilities, instead of being ignored for being about videogames or the Internet, with the derision of a parent telling their kid to “turn off that damn Nintendo and go play outside.” I can’t tell you what specifically I learned in the class because I already live and breathe its subject matter, but what DID happen is that I got involved with, and learned the value of, Twitter. I learned a thing or two about personal branding, new media business models, and where we could possibly go on from here. Sure it sucked that I had to drive from North Austin to San Marcos for a 6 PM class two nights a week, softened slightly by the fact that we teleconferenced on Monday, but for all the time and miles behind me, I’ll say it was a great trip.

Apple, Microsoft, and the Rock Star Mentality

  1. Name someone else who works at Apple besides the late Jobs and his heir, Tim Cook. (You don’t count, Dr. Royal.)

    Go ahead. I’ll wait.
    I can’t do it either. Not without a page open to Wikipedia. I can think of several Googlers (Brin, Page, Schmidt, Levinson) and several folks from Microsoft (Gates, Ballmer, Allen), but nobody at Apple.
    Art and design have hardly ever changed. The reason these famous designers work (or in Microsoft’s case of late, don’t work) is because they are all archetypes of famous group dynamics. The Greek physician Galen believed that one’s personality was governed by a mixture of four bodily fluids: blood, yellow bile, black bile, and phlegm. An abundance of one of these four elements yielded a certain temperament. Y’know what, Cracked explains it better than I do:
  2. After Hours – Which Ninja Turtle Are You? Life’s Most Important Question.
  3. So here is my take on the managerial styles of Microsoft and Apple, with a special bonus section for Zuck and Facebook.

    Microsoft found itself facing down higher stakes and longer odds, striving for evolution while Jobs swung for revolution. The Ballmer era was heralded by a toxic corporate structure which ground itself into dust trying to defend its twin bastions of Windows and Office. After the untimely death of my Windows 7 PC, I’ve been using Honeycomb and Ice Cream Sandwich on my tablet (and a handful of office suites that have completely and utterly failed to meet my expectations) and I realize why Microsoft’s product has defined so much of the x86 landscape. On the other hand, it’s not hard to see the more recent gaps in reasoning and corporate groupthink that led to some really questionable decisions in M$-land: The multi-headed suck-and-failbeast of Zune, Windows Vista, Internet Explorer, and Bing? Frustrating design decisions like UAC? The Ribbons in Office and now Windows? MetroWindows 8 UI? Man, Microsoft; all your new directions are dragging me down. You should just go back to cribbing innovation from Apple. (Caveat: The Mac wouldn’t exist without Jobs visiting Xerox Palo Alto Research Center and seeing their early PC, the Alto.)

    I’ll admit that my relationship with Windows can feel a little abusive sometimes. All my games–Steam, a couple Humble Bundles, a smattering of Telltale games, and more homebrew than you can shake a stick at–are on Microsoft platforms. With no Storify app or mobile interface, I’m literally using my Android tablet to prop up my old Dell XP warhorse so the tiny, badly designed fans on the laptop don’t fail to meet demand, overheat, and turn off with minimal warning. I’ve got a couple hundred dollars in Rock Band DLC I’ve accrued over five years that will never leave the Xbox, so I’m stuck with that system if I ever want to play with the music (and other downloaded games) I paid for.
  4. So while I criticize Apple for their walled gardens, Microsoft can be just as guilty when it comes to the sandbox model on their own hardware. The Xbox had a USB 1.1 memory card and controller that both used proprietary connectors to milk money out of end-users, required a $30 accessory to access its DVD player function, and charged subscription rates for Xbox Live 1.0’s online multiplayer with no storefront. Even after nVidia gave them short shrift by holding them to a contractually negotiated price on GeForce3 video cards after costs had significantly decreased, Microsoft failed to learn an object lesson, instead punishing nVidia by intentionally crippling DirectX 11 support in Windows.

    In some ways, the Xbox 360 was even more of a disasterous by-product of design by committee: even more comically overpriced proprietary solutions (its undersized 2.5″ SATA hard drive, its wireless controllers and headsets, its Wi-Fi adapter), a stubborn refusal to allow USB thumb drives as storage until a firmware update in 2010, the abstract fuzzy math of Microsoft Points (80 for $1.00 when Nintendo and Sony both price their online product 1:1), a head-scratchingly bizarre DRM scheme that has locked me out from my own stuff because I didn’t have Internet, and worst of all: pushing the 360 out early, with several fatal design flaws and cut corners, to compete with Sony’s technologically superior PlayStation 3. I have two dead Xboxes in my closet–one by video chip, and the other by Red Ring of Death, both outside their respective warranty period. I can honestly say that in twenty-three years of gaming, that’s the first time that’s ever happened to me.
  5. But now it’s Jobs and Apple’s turn. I have owned exactly two Apple products in my life, both of which were an iPod Touch (2G/3G) because I was on Sprint before they got the iPhone. I’ll admit that they were pretty awesome. The 2G was my Bat-Utility Belt, easily jailbroken and with some incredible features and abilities above and beyond what Apple had intended, with a customized home screen of 5×5 icons (which I apparently still won’t be getting on my Android phone until Jellybean). They also both had defective headphone jacks that gave out after a year, and were stolen before any of the other myriad problems with Apple’s closed-design philosophy (like batteries) could manifest themselves. The 3G, I didn’t even bother with. A tethered jailbreak made Apple-approved apps crash before they could load. Sure, going from iOS 2 up to 5 made it a lot more tolerable, but after building your own PC you come to appreciate the value of open design. Did I “love” my iTouch, Dr. Royal? Yes, I did–the 2G, the one that I broke down with a digital crowbar; the one that was MINE, not Jobs’. The 3G, which I never properly cracked, never really felt like mine. When it was stolen, I cursed the neighborhood and forgetting to lock the door, not endlessly lamenting that “my baby” had been taken from me. Like my stereo, it was mine, it played my music, and it was important to me–but not irreplaceable.

    I don’t feel any kind of personal connection with Apple devices. I consider them to be a gold standard for multitouch input, their hardware designs are always sleek, stylish, and functional (but man, do you pay for it), and they offer limited flavors of highly refined, polished software design. But Steve’s way is not my way, so he doesn’t care about me. In fact, when the Librarian of Congress put out the 2010 triannual update to the Digital Millenium Copyright Act, adding jailbroken OSes to the list non-infringing uses of copyrighted works, Apple responded thus: “Apple’s goal has always been to insure that our customers have a great experience with their iPhone and we know that jailbreaking can severely degrade the experience. As we’ve said before, the vast majority of customers do not jailbreak their iPhones as this can violate the warranty and can cause the iPhone to become unstable and not work reliably.”
  6. Are you kidding me. Honestly, my iTouch3G crashed and burned more often on official firmware updates than it ever did in my jailbreaking days. I had to learn the hard way to create a playlist and then drag THAT to my device so I didn’t have to spend four hours forced to use the awful, slow Windows version of iTunes to reload 19 GB of music, one album or track at a time, just for having the audacity to upgrade to the latest version of iOS. You needed additional software to pull your own music out from inside this device, instead of the open file structure of every other MP3 player I’ve ever owned.

    Jobs was undoubtedly a visionary genius. He was the biggest triple threat since Oprah. He was also a relentless, iron-fisted monomaniac. But the one thing I can’t call him is a hacker. A real hacker would try to defeat the opposition through coding a better mousetrap, not suing to recover the cheese.
  7. Tim Cook continues his “thermonuclear war” with Android, which I’m afraid will turn into mutually-assured destruction in the mobile landscape.

    Facebook has an interesting mixture of what I don’t like about both. Although run by hackers, it has an App Store mentality, whereas I prefer the open Web. Instead of trying to innovate, they try to dominate, and then the better part of a billion digital citizens have to accept whatever changes they are handed, for better and (more often) for worse. One of my personal highlights from the build-up to the IPO was Zuckerberg trying to explain the hacker ethos to a bunch of suits and ties from Wall Street. These people can only conceptualize economy of scarcity. Even someone like Zuck, with a history of building castles in the sky, still looks crazy when he shows up on CNN looking like a scraggly hipster. 
    Google hired Eric Schmidt because they figured that the talent needed a manager, just like a rock band. They were probably right.

Free the Long Tail!

I find it equally annoying and reassuring when I read an article by a respected industry insider that reads like one of my own sermons on the topic. Annoying because, well, I’m a grad student trying to make my bones when I obviously know my stuff. (I will never forget the compliment Jacie Yang gave me on a break one night in Fall 2011: “What are you even doing here?!”) Reassuring because, well, I can completely forgive my ego if the right message and right philosophies are being taught.

The Internet, the Cloud, and all the current trends in tech are all old pipe dreams of mine. I am perpetually indebted to the engineers, businessmen, and/or academics who made these things happen. But on the same hand, the failure of the dominant paradigm to abandon its top-down model of control to pursue the future has been a constant source of dismay and depression.

For his ambivalence about “sex sells,” as we covered earlier this minimester with Dr. Royal, Wired’s Chris Anderson has the right idea about scarcity, abundance, and the externalities governing both. There are a few points from his articles (that later became books) “The Long Tail” and “Free” that I would like to explore:

Scarcity of physical space on the shelf:

“Unlimited selection is revealing truths about what consumers want and how they want to get it in service after service… And the more they find, the more they like. As they wander further from the beaten path, they discover their taste is not as mainstream as they thought (or as they had been led to believe by marketing, a lack of alternatives, and a hit-driven culture)…  If the 20th- century entertainment industry was about hits, the 21st will be equally about misses. For too long we’ve been suffering the tyranny of lowest-common-denominator fare, subjected to brain-dead summer blockbusters and manufactured pop. Why? Economics. Many of our assumptions about popular taste are actually artifacts of poor supply-and-demand matching – a market response to inefficient distribution…. We equate mass market with quality and demand, when in fact it often just represents familiarity, savvy advertising, and broad if somewhat shallow appeal. What do we really want? We’re only just discovering, but it clearly starts with more.” (“The Long Tail”)

“Hit-driven economics is a creation of an age without enough room to carry everything for everybody. Not enough shelf space for all the CDs, DVDs, and games produced. Not enough screens to show all the available movies. Not enough channels to broadcast all the TV programs, not enough radio waves to play all the music created, and not enough hours in the day to squeeze everything out through either of those sets of slots. This is the world of scarcity. Now, with online distribution and retail, we are entering a world of abundance. And the differences are profound.” (“The Long Tail”)

Channel conflict, whether it’s about the physical space required for old-media commerce or between new vs. old media:

“[Physical media put] two dramatic limitations on our entertainment:
The first is the need to find local audiences. [Retailers] can pull only from a limited local population… It’s not enough for a great documentary to have a potential national audience of half a million; what matters is how many it has in the northern part of Rockville, Maryland, and among the mall shoppers of Walnut Creek, California… In the tyranny of physical space, an audience too thinly spread is the same as no audience at all.
The other constraint of the physical world is physics itself. The radio spectrum can carry only so many stations, and a coaxial cable so many TV channels. And, of course, there are only 24 hours a day of programming. The curse of broadcast technologies is that they are profligate users of limited resources. The result is yet another instance of having to aggregate large audiences in one geographic area – another high bar, above which only a fraction of potential content rises.” (“The Long Tail”)

Surprisingly enough, there’s been little good economic analysis on what the right price for online music should be. The main reason for this is that pricing isn’t set by the market today but by the record label demi-cartel. Record companies charge a wholesale price of around 65 cents per track, leaving little room for price experimentation by the retailers. That wholesale price is set to roughly match the price of CDs, to avoid dreaded “channel conflict.” The labels fear that if they price online music lower, their CD retailers (still the vast majority of the business) will revolt or, more likely, go out of business even more quickly than they already are… But what if the record labels stopped playing defense? A brave new look at the economics of music would calculate what it really costs to simply put a song on an iTunes server and adjust pricing accordingly. The results are surprising… it should cost just 79 cents a track, reflecting the savings of digital delivery. Putting channel conflict aside for the moment, if the incremental cost of making content that was originally produced for physical distribution available online is low, the price should be, too. Price according to digital costs, not physical ones.” (The Long Tail)

“In the monetary economy it all looks free — indeed, in the monetary economy it looks like unfair competition — but that says more about our shortsighted ways of measuring value than it does about the worth of what’s created.” (Free)

On the fundamental shift from scarcity to abundance created by the Long Tail model:

“The rise of ‘freeconomics’ is being driven by the underlying technologies that power the Web. Just as Moore’s law dictates that a unit of processing power halves in price every 18 months, the price of bandwidth and storage is dropping even faster. Which is to say, the trend lines that determine the cost of doing business online all point the same way: to zero. But tell that to the poor CIO who just shelled out six figures to buy another rack of servers. Technology sure doesn’t feel free when you’re buying it by the gross. Yet… [i]t’s not about the cost of the equipment in the racks at the data center; it’s about what that equipment can do. And every year, like some sort of magic clockwork, it does more and more for less and less, bringing the marginal costs of technology in the units that we individuals consume closer to zero.” (Free)

“What’s interesting is that transistors (or storage, or bandwidth) don’t have to be completely free to invoke this effect. At a certain point, they’re cheap enough to be safely disregarded… In Zeno’s dichotomy paradox, you run toward a wall. As you run, you halve the distance to the wall, then halve it again, and so on. But if you continue to subdivide space forever, how can you ever actually reach the wall? (The answer is that you can’t: Once you’re within a few nanometers, atomic repulsion forces become too strong for you to get any closer.)…[So when do costs] come close enough to zero to say that you’ve arrived and can safely round down to nothing? The answer: almost always sooner than you think…
“From the consumer’s perspective, though, there is a huge difference between cheap and free. Give a product away and it can go viral. Charge a single cent for it and you’re in an entirely different business, one of clawing and scratching for every customer. The psychology of “free” is powerful indeed, as any marketer will tell you. This difference between cheap and free is what venture capitalist Josh Kopelman calls the “penny gap.” People think demand is elastic and that volume falls in a straight line as price rises, but the truth is that zero is one market and any other price is another. In many cases, that’s the difference between a great market and none at all. The huge psychological gap between “almost zero” and “zero” is why micropayments failed. It’s why Google doesn’t show up on your credit card. It’s why modern Web companies don’t charge their users anything. And it’s why Yahoo gives away disk drive space. The question of infinite storage was not if but when. The winners made their stuff free first.” (Free)

My own opinion: The Problem

Ever-growing copyright terms are chopping out the middle part of the Long Tail. The creep on the public domain is destroying any legitimate form of participatory culture with some of our most cherished stories and characters. If comic books continue the Greek epic tradition, then they are, in effect, our new Aesop. People will work for free if you let them, and put out some really great stuff. (Look at Wikipedia!) The safe play of these necessarily conservative top-down media empires is why every movie has a number at the end (or a subtitle, if they’re worried that the number is high enough to inspire widespread cynicism). Why not let a Star Wars fan direct a Star Trek movie? It worked great for JJ Abrams. Why not let a DJ cut together a pastiche of old beats and licks to create something else entirely, and how is that a cause for alarm if they do it with a keyboard and mouse instead of a guitar and microphone?

If you love it, set it free. Just sayin’.

“Any damned fool can give you answers. It takes a genius to think of the right questions.”

I will say it until people start taking me seriously: The service with the greatest potential value in the new information economy is curation. Here is how it applies to the news.

Convergence. That big buzzword. But here, it takes on a special meaning: the various hats worn by any member of your newsroom staff. Ideally, they should possess a degree of literacy, research savvy, and artistic sensibility that will allow them to not only tell the story, but to make sure that it is heard. As Dr. Royal stated before (and again in this article), these hybrid kids know that all knowledge is just data to be presented in one form or another, and the best know how to make it happen using multiple media.

Going over reams of data is nothing new to me. When I was responsible for cleansing mailing lists for direct marketing campaigns, I would throw on some headphones and use Excel to reduce massive CSVs full of public records down to a list of 10,000 to send off to the print shop for form letters and envelopes. Now, Sir Tim Berners-Lee thinks that my skillset will be useful in finding the story beneath the rhetoric. One of the greatest quandaries facing any modern journalist is wading through the great crapflood of data in everyday life. Google’s great contribution to humanity is its peerless search engine, which gives anyone with a computer a machete to cut through the vines and brambles of the untamed digital frontier.

There is an eminently frightening Tony Blair quote that I believe bears repeating from that article:
“Freedom of information. Three harmless words. I look at those words as I write them, and feel like shaking my head till it drops off my shoulders. You idiot. You naive, foolish, irresponsible nincompoop. There is really no description of stupidity, no matter how vivid, that is adequate. I quake at the imbecility of it …
“The truth is that the FOI Act isn’t used, for the most part, by ‘the people’. It’s used by journalists. For political leaders, it’s like saying to someone who is hitting you over the head with a stick, ‘Hey, try this instead,’ and handing them a mallet.
“But another and much more important reason why it is a dangerous act is that governments, like any other organisations, need to be able to debate, discuss and decide issues with a reasonable level of confidentiality. … Without the confidentiality, people are inhibited and the consideration of options is limited in a way that isn’t conducive to good decision-making.”

Granted, he might have a point: on the flipside, there is a lot of room for error when attempting to interpret and arrange raw data, and people will take their confirmation biases under their arm when it comes time to crunch the numbers. Just look at the Romney campaign’s repeated attempts to insist that “you didn’t make that” by beating an Obama soundbite–which, under a broader, more modern, digital definition of “data,” is indeed just that–like a rented mule. One of my favorite lines from the original Men in Black (1997) comes when Jay (Will Smith) asks Kay (Tommy Lee Jones) about the clandestine nature of the organization and its mission:
“Why the big secret? People are smart. They can handle it.”
“A person is smart. People are dumb, panicky dangerous animals–and you know it.”

The future is in the hands of those who will fund, create, and release the tools to cut through these swaths of data. This Times article begins with the following:
“In an uncharted world of boundless data, information designers are our new navigators. They are computer scientists, statisticians, graphic designers, producers and cartographers who map entire oceans of data and turn them into innovative visual displays, like rich graphs and charts, that help both companies and consumers cut through the clutter. These gurus of visual analytics are making interactive data synonymous with attractive data. ‘Statistics,’ says Dr. Hans Rosling, a professor of international health at the Karolinska Institute in Sweden, ‘is now the sexiest subject around.'”

The Texas Tribune has made its name thanks to Matt Stiles and Niran Babalola’s tireless work on their amazing public records databases. I remember watching Babalola demonstrate some of TT’s toys at Mass Comm Week and thinking “Man, I remember when I had to teach my databases those tricks with CSVs. And it didn’t look nearly that nice.”

Just make sure you never lose sight of any of your “hats.”
“…[P]ractitioners, and the growing variety of users working with these kinds of visualizations, need to consider and ask fundamental questions about the full process that determines what data gets collected, stored, processed, and ultimately displayed. Otherwise, they become part of the problem of misinterpreting data rather helping to make it clearer and more meaningful.”

The Future of Music

Big Content just doesn’t get it, and Spotify was the Duke Nukem Forever of streaming audio for quite some time.

“[T]hanks to resistant labels and archaic rights systems, Spotify isn’t available in the US. And this isn’t just a problem for American music fans. It’s also a problem for Spotify—the US is the world’s largest music market, bigger than France, Germany, Sweden, Spain, Italy, and the UK combined. Spotfiy’s lack of a US footprint may even be a problem for the music industry, which has struggled to find new ways to capitalize on digital music and finds itself increasingly crushed under the heavy boot of iTunes.”

Let’s completely ignore today’s discussion of SOPA/PIPA/CISPA/ACTA and the rest of that alphabet soup of poorly-thought-out, Hollywood-backed legislation. Let’s look at the effects that current policy is having on the talent.

The RIAA sob story is easily nailed to the wall when you reverse-engineer their favorite statistics and add a few more of your own. Specifically, you have to know what counts as a “purchase.” Renting music from Spotify is not a purchase, nor is streaming it from online radio. Direct merch sales from artists, Rock Band and Guitar Hero DLC, royalties, the door at a show, and ringtones don’t factor into that figure, even though most would agree that they are no less a part of the music business these days. The truth, says Jeff Price, is this:

  • More musicians are making money off their music now then at any point in history.
  • The cost of buying music has gotten lower but the amount of money going into the artist’s pocket has increased.
  • There are more people listening, sharing, buying, monetizing, stealing and engaging with music than at any other point in history.
  • There are more ways for an artist to get heard, become famous and make a living off their music now than at any point in the history of this planet.
  • Technology has made it possible for any artist to get distribution, to get discovered, to pursue his/her dreams with no company or person out there making the editorial decision that they are not allowed “in”.
  • The majority of music now being created and distributed is happening outside of the “traditional” system.
  • And to reiterate, sales are up…

The unsung victims of industry practices are the creative team behind the process–the very people they try to claim as victims when debating the pros and cons of industry-backed legislature. The creatives always have interesting things to say about the business of their business, and I particularly appreciate the words of OK Go’s Damian Kulash:

“Music is getting harder to define again. It’s becoming more of an experience and less of an object. Without records as clearly delineated receptacles of value, last century’s rules—both industrial and creative—are out the window. For those who can find an audience or a paycheck outside the traditional system, this can mean blessed freedom from the music industry’s gatekeepers.” link

“Technically, [shooting our own video and putting it on YouTube] put us afoul of our contract, since we need our record company’s approval to distribute copies of the songs that they finance. It also exposed YouTube to all sorts of liability for streaming an EMI recording across the globe. But back then record companies saw videos as advertisements, so if my band wanted to produce them, and if YouTube wanted to help people watch them, EMI wasn’t going to get in the way… To the record company, it was a successful, completely free advertisement… But the fans and bloggers who helped spread ‘Here It Goes Again’ across the Internet can no longer do what they did before, because our record company has blocked them from embedding our video on their sites. Believe it or not, in the four years since our treadmill dance got such attention, YouTube and EMI have actually made it harder to share our videos.” link

These towers of old media need to figure themselves out pretty quickly before they start to crumble at the foundations. The grassroots are going to rise up through the cracks either way. Kickstarter has proven to be an amazing tool for financing a labor of love; I recently threw fifteen bucks at Tim Schafer and his amazing Double Fine Studios to not only play their game, but observe the documentary about their creative process–which has so far proven to be the most interesting part of the entire purchase.

Curation: The Next Wave of Social Media

I don’t know if you’ve noticed, but there is a lot of crap on the Internet. Whether you mean that to mean clutter or excrement, the sentiment is hard to question. The proliferation of communications tools (hardware, software, and the culture surrounding both) has rendered information scarcity as a wilful act of protest in the postmodern age–whether it takes the form of “no comment” from producers or “TL;DR” (Too Long; Didn’t Read) from consumers.

When I first heard of Twitter back in 2007, I thought it was the dumbest thing I had ever heard of, having just begun to embrace SMS over my initial misgivings against the short form and the questionable economics of texting plans. However, that didn’t stop me from claiming @capnoblivious (even if I had no intent of using the service); after all, I was serious about the idea of lesser Cap’ns Oblivious in my presentation–inferior pretenders attempting to pass themselves off as the genuine article. But now, something has changed. Maybe the years of surfing imageboards and news site headlines were the red pill I needed to understand the utility and economy of Twitter. Now it’s kind of an addiction.

Last weekend, the only apparently relevant bit of news in the world was the shooting in Aurora, Colorado, and I fought back with every spare thought I had on the subject of media sensationalism, growing more upset with each hour that the networks continued its dead horse beating with nothing new to contribute to the ongoing narrative. It was a familiar theme, mirroring my disgust with the  endless tape loop of the Twin Towers collapsing into the street (to say nothing about the real-time rebroadcast of the news coverage on the tenth anniversary of the tragedy). Considering the 24/7 coverage to be beyond the pale for prurient interest, I tweeted “Shame on the news channels. At the point I know everything about this tragedy, it’s not news, it’s porn. #aurora #theatershooting #seriously”.

So we return to my original point about information overload. I watched CNN most of Friday, knowing that the way each network handled its coverage would earn them a Colbertesque hat tip or finger wag in the “media coverage” leg of my planned dissertation about moral panics. I realized, with no small amount of irony, that in our increasingly fragmented mediascape, a stop-the-presses event like Aurora completely breaks everything I love about Twitter. The whole point for me is to have a small collection of trusted experts in their respective fields to point me out towards my manifold interests: video games, music, Constitutional law (especially on intellectual property, privacy, and free speech), quantum mechanics, space exploration, politics, and the mass media. When everybody is talking about the same thing, I’m left with the same flood of unwanted bad news as I would be watching television.

That, coupled with last week’s class discussions on virtual communities and social networking sites, got me thinking about what my favorite tools (Google, Facebook, Twitter, reddit) all have in common. The answer: curation of different sorts.

  • Google’s stated aspiration is to be the steward of the entire world’s information, using heuristic analysis and Web-wide referrals as its vetting process.
  • Facebook’s yardstick for relevance is the public and semi-public thoughts of your circle of friends. It has also borrowed…
  • Twitter’s concept of curation: one-to-many thoughts from influential or entertaining public figures or institutions. In events where Twitter shares a virtual presence with a physical event, the model changes to many-to-one. Think a SXSW keynote or a Twitter feed in the ticker of a news program (also ironically taken up by Facebook).
  • reddit provides you with a ready-made list of small communities when you sign up. Each subreddit’s members (ideally) use the system of up- and downvotes to indicate the relevance of each story (although that particular point of reddiquette is lost on many).

Clay Shirky, in this Nieman Labs article, shows the modern evolution of the famous Stewart Brand aphorism “Information wants to be free.” The relevant quote: “[A]bundance changes the value proposition of media as a resource. ‘Scarcity is easier to deal with than abundance,’ Shirky points out, ‘because when something becomes rare, we simply think it more valuable than it was before, a conceptually easy change.’ But ‘abundance is different: its advent means we can start treating previously valuable things as if they were cheap enough to waste, which is to say cheap enough to experiment with.’ Cognitive Surplus, in other words — the book, and the concept it’s named for — pivots on paradox: The more abundant our media, the less specific value we’ll place on it, and, therefore, the more generally valuable it will become. We have to be willing to waste our informational resources in order to preserve them. If you love something…set it free.” [author’s emphasis]

As Dr. Royal has said on numerous occasions, the thing you curate most with these services is your own brand identity. I wouldn’t have known who Neil deGrasse Tyson was without his numerous (and always amusing) Stewart/Colbert appearances. That kind of performance means he’s a like-minded individual I wouldn’t mind listening to as often as he speaks. Do yourself a favor and aspire to the same.

“I’ve used The Google.”

We took our break yesterday right in the middle of my rant about product placement, so of course it was on everyone’s mind as we walked out into the hall. I forgot one of my recent favorites (above), which was Microsoft having Peter Parker look for his father on Bing. Let’s be completely honest: the young nerd behind Spidey’s mask wouldn’t be caught dead using Bing–a service that so pales in comparison to Google, it has become a laughingstock in a number of virtual communities.

Considering how much I know about the Internet, it turns out that one of the holes in my mental encyclopedia—and something, to my shame, that I hadn’t much contemplated—is “How exactly DOES Google make their money? It’s ads or something, right?” The readings for class have, thankfully, started to give me a picture of how and why they became the industry leader—half explicitly, in pieces about Google, and half implicitly, talking about the failings of Yahoo! and Bing.

On Yahoo:

“The portals and search sites figured out that the sponsored links could be placed alongside a more objective set of search results. It was a brilliant way to turn searches into revenue. Google saw the power of this approach and decided to grow its own. Engineers at Google took the concept of pay-per-click search results and in 2002 turned it into a smooth-running, money-printing machine called AdWords. The company developed an automated process for advertisers to bid on keywords… By the time Google published its financial statements for the first time in 2004, everyone knew that the company had harnessed one of the great innovations of the Internet age.”

“Yahoo was the quintessential Silicon Valley startup. Founded by Stanford engineering students Jerry Yang and David Filo in a campus trailer in 1994, bootstrap innovation and hardcore coding were etched into its corporate DNA. Semel set out to modify that. He had to do a lot of internal politicking to convince his engineers that adapting existing technology from Inktomi and Overture was better than building their own versions from scratch. One way to convince the staff was to appeal to their killer instincts. So after both deals were announced, Semel promised that Yahoo would merge the three to produce a certifiable Google slayer… [But Overture] had been created in a hurry during the boom, and it wasn’t built to work on a global scale. Also, because of the way it was designed — to allow human review of each ad — it was painfully slow compared to Google.”

“The truth is that when [former Yahoo! CEO Terry] Semel worked in Hollywood, he understood more about how movies and TV shows made it to theaters and TV sets than virtually anyone else on the planet… When Semel became co-CEO of Warner Bros. in the early 1980s, he was steeped in the marketing and distribution plumbing of Hollywood. So it’s no surprise, in retrospect, that his legacy is as one of Hollywood’s biggest innovators and risk takers… But now, despite Semel’s achievements in Hollywood and early success at Yahoo, Silicon Valley is buzzing with a familiar refrain: Wouldn’t an executive with a little more technology savvy be a better fit? …[N]ow we have empirical evidence: At Yahoo, the marketers rule, and at Google the engineers rule. And for that, Yahoo is finally paying the price.”

On Bing:

“Privately, Googlers will tell you that the Bing ads rankle. They describe them as misleading and unfair, painting a picture of Google that doesn’t match reality. Maybe, but Microsoft — a company not previously known for its marketing savvy — is taking a page out of a 1960s Procter & Gamble playbook: create a problem consumers don’t know they have, then solve it.”

(This, by the way, was before the thing even launched: its search results, compared to Google’s, are in my humble opinion abysmal.)

“Tale of the Tape”
Google:
-97% of its revenue is from online ads. Everything else is a hobby
-Ignore Bing for now and focus on making Google even better
-Google is losing its halo as it expands into phones and operating systems

Microsoft:
-Windows and Office rule. It needs another big revenue generator
-Bing is spending $100 million to get you to try its “decision engine”
No one ever loved Microsoft. Bing could help soften its tech-demon image

That last point–cracks coming up through “Don’t Be Evil” as they take on Apple in the mobile realm, a few missteps on privacy–have made even me start to question what Google could do if they turned to the Dark Side. I unapologetically admit that Google straight-up owns me, and I pay a close eye to what they do. But the reason they own me is another matter entirely–in much the same way that Dr. Royal is an unapologetic Apple cultist, I say that Google’s cloud simply works for my lifestyle and only question it when the cracks start showing.

Google’s real-time ability to serve up and price ads and analytics is unprecedented. The old language of advertising and marketing has been forever changed by their approach. Instead of hoping someone buys your product the morning after seeing your ad on the Tonight Show, the 24/7/365 nature of cyberspace has allowed people to get their order in before the store even opens. For them to offer their consumer products free, they’ve obviously found the secret recipe to pricing their services for enterprise clients.

And for now, that’s good enough for me. Stay good, Google. I’m counting on you.

21st Century Digital Boy/Girl, Virtual Community

By some accident of the alphabet on the roll sheet, my esteemed colleague Ashley Goode and I are presenting this week–and what a lineup it is. I asked her if she wanted to take virtual communities or gender in new media, and she chose the latter–which is great, since I’ve been a part of some virtual community or another since adolescence. The biggest security blanket I have in a virtual space is the “Enter” key, which gives me a chance to see what I want to say on the page before I say it out loud. But for some, it provides new layers of identity that are not otherwise possible in the physical realm.

A friend of mine–to respect anonymity, we shall use the pseudonym “Jack”–has attempted to use the Internet to untangle his rather complicated personal identity. Jack, you see, is transgendered, but came from a home environment where even the thought of exploring such an unorthodox thing was anathema. He found solace in a place where he could present what he believed to be his true self: a short, spunky little girl persona I will call “Jill.” Jack and Jill paradoxically occupy the same male body at different times, depending on mood and other environmental factors, but roughly approximate his Freudian id and superego (instead of disparate personalities that take total control). Jack got lucky in a way a lot of LGBT kids don’t: he can pass for straight, which spared him untold horrors in high school and his personal life.

Jack has always played as female characters in videogames–not because he wants the eye candy, like most stereotypical male gamers (to say nothing about the people that pander to them), but because Jill is the fearless part of his soul, the one he would send out to do battle. I was telling him about this presentation and he told me that he had recently worked up the courage to discuss these issues with his therapist; posting in online communities using a second, throwaway identity (in reddit LGBT groups); and being more forthcoming with his friends instead of just hiding to fit in. For my part, I’m so proud of him. Jack has a long way to go, and he’s still not quite sure how to deal with Jill in his daily life–I asked him what personal pronoun to use here, and he said, with a note of dejection, “Default to the boy name and pronoun so people can keep up with the story.” I gave him a hug because I’ve seen him quietly do this for a decade now.

“If women have to face the glass ceiling on a daily basis,” he said, “the cultural preconceptions against a man who wishes to be regarded as a woman circumscribe the worst parts of both ends of the sexual double-standard.” We watched Sheryl Sandberg’s TED talk together, and I saw him nodding in solemn agreement with her calls to be more assertive and engaged–things that Jack has habitually lacked in life. I think that for some, this kind of virtual interaction is a buffer between a misunderstood boy and the world that misunderstands him. I hope that this somehow brings him closer to what he’s after.