Marathon bombing leaves cracks in the “new news ecosystem”

When news of the Boston Marathon bombing broke on Monday my reaction was worthy of one of B.F Skinner’s pigeons in a Skinner box: I wanted to press the lever that delivered a newspaper – one of those thrilling extras we used to put out when big news broke in the morning back in the ‘80s and ‘90s.

But since nobody puts extras out anymore – just 10 or so years after my former employer, the Boston Herald, printed its last, the notion already seems as quaint – and as dead – as the paperboy – I did what I thought was the next best thing: I went to the Boston papers’ websites: bostonherald.com and boston.com. The Herald’s website was accessible, but clearly the staff was scrambling. A half-hour after the first bomb went off at 2:50, a breaking news banner led to a jumbled, four-paragraph story. The Globe’s website had crashed entirely.

As I looked up, stunned, a classmate had jumped onto Twitter, and began reading off posts that unfolded like a series of news bulletins, hundreds pouring in every minute, providing everything that the newspaper sites could not.

I’m afraid I sound like a naïf here, but I’m really not. I’m an old-media guy by training and experience for sure, but I’ve been on Twitter for a couple of years. I understand the role it has played in breaking-news stories since the earthquake in China in 2009 and especially during the Arab spring that bloomed a year later. I could even pick Andy Carvin out of a crowd. But this was the first time I saw its incredible power to report a story in which I was personally invested – and how lame it made big media look moment by moment. For me, things have changed, and for the better.

Of course, no good in the new-media world is unalloyed. As usual, Twitter was distinguished in part by the amount of bad information it put out along the way. As Slate’s Jeremy Stahl later said, if newspapers provide what Washington Post Publisher Philip L. Graham called “the first rough draft of history,” then Twitter “is the first rough draft of journalism.” And, as anyone who has ever produced one knows, first drafts are messy.

But so what? In this, Twitter and other social media outlets were hardly alone. Perhaps in an effort to keep up with the speed of developments on social media, old-media outlets, ranging from CNN to Fox, from the Globe itself to the New York Post, made egregious errors along the way that they were forced to walk back, often grudgingly. Some of the mistakes were never corrected. As radio and TV reporters have known for decades, the immediacy/accuracy ratio is thornier than any quadratic equation.

But here’s where it gets weird. Not content merely to provide breaking-news developments, social media denizens began to dabble in press criticism, pointing out the mainstream media’s mistakes with all the gleeful high-school snark we have come to expect from the medium. One commentator went so far as to celebrate social media’s new role as ombudsman. Excuse me, but it’s hard to imagine the Times’ Margaret Sullivan pointing out other outlets’ mistakes without taking note of those made by her own. The other way around is more often the case, in fact.

This phenomenon reached the Everest of absurdity on Reddit, where would-be critics of mainstream journalism raged about the New York Post’s decision to splash with a photo of two men (not the brothers later ID’d as the bombers) allegedly sought for questioning by investigators, charging that the paper had unfairly stigmatized, and possibly even libeled, the pair. Flash forward a day, and posters to Reddit named as a suspect a Brown University student who almost assuredly had nothing to do with the blast. What’s wrong with this picture?

What’s wrong with this picture is that both social media and the mainstream media need to understand their new roles in this part of what Jeff Jarvis has called the “new news ecosystem.” Social media gives us a speedy, crowdsourced first draft of breaking news. As such, errors are to be expected, although restraint and discretion about what to post and re-post should be exercised, as Carvin does so well.

The mainstream media provides the context, analysis, depth and breadth that social media by definition cannot provide. If it wishes to retain the credibility that is one of its natural advantages, the mainstream media needs to be far more careful about competing with social media for immediacy, because errors are more apparent when they’re frozen in print or on tape, making it tougher for the mainstream to walk back an error.

For their part, social media natives need to understand that they are part of an ecosystem that includes mainstream media. Where would social media be, for example, without the reporting of the mainstream to post? Instead of behaving like a high-school clique bent on ridiculing the errors of old-media outlets, they need to grow up, put aside the snark and self-righteousness, and strive to improve at doing what they do best. To do otherwise is to display all the intelligence of a pigeon in a Skinner box.

This post brought to you by… me

Our spirited discussion about advertising with Lewis D’Vorkin yesterday reminded me that I’ve wanted to scratch the itch of sponsored content for a few weeks. Happily, the New York Times sanctioned such discussions the week before last, when it discovered – in much the same way Columbus discovered America, I guess – the controversy over this growing form of advertising.

In truth, the controversy’s been simmering for more than a year. And that’s as it should be. Because, done improperly, sponsored content threatens the only thing that old and new media alike really have to sell – and that’s credibility.

First, let’s define the terms: sponsored content is exactly what it sounds like – a story, a photo, a chart, a video directly sponsored by an advertiser. They can be written and edited by the sponsor or by the outlet’s staff on behalf of the sponsor. On this everyone agrees. It’s not much different from the concept of “advertorial” in the old media world: a piece of print content, often an insert, sometimes a page or two of ROP, prepared by or in cooperation with the advertiser and clearly marked as such.

Notice I’ve italicized “clearly marked as such.” That’s because this is where the agreement on sponsored content begins to break down these days. In the old media world, the marking would take the form of a narrow banner at the top of the page that said “advertorial” or “advertising.” Papers took pains to insure that readers were further tipped off by fonts, headline and layout styles that differed visibly from those employed in the news pages.

Why did papers go to the trouble? First, because they aware of the eternal tension between advertisers and news organizations. Advertisers, naturally, want to get their messages across to consumers with the least possible amount of mediation. And, if they can, they also seek to appropriate some of the news organization’s credibility by making their message resemble news content as much as possible.

This is undeniable. Some of most brutal battles I had with my colleagues in advertising were over their requests that advertorial be made to look a little less like what it was, and a little more like news content. The nadir of this approach was the infamous grapefruit diet ad and others like it that some newspapers, including mine at the time, published in the ‘90s and 2000s. That was one fight I lost.

Forbes handles sponsored content well. If the banner designating it isn’t entirely intelligible, it is large, and there’s a clickable “what’s this?” link that explains it all very clearly. But I can’t say the same for Mashable, Buzzfeed and some other websites, where the game seems to be to cater to advertisers by giving them what they so badly wanted from newspaper advertorial back in the day – as little distinction between it and news content as possible.

Yes, it’s a new digital day, and a much more challenging environment than in the glory days of old media. But one thing hasn’t changed. Readers still value credibility highly when it comes to news websites. And advertisers want to be part of a site in part because it’s credible. For online news executives to squander this with vague justifications such as “content is content” and “readers don’t care where content comes from” is to sell, like Esau, their birthright for a mess of pottage.

Can you hear me now…?

When the blog posts and tweets started cresting last week celebrating the 40th anniversary of the cell phone, I didn’t know whether to sing a chorus of “Happy Birthday” or pitch my iPhone off the roof and then run it over with my car.

Cell phones have had that sort of affect on me almost from the start.

Though most writers were quick to offer the ritual top tens – worst handset, best handset – or the hoary foundational story of the first call from a Motorola Dyna-TAC (a brick-like appliance that made Maxwell Smart’s shoe phone look sleek) -from a Manhattan streetcorner in 1973, – few dug beneath the surface to examine the impact of the cellphone on our lives.

Though it can hardly be understated – cellphones, it could be reasonably argued, are the sina qua non of the modern world — that impact, I would suggest, is decidedly mixed. In truth, like Antony, I bury my cellphone much more often than I praise it.

It wasn’t always this way. Unlike many of my youthful classmates, I not only remember the early cell phones, but I carried one of the first models to make a commercial splash in the late ‘80s – the unbelievably ungainly Motorola Micro-TAC, which despite its optimistic nomenclature was about the size and weight of half a brick, available in any color as long as it was it was death-pallor-gray and with its whip antenna looked more like a walkie-talkie than the walkies it was replacing in the newsroom. When you loaded one of those babies into your inside breast pocket, it made your jacket sag so one-sidedly it looked like it had had a stroke.

But we loved them. Call anyone, anywhere, anytime. Reporters in the field. Wow. “Can you hear me now?” Your wife from the train. “I’m calling from a cellphone.” To repeat, wow.

Of course, that was before we realized that cellphones made you reachable at any time as well. Like while you were driving your car, formerly an inviolable oasis of solitude, or traveling, or in the grocery or during a hundred other formerly solitary activities that once defined daily life in the era of the landline. And once those boundaries of solitude were erased by the cellphone, it was only a matter of time until the appliance leveled the wall between work and play as well. Like the appliance to which they were increasingly tethered, workers were soon considered always on.

Thus, the era cell-phone serfdom had begun. And, remember, this was well before email, broadband, mobile internet access, text messaging and social media made the cellphone even more essential, not just a communications tool but the device through which we have, for better or worse, come to access much of our world. Need evidence of the cell’s centrality to our lives? Just watch the next time someone you know almost leaves the house without his or her cell phone but catches himself. You would think they had almost left the house without pants.

I don’t want to throw the baby out with the bathwater here. Much of what I do for a living, as well as much of what I do for pleasure, would be infinitely more difficult without a cell phone, and sometimes I even appreciate mine. But you’ll forgive me if I sit out the candles and the birthday cake.

But I do have a birthday present for you, my little cell phone. Come with me. It’s up on the roof.

The days when every business broke guitars…

When Jeremy showed us the “United Breaks Guitars” movie during class the week before last, I had to laugh. Not just because it was a funny bit – although it was – but because there was a time when many businesses treated their customers in that way. Especially in media.

I know – I used to do it myself.

To recap: Band members traveling together to a gig on United looked out the windows of their plane before takeoff only to see United’s baggage handlers tossing their instruments around like the gorilla in the old American Tourister commercials.

Sure enough, the players found on arrival that their instruments had been damaged, thus beginning a customer-service nightmare that culminated in the short film we saw. Even after the film went viral, United’s response was tepid at best.

Like Jeff’s well-documented experience in Dell Hell, this was the nature of much customer service in the era before the full bloom of social media. And this despite the lip service paid to such notions as “customer-facing companies” and hoary shibboleths such as ”the customer is always right.”

This was particularly true in media companies, where the lines of communication tended to run in direction only and executives were unaccustomed to dealing with unhappy customers, who were treated as annoyances at best and marks at worst.

Home delivery customer service – the lifeblood of most newspapers — was comically bad, with papers delivered everywhere but where they were supposed to go, circulation directors who found a rich vein of humor in the resulting complaints, and little done to solve small problems that sometimes dragged on for years.

Things were hardly better upstairs in editorial, where citizen complaints about stories were routinely laughed off, ignored or treated with disdain. And if you wanted to get a top editor on the phone to talk you had better be the governor, the mayor or the police chief – or you’d find yourself talking to an overworked editorial assistant or secretary-gatekeeper at best.

Small wonder, then, that this period coincided with a noticeable rise in libel suits against newspapers, a decline in trustworthiness ascribed to the media in attitudinal surveys and a growing feeling that the media was just another arrogant, unresponsive institution instead of the public-spirited citizen watchdog it was supposed to be. This attitude has hardened into the current serves-them-right perspective on the disruption of big media.

That disruption has brought a lot of pain to legacy media, but it has also forced it to walk the walk on customer service — now that social media has institutionalized two-way communication and evened the balance of power. After experiencing their own versions of “United Breaks Guitars” and “Dell Hell” campaigns, you can bet that old media doesn’t respond to complaints in the way it used to — even if it is a result of fear rather than respect.

Either way, it’s a good thing.

My Ankle, Part 643…

I want to assure my long-suffering classmates and instructors that I do not intend to make a career of writing about my broken ankle. But a few things that happened to me in New York last week led me to reflect on something Alex Taub and others said repeatedly during our whirlwind tour of start-ups a few weeks ago.

Be nice. Help people.

New York is a city renowned for its Darwinism and brusque mien. If Chicago is the City of Broad Shoulders, New York is the City of Hard Shoves. Outta my way, Mack.

Unless you are on crutches. Fold your arms over a pair of these universal injury signifiers and watch what happens: The surly doorman sprints from behind his desk to open the door for you. Strangers actually look up from their iPhones to give a balky brass door the shove you have been unable to deliver effectively with your hip. Your fellow elevator passengers, usually jockeying to escape the car first, insist that you precede them, even though your pathetic crutching will only slow them down.

In short, New York is a different sort of town when people are nice, when they are helpful.

It’s simple to understand why New Yorkers behave this way: Empathy. It’s easy to look at someone on crutches, imagine what it would be like to be so encumbered oneself and move to help out if you can.

I think the same sort of mechanism is at work in the be-nice mantra we heard during our visits. At first I wondered if it was some karmic or Aquarian sea change in the business world, until Jeremy brought me up short: “I doubt people of the same age on Wall Street are very worried about being nice to each other.” Indeed, much as I tried to foster it during my last assignment, American business in general doesn’t place much of a premium on being nice or helping others, unless you are CEO of, say, the Boy Scouts, and even then…

No. What it is, I think, is that all of these young entrepreneurs work in the same small pond, rub elbows with the same people, congregate in the same places, go to the same events. It is said manners evolved as a kind of social lubricant to prevent clashes in the crowded interior spaces of an evolving society. Being nice is one way to keep the peace in an intensely competitive environment.

But there’s more to it than that. Every one of these entrepreneurs knows what it’s like to need a hand, to be treated kindly, or to get a crucial introduction or sit down through another’s intercession. Or not. And so they’re empathetic when someone else needs something they can provide. Like those who’ve been helping me.
There is one more dimension to the entrepreneurial be-nice admonition that does not line up with the motivations of my helpers. These business people all know that theirs is a small community, and that some of them will make it, a few of them in a very big way. They hope the favor will be returned if they need it, and they realize what a huge potential chit that could be.

I doubt any of my helpers are hoping that they will break their ankle sometime in the future and that I will return the favors they have done me over the last five weeks. At least I hope not.

The state of journalism and Professor Pangloss

Matthew Yglesias of Slate wrote a provocative and widely circulated post last week urging media consumers to ignore “the doomsayers” worrying about the state of journalism and enjoy the incredible profusion of content the Web makes available to readers with even the most arcane interests.

The doom-saying to which Yglesias referred was the annual Pew report on the state of the media – a gloomy document that chronicled continuing revenue and job losses in the nation’s newsrooms, concomitant reductions in coverage and the reader defections those losses have caused. Grim stuff.

Nevertheless, “American news media has never been in better shape,” proclaims Yglesias. “That’s just common sense.”

Putting aside the Panglossian naiveté of Yglesias’ point of view – it is clear even as he celebrates this alleged high point in American journalism that he has little idea how journalism is funded – he makes a critical logical error and then compounds it by confining his analysis to the area of journalism that bests proves his point. Meanwhile he completely ignores another, equally important, if not more important, area of the business.

In attacking the Pew report, Yglesias calls it “a blinkered outlook that confuses the interests of producers with those of consumers, confuses inputs with outputs, and neglects the single most important driver of human welfare—productivity. Just as a tiny number of farmers now produce an agricultural bounty that would have amazed our ancestors, today’s readers have access to far more high-quality coverage than they have time to read.”

Let me count the ways in which Yglesias errs in this single paragraph.

First, he accuses Pew of focusing entirely on the producer side of journalism rather than the consumer side – which is true enough. But then he proceeds to do exactly the opposite, celebrating the diversity of news on the web from the consumer perspective without giving much if any thought to how that news is produced, and what the impact of declining financial support for that mechanism means for the bounty he is celebrating.

And, by the way, we’re not just talking about declining support for old media. Freelance blogger Nate Thayer – presumably one of those contributing to Yglesias’s marvelous cornucopia of content, recently wrote bitterly about being asked by Atlantic.com to repurpose a blog post he’d done so the site could run it for free. Notwithstanding Thayer’s complaint, this happens all the time. If this is the business model for Yglesias’ American journalistic renaissance, he should enjoy it while it lasts.

Then Yglesias says the Pew study fails to sufficiently celebrate productivity. I wonder why? Could it because it is all too familiar with assessments of this new productivity similar to Allyson Bird’s?

“You get called out of a sound sleep to drive out to a crime scene and try to talk with surviving relatives. You wake up at 3 a.m. in a cold sweat, realizing you’ve misspelled a city councilman’s name. You spend nights and weekends chipping away at the enterprise stories that you never have time to write on the clock…
I quit my newspaper job at 28, making less money than earned when I was 22.”

Yglesias then backs up his productivity argument with a simile about the small number of farmers producing more food than anyone ever thought possible. Except he fails to mention two things: These farmers get paid, and paid well, for what they produce – the average family farm had average annual income of nearly $82,000 in 2004, a salary most bloggers and reporters would kill for — and many are subsidized by the government itself.

But Yglesias’ biggest gaffe may be what he fails to address at all. In breathlessly describing the bounty of web news, he fails to mention a single instance of what we might call local news. Surely we can read broadly and deeply about the Cypriot banking crisis, as Yglesias suggests. But what about the crisis of corruption at city hall? What about the scandal at the local bank? Not so much, especially as local, old-media operations shrink and their would-be hyper-local successors struggle to find a workable business model.

In Candide, the title character rejects the mindless optimism of his mentor, Professor Pangloss and comes to the conclusion that “we must cultivate our garden.”
Yglesias’ celebration of the Web’s current news bounty is akin to the farmers he mentions eating their seed corn and mistaking it for a banquet. It’s not enough to be optimistic. Those of us who care about journalism and especially local news must cultivate our gardens.

t

Broken bones … and business models

Please help me!

A few weeks ago, I broke my ankle in two places after fall on some black ice in a parking lot. From the moment I heard the snap of bones breaking as I landed on the joint, it was clear that my world had changed. Always inclined to go it alone, I would now be dependent on others – my wife, my friends and colleagues, my doctors – to help me as I crutched, hopped and wheeled my way through a recovery that could take months.

I can’t stress how hard accepting help is for me. But I have no choice – I just can’t do this by myself.

My situation came to mind recently when I was contemplating the case of an important mid-career journalism educator whose world has changed in a similar way. Tuitions are way off because of trouble in the industry. Its other traditional means of support are tanking. And unless something is done quickly, the future looks grim.

What’s the difference between it and me? It’s this: When another journalism educator in another region of the country approached it with a proposition for a partnership – a plan that would have helped both institutions, the mid-career educator scarcely even wanted to talk about it. Like me, its world had changed and it needed help. Unlike me, it declined a potential source of it.

Why?

I can’t say, really, apart from a vague explanation that the institution was too wrapped up in its own problems to contemplate the proposal. As ridiculous as it sounds for a business to be too wrapped up in its problems to consider a way of solving them, I can tell you as a management-level veteran of several large news organizations that it’s sadly familiar.

Based on that experience, here are three reasons why troubled institutions decline help when they badly need it:

Arrogance: A variation on the too-big-to-fail fallacy, this factor causes leaders to believe heir institution is too important to the audience to succumb to the laws of business. The names of some news organizations so afflicted can now be found on the rolls of the closed and the bankrupt.

Denial: Few successful leaders are so clueless as to not see the handwriting on the wall but many refuse to heed it all the same, believing that the crisis is not as bad as it appears, or will somehow pass, or that somebody somewhere will come up with a solution. The final destination is usually the same as in the case of arrogance above.

Inertia: Sometimes, as in the case of the mid-career journalism education institution, companies are so busy worrying about their problems that they forget that the idea is to do something about them – and that requires an open mind and a bias toward action. This is a variation on both the frog in the pot parable and Clayton Christensen’s idea of the Innovator’s Dilemma: By the time institutions start responding to the trouble, their flailing about actually intensifies it.

It seemed ironic and sad to me that an institution that has spent so much time training managers to avoid these pitfalls would itself succumb to the one of the simplest and most common: failing to accept help.

Now, could I ask you to please roll my wheelchair over to the water fountain?

Thanks.

In defense of Marissa Mayer and woman managers everywhere…

Yahoo CEO Marissa Mayer can’t win for losing these days, it seems. She’s taken hits in the press for ending telecommuting at her company while building a private nursery next to her office, and more lately for deciding to review all new hires in the company.

Let me tell you something: If I were the CEO at a company that was tanking as severely as Yahoo, I’d be doing the exact same thing. In fact, I’d doing a lot more than that – as I’m sure Mayer is.

Of course, if I were the CEO of Yahoo, nobody would be writing about me and the heartless things I was doing to my employees. Why? The answer, I hope, is obvious: because I’m male, and it’s not news when male bosses piss workers off. It’s accepted, expected even. Dog bites man.

Let a woman do the same, though, and the best she can expect is for her management style to be called “unorthodox” by the Washington Post. (Exactly what is unorthodox about giving a failing organization a few swift kicks in the ass is beyond me.) What’s the worst she can expect? Well, if you’ve been following the story, you’ve already seen it.

How did this come to be? Some of it, of course, is good old-fashioned sexism. Some of it is same-sex sour grapes – many professional women have noted that some of the cruelest behavior toward women is inflicted by other women (Are you listening, Maureen Dowd?). And some of it is the backlash from promises women have made about how much different the world would be once they were in charge.

There is no doubt that women as a whole are superior at team play then men, no doubt that they operate better in groups, no doubt that they see workers more as individuals than as cogs in a big machine. This is all to the good, a boon to the workplace and fair enough.

But just because women possess these attributes doesn’t mean that the mere appointment of a woman to the top job will – or should — transform a company into a workers’ paradise. And especially not if the enterprise is failing, as Yahoo is. Sometimes the sicker a patient is, the stronger the medicine required – whether the physician is male or female.

In her new book, Facebook COO Sheryl Sandberg writes that part of the reason women aren’t getting ahead faster in corporate America is because too many want to be liked. It appears that this is not a problem for Marissa Mayer. And I say good for her.

James Bond, Entrepreneur

As Commander James Bond celebrates 50 years on Her Majesty’s Secret Service at the movies — and cable stations across the grid provide the fireworks with wall-to-wall 007 orgies — it’s a good time to celebrate the lessons Bond can teach entrepreneurs.

Even though Bond drew a paycheck — undoubtedly funneled through some nondescript entity — from MI6, he was in fact an entrepreneur.

Dispatched to some dangerous — and predictably warm and coastal — foreign clime with little more than some gruff orders from M and a few gadgets from Q (the latest Bonds have been notably gadget-shy, as befits Daniel Craig’s brutally lean take on the character) he was expected to figure out the situation on the ground and improvise a
response on his own.

Hence his “license to kill.” When he needed to liquidate a bad guy, he couldn’t wait around for the home office to clear it.

Herewith, then, are seven lessons entrepreneurs can learn from the legendary secret agent:

1. Fail fast: The producers of the Bond franchise themselves provided this lesson. When George Lazenby quit the role as his wooden portrayal of Bond in 1969’s “On Her Majesty’s Secret Service” bombed with critics, they moved swiftly on to Roger Moore. Whatever you think of Moore’s fizzy Bond Lite — and I thought it stunk — he played the role seven times, just as many as Connery did. A big flop led to a big success.

2. Embrace new technology: This is practically the unspoken philosophy of the entire series up until Craig, Examples abound, but who can forget Bond escaping an assassin via then-nascent jetpack tech in the pre-title sequence of 1965’s “Thunderball”?

3. But sometimes old technology can do the job: In last year’s “Skyfall,” Craig’s Bond lures villain Raoul Silva to his family’s abandoned estate (the Skyfall of the title) on the moors in Scotland, there to greet him with improvised booby traps and vintage shotguns. When Albert Finney’s gamekeeper empties both barrels into a Silva flunky while saying “Welcome to Scotland” it provides one of the film’s few applause lines. Though he is badly outweaponed, Bond prevails (not so Skyfall, though).

4. Fake it ’til you make it: Pierce Brosnan’s Bond obviously has no idea how to pilot the T-55 tank he steals in “Goldeneye” (1995) — an apt metaphor for first-time entrepreneurs contending with unfamiliar forces and technologies. But he sticks with it, acts like he knows what he’s doing, and in time he’s flattened half of St. Petersburg and nailed a few stooges in the process.

5. Sometimes you have to destroy what you love to win: In “Skyfall,” Bond leaves his beloved — and priceless — Aston-Martin DB5 outside his childhood home as bait for Silva. Bond loses the car (happily for car nuts like me, the film version was made by a 3D printer) but wins the skirmish. Lesson: Don’t get so attached to a business model or anything else that you can’t pivot if circumstances require.

6. Choose you partners well: Bond’s friend CIA agent Felix Leiter is one of the film series’ most enduring characters, having been played by eight different actors. He’s always had Bond’s back. Make sure your partners have yours.

7. Whatever happens, maintain your sense of humor: Bond’s wisecracks as a laser beam inches toward his crotch so rattles the villain Goldfinger in the 1964 film of the same name that he sputters: “Choose your next witticism wisely, Mr. Bond. It may be your last!” Bond stayed cool. Guess who lived to have a Martini, shaken not stirred, later that night?

A massive opportunity for universities in the era of the MOOC …and one danger.

As the world of higher ed buzzes about the explosive growth of Massive Open Online Courses (MOOCs) with equal parts fascination and terror, it’s important to cast a hard eye on what MOOC’s have accomplished so far, what they haven’t accomplished, and what they may never accomplish. And what opportunities that may provide for brick-and-mortar universities in the age of online education.

It’s true that MOOCs have enrolled hundreds of thousands of students for college-level courses lasting as long as a semester. Notice I said enrolled. Because counterbalancing all the crowing about enrollment sizes is the vast, rarely-mentioned-in-the-same-breath number of dropouts, approaching 80 to 85 percent in some cases.

(Size may not be all that it’s cracked up to be — in terms of MOOC’s, anyway. Coursera had to call off its MOOC on — wait for it — “Fundamentals of Online Education: Planning and Application.” last weekend when the demands of the 41,000 students participating crashed the course platform.)

Even allowing for the low barriers to enrollment in MOOCs and the concomitantly low level of commitment to a course in which the student has no financial investment — most MOOC’s are being offered for free at this point — this is the kind of dropout rate that would drive your average university president to mount the campanile for a swan dive.

The low level of commitment endemic to today’s online courses raises an important question, and that is one of outcomes. Surely there are many students who learn, and learn well, for the simple intellectual pleasure of it. Others require incentives of various kinds, ranging from “Dad’s gonna kill me if I flunk this course,” to “I paid a lot to come here and I’m going to get my money’s worth,” to learn effectively. Even when students satisfy MOOC course requirements — and we’ll get to that issue in a moment — it’s certainly worth asking whether their outcomes are equivalent to those of their peers in conventional classrooms.

The fact that most MOOCs are free undoubtedly drives their massiveness. But, as every first-week entrepreneurial-journalism student knows, free is not a business model over the long run –someone always pays. And this raises two more questions: What happens to enrollments when or if this someone winds up being the student? (Less-than-massive, not-so-open online courses, anyone?) And which MOOC platforms of the many now in operation will survive the shakeout?

Finally, the issue of accreditation and certification looms large for MOOCs. Who, precisely, is going to certify to schools, employers and other evaluators that the online coursework has been rigorous to an agreed-upon standard, and that it has been completed legitimately and satisfactorily? Who has the credibility and the standing to accomplish this? Coursera? Open Badges? Pathbrite? Not at this point, certainly.

The point of the argument isn’t to deny the importance and inevitability of widespread online learning — that would be like the newspaper executives who kept debating the potential impact of the web until it disrupted them right into the unemployment line. Obviously, large-scale online learning is here to stay — and that’s a good thing.

The point is the stunning opportunity this provides for universities, especially specialty schools such as CUNY’s Graduate School of Journalism. They don’t have to reinvent the wheel when it comes to student motivation. They already retain the tech savvy necessary to make the platform work. Unlike most MOOC platforms, grad schools actually create educational content — they’ve already hired the talent necessary to produce it. They already have infrastructure for marketing, billing, recruitment and the like in place. And who better to certify and accredit online learning achievement than a school whose business is already based on it?

In other words, who are you going to trust: the City University of New York … or Coursera?

The danger for universities in all of this, of course, is the same one always faced by businesses staring down the barrel of disruption: That they’ll fail to see that the new medium requires a new paradigm. That merely shoveling the old educational tools of syllabi and office hours and lectures and recitations onto the web won’t do the job over the long run, no more than newspapers’ early shovelware forestalled their day of reckoning with digital.

The era of the MOOC provides a lot of opportunity for existing educational institutions. But, as in most things, the best opportunities will go to those unafraid to be bold.