Posts tagged content

In mid-April, we went live with a half dozen articles which we call “stubs.” The idea here is to plant a flag in a story right away with a short post—a “stub”—and then build the article as the story develops over time, rather than just cranking out short, discrete posts every time something new breaks. One of our writers refers to this aptly as a “slow live blog.”

This Is What Happens When Publishers Invest In Long Stories ⚙ Co.Labs ⚙ code community

The results of Fast Company’s experiment with “stubs” — which allowed them to gradually create long-form journalism — pleasantly surprised the team when it brought a lot of traffic. Learn more about their strategy and check out snapshots of their site analytics from Chris Dannen. (via onaissues)

FJP: SBNation, the network of sports blog, rolled out a feature similar to this when Vox Media redesigned the entire ecosystem. This is how Jeff Clark of SBNation’s CelticsBlog described “Storystreams” when the redesign launched: 

This is a kind of post that has several updates within that post. It is a smarter way of handling big stories that have many updates (like trade deadline day and media day) rather than editing a single post or breaking it into several smaller posts.

And yes, I’m a Celtics junkie. — Michael

The situated documentary allows us to examine the emerging transformation of the storytelling model of journalism from the analog to the digital age. In the traditional model of analog journalism, storytelling is dominated by a linear presentation of facts, typically from beginning to end. The audience experiences the story in a passive—almost voyeuristic—mode. Stories tend to have a single or sometimes dual modality of media forms (e.g. text, or text combined with photographs, infographics, audio, and video). A story is published and fixed in time. Corrections might be published later as an afterthought. Stories tend to be based on events, and as such, are episodic rather than contextual. The voice of a typical story is that of a third-person narrative, perhaps best characterized by legendary CBS Evening News Anchor Walter Cronkite’s signature sign-off, “And that’s the way it is.”

The new media storytelling model is nonlinear. The storyteller conceptualizes the audience member not as a consumer of the story engaged in a third-person narrative, but rather as a participant engaged in a first-person narrative. The storyteller invites the participant to explore the story in a variety of ways, perhaps beginning in the middle, moving across time or space, or by topic. Nonlinear storytelling may come as a bit of a shock to some traditional journalists, but it is possible to adapt to new technology without sacrificing quality or integrity.

~John V. Pavlik and Frank Bridges’ monograph, The Emergence of Augmented Reality (AR) as a Storytelling Medium in Journalism, published in Journalism and Communcitaion Monographs, Volume 15, Number 1, Spring 2013. (via virtual300)

FJP: This came across my Twitter radar a few days ago where Jill Falk was kind enough to share the quote. It’s an interesting concept and one that has roots beyond contemporary multimedia storytelling.

For example, my favorite books growing up were of the choose your own adventure variety. You read a chapter and were then told to proceed to chapter X, Y or Z depending on your plot desires. Later, as a teenager, I was fascinated by Julio Cortázar’s “Hopscotch”. The table of contents told you you could read the book traditionally, from Page 01 to the end. It also gave you an alternative reading. That is, read Chapter 01, then jump about nilly-willy, forward and back between chapters. The end result is a type of narrative driven more by “impressions” than linear storytelling.

William Burroughs did this as well. “Naked Lunch” can supposedly be read any which way. Front to back, back to front, jumping about the middle. It’s all good. Urban legend has it that Burroughs dropped the manuscript on the way to his publisher. Despite pages spilled on the ground there were no worries. Again, the book could be read any which way so he gathered the pages up, stuffed them in his binder and continued on his way.

Film plays with this too. Fans of Memento enjoy the front to back and back to front chronologies. Other films employ this technique as well. Back in the 1960s, Jean Luc Goddard famously remarked, “I agree that a film should have a beginning, a middle and an end but not necessarily in that order.”

So let’s go back to multimedia storytelling with the Internet as a primary distribution platform. The underappreciated hyperlink is our key to moving back and forth within a narrative. Our design and UX considerations help control where the story inquisitor might go. But despite our best intentions, that independent viewer is going to pick and choose his or her way through a narrative.

Check our Multimedia Tag for references here. These are stories that have beginning, middle and end. But they’re also stories where the viewer chooses what his or her beginning, middle and end actually is. Site visitors are independent operators. We can try to guide them with our design but they’ll go where interest guides them to go.

Which brings me in a roundabout way to the crux of the matter — multimedia storytelling or not — and that’s the atomic unit of online consumption.

This is a concept that’s been around for a while now. In my interpretation it means something like this: Whatever you do, whatever you post, whatever you research, whatever you pour your heart and soul into, the following will happen: your story will be sliced and diced and shared on social networks and otherwise refactored elsewhere. This could be the mere title. It could be a sentence buried deep within you article. It could be seconds 00:45 - 00:55 of a video. It could be an animated gif of that video. It could be metadata of the information that you produce. It could be an API mashup of all the above.

Simply, whatever story you produce, and whatever media you use to produce it in, your content will be broken down into its smallest parts and shared on Tumblr, Twitter, Facebook, Reddit, blogs and the like.

This is not a bad thing. It’s an agnostic thing. This is remix culture.

Simply and unambiguously, we must deal with it. And from this side of the Internet, we deal with it pleasurably so. — Michael

Mothership Connection

Dylan Tweney, Executive Editor of Venture Beat, runs through recent kerfluffles on our social networks, from their ever changing terms and conditions, to restrictions on how you can access your information, to the whole wall garden-ness of them.

"It’s time," he writes, “to take back our social networks.”

To do so he recommends going back to a platform many have left behind, the blog, and turning it into a mothership of sorts.

[It’s] a bit of a retro suggestion, because blogs have taken a back seat to other forms of expression in the past few years. The RSS feed never engendered the kind of reciprocal sharing and commenting that a well-designed social network does, and as a result, many people have migrated away from blogging.

“I’ve used many social networks. Friendster, Facebook, everything. But they come and go. But my blog has always been my home on the web,” Matt Mullenweg, the founder of WordPress, told me last week. ”What’s changed in the past few years is that blogging started to feel a bit more lonely, because it wasn’t connected to these social news feeds.”

Like Mullenweg, those of us who have had blogs for a decade or more have been using them less and less, drawn to the ease of tweeting and the warm, friendly responsiveness of Facebook.

But now it’s possible to circle back to the blog without giving up the social networks. In fact, it’s increasingly easy to use a blog as the center of your social universe.

That’s because, while social networks like Facebook and Twitter are reluctant to share data out, they are eager to bring your data in. (This is why Twitter no longer lets you update your LinkedIn status from Twitter, but you can do the reverse and update your Twitter status from LinkedIn.)

So if they won’t share, fine: Make your own website the source, and share it out to various other networks as a way of staying in touch with your friends there.

In other words, with a few simple hooks, your blog becomes the mothership of your online activity.

For example, Tweney points out that WordPress users can use a part of the Jetpack suite of plugins to push your content to networks like Twitter, Facebook, Tumblr and LinkedIn. IFTTT is another great resources for managing content flow online.

This is a good start to maintaining control of your content but here are three quick — and related — caveats to think about as you do so:

  • Social networks are social because of our interaction on them. They shouldn’t just be repositories of our publishing exhaust. You have to go to them, and interact on them within the cultural norms that have evolved on them. Or at least use tools that allow you to do so. One I’ve recently started playing with is Engagio. It basically presents your social interactions across platforms in a Gmail-like interface.
  • Yes, we’d like people to come see how shiny and bright our motherships are but that’s not really the point, is it? We’re trying to engage audiences wherever they may be which means it is not just about sending out a link and asking people to leave the space they’re currently in to learn what might be behind that link. Again, remember the social and think of a real life gathering. You wouldn’t approach people at a party and say, Hey, I have something I want to share with you but we have to leave here in order for me to give you the lowdown.
  • Sometimes — perhaps oftentimes — we need to customize and/or prepare our content for different platforms. Take photos, for example. Tumblr has a great way to present them, and allows you to order them in particular ways and with different layouts. Facebook handles photos different. Ditto Flickr and Google. By merely pushing content from the mothership to the social network without paying attention to the nuances of each platform we lose an opportunity to tune that content in the best way for the audiences that will view it on that platform.

Despite the caveats, knowing how to maintain control and “ownership” of your content is important. And it’s not just for us little people.

Last week the Guardian announced that it was killing its Facebook social reader in order to regain control over the user experience people have with its content.

Somewhat Related, Part 01: Anil Dash, The Web We Lost: A look back at how just a few years ago the Web was much more interoperable.

Somewhat Related, Part 02: Bernard Meisler, Why Are Dead People Liking Stuff On Facebook?: Exploring fake likes across Facebook and how/why they might be happening.

YouTube users upload 48 hours of video, Facebook users share 684,478 pieces of content, Instagram users share 3,600 new photos, and Tumblr sees 27,778 new posts published.
What Happens in an Internet Minute
Via Intel:

In just one minute, more than 204 million emails are sent. Amazon rings up about $83,000 in sales. Around 20 million photos are viewed and 3,000 uploaded on Flickr. At least 6 million Facebook pages are viewed around the world. And more than 61,000 hours of music are played on Pandora while more than 1.3 million video clips are watched on YouTube.

All in all, that’s 625 terabytes of information sloshing about the tubes each minute.
If we do some math that’s 878.9 petabytes per day which is a bit difficult to wrap our mind around.
But if we convert that to the universal measurement of the MP3, we get the equivalent of about 235.9 billion songs passing through the internet and mobile networks each day.

What Happens in an Internet Minute

Via Intel:

In just one minute, more than 204 million emails are sent. Amazon rings up about $83,000 in sales. Around 20 million photos are viewed and 3,000 uploaded on Flickr. At least 6 million Facebook pages are viewed around the world. And more than 61,000 hours of music are played on Pandora while more than 1.3 million video clips are watched on YouTube.

All in all, that’s 625 terabytes of information sloshing about the tubes each minute.

If we do some math that’s 878.9 petabytes per day which is a bit difficult to wrap our mind around.

But if we convert that to the universal measurement of the MP3, we get the equivalent of about 235.9 billion songs passing through the internet and mobile networks each day.

Paedophilia, necrophilia, beheadings, suicides, etc. I left [because] I value my sanity.

A Facebook moderator explaining why he quit his job monitoring content on the social network. Via The Daily Telegraph, The dark side of Facebook.

Background: Facebook outsources much of its content moderation around the world. There are privacy concerns, of course, but here’s how it generally works:

Last month, 21-year-old Amine Derkaoui gave an interview to Gawker, an American media outlet. Derkaoui had spent three weeks working in Morocco for oDesk, one of the outsourcing companies used by Facebook. His job, for which he claimed he was paid around $1 an hour, involved moderating photos and posts flagged as unsuitable by other users.

“It must be the worst salary paid by Facebook,” he told The Daily Telegraph this week. “And the job itself was very upsetting – no one likes to see a human cut into pieces every day.”

Derkaoui is not exaggerating. An articulate man, he described images of animal abuse, butchered bodies and videos of fights. Other moderators, mainly young, well-educated people working in Asia, Africa and Central America, have similar stories…

…Of course, not all of the unsuitable material on the site is so graphic. Facebook operates a fascinatingly strict set of guidelines determining what should be deleted. Pictures of naked private parts, drugs (apart from marijuana) and sexual activity (apart from foreplay) are all banned. Male nipples are OK, but naked breastfeeding is not. Photographs of bodily fluids (except semen) are allowed, but not if a human being is also shown. Photoshopped images are fine, but not if they show someone in a negative light.

Once something is reported by a user, the moderator sitting at his computer in Morocco or Mexico has three options: delete it; ignore it; or escalate it, which refers it back to a Facebook employee in California (who will, if necessary, report it to the authorities).

Shawn Price, President, Zuora.com, discusses how content businesses such as publishing, music, and video are transforming themselves to achieve economic viability through a variety of business models.

This video is from a panel discussion our sister site, ScribeMedia.org, produced with the MIT Enterprise Forum in NYC last week.

To watch all the speakers, including a VC that invests in media companies, NY Times, Glenn Beck’s subscription site TheBlaze.com, and Teleshuttle, visit ScribeMedia.org

The Impact of Internet Copyright Regulations on Early-Stage Investment in Content Companies

A large majority of the angel investors and venture capitalists who took part in a Booz & Company study say they will not put their money in digital content intermediaries (DCIs) if governments pass tough new rules allowing websites to be sued or fined for pirated digital content posted by users. (DCIs are the companies that provide search, hosting, and distribution services for digital content such as YouTube, Facebook, SoundCloud, eBay, and thousands of others.) More than 70 percent of angel investors reported they would be deterred from investing if anti-piracy regulations against “user uploaded” websites were increased.

More than 80 percent of the angel investors would prefer to invest in a risky, weak economy (with the current internet regulations) vs. a strong economy (but with the new, more stringent proposed regulations on copyright infringement).

If the legal framework for digital content was clarified, and penalties on copyright infringement were limited for content providers acting in good faith, the pool of angels interested in investing would increase by nearly 115 percent.

Tumblr would not have been funded if it was trying to raising capital in the current regulatory environment.

- Peter

Nothing “Appealing” About This: Court Reinstates $675K File Sharing Verdict

Re-blogging or “curating” originally reported news articles in full, anywhere, everywhere, remains an un-fineable offense. But promoting some tunes among a highly desirable consumer group—college kids— that’s illegal, and worth somethin’ like a couple few grand per track.  Music doesn’t grow on trees! (But then again neither does good, investigative reporting, does it?)

Broken Reckless record

FROM WIRED’S David Kravets @dmkravets:

"A federal appeals court on Friday reinstated a whopping $675,000 file sharing verdict that a jury levied against a Boston college student for making 30 tracks of music available.

The decision by the 1st U.S. Circuit Court of Appeals reverses a federal judge who slashed the award as “unconstitutionally excessive.” U.S. District Judge Nancy Gertner of Boston reduced the verdict to $67,500, or $2,250 for each of the 30 tracks defendant Joel Tenenbaum unlawfully downloaded and shared on Kazaa. The Recording Industry Association of America and Tenenbaum appealed in what has been the nation’s second RIAA file sharing case to ever reach a jury.

The Obama administration argued in support of the original award, and said the judge went too far when addressing the constitutionality of the Copyright Act’s damages provisions. The act allows damages of up to $150,000 a track.”

http://www.wired.com/threatlevel/2011/09/file-sharing-verdict-reinstated/

People tell me that content is king, but that is not true at all. Most people make money pointing to content, not creating, curating or collecting content.

Rishad Tobaccowala, chief strategy and innovation officer at Vivaki, to the Wall Street Journal, Content Deluge Swamps Yahoo

The Wall Street Journal outlines how Yahoo and AOL are struggling with their ad-supported business models. 

In a nutshell: in the not so distant past having great scale almost guaranteed profits, but with the proliferation and commoditization of most content that’s not the case anymore.

As the article’s authors write

It’s a simple rule of any market. The more information that is created, the more the value is reduced. And despite attempts to woo spending with bigger, bolder and more targeted ads, services that help consumers navigate that content, namely search, remain the big money makers online.

In other words, services that make content discoverable either via search (Google) or social (Facebook) are thriving.

In 1998 Yahoo was charging CPM rates of $25, according to the Journal, that’s now down to $6.50. 

The Half Life of Shared Links
Via Bitly:

The mean half life of a link on twitter is 2.8 hours, on facebook it’s 3.2 hours and via ‘direct’ sources (like email or IM clients) it’s 3.4 hours. So you can expect, on average, an extra 24 minutes of attention if you post on facebook than if you post on twitter…
…Not all social sites follow this pattern. The surprise in the graph above is links that originate from youtube: these links have a half life of 7.4 hours! As clickers, we remain interested in links on youtube for a much longer period of time. You can see this dramatic difference between youtube and the other platforms for sharing links in the image above…
…Many links last a lot less than 2 hours; other more sticky links last longer than 11 hours over all the referrers. This leads us to believe that the lifespan of your link is connected more to what content it points to than on where you post it: on the social web it’s all about what you share, not where you share it!

H/T: Sanjiv Desai.

The Half Life of Shared Links

Via Bitly:

The mean half life of a link on twitter is 2.8 hours, on facebook it’s 3.2 hours and via ‘direct’ sources (like email or IM clients) it’s 3.4 hours. So you can expect, on average, an extra 24 minutes of attention if you post on facebook than if you post on twitter…

…Not all social sites follow this pattern. The surprise in the graph above is links that originate from youtube: these links have a half life of 7.4 hours! As clickers, we remain interested in links on youtube for a much longer period of time. You can see this dramatic difference between youtube and the other platforms for sharing links in the image above…

…Many links last a lot less than 2 hours; other more sticky links last longer than 11 hours over all the referrers. This leads us to believe that the lifespan of your link is connected more to what content it points to than on where you post it: on the social web it’s all about what you share, not where you share it!

H/T: Sanjiv Desai.

Technological Innovation: A Publisher’s Dilemma

The news yesterday that newspaper giant Tribune Company is developing a tablet makes me wonder where and how publishers should technologically innovate.

The Tribune plans to offer subscribers free — or highly subsidized — tablets that will reportedly be built by Samsung. Many think the effort is already doomed for failure.

The plan reminds me of a recent Adweek article about the publishing industry’s ongoing woes with Content Management Systems. In it, Erin Griffith catalogues how BusinessWeek spent upwards of $20 million trying to create a social networking layer on top of its proprietary CMS; how Salon.com — which launched in the 90s — is still using the home-rolled CMS it used in the 90s but is reportedly migrating to WordPress; how Time, Inc. has worked on a home-brewed CMS for seven years but will probably abandon it; and how AOL spent three years trying to create a proprietary CMS before ditching the effort, buying Blogsmith for about $5 million and now trying to migrate to the Huffington Post’s highly customized version of Moveable Type.

Griffith writes:

Add a marketplace crowded with content-management options, tight budgets, and a string of media mergers—and the corresponding change in personnel—and the result is that these troublesome tools are being plied in a cultural clusterfuck. The result is a growing number of bloated, tangled CMS platforms reviled by the editors that publish on them, and the IT teams that maintain them.

That’s just the tip of the Content Management iceberg and doesn’t even begin to touch on the difficulties of creating a friction free workflow for multiple platforms (Web, print, mobile, tablet). In hindsight, it’s easy to say publishers shouldn’t have rolled their own. But with foresight does it make sense for Tribune to get into the tablet game?

The short answer is no, but that’s not to say news organizations should ignore in-house technical innovation.

Instead, it’s to ask how and where they should allocate resources in the pursuit of technological innovation. 

Part of the answer is remembering the core product, journalism, and then investing time and resources into technologies that enhance it. 

For example, technologists from the New York Times and ProPublica collaborated to create Document Cloud, a Web-based platform that allows organizations to analyze large data dumps across multiple documents. 

Document Cloud, in turn, uses Open Calais, a Web service developed by Thompson Reuters that layers semantic metadata over content.

These are innovative technological investments in the service of a publishers’ core news and information product.

Meanwhile, Tribune ramains in bankruptcy, is laying off editorial staff and is plowing human and financial capital into a product that will compete with the iPad, Kindle and other market leaders.

From this corner of the Internet, it seems an investment gone wrong. From another corner, Markus Pettersson, head of reader relations and social media at Göteborgs-Posten, writes that Tribune is “afraid, clueless and [has] lost track of what is [its] core product: journalism. It tells everyone including your readers and ad buyers that you have business ADHD, and cannot be relied on to focus on developing your core product: journalism.”

Agreed, and thinking we’ll be writing something very similar to Griffith’s Adweek CMS article a few years down the line. At that point in time, it will be Tribune as the poster boy for tech investment gone wrong.

Some might remember when ESPN tried to create a branded phone. Steve Jobs’ response at the time, “Your phone is the dumbest fucking idea I have ever heard.”

ESPN, it’s reported, lost $135 million on the venture.

UX for news content: Designing news to be usable by Alex Gamela
It’s not about just informing people anymore, it’s about creating a product that lets people do something with that information, creating richer and more immersive content, making it more valuable and with a longer lifespan.
For an in-depth look at each piece of the honey comb, see Innovative Interactivity

UX for news content: Designing news to be usable by Alex Gamela

It’s not about just informing people anymore, it’s about creating a product that lets people do something with that information, creating richer and more immersive content, making it more valuable and with a longer lifespan.

For an in-depth look at each piece of the honey comb, see Innovative Interactivity

Mapping Google’s Transparency Report
Every six months Google releases a report outlining how many times governments either ask it to remove content uploaded to its various services (eg., YouTube), or request data about specific Google users (eg., you).
Shown here is an overview for the United States.
Via Google:

Like other technology and communications companies, Google regularly receives requests from government agencies and federal courts around the world to remove content from our services and hand over user data. Our Government Requests tool discloses the number of requests we receive from each government in six-month periods with certain limitations.
Some content removals are requested due to allegations of defamation, while others are due to allegations that the content violates local laws prohibiting hate speech or pornography. Laws surrounding these issues vary by geographic region, and the requests reflect the legal context of a given jurisdiction.

The report does not provide specifics on whose data is being requested or what content is being asked to be removed.

Mapping Google’s Transparency Report

Every six months Google releases a report outlining how many times governments either ask it to remove content uploaded to its various services (eg., YouTube), or request data about specific Google users (eg., you).

Shown here is an overview for the United States.

Via Google:

Like other technology and communications companies, Google regularly receives requests from government agencies and federal courts around the world to remove content from our services and hand over user data. Our Government Requests tool discloses the number of requests we receive from each government in six-month periods with certain limitations.

Some content removals are requested due to allegations of defamation, while others are due to allegations that the content violates local laws prohibiting hate speech or pornography. Laws surrounding these issues vary by geographic region, and the requests reflect the legal context of a given jurisdiction.

The report does not provide specifics on whose data is being requested or what content is being asked to be removed.