posts about or somewhat related to ‘computing’

The Future of Technology and Computing →

Get ready for some serious reading. The New York Times has a new special out called Future of Computing. Here are just the first few articles in the series:

Power in Numbers: China Aims for High-Tech Primacy
China’s booming economy and growing technological infrastructure may thrust it to the forefront of the next generation of computing, many American experts say.

Creating Artificial Intelligence Based on the Real Thing
Facing the physical limits of conventional design, researchers work to design a computing architecture that more closely resembles that of the brain.

Vast and Fertile Ground in Africa for Science to Take Root
Computer science study in Africa shows great promise, with one Ugandan university even charting its own course in many aspects of mobile computing ahead of the developed world.

With a Leaner Model, Start-Ups Reach Further Afield
The business of Silicon Valley today is less about focusing on an industry than it is about a continuous process of innovation with technology, across a widening swath of fields.

A High-Stakes Search Continues for Silicon’s Successor
As silicon processors grow more packed with each generation, they lose efficiency, and researchers are looking for a new medium.

Out of a Writer’s Imagination Came an Interactive World
The author Neal Stephenson’s reputation for prescience about the online world is well earned, even if he regards it lightly.

Looking Backward to Put New Technologies in Focus
The science historian George Dyson, author of the new book “Turing’s Cathedral,” talks about the genius of Alan Turing and John von Neumann, and growing up in the birthplace of the H-bomb.

Interactive Map
In just four decades the Internet has spread to much of the world. Now, the shift to high-bandwidth connectivity and the global availability of supercomputing is accelerating. Examine the global hotspots.

Looking forward to reading these and the many, many more.

The Decline and Fall of Facebook →

stoweboyd:

Cringely heard a talk by Roger McNamee in which McNamee cites the now-conventional tech viewpoint: Facebook has won.

Again, I’m not saying he’s wrong, but what I took away from this speech was first an image of Microsoft as the Roman Colosseum being mined for marble after the barbarian invasion, and second a sense that while Facebook is certainly a huge social, cultural, and business phenomenon, I just don’t see it being around for very long.

Facebook is a huge success. You can’t argue with 750 million users and growing. And I don’t see Google+ making a big dent in that. What I see instead is more properly the fading of the entire social media category, the victim of an ever-shortening event horizon.

Each era of computing seems to run for about a decade of total dominance by a given platform. Mainframes (1960-1970), minicomputers (1970-1980), character-based PCs (1980-1990), graphical PCs (1990-2000), notebooks (2000-2010), smart phones and tablets (2010-2020?). We could look at this in different ways like how these devices are connected but I don’t think it would make a huge difference.

Now look at the dominant players in each succession – IBM (1960-1985), DEC (1965-1980), Microsoft (1987-2003), Google (2000-2010), Facebook (2007-?). That’s 25 years, 15 years, 15 years, 10 years, and how long will Facebook reign supreme? Not 15 years and I don’t think even 10. I give Facebook seven years or until 2014 to peak.

Does this feel wrong to you?  Listen to your gut and I think you’ll agree with me even if we don’t exactly know why.

Roger may not care since he will have already made his Facebook fortune and then some. But I think this foreshortening is important because it makes Facebook the winner, yes, but the winner of what? Super-IPO of the decade? Yes. Dow-30 company of 2025? No.

My interest is in what follows Facebook, which I think must be its disintermediation by all of us reclaiming our personal data, possibly through our embracing the very HTML5 that Roger loves so much. The trend is clear from “the computer is the computer” through “the network is the computer” to what’s next, which I believe is “the data is the computer.”

You’ll notice I didn’t mention Apple. Black swan.

Facebook is the new AOL.

Cringley doesn’t get into my argument about the rise of social operating systems, but he points to Apple, where we just might see it first.

FJP: Our extremely early stage in social computing and shared data reminds me of early photography and film, two technologies that literally turned a lens on a public that was simultaneously bemused, confused, awed and horrified with what they were seeing.

That is us, they pointed, and didn’t quite know what to do with that information. Even laws needed to be be created to handle the new technological possibilities. After all, was it legal to “capture” the image of a private object? What about a person?

And as we sometimes gawk with amazement at what was created in that jurassic age, most times we chuckle with amused nostalgia at how crude the technology and how primitive the cultural creations actually were.

So too will be the reaction of those twenty and thirty years down the line as they look at our “social networks” and what we could or couldn’t do with them.

All of which is to say that we’re still very, very young, and the only truth we know now is that everything we know now will change. — Michael

"Data is expanding faster than Moore’s Law and that’s a compelling problem that we’re trying to solve,” Ranganathan said. It’s apparently a problem that Intel’s Kirk Skaugen, vice president and general manager of the chipmaker’s Data Center Group, is thinking about too. Skaugen said at a speech last week at Interop that there were 150 exabytes of traffic on the Internet in 2009, and 245 exabytes in 2010, and the Internet could hit 1,000 exabytes of traffic by 2015 thanks to more than one billion people joining the web.