Jean-Marie Gustave Le Clezio won the Nobel Prize for Literature in his Nobel Lecture in Stockholm yesterday, he commented “if the Internet had existed at the time, perhaps Hitler’s criminal plot would not have succeeded — ridicule might have prevented it from ever seeing the light of day”.
It is certainly a rosy idea, that a world with more dialog and discussion would’ve prevented the carnage and slaughter that accompanied World War II and the Holocaust, but the pessimist in me thinks that technology is a neutral player when it comes to world peace. Give every child in the a world a laptop, and we’ll end up with a world just as constrained by economic, social, and ethnic divisions. Technology can certainly be used as an educational tool, but the ability to communicate does not, in and of itself, encourage communication. More eyeballs on a news story doesn’t always translate to a more reasoned discussion.
Read more at O’Reilly Broadcast
First interview with a member of Congress today, it went well. Was interesting, I found myself getting a science lesson the conductivity of carbon nanotubes.
Check out the story
This is from my article over at O’Reilly News:
Social media and social networking are going to be pivotal aspects of engaging and empowering the individual. Decades from now, we won’t be able to imagine a government unaffected by the real-time input of constituents filtered by the social operating system that will define the culture. It is already happening, in this presidential election cycle participation has ballooned, and conventions are being moved to stadiums. This is much larger than just letting Congress Twitter, this is about letting social networks help to evolve the very concept of governance.
If our government doesn’t keep up with the technology of the surrounding culture. It will grow more and more irrelevant and unresponsive over time. In one sense this is about government surviving in a connected age. Dramatic? Yep. True? Yep. I really think that if the government doesn’t embrace technology, we’ll probably discard it when it falls further behind the cultural context of social computing.
Lofty? Sure. But, I believe it. Read the rest over at O’Reilly News
Check it out, this is the second audio interview published with O’Reilly. A smaller interview than the Brian Cox interview from last week, but still interesting. Luiz Barroso wrote a very interesting paper with Urs Holzle for IEEE Computing on Energy Proportional Computing, and I took this interview as a chance to ask him some questions.
I’ll summarize it: Most hardware manufacturers build systems to perform well under the SPECpower benchmark which measures the efficiency of a computer at maximum performance. While these specifications look great, they are relatively meaningless when you consider the fact that most (data-center, volume) servers are running at between 15% and 45%. In this range, the computer operates with a terrible efficiency. Full power must be used to keep the disks spinning and the DRAM running, but the CPU is idle for 70% of the time.
He argues that we need to rethink hardware such that components will use energy proportional to the computing they are being asked to perform. If a computer is only operating at 30% utilization…. well… duh… we’d expect it to only use about 30% of the power.
This issue is important as data centers grow from about 1.5% of our total national energy budget to somewhere around 3.0% in 2011.
Cool, Slate Quoted the Brian Cox Interview that I published on O’Reilly News. Very cool. I guess this makes O’Reilly News an official source for technology and science news.
I recorded an interview wth Brian Cox of CERN last Wednesday, and I just published them on news.oreilly.com.
Check out the Interview Here – it is about 45 minutes long. Highlights include some in-depth questions about the science, technology, and computing behind the Large Hadron Collider. Brian also does a great job debunking some of this black hole FUD against the accelerator and building up the case for funding direct research. Listen to the whole thing, it gets much more interesting toward the end.
In addition to the interview piece I published another article with more technical and computing background about the Large Hadron Collider and the computing behind the effort. “Large Hadron Collider as Massive Grid Computer” is also published on the O’Reilly News site.
My goal with this story was to cover something broader than just another technology issue. I feel like the industry needs to be reminded that there are people in the world using computers to do more than just another web application. Scientific computing is a huge area that I’m interested in and I’ll be agitating for more coverage in the weeks and months to come. I was sick of reading technology news that is focused on the wrong conversations: which language is better? and who’s web application framework is going to “revolutionize” everything?
DHH’s Rails framework is nothing compared to René Brun’s and Fons Rademakers’ LGPL’d High Energy Physics (HEP) ROOT Framework.