Google ChromeOS – a non-event

*** Posted on All rights reserved. ***

The net is abound with buzz nowadays about the aannouncement of a Google “Operating System” due to come out the second quarter of 2010. Yawn..

Beyond the discussion of what qualifies this as an Operating System, for which I will direct you to two excellent articles by TechCrunch and The Register (Caution: colorful language), there is also the question of what the product actually is. According to Google, the Chrome OS is “Google Chrome running within a new windowing system on top of a Linux kernel.” In other words: Install Linux, Install Chrome, take away anything that isn’t Chrome, and expect who ever is using this to only use tools that run in Chrome. Seems to me this should take about a week for a competent System guy to do, assuming he has to write his own scripts.

So what’s the big deal here? The “new Windowing system”? Gnome, KDE, and the rest aren’t good enough for you, you need something that will launch Chrome automatically without showing a Desktop first? I mean seriously, what are they going to be spending a year on?

The answer is as simple as it is sad: they are going to spend the next year on trying to make Chrome do things the way a real OS does, and on trying to make Web-apps function like real apps. With all kinds of hocus pocus like HTML 5, Google Gears, and G-D knows what else, they’re going to try and develop a comparable platform for running the programs you need inside a browser.   I doubt they’ll succeed, and I seriously doubt that they can do it in a year.

And that’s the bottom line. No one really cares is the OS they use come from Microsoft, Mac, Linux, or Ed’s computer shopp and live tackle, they simply want to install their applications (that they’ve been using for years) and have them work. Period. This is the reason that after a great valiant run at Linux, I came back to Windows. It let me do the things I wanted, and have been doing for over a decade, without having to re-learn and re do everything. Trying to get everyone to shift paradigms and move to Google Docs is one thing (and a daunting task at that). The fact that Google Docs can’t do everything that office can is completely different. In the larger scheme of the corporate world, a $200 saving on a computer with a free OS is nothing compered to the amount of time, and hence money, wasted on trying to relearn years of established ways to do things.

There’s even more to is than that. As the Register points out:

But it’s not just Office that will keep Microsoft’s hold on the PC market. Can you replace Active Directory with a web app? Is there a site I can visit to connect to my office’s shared printer? What do you mean World of Warcraft doesn’t run in the browser? How do I play a DVD in Google Chrome?

And he’s absolutly right: The greatness of a true OS is that it can run ANYTHING, not just thing that are written in the limited context of the Internet. And if a program is installed on my hard drive it will run with or without a network connection, and can access and modify the files on my drive without the fear that sudden server congestion will break it. Until ChromeOS can claim even a little of that, it is not Operating System, it’s a non-event.

Good Weekend,



Asking questions that mean something.

Here is a trivia question for all the minutia lovers out there: what former US surgeon general looks like colonel sanders and has a name that has something to do with chickens?

Go ahead. Type that into Google, bing, or Wolfarm alpha, and see what you get. Chances are it’ll be a long laundry list having to do with Obama picking his surgeon general, KFC, or chicken jokes. In fact, you will get a whole load of matches to your query, but you will not one simple thing: an answer.

This is, of course, because search engines don’t understand questions. They simply scan your search terms for keywords and try to give you relevant pages. They do some rudimentary grammar analysis to try and determine the subject of the query (IE the thing you are actually looking for) but more often than not, they get it wrong (which is why you get more entries about chicken then the surgeon general). Search engines have a hard time with descriptions, too. A ‘man who looks like a thing’ is the sort of thing that a search engine simply can’t handle. And finally, search engines can get easily confused in determining what pages to return that might contain an answer (Consider this post, for example, it has links to search engines, chicken jokes, and observations about grammar. If you had to quantify it by keywords, you’d end up with some mighty odd matches)

I say this because there has been a trend recently of creating “answer engines” – search engines that can understand your question and miraculously supply you with the answer. It started a few years ago with “ask Jeeves” (now part of and had its latest arrivals in the much publicized Wolfarm Alpha and the bing “decision engine”. Sadly it seems that hype aside, there really is no noticable difference between a search engine and a answer engine, with the possible exception of Wolfarm – the first engine that has the desency to tell you when it doesn’t understand what you want.

I’m not faulting search engines developers, mind you. Understanding plain-English questions is a hugh and daunting task, and the field is really only init’s infancy. Search engines have gotten a lot better over the past few years, and will continue to improve (and users will continue to get better in searching, which is a different topic for a different day.) But we’re still far far away from the day when all the knowledge of the Internet is at our fingertips. Search engines can fill in many details, but they’re no replacment for a structured approche to learning,  no replacement for simple thirst for knowledge, and no good in trivia. At least, not yet.

Big Endian.


For anyone who’s wondering, the answer is

C. Everett Koop

C. Everett Koop

a great name, if I ever heard one.

Upward mobility

These past couple of weeks have been significant in the world of mobile communications. With the introduction of the Apple IPhone 3GS and the Palm Pre (in addition to the already-out Backberry storm) it seems that the days of the cell phone as a true  “Mobile computing platform” are finally here. Hooray Hurrah! it’s 1983 all over again!

Except that it’s not. 1983 saw two major advancements in the field of personal computing that complimented each other: First the IBM-XT personal computer. A truly flexible computing platform that combined power (such as there was at the time) and a price attractive enough for regular people, the XT (and it’s successor the IBM-AT based on Intel’s 80286 cpu) was not only a great computer, but the fountainhead of a whole new industry: IBM clones. These “PC-compatible” machines took what was just one of many competing computing platforms, and made it into the de-facto standard for computing till today. Even though IBM no longer makes personal computers, the x86 architecture on the desktop has endured magnificently over the last three decades, in large part due to the availability of cheap, inter-compatible  hardware, readily bought from your nearest distributor.

But hardware is only part of the story: 1983 also saw the announcement of MS-Dos version 2.0 – the first Operating System that was both truly usable and (more importantly) cross-platform compatible. You could buy Ms-Dos separately from the computer, install it on the PC-clone of your choice, and IT WORKED! You did not have to get your hardware and software from the same place any more. You weren’t locked into whatever your vendor offered and that’s it. For the first time ever, you had true flexibility in computing. This was astounding. Still is, if you think about it.

The situation in today’s cellular market is the same as it was in the PC world before 1983. We have good platforms out there, and they have a tremendous potential as the computing platforms of the future. What we lack is a PC-Clone and a Microsoft. We lack a platform that is so convincing that it would sweep all others before it, and so easy to build that everyone (except Apple, as always) will build versions of it. That, coupled with a company that would bring it all together, and develop the definitive Mobile OS that would run on that platform. I had (and still have) great hopes for Android as being that OS. But so far this seems not to be happening.

One final thought: We don’t HAVE to have the hardware platform. You can port an OS to any number of architectures. Linux does it, and it works. But we do HAVE to have ONE (or at most two) mobile OS. The one platform that would sweep the market and create a standard. When that happens, when you can run any application on any cell phone, when the brand of your cell phone becomes a matter of price and convenience, not a question of what that brand can and can’t do. Then – only then – can we party like it’s 1983

Party cloudy future

Intel has recently announced the “Pine Trail” platform for Netbook and Nettop computers, which would allow for ever cheaper (albeit, lower powered) computers for specific Internet-oriented tasks. Under Pine Trail, the CPU, GPU, and Memory controller are integrated into a single chip, which is then coupled with a new southbridge chip called “Tiger Point” will allow the whole platform to be based on a two-chip solution, rather than the three chips in the current Diamondville architecture, lowering its price and, more importantly, it’s power consumption.

While Intel is positioning this as a solution for developing countries and medium to large businesses, and as an alternative to NVidia’s ION platform (which includes a seperate GPU, the 9400M) I submit that this could be an important step in a new type of home environment: Local Cloud computing.

Consider the typical “westernized” family. Father, mother, two point four kids, broadband Internet. The family budget, parent’s email, and work-from-home functions and carried out the computer in the Parent’s room. Junior and Juniorett do their school work and social-network posts on computers in their respective rooms, the family photo albums and streaming media are localized in the living room HTPC and so on. Under the current scheme, the family would buy (at least) four full-fledged computers to answer these needs, plus additional hardware, software, and peripherals as necessary. Assuming  the average life span of a computer is about three years, this family would endup buying 1.3 new computers per year. Roughly US$1000 worth, in our present prices.

But of course, they don’t NEED four computers. The some total processing power required to do all these tasks can be met quite easily by one decent-sized machine. It does not take much power to write a term paper, to surf the net, or to use twitter. It takes a little more power to play HD movies, but it’s a dedicated job for the processor. IE, it’s a fair assumption that while you are playing a HD movie, you are also watching the HD movie, so you’re not really doing anything else on the machine. Not that it really matters, either. A typical Intel Core i7 processor with it’s quad cores could easily handle playing the movie and composing email at the same time. No sweat. In terms of available raw power, there’s very little that the average user can do to seriously task a modern CPU (With the exception of playing games or serious graphics/video encoding jobs), so having four of those in one household is overkill.

Instead what you could do is have ONE powerful computer (probably the HTPC) and three Nettops or even Netbooks. The HTPC has one property which is compelling for this: Although it’s usually always on, for most of the time, it stands ideal. This makes it a perfect candidate to serve as a file server, a media server, a print server, etc. Of course, this is nothing new. The home server idea has been around for years, with various degrees of success in its implementations. It seems that it’s just a little hard to get all this stuff working in a way to is easy to setup and use for the average non-techie.  Configuring remote print servers is something that the average tech-support person does regularly, but faced with a new installing a TCP port on their machine, most users tend to simply give up.

But imagine a home server where the server serves not only files and printers, but also actual applications. Word processing, email programs, anything that can be squeezed into an applet, a servlet,  AJAX, ASP, etc. And furthermore, a server that can do all that simply, easily, with as little “scary details” as possible, and using an interface so simply that even a grandmother can use it, and so resource undemanding that it would run on anything that runs a web browser.

This is where cloud computing and home computing need to meet. The power behind things like things like Google Docs is that they allow you to use complex applications in a very simple environment. It’s a good idea, but limited: Mot everyone wants Google to read their documents, and there are some things (like streaming media) simply don’t work very well when they have to be done on-line. A “Home Application Server” which combines the benefits of centralized computing, with the locality of media, and the ease of use of a web browser could be the ideal computing environment for the home. Allowing the family to get the most of out of it’s resources, without the waste of duplicating so much hardware.

There are other benefits as well: A central server that runs most of the Internet interaction for the family is an ideal place for a web-caching service to make browsing faster, and updated firewall and virus scanner to make it safer, and content filtering and monitoring software to let parents keep an eye on what their children are doing on the net. It also allows for a much easier upgrade path: USB3 devices can let a family add gigabytes of storage to the server at a fraction of the cost it would take to add them to separate computers. And when the time comes to upgrade the home server, they can be simply be moved to the new hardware with minimal downtime.

With simple, cheap “access point” computers sharing the resources consumers reap the benifits of lower computing costs and electricity bills, while the environment benefits from having less obsolete computers being dumped every year. As we become more and more conscious of the impact computers have on our pockets and air, I can definitely see this becoming a major trend for the upcoming decade.