« Oops ... | Main | SFWA attempts to commit public suicide »

Commoditizing our future

(Sorry I've been so quiet this week; a seasonal cold turned into bronchitis, and I'm not entirely over it yet ...)

I've spouted off previously in this blog about my lamentable poor saving throw versus shiny! — not to mention my irritation at the refusal of the consumer electronics industry to render me bankrupt by actually giving me what I want. Trouble is, at long last they've turned around and done it.

I have in my possession an Asus Eee subnotebook. It cost me an eye-watering £220 — as the top-selling laptop on Amazon.com right now, there's a certain scarcity value attached, and they haven't yet sunk to their real price, somewhere around thruppence ha'penny. But the process is becoming clear.

Back in 1998, I bought a notebook computer: a Hewlett-Packard Omnibook 800, with trimmings. It had a 166MHz processor, 80Mb of RAM, a 4Gb hard disk, an external CDROM drive (reader, not writer), and a docking station. It was being discontinued, which is why I was able to walk out of the store with it for under £1400, rather than paying the full-whack £1900 that bundle was going for a few months earlier.

Compare with the Eee. On processor and memory the Eee wipes out the 1998-vintage high-end subnotebook, with 900MHz and 512Mb respectively. The disk space is the same, except the Eee uses solid state memory rather than a spinning mechanical thingy. The screen resolution ... the Eee has a 800x480 pixel panel to the Omnibook's 800x600, so we'll chalk that one up as a win for HP, but both machines can cope with larger external dispays. For an extra £80, I bought the Eee 8Gb of additional storage media (an SDHC card), an upgrade to 1Gb of RAM, and an external CD/DVD rewriter. You don't need any of that stuff to make the Eee useful, but it's interesting to note that with it, the Eee is considerably more capable than the 1998-era high-end HP notebook, has triple the storage, double the battery life, and weighs less (with all its accessories, including the power supply) than the Omnibook on its own.

So, let me cut to the chase. Moore's Law suggests that every component of a PC halves in price on a roughly 18-month cycle. A desktop PC today should be roughly 100 times as powerful as a desktop PC of similar price 10 years ago, and 50 times as powerful as a PC of eight and a half years hence. A naive soul with no prior experience of consumer capitalism might ask why, instead of doubling in power, the manufacturers don't concentrate on cutting prices? But that's not how the industry worked. Until now.

A couple of years ago Nicholas Negroponte of the MIT Media Lab launched the idea of a $100 laptop for education in the developing world. Well, the OLPC XO-1 is now out, costs $188 in bulk (a chunk of which is attributable to the dollar collapsing in the meantime), and hasn't exactly taken the world by storm — but succeeded in sticking the proverbial cattle prod up Microsoft and Intel's collective arse. For too long, the software and CPU giants had been treating the PC market as a cash cow, with a natural floor on the price of the product; the XO-1 proved that they were overcharging grossly. Intel's reaction was the Classmate reference design, their own purported rival to the XO-1; the Asus Eee is what you get when a large far eastern OEM thinks "hang on, can we commoditize this and sell it in bulk?" Microsoft, incidentally, failed to make it onto the Eee bandwagon because they wanted $40 for a Windows XP license — on a machine that starts at $250 for the stripped-down version. Mine runs Linux perfectly well, thank you, and comes with the basic stuff you need to be productive; OpenOffice, Thunderbird for email, Firefox as a web browser, and some other gadgets (like Skype and a webcam).

The Eee isn't an order of magnitude cheaper than a normal laptop but it is close to an order of magnitude cheaper than previous ultra-lightweight subnotebooks. And I think I'm going to use it as a pointer to a future trend in the computer business, at the low end. The Eee is about 8 times as powerful as that 1998 Omnibook, at a quarter the price. That's an improvement of half an order of magnitude in one direction and close to a full order in the other. And it's a tipping point, I think, showing that the price points that have defined the goal posts for the personal computer business aren't set in stone.

The dirty little fact everybody in the consumer computer trade have been trying to ignore — Dell, HP, Microsoft, Intel, AMD, Apple, all of them — is that the computer biz is overdue for commoditization. There is no intrinsic reason why a kilogram of plastic and metal with a couple of silicon chips in it should sell for more than its weight in silver. Nor do we need ever-more-powerful personal computers; the heavy duty processing is moving off our desktop and onto servers, and has been for years, and only idiocy of the finest water (such as Microsoft's attempt to turn Vista into a surveillance state in microcosm) can justify it. Moreover, there is enough competition in this business that prices should be falling, steadily. Apple have staked out a boutique territory for themselves, and more power to them for noticing that they needed to do that in order to survive: but that's a small lifeboat, and not everyone can market themselves on being cooler than everyone else.

The Eee isn't quite the disposable computing resource I've been wanting — they'll have to shave a zero off the price tag for that — but it's close enough for now. It does the basics I need, runs portable cross-platform applications and editing open file formats, and if I leave it on a train or sit on it or something my immediate reaction will be to swear, check my backups, and buy another one, rather than to whimper and go talk to my bank manager. Which is as it should be. We've been held to ransom by these bastards for too long. The only remaining questions are, how long will it take before they wake up and realize the 30-year binge at the expense of the public is over? And how deep will be the recession that follows once the personal computing industry deflates to its natural value (i.e. peanuts)?




I find this interesting because, to me, this feels like the second time the natural price floor has been broken. In my (fallible) memory, until about 5-8 years ago the computer I wanted for home use, one or two steps down from the high end, was always out of my league at around $5000. I'm typing this on my (expensive, niche) Mac and it's nowhere near that. I think it was around 5 years ago that non-toy PCs really dropped under $1000, later under $500, and that people weren't afraid to buy them for serious use.

I'll be delighted if you're right and we're really in for another price drop of a factor 5-10. Think of all the fun things we can do if programmable general-purpose computers, running a commodity hardware platform, drop below $100. (Yes, I know embedded systems are below that today. They often cannot be re-programmed or run a non-standard CPU, which means I cannot just slap standard software on them, especially codecs.)

Seriously cheap PCs will also mean the end of the shrink-wrapped software industry - who will pay for an extra $600 (Windows OS + Office) if the whole computer is $100 and comes with Linux/xxxBSD and OpenOffice/KOffice?


Hildo: Seriously cheap PCs will also mean the end of the shrink-wrapped software industry - who will pay for an extra $600 (Windows OS + Office) if the whole computer is $100 and comes with Linux/xxxBSD and OpenOffice/KOffice?

Now consider the combined market cap of the shrinkwrap software industry and the PC business, and contemplate what deflating it by, oh, 80-90% will do to the western economies.


I have to admit, I tend to find myself wondering how much the licence for the software is costing me when I buy a new PC these days. There was a brief, hopeful period a few years ago, back when the MS anti-competition business was going on, where I saw the possibility of a brave new future with the chance of being able to purchase a PC without necessarily having to accept whichever MS OS was being preloaded onto it as the OEM standard today. Alas, that fitful vision flickered and wavered and disappeared, which means my current situation is that I'm not going to be even thinking seriously about purchasing a new PC (despite the fact that my current one is reaching the end of its usable life - no working USB ports, floppy drive not connected, no DVD drive etc) until Windows Vista reaches at least service pack 2. As far as I'm concerned, the notion of paying Microsoft an additional $700 (AU) for the joys of beta-testing and being nagged to upgrade their software is just ridiculous.

Here in Australia, PCs are still largely in the range of $1000 to $5000 - for $1000 you're generally getting a "bitsa" box, put together by chip wranglers somewhere in the local area and with no manufacturer name on the box (so, something much like all of my previous boxen); for $5000 you're getting something with a "guaranteed" brand name (like Dell) or you're getting the latest and greatest in gaming machines with all the bells, whistles, chrome, and go-faster-stripes on the modem.

I already use Open Office for a word-processing package (have done so since I first ran into it as Star Office 5.0, from a CD on the front cover of a PC magazine) and wouldn't switch if I was paid to. (My greatest rave about OO is the size of their proprietary file formats - when a file of 50K words is 663KB in .doc format, but only 161KB as a .sxw file, it's impressive to someone who carried most of their files around on a floppy disk for years. In these days of multi-gigabyte personal storage which is carryable on a lanyard, it's not much of a selling point, but I still find myself impressed by it). I've tried my hand with Linux a couple of times, but not in the last 4 - 5 years. I've no doubt things have improved greatly in terms of user-friendliness for the less Unix-knowledgeable of us out there, which makes me more inclined to use it, rather than less.


In Intel's case, the word "recession" doesn't even start to describe what would happen. For the last 30 years, Intel has had a business model based on Moore's Law, and the inevitable outcome is the the corollary that every 18 months the cost of running the assembly line for a PC component doubles, as does the cost of the engineering required for a new component. So Intel (and other IC companies, to a much lesser extent) has been in a Red Queen's Race with itself, constantly having to pay off the investment in the previous generation by adding huge fixed costs to the production cost of all their components, while simultaneously doubling that investment for the next generation, which is the only tool they will have to prevent the commoditization that would destroy the advantage they have over their competitors. This cycle meant that Intel, in a very real financial sense, has financed the development of all the hyper-integrated technology that makes modern computing possible. Take away the market acceptance of that constant drive for higher capability, and Intel doesn't have a workable business model anymore. Being a large corporation, they won't notice that right away, and will probably do some extreme damage to themselves while they try to turn the battleship around.

The greatest effect of the silicon revolution has always been based on putting as much as possible of the functionality on as few as possible components, and making as many of them silicon chips as possible. The cheapest and most functional solutions are ones that can be built on a single chip, with as few external components as possible. We can do that right now with a commodity PC (although nobody has, as yet; I predict someone is going to make a bazillion dollars by doing it sometime soon).

The only real survival strategy I can think of for companies like Intel that depend on a competitive edge based on high R&D investment is to start diversifying into other application areas and other system architectures than PCs. There are other markets the bleeding edge junkies can go after and the long-term consequences of them should make any SF writer or reader salivate. The server market is one that the big silicon houses haven't really gone after, which is why the most cost-effective server solutions right now are based on cheap consumer components in huge fault-tolerant clusters. What happens when someone puts a server, cpu, ram, and flash storage*, all on one chip and sells it for USD 25 in lots of 10,000? Or when a complete internet node with webcam, mike, speaker, wireless connection, and a multi-farad supercapacitor and solar cell, capable of powering the chip indefinitely in low duty-cycle mode can be manufactured for USD 5?**

These sorts of markets haven't been a high priority for the industry up to know because they've been making their nut in the consumer PC market. When the bottom falls out of that, we may see the world change drastically as the industry moves into other areas in a big way to replace the losses.

* and maybe even power-conditioning circuitry, so the power efficiency of large bunches of the things goes toward maximum? And perhaps ventilation as well, in the form of electrostatic air movers?
** This is known in the trade as "smart dust", and Vernor Vinge has had a few things to say about what it means.


Charlie Stross @2 said:

Consider the combined market cap of the shrinkwrap software industry and the PC business, and contemplate what deflating it by, oh, 80-90% will do to the western economies.

Hmmm. The main thing I can see happening is things like Operating Systems, office suites, graphics manipulation packages, and similar all dropping in price to the point where they're matching prices with things like games. The market for PC games is pretty much "whatever the traffic can bear" already, and this means there's a range of games available at prices from $120 (AU) down to $20 (AU) - the cheaper ones are usually the ones which are being re-issued, which don't require the latest and greatest equipment to run, or which have been selling so slowly the retailers want to get the shelf space back so they can put another copy of "World of Warcraft" on it. The median price is about $80-ish - a bit higher if it's new or popular, a bit lower if it's older or past the first flush of popularity. PC gaming in itself is starting to face serious competition from the console market, from what I've seen locally - things like the PS2, the PS3, the XBox and the Wii are able to do all the things that a high-end gaming PC can do, and they cost less (the most expensive console at the moment is the PS3 - that's about $999, roughly equal with the "bitsa" PC I mention in #3). So do the games, by and large, and the games are what people buy the consoles for.

So, the shrinkwrap software industry as a whole will probably start resembling the game software industry a lot more - in order to be able to charge high prices, you have to provide something new, something interesting and something genuinely different to the majority of the stuff out there. It doesn't matter how many man-hours you put into your product, you're not going to be able to charge people more for it than they'd pay for the PC to run it (see the pricing of games for the hand-held consoles - Gameboy, DS, PSP etc - for further reference). Up at the corporate end of town, we'll probably see a lot of the bigger players starting to take major hits, because this puts the ball back in the customer's court - and many of those customers will be asking for things like better interaction between various tools they're using. As a result of this, I think we might see things like the various Microsoft products becoming much better at playing with others.

I also suspect that there'll be a greater appreciation for the work done by tech support staff. I foresee the realisation on the part of some of the larger firms that the support desk is where they're going to be making the money now. So, it'll probably be paid support, in that you'll have to pay for support calls, but in return, you'll get good support. Buying a warranty will actually mean something (rather than the present situation, where it just says you have money to spend and a willingness to be ignored/put on hold for long periods of time).


The Eee isn't quite the disposable computing resource I've been wanting — they'll have to shave a zero off the price tag for that...

The obvious question - how long ago was it that the average consumer mobile phone sold for £220? Because we've certainly had most of that order-of-magnitude drop.


Huge problem with the pointer, Charlie. Indium and other rare metals used in computers. The price of Indium current stands at over ten times what it did a mere five years ago, and this has badly affected the price of LCD screens. This is going to get worse. Much worse.

Also, as an interesting point, the GPU's companies have stopped doing something. They've stopped pricing graphics cards as $=. The European cards (when you look at them minus VAT) have less than a 10% markup.

Anyway, 220 is far too much for something with the limits of the Eee - the battery life is simply far too low and the only way to get a spare battery is to buy another Eee.

I also have to disagree that offloading the work to the server makes sense in the home. In business, perhaps, but the nature of home connections (UK actual broadband connection speeds and reliability have fallen in the last two years, and will continue to fall - especially given the massive costs of the 21CN network to ISP's) means it's not even remotely viable outside the LAN.

Meg? Er... the PS2, Xbox 1 and Wii don't have as much computing power as PC's of some years ago, and the top end PC's now allready outstrip the gaming performance of the PS3 and 360.


Not all components of a PC obey Moore's law. The case and keyboard are already at commodity levels. The LCD screens follow a much slower price/performance reduction trajectory - although OLEDs may force a big drop in the next few years.

The biggest drive to commoditization of prices is that the CPU has topped out in clock speed. Moore's law is being used to add more cores per CPU chip, but applications cannot yet make this extra power work for them - so even CPU hogs can only use a fraction of the available power at the moment. As a result, there is little reason to buy a premium machine as the performance gains are not perceivable.

I think Intel understood the issues quite clearly over a decade ago. They encouraged and invested in applications that were CPU intensive and Microsft obliged by making Windows ever more bloated. [As an aside, I once installed Windows 95 on an ancient HP 486, stripped out every piece of crap I could to slim down the OS and ended up with machine that booted up in a few seconds - yet still ran basic applications normally].

Note that Microsoft is offering a Chinese version of Windows at $3 a copy, to counter the Linux and pirated Windows OSs there.

So all the signs are there that basic PCs should become like simple calculators - you give them away in cereal boxes.

BTW - I really like the extremely low energy consumption of the OLPC - that makes it quite feasible to run off a small solar panel. Removing the weighty batteries from a PC would be the largest contribution to weight reduction and form factor changes.


Andrew: indium, LCDs, yes. But there are OLED and IMOD and epaper waiting in the wings, all with different input materials and different performance parameters. Worst case, if indium turns into a huge bottleneck for LCDs, it'll make it cost-effective to spend the development bucks to bring OLED up to speed as a replacement technology. And so on.

Now, batteries are indeed a bottleneck technology -- they don't follow Moore's Law at all. But again, look at epaper. Today it suffers from rubbish latency and a slight lack of colour (in any escaped-from-the-lab form, anyway), but if they can speed it up by an order of magnitude and get colour working, then we'll have something that will simultaneously make the LCDs obsolete and reduce the power drain of one of the two most juice-hungry components of a PC by more than an order of magnitude.

Broadband connections ... yes: but I'm not convinced that's a technological bottleneck rather than a f*cked-up-industry bottleneck. I suspect it's also hugely aggravated by the fact that we haven't gotten the idiot film and music cartels to switch from a 19th century copyright model onto a compulsory licensing scheme. Once compulsory licensing is in place (with a blanket exemption for downloading content, because users are paying for a license as part of their ISP fees) we can begin to clear up a lot of the p2p traffic that's clogging our networks and replace it with a sensible content-on-demand system with local caching. (I hope.)

The one constant I can predict and be certain of is that the future will not resemble the past. (There may be bits of the past embedded in it, but the future will invariably be more complicated ...)


Speaking parenthetically, there are three things that could improve the Eee experience hugely for me (apart from dropping Xandros and switching to Ubuntu):

1. More FLASH memory. 4Gb is marginal. (But they're promising that for later in 2008, anyway.)

2. Built-in 3G broadband. (Again: promised for 2008.)

3. A non-backlit purely reflective display using something like epaper or IMOD. Doesn't have to be fast enough to play games or video -- just for email/word processing/web -- but if you can cut the power drain from that display to zip, then you can go from a 3.5 hour battery life to about a 6-7 hour life using the same cells. Going to 1024x600 resolution would be great, too (and cheap-ish 7" LCD display units in that resolution are available -- for example, Kohjinsha and some of the UMPC vendors use them).


You think the Asus Eee is cheap and small? Check out these mini pc's, one of them under 100 u.s.--




Charlie - I think there's actually an OLED product finally due out in Japan soon. Sure, there are other techs but in the next two to three years it's going to push the price of the mini-screens up nastily.

As for broadband, yes. It is an industry problem, but one which isn't going to get sorted in the UK in the next few decades unless something radical happens because of BT and an utterly, utterly incompetent and impotent industry regulator.

"sensible content-on-demand system with local caching"

So something like...oh...NNTP? (I hate NNTP specifically, but something done even slightly differently..)


Charlie @2: Now consider the combined market cap of the shrinkwrap software industry and the PC business, and contemplate what deflating it by, oh, 80-90% will do to the western economies.

I've seen various estimates that 60-70% of all software is custom-made, not shrink-wrap. In my workplace (an investment bank) our IT costs are over $3B, and a small fraction of that is spent on shrink-wrapped or ready-made "enterprise" software. We have over 1500 software developers in NY alone, writing code that only our firm runs.

So I don't think the deflation of software houses will be a big deal. The importance of in-house software, of customization, and actual customer interaction, will increase. And that's long overdue.


Another SF ref for ultracheap computing would be Bruce Sterling's "spun straw" stuff in "Distraction" -- fueled in part by a Chinese release of virtually all intellectual property they can get a hold of to the public.

But the computing universe has gone through interesting cycles of centralized-decentralized (google for "Dogbert's First Rule of Consulting"): The web almost brought back the mainframe, and it looked like we might just carry ID keyfobs to public thin terminals to do most of our play^h^h^h^hwork. Malware has made that a little less attractive, but it's still an option.

We're still a long way from goggles-and-rings for user interface, that stuff is still pricey. But phone "slates" like Crackberry and rumored Nintendophones bring this convergence a little closer.


Hildo @13: dead right. But the shrinkwrap market is the chunk of the software iceberg that's visible above the industry waterline.


I view the ultimate problem at this stage as being power — electrical power — and not component costs. The "killer micronotebook" must be able to handle a twelve-hour day without getting near a power outlet: Cab to airport to commuter-type aircraft to airport to cab to meeting to cab to hotel. (Or, for the fen among us, a full day in the dealer's room, art show, four or five panels, a smoke-filled bar with one's editor/agent/favorite author/favorite Klingon, and a couple of hours in the con suite before getting back to one's room.)

That's a problem of chemistry and basic physics, not of engineering. Almost all of the advances we've seen in the last 25 years have resulted from engineering solutions to reasonably well-understood chemistry/physics/materials problems (at the basic science level, anyway). Unfortunately, what we don't understand at the basic science level about energy storage/production is running right at the edge of our current engineering prowess for energy storage/production. That's one of the reasons that the Eee is so power-handicapped: Its batteries can't get better and lighter with current knowledge... just bigger, and being drawn down less.

The "stuff it full of AA/AAA batteries and let people use rechargeables if they want" solution isn't any better, as the power loss in the connections wastes almost all of the purported extended life! And the less said about a "multi-farad supercapacitor" the better... because that's bounded by some pretty basic physics.

So, for at least the foreseeable future, I don't see things getting really much better than the Eee, and it isn't quite good enough for me to consider switching. Of course, all it will take is a fundamental breakthrough of some kind to change "foreseeable" from "five to ten years" into "three months." The problem with fundamental breakthroughs is that they're quite unpredictable (one of my major annoyances with Civilization-type tech trees in games).


CEP: We've got an energy bottleneck, true. But look at it from the other end: we have a bunch of components with various power drains and if we optimize the hell out of them we can draw less power. The core components of a current laptop are: fixed storage (hard disk or FLASH disk), CPU, and display. The fixed storage we can work on by switching to FLASH or using physically smaller rotating media. The CPU ... do we really need much more than a 624MHz ARM9 core for toting around in an ultramobile notebook? Those things will run off milliwatts. Even an Intel core-architecture CPU doesn't have to be a stupendous power hog if you underclock it and optimize for power efficiency.

The problem we're left with is the backlit screen. Staring into a light bulb is something we've come to associate with laptops, but it ain't necessary; back in the days of the early 1990s when we still used B&W screens, I had an HP Omnibook 300. It had a non-backlit black and white LCD -- and a battery life of up to 10 hours. Those backlights suck juice, but they were really only required because colour TFT screens are lousy reflectors. Switch to a reflective rather than a transmissive display technology, and I'd expect the power consumption of the display unit to drop like a stone. Switch to one that only draws power on refresh (like epaper) rather than thirty or sixty times a second, and you'll save even more.



There's an additional power savings with using a slow core: you get to use slow DRAM, which uses considerably less power. If we could get flash ram in random access arrays that had reasonably fast access times at reasonable prices, the power drain would go way down.


The shrinkwrap industry isn't composed of the same companies as the enterprise industry, and it's a much larger market cap, just because there's no inhouse development backing it up. If Microsoft alone were actually to have to do a major restructuring and lost most of its value, it would do a lot of damage to the stock market, as well as put a lot of people out of work, and take a lot of aftermarket and service companies with it.


The best route to an order of magnitude increase in power storage I know of is nanostructured battery elements. If that works well, it has the potential for a Moore's Law-style development curve that could easily last a couple of decades before structure size hits the monatomic wall.

That the PC is the mainstay product of the computer business is somewhat of an accident of history; it happened to be the market that, in the technical and financial environment of the 1980s provided the least resistance to the imposition of a 2 to 3 year product life, thus allowing easy recouping of the massive R&D investments required to keep Moore's Law trundling along. And we're talking here about a specific market segment: ultralight notebooks. The technical constraints on desktops are not the same, nor are the marketing considerations. The same kind of deflation will eventually occur in desktops too, but that market, and the TV/Entertainment device market could serve as a cushion to give the industry some time to find a new business model that's more profitable than commodity but more resistant to technical development. I predict they won't do that, but will instead thrash around trying to keep the ancien regime going.


For those here who know a little basic economic jargon, it's worth remembering that hardware and software are complementary goods, not substitutes. Yes, people sometimes balk at paying more for their software than for their hardware, but when the software is something they feel willing to pay for, the less they're spending on hardware the more they can ante up for the software.

The real story with the OS pricing and Linux ending up on more and more inexpensive computers - Dell is starting to gradually push it more - is that the basic functions of operating systems have become a commodity, after a couple generations of computer science development. Only Windows' monopoly dominance plus a hell of a lot of marketing has enabled them to maintain the illusion that an operating system is worth a lot of money. Ditto for many standard applications like word processing, browsers, basic databases, software frameworks, etc. (Programming environments and languages hit this point a while ago; the GCC toolchain, Perl, Python, Ruby, Eclipse, and the end effect is that MS gives away a 6-month free version of Visual Studio to keep developers interested, and has cut the purchase price of basic VS versions to well under $100 IIRC.)

But the same complementary good relationship is true between operating systems and applications components, and the applications themselves. If your OS is free or nearly so, your WP is free or nearly so, your DB server is free or nearly so, then if there's some highly specialized or personalized software you really want or need, you can now afford it. These are the software development areas I'd expect to boom. Unfortunately I have no idea what they are.

Incidentally, I would expect these factors to mean that Apple suddenly takes a big profitability hit a few years down the road; they're finally perfectly positioned against the last software and hardware market, so they should get a couple more years of great profit and market share gains, and then I'd expect it all goes blooey.


Bruce: I'd argue that the market for low-end desktops has already undergone most of that deflation. Just a few years back, the minimum price for a desktop was over $1000, with the manufacturers making every effort to push you up to $2000 via add-on options. Now Dell's pushing desktop systems with a starting price of $349 (without monitor.) That's roughly a 3-fold price drop, and of course you get more power in that desktop than you did in most desktops of a few years ago. (The next step up, $499 with a monitor, is dual-core, so probably more equal to the typical servers of 4 years ago.)


CEP@17: They can start by selling me (not 50% unit price, either!) additional batteries. Cheaply. Oh, and a charger unit. And a connector which I can plug into the unit on one end and the battery on the other so I can keep it running on battery power when having to shut down for a minute isn't really viable.


"There is no intrinsic reason why a kilogram of plastic and metal with a couple of silicon chips in it should sell for more than its weight in silver." I'm sure that argument could be applied to the constituents of the human body to terrible effect...

There are still problems all over - Symbian 60 seems to need more RAM to operate effectively on my Nokia phone, than Windows XP does on a PC, and takes longer to boot. I also wonder how effectively Open Office et al will continue to develop without Oracle, Google and Sun pouring megabucks into them simply to kill the Microsoft cash cow. The commoditization may in the end be paid for by advertising space on every spare pixel.

I haven't looked it into it lately, but has anything been done to break through the problem of write endurance limits on Flash memory. If you have a database application writing record-level updates, then you can wear out your storage medium extremely quickly.


Hmmm... having a thread like this on LJ at the moment and I find myself more or less in the opposite camp. I think the phone business is an interesting place to look at. For a long time (well, a couple of years around 2002) many people were demanding that the phone business should grow up and become commoditised like the PC business - of course, the problem was, it already had.

The software costs on typical phone platforms are a pretty small fraction of the BOM costs, even something frighteningly expensive like WinMo or Series60 is a small part of the over all costs. The real costs for software rack up in adapting the essentially low price stuff you license and putting it together in a way you like. Depending on your starting point you're looking at somewhere between 20K and 100K man hours, depending on the feature set. Assuming around $75/hr costs for Western engineers and $20(ish) at Eastern, that's still a lot of money that needs to be recovered in unit sales. (And that will go for Android too, if half of what I'm hearing about it is true...)

There will be some things that people will be happy to do for free, but there's a lot of dull, crap, hard stuff, especially in the phone business, which won't get done, or will get done so badly it won't be good enough to ship in a mass market market device. So while software prices are going to drop, there will become a point where the needs of the mass market and the costs of actual mass market quality software development intersect.

I have no idea where that price point is, it'll be lower than MS currently charge, but probably more than they charge for WinCE professional ($16).

It could mean the end of Intel's drive for ever faster multi-core processors, but there's still a lot of development work needed in low power, higher speed mobile and embedded processors where there's a lot more competition; ST, Freescale, Qualcomm, Marvel, TI, Infineon, Philips, Toshiba, Samsung etc...

Of course, lots of silicon means lots of NRE porting stuff to it that isn't free either, which in turn...


Symbian 60 seems to need more RAM to operate effectively on my Nokia phone, than Windows XP does on a PC

I'm assuming that you're talking Series 60 running Symbian? I'm pretty sure that that's a 64MB OS, probably running 64RAM/64ROM (128MB total) on a sub-400Mhz processor with an ARM 9 core. The processor is probably the real problem here. One of the reasons that the iPhone looks better and runs better than other phones is it's processor runs roughly twice as fast and it's got a lot more memory.

The real issue here is that Apple, by dint of who they are and what they sell, don't have the price pressures that people do on a phone and have been able to throw away the BOM rule book.

You can build a GSM phone for sub-$15 (probably sub-$10) but it won't do much. The component costs alone for something that will be interesting, look good and can run a Smart OS and Platform are still up over $150 - which is still a good $50 more than the global carriers want it to be (they have a target of sub-$100 and then they can start weaning the users off subsidised handsets).

The killer is how you amortise your NRE related to getting the product developed, tested and certified and into the shops and why the economics of the phone business are so bad. For every RAZR or T610 you'll have box loads of duds, all of which will have basically cost you the same fixed cost in software development. Making the basic core software free doesn't solve that problem if you still have to pay people to do the dev work, even if that dev work is done by a thousand dirt cheap software engineers in Nihzny Novgorod.


CEP@17, the OLPC uses crank power. Would you find that acceptable for 24-hour power?

Charlie, the 10-year-old Mitsubishi Amity subnotebook (http://www.businessweek.com/1997/46/b3553067.htm) I have was $500 with all the extras I could add. It has a 7.5" color LCD screen, the display is 640x480, and it weighs 2.4 pounds. There are two problems: 1) Mitsubishi stopped making subnotebooks, so the upgrades I expected didn't happen, and b) the peripheral market changed to USB, so there aren't other companies providing add-ons.

So the big difference between what I have and the ASUS Eee is really better upgrades and peripheral use. It'll probably come out more expensive than the Amity, too. And I definitely think I'll wait for the next edition since it will take me a month or so in the new year to recover from Medicare Plan D from this year.


My Z88 ran 40 hours between battery changes, has a full sized keyboard, sufficient memory for what I did, and fitted into the top of my bag for visits.

The EeeeeeeeePC does indeed look like the current tool, but ...


Adrian: I've still got a Z88.

You omit the fact that it ran for 40 hours on a four-pack of alkaline AA cells, that if it ran down you had to change cells PDQ before the backup capacitor discharged and it lost its memory, and that the only non-volatile backup memory cartridges for it were EEPROMs that took half an hour under an ultraviolet lamp to reset once you'd filled them.

I'll grant you the 40 hour battery life, but I'd like to point out that the Eee's power supply is a whole lot smaller than a Z88's brick. Indeed, it's about the size of a last-generation mobile phone charger. And how often do you have to work for four hours without any access to a mains socket (or enough privacy to change a battery)?


Hah, scooped you!

Actually the OLPC crank has been replaced by a pull-string, but a small solar panel also works.


Dave @23: You raise a good point about licensing issues. There's a lot of IP to pay for - MP3 and other codecs (not all of course), DVD, USB blah blah blah. Don't forget language issues - side from UI translation, developing tools for language parsing, speech reco, handwriting recognition generally involve licensing lists, texts and samples from third parties. Building your own isn't necessarily cheaper by the time you've assembled enough computational linguists to do the work. Want to provide the tools for one of the many many language with less than 10 million speakers? - the rates go up due to lack of competition.

Want it all really really thoroughly tested before it goes out prematurely, whether it be from MS, Apple, Nokia or Mozilla? Then you're going to have to pay pay pay.


Mike: Where the PC industry does have it easier than the phone business is that there's no significant 3rd party certification requirements on the final device. Launching a device for the global phone community will require 3rd party GCF testing by an acredited testing agency and, typically, approval from the Operators on which you plan to deploy a device.

It's not all window dressing either. A few thousand phones with poor radio management can seriously affect the base station controllers in a network. There are things you can do with 3G QOS (Quality of Service) demands which are plain scary when you start to think of an era of more open radio and software defined radio.


Dave @30: I can't speak to the hardware, but PC software will usually have to be certified it complies with various standards for accessibility (US government), privacy and encryption(various EU), ... in order to get sales to any government agencies, including schools.

I've been out of the game for a few years, but every year there were more and more of these to work through.


No, the crank isn't going to cut it. Just try deploying it on an airplane that's been sitting on the tarmac for 45 minutes, pushed back from the gate due to a weather hold at the destination. The same goes for pulling a string (which I'd expect to result in some stupid doll-like phrases anyway).

My point was that battery life is going to be a limfac (limiting factor) for the foreseeable future; no matter how much more efficient we make screens, processors, etc., we're still going to be dealing with the same power budget, in a way that is different from the progression in processor power, memory/drive density, LDC size, etc. over the last couple of decades.


7W is a fairly small solar panel, I think. I realise that would mean going outside...

On a related note, has anyone ever made a laptop with a translucent cover, so ambient light would pass through the LCD, and you could turn off the backlight? I'm tempted to try it with my elderly Thinkpad.


Power: I know it costs considerably more than 300, but I can't complain about my Sony Vaio TX1. It's put up with almost 2 years of my business travel, a half dozen major trade shows and upteen drops and dings and still gives me a solid 6-7 hours out of it's battery, can play 2 DVDs on a plane in DVD only mode and weighs in at under 2kgs. With my Kensington 70W power everything I carry pack it's an amazing piece of kit.

I can do London-Seattle with doing some work and watching some TV and still not have to plug in to the seat.


The low end notebook market in the US is really dropping. While they aren't as light weight as the Eee PC or other subnotebooks, you do get a larger screen and more storage for around the same price or less. I just picked up an Acer Aspire for $350, which is more powerful than a desktop computer I was using 5 years ago.

In the near future I wouldn't be surprised if low end notebooks replace low end desktops as the starter computer of choice for light users. The costs aren't much different, and neither are the capabilities. You get a bit more in a desktop, and they'll always be more expandable, but most light users aren't going to max out their machine's capabilities or go poking around inside the case.

And notebooks offer the advantage of being at least semi-portable and having a smaller footprint around the home. It seems like every college student who isn't a PC gamer has a notebook now, and I find it likely they'll stick with them after graduation and once they move into the professional world.

So in the coming years the place where we'll likely see the most competition in the market is between low end notebooks and subnotebooks. Desktops will be mainly something reserved for the office, professionals who need a lot of power, or gamers.


Thanks a bunch for your very helpful rundown of the Eee's pros and cons -- I think it's going to be my next computer after my 12" iBook gives up the ghost, probably in another couple of years. Like CEP said, the short battery life's a killer for me; hopefully by then they'll have figured something out.

Now...is 12 Gb of storage enough for a primary computer? I suppose if one stored music, etc., on external hard drives...


Remove the screen, sound hardware and input devices-- and add some storage (and ideally upgrade to gigabit ethernet) -- and you have the basis for a nice, quiet, storage / backup appliance.

Also of interest in the storage world: BitMicro are to start shipping (in volume in Q2) 3.5in flash HDDs with up to 1.6TB capacity each. No prices as yet, and I'm still waiting to hear how many re-writes you get per sector - and more importantly, how cool they run. But an indication as to the direction things might go..


Clifton @ 20

I don't think desktops have bottomed out yet. Exclusive of the display, a low-end PC isn't much more complicated than a commodity DVD player or a Tivo (which, really, is a low-end PC), and they retail for 80-100 USD. That looks like a bottom price-point for consumer electronics to me; down where the components are dirt cheap, and most of the cost is assembly and supply-chain. So with the display, commodity PCs should sell for around $150-180. Once we have a mass production on a display that doesn't use scarce materials or require expensive assembly techniques, I think the price will go down to $100. I give that four years max.


Interesting discussion, but premature, I think. We're still well within the feature-driven phase of the computing revolution (vs. cost-driven). If people were ready to start pushing the market based on cost, Linux would be on millions more desktops, no one would make PC games any more, and items like the iPhone wouldn't sell like crazy. No, people are still choosing their electronics on a feature (and "cool factor") basis.

The only way we'll really see a move to commodity PCs in the near future is if the technology advances to the point where someone can produce the same features at commodity prices. Not "almost as good" features, like many of the iPod clones; not "just as good but different" like Linux vs. Windows. Until then, this stuff is going to remain a purely niche market because your average consumer is willing to pay for those extra features on the higher-cost PC (even if he never uses them).


In the Royal Society archive is a video of the 2006 Clifford Paterson Lecture given by Professor Richard Friend discussing plastic/polymer semiconductors, which can be used for displays and laid out using ink jet printers. Well -- at least in the lab. It does seem an interesting direction and should help the costs to come down. It could open up a whole new approach to chip fabrication once you've got an inkjet that can print semiconductors.
reprap for chips, anyone?

I can't link directly to the video but here's the link to the main page http://www.royalsoc.ac.uk


Andrew@7: the eeePC has been out for less than a month. There will be third-party batteries available by the time the original one won't hold a charge. I'll bow to wiser heads than mine and agree they probably won't be much better, although my old 3G iPod runs much better on its 3rd party replacement battery than the one it came with.

Charlie: you mentioned you upgraded the RAM. I understand this voids the warranty - can you confirm?


There are some questionable assumptions being made here.

The first is that increased processor power does nothing to improve our lives, it just allows hardware makers to fleece us for more cash, whilst lazy software developers write sloppy bloatware to suck up the extra processor cycles without any functional benefit.

The second assumption is that higher bandwidth, ubiquetous internet access naturally implies that we will return to a thin-client, thick server model of computing.

I think both assumptions are false, or at least require independent justification rather than being held as self-evident.

Faster computers have several benefits. Firstly the sales of expensive, fast computers subsidise the cost of cheap, slower hardware such as the iPod Touch I'm writing this on - a machine comparable in power and price to the laptop you describe. It has always been the case that fast expensive chips have subsidized the slower, cheaper version's. If we stopped pushing forward the state of the art, then the new generation would not exist to become the old generation which ends up in budget hardware.

The concept that we could focus on reducing cost INSTEAD of increasing power is a misnomer. It is only by increasing power that we can pay for the reduction in cost of the older technology. Unless you are proposing that the electronics industry simply stops making faster machines, and lowers its profit margins year on year to keep the cost of hardware falling at the same rate.

As for the idea that servers will become more powerful, this also makes no economic sense. A server is a computer bound by the same rules as any other. If a thousand users of a web site all buy a computer half as powerful and push all the processing to the server, the server must now take on a thousand timesthe workload that each individual client has offloaded. The cost of that hardware upgrade will then be pushed back to the consumers, and on top of that they will also have to pay for the extra bandwidth required to shuffle data to and from the remote server that could have stayed on the client. Then there's all the overhead of wrapping that data on layers of encryption to stop it being snooped or tampered with in transit.


David@37: You do realise that a lot of routers can have USB hard disks plugged in right? Heck, one of the ASUS routers has a built-in hard disk. No need for another box :)

Harry@41: Except I'd want 2 batteries straight off - I do travel for longer than 3 hours on a fairly frequent basis. (Well, it shouldn't be longer than 3 hours, but given the English transport system...)

Incidentally, looks like ASUS has managed *again* to violate the GPL on the Eee release. I'm sure it'll get sorted (again), but you'd think they'd not need prodding to do it one day...


Harry: there's a sticker across the memory hatch that says "warranty void if removed or broken". I would be interested to see whether that holds up in court in the UK, as I suspect it violates the Sale of Goods Act. On the other hand, the Eee is basically solid state, with no moving parts, so after a basic burn-in period it's hard to see what might go wrong with it within a one year warranty period.


Talking of computers ...

Someone has won the Wolfram Turing prize - see here:


An interesting detail re: EEE PC & RAM comes from ThoughtFix:

"It [the EEE PC] takes both 1GB and 2GB SO-DIMMs, but only 1GB is recognized in Xandros. XP [which runs on EEE PCs, too] recognizes 2GB. It's just a kernel option on the Xandros side."


Xandros is, IMO, a rather crappy distro; it's a Debian fork (so installing Debian or Ubuntu packages can break it badly), it's an out of date Debian fork at that (hence things like OpenOffice being stuck at 2.0 on the Eee, rather than the current version 2.3), and there's lots of stupid nonsense stuck to it. For example, Xandros supply a frankly inferior file manager for KDE that isn't as functional or elegant as Konqueror (which is still there if you know what to look for); they've got that silly maximum memory limit compiled into their kernel; they've left out HFS and HFS+ filesystem support (hello? Ever heard of Macintosh users? Or Mac-formatted iPods?); the version of Python is out of date; and so on.

Once someone digs up a workable driver for the wifi hardware (so I don't have to dick around with ndiswrapper) I will probably blow Xandros off my Eee and stick Kubuntu on it, clawing back about a gigabyte of disk space and adding functionality while bringing the apps up to date.

But you may take this to be a complaint by a Linux gearhead. As a second machine for someone who is not a Linux gearhead, it's fine.


Asus have apparently released the source code required under the GPL: see


Sounds like an oversight, rather than deliberate breaking of the GPL.


Spike, I'm pretty sure that's the ACPI driver, not the wifi chipset. Still handy, though.


Skip @39.

re Asus being a niche market - I have current visibility of lots of orders (note: I don't want to get fired for giving out numbers..but lets say 'many' thousand); that's in the first month of actual release and with people willing to wait post xmas for delivery.

They are in such demand that I know certain folk are making befs at taking orders for 40,000 units before 08.

And that's from selling to a niche market.

You forget that the only clowns that want to pay £3k for a laptop are (generally) the ones who will have no idea what to use it for.

The Eee however, appeals to exactly the right people. Our children, who by and large will be/are far more tech savvy than we'll ever be.


Incidentally, I would expect these factors to mean that Apple suddenly takes a big profitability hit a few years down the road; they're finally perfectly positioned against the last software and hardware market, so they should get a couple more years of great profit and market share gains, and then I'd expect it all goes blooey.

Apple is rapidly transitioning from being a PC manufacturer to being the highest-margin consumer electronics manufacturer. I'd say in a few years most of their revenue will come from the iPhone and similar gadgets. They're Bose on steroids, only instead of being based around loudspeakers, they're based around anything that needs a decent user interface, good industrial design and the ability to run software applications. They're already years ahead of Dell, HP and other PC-industry rivals in moving into the general consumer electronics space - those firms will probably never catch up (look for HP to sell off their PC division to the Koreans or Chinese in the next 5 years).

Microsoft is also screwed. Nintendo is eating them alive in the games market, their mobile platform is under siege from above (iPhone) and below (everybody else), desktop computers are rapidly becoming commodities, and they still aren't a dominant player in the server space even with all of their monopoly money. When that dries up, they dry up.


Wow, what a fascinating discussion.

True, low end laptop have gotten quite cheap. However, as Charlie mentioned, what may be game changing about the Asus Eee is its weight. All other two pound laptops are significantly more expensive. Most UMPCs weigh around two pounds, but they're also more expensive (and don't offer real keyboards). The laptops which cost close to the Eee's cost probably weigh around 5lbs. The Eee may spur more interest, and development, of really lightweight laptops.

As for computational power, for what an ultralightweight low-cost computing device is expected to do, a 624MHz ARM9 undoubtedly serves fine. However, I think there's a difference between what people need and what people want. The affordable people that would suffice, and the product that people are willing to buy may not be the same product. (OTOH, drop a zero from the end of the price, and no one may care. "The cheaper it is, the better the perceived performance" may be an axiom. Obviously, this fails at the limit.)

As for the "lower price" vs. "increase power" argument, commodization is clearly happening. IIRC, I read about Walmart selling a $200 Linux box. It's not hard to find a sub-$1000 laptop. The interesting question is what happens to the computing industry in the face of this. Not every computing device will be a commodity device, at least in the near future. (We won't get there until we have more computing power, and storage than anyone knows what to do with. For some people, we're already there. But there are still lots of people who, whether out of need or want, are willing to pay extra for more computational power.) It'll be interesting to see how the market deals with this.


Microsoft is also screwed. Nintendo is eating them alive in the games market, their mobile platform is under siege from above (iPhone) and below (everybody else),

Hmmm.... it makes nice rhetoric but it's not quite the case.

Nintendo is eating everybody in the games sector, but the XBox is doing pretty solid business with the Xbox.

In the phone sector, WinMo isn't, by any real definition under seige from iPhone either. They shipped about 8 million WinMo devices in 2006, Symbian shipped something like 52 million devices. That's separate to the WinCE licenses they shipped which I'm actually gathering some numbers on.

iPhone had a great launch, but even at their target numbers they still are hitting a small part of what is an emormous market.

As I said before, the NRE of building phones cripples your cost basis unless you are going to be shipping multiple millions.

In the server sector, I know that their London (Financial group) blew away their targets and took in nearly $1BN in deals last fiscal and this year looks even bigger.


But there are still lots of people who, whether out of need or want, are willing to pay extra for more computational power.

I wouldn't be so quick to say that. I work in a science department (biology), and the number of people here who would actually notice the difference with a faster machine is pretty small. And for those people, modern "faster" machines aren't actually any faster, because as someone else pointed out, stuff like GIS and statistical software doesn't really know what to do with multiple cores, etc. I know for a fact that R, which is probably the most up-to-date statistical software, can't thread hardly at all.

That kind of work benefits from more memory and better bandwidth into the processor, rather than sheer grunt.

Nick @42: slower chips have traditionally been underclocked versions of faster cheaps. They're the ones that don't pass the full-speed test at the end of the production line, so they get their multipliers locked at a lower speed and off they go to be sold at a discount. I don't see that model being a whole lot of use when the push is increasingly towards low-power rather than high-speed. Developing any new product costs a fortune and manufacturers defray the expense in whatever way they can. There are plenty of fields out there where the new product doesn't really do anything really new, but it still had to be developed, and production lines paid for. Most of the things that Charlie is calling a commodity fit that bill: new speakers, for instance, don't really do anything that old speakers didn't do. Same with food processors, and cars, and... you get the point, I think.


Hey Chris @54:

I do bioinformatics at UC Berkeley. I'll take ALL your spare cycles, plus all of your RAM.

My colleague in the CS dept is doing dynamic programming with FPGAs, mapping Solexa data onto the genome, and other colleagues have been talking about playstation clusters for years.

Then there's BOINC (ex Seti@home). Compbio can ALWAYS use more cycles.


Nintendo is eating everybody in the games sector, but the XBox is doing pretty solid business with the Xbox.

The Wii has already buried the Xbox 360, and it's only been on the market, what, a third the time? See http://www.vgchartz.com/?p=60 Worse, Microsoft invested a reported $21 billion (!!!!) in the development of their videogame platforms and their launch according to Forbes - heck, they supposedly sank a billion into just extending the 360's warranty when so many of them started to go tits up with overheating issues. They'll never break even on that investment, let alone turn the kind of profit it would take to justify an investment that massive.

Whereas the folks at Nintendo could already roll naked in the millions in profit the Wii has generated less than a year after launch, as the Wii is widely believed to have cost Nintendo a mere fraction of what Microsoft spent to develop its next generation console.

In the phone sector, WinMo isn't, by any real definition under seige from iPhone either.

Ha! Apple, selling just a single device, has already moved more than a million units in the US alone in under 6 months. Their plan is to sell 10 million by the end of 2008, which will place them within spitting distance of WinMo sales, and with a vastly higher profit margin. Not only that, but Apple's getting a cut of the monthly phone bill, a deal Microsoft certainly can't command. That may ultimately prove more profitable than selling the phones.

It's hard to see how Microsoft's going to compete with that, especially as hardware prices continue to decline and platforms capable of running the mobile version of OS X grow cheaper and more capable. As with the iPod, it's only a matter of time before Apple dominates the only profitable portion of the market - the high end. Let Microsoft have the junk. I'm sure the phone companies will be delighted to pay Microsoft a buck a phone to use their crappy OS, assuming they don't switch to Linux and get the same functionality for free.


Some software and hardware companies are going to make the transition to the post-PC world, and some aren't going to make it. It's becoming pretty clear Microsoft's not going to make it - certainly not in the consumer space. If they can't make it on the server side, the only real question now is, how long before they implode?


Here's something that might be a pointer to the software future.

Poser is a program at the bottom end of the CGI market. It started out as a digital equivalent of an artist's lay figure--one of those jointed wooden dolls--and became a fairly decent posing and rendering system. The render engine is comparable to professional software in its features, but it's stuck with a user interface with no user-adjustable features. I want to use a slightly larger font for the filenames, and something other than brown on brown for the colour scheme.

It's also stuck in the 32-bit single-processor world, and rendering needs a lot of clock cycles.

There was a promise of an improved version that would make the jump into the future promised by an OS such as Vista (though I gather there's a 64-bit version of XP). and the latest version will support rendering in a seperate process.

And a couple of weeks ago it was sold to a company called Smith Micro.

Looking at the company website, it looks to have a lot to do wuith software for mobile phone apps, and distributes a range of low-end software, such as photo-editing apps which are not Photoshop. Apparently, it was already distributing Poser in the US retail market.

So either Poser is going to be dropped into that low-end niche, and sold more like a computer game (which likely will lead to a flood of Poser-porn), or Smith Micro are trying to claw their way up-market.

But to get a more mass-market approach to work, the Poser manual is going to need a major rewrite. Major sections are written for people who know the jargon. Some of the explanations are as limited as "The Phone node implements a Phong reflection model."

And the 64-bit/multi-core barrier is still there.

But the other side of the Poser business is that a huge number of people sell models designed to work with the program, and part of what Smith Micro has bought is one of the content-retailing sites.

At $10 a time for a set of clothes, which is cheap, you soon can spend more than the software cost. And another pose/render program is already being given away (www.daz3d.com).

I can see the Poser market surviving the changes you're anticipating, because its so different. It's already a market where the program brings in less money than the support. And the support that is being sold is, in a virtual world, more tangible. It's not helpdesk time.

(Sample worksafe poser picture at http://myweb.tiscali.co.uk/zhochaka/Pictures/FurettePirate.jpg )


I'll not argue that the Wii is doing well for Nintendo, but to claim that the Xbox isn't is, well, wrong. Besides, Microsoft don't really have to justify investment, anymore than Google need to. They're still recording record double digit growth in their core client markets (although I have no idea how). There's room for more than one games platform and I'll admit that I'm thinking of getting an XBox in addition to the Wii.

Their plan is to sell 10 million by the end of 2008, which will place them within spitting distance of WinMo sales

Yes, I remember when MS thought they were going to have 100 million devices shipped by 2006 too. I don't see, at this point, where the 10 million sales come from, the US sales seem to have flattened dramatically, and the European sales aren't tracking the US curve (hardly a surprise) - there was some editorial in the press about this yesterday.

Even then, MS shipped nearly 8 million units on their OS in 2006, so we don't know what the official numbers will be for 2007, let alone 8. I've seen some unofficial ones, more than a million Motorola Qs shipped on Verizon alone in 2007, so Apple will still be playing catch up.

The "cut" of the phone bill is going to go away too. I've no idea how they managed to persuade people at AT&T that this was a good idea, but it doesn't really stand up to the GSM phone model, as last week's court action in Germany showed. Apple will end up having to sell unlocked phones to a general market if they want to sustain their sales, and the operators will expect the prices they'll pay for shifting them to come down accordingly, which will squeeze the margins more.

Of course, MS don't really care about the margins on the actual phones because all they ship is the OS and make it as easy as possible for manufacturers to port to their chosen silicon.

Based on the new projects we're seeing from OEMs and ODMs I'd say that there's a lot of mileage in Windows Mobile yet.


$21 billion for a gaming platform? Wow. Even with the dollar down the dump, this is much more than the planned cost for the research fusion reactor ITER (that is: construction + running costs for 10 years).

Ok, might be my German genes taking control of me and make me complain ;) but darn it, I'd much rather have ITER running now than the XBox. Why can't any of those firms or one the superrich guys take on these projects? I mean, for once I wouldn't even complain if Bill Gates were to found such a reactor. (And he could!)


to claim that the Xbox isn't is, well, wrong

How is it "wrong" to claim that a platform which cost in excess of $20 billion to develop and launch - and hasn't recouped a fraction of that investment - is a failure? If I spent $200,000 building a house, and could only sell it for $50,000, would you consider that a "success"?

XBox is a fiasco. Microsoft outspent Nintendo by at least $15 billion, and is losing in the marketplace. 360 weekly sales are now comparable to that of the (more expensive) PS3, also widely considered to be a flop. Look at this chart if you don't believe me: http://www.vgchartz.com/hwcomps.php?cons1=Wii®1=All&cons2=PS3®2=All&cons3=X360®3=All&start=39047&end=39411 Nothing like spending tens of billions of dollars only to duke it out for 3rd place.

European [iPhone] sales aren't tracking the US curve

I'd be surprised if the current-model iPhone was as successful in Europe as it's been in the US, if only because Europeans already have access to some pretty sweet phones. On the other hand, if you ignore ignorant editorials and go with actual figures reported by the parties involved, it looks as though the iPhone is selling up to - or in excess - of initial expectations. See for example http://money.cnn.com/news/newsfeeds/articles/newstex/AFX-0013-20911760.htm And of course, it's unlikely the current iPhone will last much beyond the New Year before being replaced by a substantially more powerful version. If Apple can address some of the device's more obvious limitations, they stand a good chance of broadening their market substantially in 2008. I'll certainly be looking at a GSM version of the iPhone.

Microsoft has been trying to crack the phone market for, what, the better part of a decade now? They're still in third place, after having spent billions. Likewise, they're battling it out for third place in the videogame market. Their forays into the consumer electronics space have been massive investment sinkholes - they'd have better served their shareholders by investing their money in Apple stock.

When it comes to consumer electronics it's clear Microsoft just doesn't get it, and with smaller, profitable, nimble competitors now dominating or moving to dominate the niches they've attempted to colonize, it's painfully obvious they never will get it. Not when their core business and cash cow - the Windows OS and Office suite - are coming under increasing pressure from ever-cheaper PCs and free (or ultra-low cost) open source alternatives. They simply aren't going to have $10 billion a year leftover to spend trying to give their hardware and software away to disinterested consumer electronics customers.


Iron Thighs@55: You're in the less than 1% who actually use a computer for computing. And if you aren't using some decent server-grade hardware, your department must be pretty tight. We've got a multi-CPU SGI system in a basement somewhere that the hardcore climate modellers use. Their stuff just doesn't scale down, you can't break it up across ten desktop systems. Which I think is part of Charlie's point; you don't need the supercomputer on your desk, one big server per thousand desktops is a much better way to do the heavy lifting (and ultimately more cost-effective).

I've seen clusters of all descriptions described as what happens when 20 different academics all want control of the pie, rather than letting someone set up what's actually the best system for the job ;)


I've dubbed this "bubblepack" computing since once the price drops a little more it will be selling in laser sealed bubble packs as impulse buys in Target/WalMart/Tesco.

Frankly, I see a huge market occurring, one that will dwarf the losses at microsoft. Frankly, I suspect that Intel will do well w/ all the servers, flash, and other chips in demand.

While they initially run the "same" software as PCs, bublepack (BP?) computers will be more like what microcomputers were to minicomputers. We'll have a lot of PC based servers, both in the googleplex and in each home (to store things w/ questionable IP or moral content). But the BPs will crowd out the PC for a lot of basic functionality just as the PCs did with minis. You will probably use your PC as a NAS, heavy processing (transcoding, rendering, image analysis) and as the gatekeeper for content within your house, but much of your physical interaction will be a combination of BPs and large monitors (HDTVs).

If the BPs crowd out the PCs and leak through the digital divide to the point that they are as common as cells, I see a much bigger market for smaller software. I also see metro wifi/wimax turning into a no brainer. Service will probably involve wiping the unit and tossing it into the recondition pile and then handing the customer a new one. Web companies will make out like bandidts, smart content providers will too, once they focus on juggling novelty and availability over IP extremism. I also see a new market "configuration distros", in which the basic BP gets a distro/acessory set which suits the user. A hello kitty distro with a HK skin, panic button and parental controls, a set of student distros tweaked for note taking interfacing with the college network and major specific software. At the cost, getting one for work and one for leisure won't be uncommon. Or the distro may just be in the flash installed allowing for a personality shift as you flip between flash modules.

If Microsoft follows the status quo Redmond will be the next Detroit. But in this specific case, I do belive in the creative destruction that will result.


Re: XBox - do you have a cite for the $21bn in investment for the xBox handy, the only quote I can find has Robbie Bach saying around "3.8bn", which while a serious chunk of change is a lot less than 21. It'll be interesting to see if the PS3 maintains it's pick up. You've 2 platforms which are almost neck and neck in sales according to the chart you provided, both aimed at different parts of the market. We have a Wii because my wife actually was prepared to have one because it's focused at a different sector of the gaming market. It was a gamble that paid off.

Apple: Microsoft launched Stinger in 2002, so they've been at an actually mobile phone play for 5 years. They're second to Symbian in the Smartphone market and gaining market share. As I said, I base this on the projects we do (I work for a company that puts together the software builds for most of the world's phones) and we're seeing huge amounts of traction in Windows Mobile, especially since they launched 6.0.

I'd not see that quote from Erskine, I had seen the one from the previous day where he said it was probably "tens of thousands". But as you point out the iPhone isn't as big a deal in Europe. BTW - the iPhone is a GSM phone, it's just crippled by being locked to certain networks (apart from in Germany).

Microsoft's phone play is more interesting because they didn't go for the consumer electronics play. They're licensing a fast to integrate and deploy mobile OS which any OEM/ODM can build a phone around. It's proven harder than they expected to get it right, but from what we're seeing of the devices starting to emerge from the leading OEMS (Motorola Q series, Samsung Blackjack, etc...) they're really starting to get some traction, plus while there are limits on what you can do, it's pretty flexible. So you can have something like the 2 versions of the Palm Treo, the Blackjack, the Q plus the touchscreens.

The problem for the phone market is the software, as a proportion of your development costs is a really trivial amount, so it's a better deal to license something at $10(ish) where the core integration work has been done well, than it is to use free software but be looking down the thick end of a 40K hour+ engineering project.

In this regard MS, Symbian (S60 and UIQ) and, probably Android, will end up duking it out for market share with Apple having a nice little niche.

The phone is too much of a commodity good for them to repeat the success of the iPods. Even if they ship the numbers they're aiming for over the next 2 years, they'll still be well short of the MS, Symbian totals and way short of Nokias _monthly_ shipping numbers.

And, finally, while everybody keeps telling me that MS are coming under increasing pressure, they keep delivering double digit sales growth in their core markets. Now that has to stop one of these days...


do you have a cite for the $21bn in investment for the xBox handy


BTW - the iPhone is a GSM phone

I'm old and getting my computer industry acronyms confused. Meant to say 3G, not GSM.

It's proven harder than they expected to get it right

They've spent a fortune and it's taken them years. Apple launched out of the chute with a better product, which they developed in about a year for a fraction of the cost (hardware AND software). This does not bode well for Microsoft's future in the phone market, let alone the rest of the consumer electronics space.

They're second to Symbian in the Smartphone market and gaining market share.

Not according to Wikipedia they aren't. According to this page they rank a distant third, behind Linux. http://en.wikipedia.org/wiki/Smartphone RIM is threatening to overtake Microsoft in the smartphone market, pushing them to fourth.

Sort of embarrassing for the world's largest software maker. 5 years and they're still a niche player with a commodity product. Apple will probably make more money off of smartphones in 2008 than Microsoft.

Even if they ship the numbers they're aiming for over the next 2 years, they'll still be well short of the MS, Symbian totals and way short of Nokias _monthly_ shipping numbers.

Yes, but Apple will be making money off of every unit sold. The other guys, not so much - they can't command premium pricing. And I expect to see OS X begin to crop up on a slew of other Apple-branded consumer electronics devices over the coming years - we're already seeing it on the iPod Touch.


I'd note that the iPhone has fairly reliably been scoring really badly on useability tests, and personally after trying one there's simply no way I'd buy one - I prefer tactile feedback, thanks! (this is also why I mourn the demise of the rollerball on notepads - one old laptop I used had it at hinge, to the right, with the buttons on top of the case. This worked really quite well...)

Dave Bell@57: What you mean "lead to" on Poser porn? (Let's just say that being careless with google image search lines can accidentally expand your knowledge in bad, bad ways)

PS, todays great spam subject line: "Beware of beam-controlled government mimes"


Hmmm... the Forbes article throws in that number and doesn't back it, plus it's quite out of date hailing from April and not taking into account the huge surge in the XBox sales that your charts you posted have shown. Robbie Bach's quoted number at $3.8bn sounds more credible. I'm slightly aquainted with the operating budgets of that MS unit and $21bn doesn't fit in there.

Ugh. That wikipedia article is pretty misleading. I'd like to say more, but take the Linux number with a pinch of salt, in the phone sector not all versions of Linux are actually running what would be known as Smartphones, especially the DoCoMo and chinese market devices, they're really more high end feature phones. By that logic, you could say that all of those are beaten by high end Nucleas or OSE devices running J2ME applications.

Sort of embarrassing for the world's largest software maker. 5 years and they're still a niche player with a commodity product.

No, they're really not. They're supplying a complete phone platform like Series-60. It's being used by 3 out of the 5 Tier 1 OEMs and every single ODM trying to break into the phone market in Europe. There's a bunch of Tier 2s in there too.

I'd be happy to continue this via email, partly because I think it'll get boring for people and also there's lots of stuff I can't talk about publically because, as I said, I work in this stuff.

Not for MS, in case you were wondering. We work on Linux, MS, Symbian, RIM and lots and lots of RTOS based phones.


OSX - yeah, one last thing. The Apple kit is currently running with 800MHz processors, and apparently it's about 500MB of code.

As a high end device that will have a loyal "high end" spender appeal that should work. It will cause them problems for the rest of the market. To put it in perspective, the monsterous bloatware that is Windows Mobile 6.x fits into a 64MB ROM (128MB prefered) and will run on a 250MHz processor. It's one of the reasons that the BOM on an iPhone is about $250 compared to most other Smartphones at sub-$150. As long as Apple can stay playing in the high end of the market, the sums will work for them. But it will take a lot of Moore's Law to get the prices for that kind of performance down to meet their competitors, especially some of the Flash based devices that are coming along running on RTOS platforms.


re: Poser porn

I'm not sure if they can sell the program in Saudi Arabia.


The 21bn quote isn't for the Xbox. It's for MS's home entertainment division in total:

"Microsoft's Home & Entertainment Division ("H&E"), which includes Xbox 360, Xbox, Xbox Live, Consumer Software and Hardware Products, and IPTV."

You should also keep in mind the box actually makes a loss every time it sells - much like the PS3. The money is made back from the licencing fees on the games, from subscription to the Live service (the thing you need to play online/chat), downloadable content, etc.

It's a subscription model - not a purchase model. I think you'll also find that some of the Japan numbers got better after the release of Halo3 (the killer app' according to MS).


Hmmm... the Forbes article throws in that number and doesn't back it, plus it's quite out of date hailing from April and not taking into account the huge surge in the XBox sales that your charts you posted have shown

April was hardly the stone ages, and their point was that MS has blown $20 billion plus in the games market since 2002 and still hasn't turned an annual profit with that division (see the table from this link for details: http://seekingalpha.com/article/32642-when-will-microsoft-own-up-to-the-xbox-360-bomb). Yes, there was a sales blip a few months back when Halo finally came out for the 360, but it wasn't a $20 billion blip and sales levelled right off again after a couple of months. Current sales appear to be on-par with those from last Thanksgiving. In other words, demand is flat year over year. The PS3 is rapidly catching up in sales - recent price cuts shouldn't hurt on that front - and the Wii surpassed the 360 in overall sales in half the time on the market. I wouldn't call the 360 a disaster from a sales perspective, but they'll be lucky to maintain their position as the #2 selling console in the current generation, and could easily fall into third place. All this for only $20 billion dollars. Microsoft would have been better off burning the money to generate steam to drive a turbine.

The funny thing is PCs are starting to fall into the same price range as the 360 and PS3. One of the arguments Microsoft's (and Sony's) defenders have been making is that this massive investment in gaming consoles is justified because these devices will represent some kind of a multimedia convergence hub in the home. But why deal with some crippled, overpriced, propriteary gaming console when you can wait a year and buy a full-featured PC - with more hard drive space - for a similar price, and just hook that up to your TV? I keep looking for Apple to kill the AppleTV and just replace it with a beefed up Mac Mini with a larger hard drive and Front Row configured to run on startup. If they don't somebody else will probably do something similar using increasingly commodotized PC hardware. Maybe Google - like Microsoft they've got loads of cash to throw around, and unlike Microsoft they seem to have some idea of what consumers actually want.

As for the iPhone's OS X-driven hardware requirements, the cost of that hardware's only gonna continue to decline. At some point it becomes a non-issue, at least in the smartphone arena. Regarding marketshare, those figures don't come from "Wikipedia" but from the source the Wiki article cites, Canalys. It's not just some crap some kids made up.

One area where I see the iPhone / mobile OS X expanding to is gaming. There's nothing to stop 3rd party developers from producing games for the iPhone. The first gen device will probably have its batteries sucked dry by games, but the 2nd and 3rd gen iPhones and iPod Touch units should be able to handling gaming nicely. I think we'll see some reverse convergence, as phones and PCs begin to assume the functionality of handheld gaming devices and consoles, respectively. I don't think there's ever going to be a true multimedia hub - I think ultimately you're going to have a lot of devices which share common media file formats and can communicate / share with each other, from your iPhone to your PC to your big HDTV. The device I see as most likely to get crushed out of existance in all of this is the dedicated game console - they've become so complicated they're computers in their own right. And with computers selling for only $100 or $200 more, why buy a game console at all? This is the exact same phenomena which killed game consoles the first time around, back in the early 1980's, when you could score a Commodore 64 or Atari 800XL for only a hundred or two more than a 2600 or Intellivision.


Dan Flanery: I keep looking for Apple to kill the AppleTV and just replace it with a beefed up Mac Mini with a larger hard drive and Front Row configured to run on startup.

Me too, and I expect it soon -- the Mac Mini hit end-of-life about a month ago. I suspect the main cause of the delay is that Steve Jobs has committed Apple to Blu-Ray, and the hypothetical Apple TV/Mac Mini replacement should therefore have a Blu-Ray drive. But BD drives are still a little pricey (although £120 consumer BD drives from Alba are promised for early 2008), so there may be headaches getting the supply chain lined up behind the new product.

(Which, in hardware, I expect to resemble a Mac Mini with HDMI and composite video out, and a CD/DVD/+-RW that can also play Blu-Ray disks, and the aforementioned "boot straight to Front Row" mode, and a slightly different marketing proposition to the Mac Mini.)

As long as Apple can stay playing in the high end of the market, the sums will work for them. But it will take a lot of Moore's Law to get the prices for that kind of performance down to meet their competitors, especially some of the Flash based devices that are coming along running on RTOS platforms.

People used to say similar things about the iPod: "Well, sure, it owns in the high-end, premium hard-drive player market, but that's a diminishing sector compared to the huge overall mp3 player market, in which they'll never be able to compete cost-effectively!". And then iPod Mini, and then Nano, pirouetted in by the tens of millions, dismantled the flash-memory player market and rebuilt it in their own image.

Don't underestimate their ability to scale down and produce an "iPhone Mini" that slashes BoM in half or better, while still bringing something fresh to their customers, even if it involves trade-offs. (These days, Apple are pretty slick on "consumer segmentation," spotting where the sweet spots are to pitch products and prices. The end result might not look anything like an iPhone, but it'll do something spectacular as something to hang the marketing off of, provide a "warm bath" user experience and hit a magic price point.)

Meanwhile, the other companies seem largely to be Doing It All Wrong: They're trying a kind of Cargo Cult product-design: They blindly replicate some of the surface details of Apple's formula -- touch screen! glossy buttons on a black background! tilt sensor! -- but seem unable to see through to the underlying principles, and so they get nearly every detail wrong. Usability on most of these devices is wretched. In a way, they're making things better for Apple -- the thing about being kicked in the teeth is that it's great when it stops, and that's most people's* response to trying an iPhone.

(* Andrew's personal taste aside. I haven't seen any genuine usability studies of the iPhone finding any problems. I've seen some blatantly dodgy "We compared experienced users of system X [T9, thumboard, whatever] against some newbies we rounded off the street on iPhone's keyboard". Yeah, I wonder how that'll turn out.)


Oh, and a quick note with respect to:

"sensible content-on-demand system with local caching"

...in the right context, P2P systems are exactly that. For example, Steam is a legal (commercially-run) P2P network for distributing videogames, run by Valve Software. At home, it can take a long while to install a game over Steam, but at work, it installs gigabytes with insane speed, beacuse it's using peers on the local area network.
Because you tend to want the same games as people you work with (because you discuss the games over lunch, or play against them in multiplayer games), this is a smart local cache that sort-of falls out of the system "by accident".

More serious research in these areas can be found by googling "content-centric networking". There's also a Google Video presentation on the subject. I thought it was quite flawed, though.


Charlie... While I can agree that for MOST people the native processing grunt doesn't need to be actually in their device (lap or desktop), there are a significant number of us creative types using computers as fancy recording studios where the latencies of the fastest IP network make real time audio (and to some extent video) processing anywhere other than right under your fingertips a non starter.

For me a fully pimped out macbook pro just about cuts it with a big multitrack project, and I'm more comfortable with the grunt of a big machine under my desk when I really start to throw virtual effects and instruments about.


People used to say similar things about the iPod: "Well, sure, it owns in the high-end, premium hard-drive player market, but that's a diminishing sector compared to the huge overall mp3 player market, in which they'll never be able to compete cost-effectively!". And then iPod Mini, and then Nano, pirouetted in by the tens of millions, dismantled the flash-memory player market and rebuilt it in their own image.

Dan: Your own data doesn't really support the argument you're making about the PS3 and the Xbox360, at least not the charts you supplied. Personally, I'd prefer some open standards on gaming platforms myself, such as the one EA called for a couple of weeks ago. I doubt if we'll get that. But as somebody else said, that $21bn is for the entire home devices group, which is significantly more than just games.

I'm wavering about this type of convergence at the moment, I was rather shocked when the gaming system market reappeared. I also thought that people would go for computers. But people would rather something simple that plugs into the TV and just does what it says on the tin. My Wii does that. I looked at Apple TV but settled for a Media Centre PC because it did more of the stuff that I needed it to do, I'm considering an Xbox360 because I've seen it's media extender working at a neighbours and with the HD-DVD it really would solve a lot of my home entertainment problems in a relatively cheap box.

The hardware issue is a problem for Apple because they're coming in so much higher than everybody else. They have a fantastic experience on hardware that will come down in cost, but they're at a BOM $100+ a unit higher than any other Smartphone. As the prices drop, the abilities of their competitors will improve, and they'll start to hit other problems they've avoided, like having to work out how to do 3G radio etc... I'm sure they're bright people, but Nokia, Samsung, Motorola, LG, Sony Ericsson etc... aren't exactly slouches in the consumer goods market and they're used to working in the low end already. Their stuff will only get faster as the prices drop, whereas Apple will find themselves with a lot more competition and declining margins.

The other companies sell hundreds of millions of units, not tens.

Regarding marketshare, those figures don't come from "Wikipedia" but from the source the Wiki article cites, Canalys. It's not just some crap some kids made up.

Did you try and look up the actual data? You can buy the report they claim to have used, but the data isn't actually available on the Canalys website.

The problem with the data is none of the players provide particuarly accurate numbers and all of it is under NDA.


Canis @72: The cellphone market is hundreds of millions of units a year. Comparisons with the PC market at one end, or the MP3 market at the other are just wrong. They might be able to slash BOM but not without radically re-engineering the core system to work on single core radio and application processors, that would involve getting their cool stuff to work on a processor running at about half the current speed and in a fraction of the available memory. All the while they'll be chasing down into an area where people have more experience. (*)

The other companies might be "doing it wrong" but not everybody wants the iPhone features.

On usability. I've used an iPhone and the browser is amazing, the iPod services are pretty good, the phone is average and the messaging and email, in my opinion, sucked. The visual voicemail is a cool feature but not hard and dependant on the network back end rather than the phone.

The really nice eye candy works fast and smoothly because they've twice the processor power and 4 or 5 times the RAM of any competiting phone.

I need a keyboard myself and I need business email, so I've 3 real choices, a Blackberry, a Symbian based phone (UIQ or S60) or WinMo. I use WinMo (Samsung Blackjack) and I'm reasonably happy with it. As I work in the business I get a lot of phones given to me and I've used lots of them over the last 5 years and while the iPhone has lots of cool stuff, I'm not a fan of touchscreen phones myself (I've had 3) for all the usual reasons: lack of feedback, lack of keyboard, the screen gets really disgusting really quickly - there is nothing worse than a touchscreen phone in London on a hot sticky day. Ugh. Finally, I want to be able to run real IM clients, Skype and any other application that takes my fancy without doing nasty things to my phone - realistically that limits me to WinMo or Series60.

A lot of the features that Apple have put into the iPhone aren't that different to ones that I've seen before, that didn't actually make the cut to get into phones we were working on. Mostly they were dropped because the manufacturers couldn't get them to work properly on a device with a sub $150 or $100 BOM, or they were just not that good in the first place (touch screen), or we had to work on something harder and more time consuming (getting 24+ hours of battery life out of a 3G radio).

A lot of the UI stuff has been solved by Flash or similar and you'll see a lot of devices coming out with Flash or Flash like UIs in 2008, and that will be on devices with BOMs that are less than half that of the iPhone.

I am tired of being told how amazing Apple are because they've been able to do something that any of the other phone companies could have done if they had the luxary of selling to a high end Niche market who don't care about cost. They don't, Nokia is more worried about selling 10 million phones a month rather than that number a year.

(*) - actually everything I've heard so far indicates they intend to stay dominating the end for as long as possible.


Me too, and I expect it soon -- the Mac Mini hit end-of-life about a month ago. I suspect the main cause of the delay is that Steve Jobs has committed Apple to Blu-Ray, and the hypothetical Apple TV/Mac Mini replacement should therefore have a Blu-Ray drive.

Doesn't sound like the standard Apple strategy to back a weak technology like Blu-Ray. While not an Apple fan (I hate the hype, and the iPod/iPhone does nothing for me) they do have a strong history of taking promising tech and pushing it to Breakthrough stage. I'd expect Jobs to broker a major jump in TV on par with a Tivo w/ downloads and iPhone/iTouch transcoding. Sort of Miro on steroids.


Your own data doesn't really support the argument you're making about the PS3 and the Xbox360, at least not the charts you supplied.

It doesn't? Here are the arguments I made:

1) That Xbox 360 sales are flat year over year. That's true – you can draw a line from last week's sales to the same week last year, and it's almost perfectly straight. Graph supports that assertion.

2) That the 360 saw a big sales spike related to the release of Halo. True. Halo was released preceded by much hype in late September, and we see 360 sales jump from roughly the start of August thru mid-October, when the PS3 got its big price cut.

3) That PS3 sales spiked when the price was cut. Well, that's obvious from the graph.

4) That PS3 sales are picking up, affording it the opportunity to catch up with the 360 in overall units sold. This leaves Microsoft competing for 2nd place in the next gen console wars with the PS3. I'd say that's true as well, looking at that graph.

I'm not the one who made the post regarding the iPod, although I generally agree with its observations and conclusions.

That $21bn is for the entire home devices group, which is significantly more than just games.

It is???? WHAT??? The Zune? Didn't Toshiba develop that brown brick? The Xbox and 360 are the only home devices released by Microsoft that have been even remotely successful in the marketplace, if by "success" you mean they've been successful in essentially giving units away to consumers for free and losing billions. If Microsoft has spent a sizeable chunk of that $21 billion on anything other than the Xbox, then it's represented an even worse investment than their gaming consoles, because nobody's buying that other junk. (For example, if they spent more than $100 million to develop and launch the Zune they're imbeciles.)

I also thought that people would go for computers. But people would rather something simple that plugs into the TV and just does what it says on the tin.

Well, when the gaming market reemerged computers cost over $1,000. There's a pretty big gap between $200 and $1,000. Now that computers are approaching the $300 price point – and console costs have ballooned to $300 – that gap has largely vanished.

I think all this nitpicking over graphs or where exactly Microsoft incinerated $21 billion in shareholder's money misses the bigger point, namely that the rapidly decreasing cost of PC-level technology and proliferation of low-power consumption portable form factors poses an enormous market disruptive threat to today's major technology players, including not only the traditional PC manufactures like Dell, but also OS and application vendors like Microsoft, cell phone makers like Nokia and game console manufacturers like Sony.

Here's my nightmare scenario for all of these guys: in 2011, Apple introduces a $399 iPhone running OS X on top of a mobile, low-power x86 platform. It comes with 2 gigs of RAM and 32GB of flash "hard drive" space. It can be plugged into a docking station to utilize a 24" monitor and full-sized keyboard. It can connect to your HDTV to play games. It can play portable games. It functions as an iPod. It can run PC software natively with no need for Windows via virtualization technology. Oh, and it's a cell phone too, as well as an IP phone with wireless connectivity. There's even a portable docking station for it, so those who need to compute on the road can use it as a laptop.

Why would businesses buy anything else? Employees can use it as a phone when they're in the office, over the wireless network. It's their cell phone. It's their computer – they can use it in the docking station at work, and in a docking station at home. If they need a laptop, there's a portable docking station for the road warriors. Now, that's a convergence device.

There's a reason why Apple recently removed the "Computer" from their name.


I am tired of being told how amazing Apple are because they've been able to do something that any of the other phone companies could have done if they had the luxary of selling to a high end Niche market who don't care about cost.

I do want to address this point. The iPhone is frankly not all that expensive. There's a large up-front cost, but AT&T's monthly fees are pretty reasonable. There are people out there paying a lot more over 2 years for dumb phones than it costs to get an iPhone and hold it for 2 years. AT&T is doing this to attract customers and to polish their deservedly-poor reputation, but like today's Macs the iPhone is not a bad deal and the total cost of ownership certainly isn't exorbitant when compared to its rivals.


Along the theme of low priced, low end notebooks I just saw this article:

It looks like Zonbu is releasing a $279 laptop, though it might have some odd monthy fee attached. Not sure if that's optional or not. Low specs, but it's running Zonbu's custom Linux OS so it may run pretty fast.


Dan: One of the charts you put up has Wii on 44% market share, Xbox on 37% and PS3 out of site in 3rd place. They've had a spike from the price cut - hell, I was tempted to get a Blu-Ray at that kind of money, but I think any talk that they'll make up the gap is unlikely.

The MS Home Group includes the Xbox et al, plus the live! group supporting them and a bunch of other stuff. I don't know who built the Zune hardware but the software is all MS, as is the portal side of the operation. They also run their MS TV ops and a bunch of other home related stuff which plugs into the Media Centre bundle. Some of which is actually pretty sweet stuff.

Here's my nightmare scenario for all of these guys: in 2011...

Hmmm... it's tricky for me to say too much on this subject, but, that's not a 2011 scenario. There's at least one device I know of that'll be launched at 3GSM in Feb which will be pretty much as you describe. Although, to be honest there are limits to what you want from a converged device and there'll be some sound reasons why businesses and individuals won't actually want that convergence.


Monopole: Steve Jobs publicly committed Apple to Blu-Ray. Think he's going to eat his hat in public?


Dan @79: There's a large up-front cost, but AT&T's monthly fees are pretty reasonable. There are people out there paying a lot more over 2 years for dumb phones than it costs to get an iPhone and hold it for 2 years.

There are? I'm looking at AT&Ts website at the moment for a US comparison and the most expensive comparable phone is the "Tilt" aka a HTC Omni which is still $100 upfront with a 2 year contract. Plus it's 3G and has GPS. (Personally I'd not have another HTC device due to their shoddy QA but in terms of features it's pretty much what you outline above)

If you want a music phone with a web browser, they're basically free with a contract - the Sony Ericsson's are really good devices and have an amazing Flash based UI on the most recent builds.

If I was still living in the UK and in the market for a new device, I'd probably go for an N95 which is free with a 2 year contract from pretty much anybody. Much more device for my money than an iPhone - although, it still isn't quite what I'm looking for.

I do have my next phone in mind, it's due to ship in November '08 and it is almost exactly like the one your described in your nightmare scenario, except it's not running OSX. It will have extenable memory, external docking capability, WiFi, decent Bluetooth, a touch screen and multimedia capabilities, GPS, a 5MP camera with video camera abilities etc...


I'm retired on disability, so not in the same set as you guys are for phones and computing, but I would really rather not have the phone and everything else on one device because I almost certainly won't be able to afford it. I like my obsolete Samsung phone. I just need a new subnotebook to travel with (and use in the recliner on the days I can't sit up at the desk -- I can still do that with the current subnotebook, I just can't get email on it without a lot of dial-up downloading and installations).


Now that a considerable number of people have expressed their unfavourable opinion of Xandros -- not the least Charlie himself -- I should mention that this particular combination of SSD and Linux has its advantages, too. Kevin Tofel from jkOnTheRun demonstrates in a video that it's feasible to make a complete EEE PC system restore -- via a recovery partition on the SSD -- within mere 90 seconds.


One of the mighty powers of the reality distortion field is the ability of Jobs to produce a memory hole so mighty that not even light can escape. Remember the first iPhone, the ROCKR to which Jobs was committed? Nobody in the press does. Jobs would be able to drop the Blu-Ray in a nanosecond. The press and the Apple fanbois (as if there is any remaining distinction) will immediately announce that feinting towards Blu-Ray and then dropping it was all "part of the (divine) plan from the beginning" that "Jobs moves in mysterious ways" and "we have always been at war with Eastasia".
Just look at the Intel switchover, one day the PPC was inherently superior to the x86 architecture, a veritable desktop supercomputer, and the next day it never existed, airbrushed out of history like Trotsky. Same thing with video on the iPod, for years the fanbois argued that nobody wanted video on such a small screen and that video was intrinsically different from mp3s. The instant Jobs announced the video iPod, the fanbois announced that he had invented TV.


Dave @76/83: I started trying to address this point by point, but it's late and I'm tired. So here's some highlights:

- I don't think a hypothetical "iPhone Nano" need be a cost-reduced iPhone at all, just like an iPod Shuffle bears no relationship whatosever to an iPod Classic, so trying to re-engineer OS X into a smaller system is a red herring.

- "The other companies might be "doing it wrong" but not everybody wants the iPhone features." Ironically this sums up exactly what I mean. It's not about chasing features, it's about user experience. If not everybody wants the iPhone features, why are the other companies trying to replicate them? They should ditch features, aim for simplicity, and optimise the hell out of the experience until they have something my grandmother could use.

- All your comments about having seen Apple's features before etc completely miss the point. It's not about features. As far as I'm concerned, "feature" #1 on the iPhone is: it doesn't stab you in the face when you try and use it, like every other phone I've used. The more features a phone has, the worse it is. Phone manufacturers add features to phones as a comparison chart pissing match: "We have 300 features! They have 250! Our phone is clearly 20% better!" They don't actually care if anyone uses -- can stand to use for more than 30 seconds -- those features. GPS? Without an external antenna it'll probably give worse resolution than cell triangulation. N95? An acquaintance of mine had one, got rid of it in favour of an iPhone because it chewed through its batteries in two-thirds of a day.

- Networks care about 3G a lot. They should. They got stung for huge amounts in the carefully-rigged spectrum auctions. Consumers? Not so much.

- Messaging and email are working great for me, better than anything else I've used by far. There's a bunch of features they don't have, but they're not features I use, or indeed features that anyone I know uses.

- Flash has not solved any UI issues. Flash is a tool that can sling some alpha-blended vector and sprite animations around. UI is not about these things, and that is one of the mistakes I see time and time again in phone UIs: A profound lack of understanding as to user-interface design. There will be whizzy animations aplenty in 2008, I am sure -- I've seen the TAT demos, for example -- but they will be whizzy irritants after the initial shine wears off.

- My previous phone was, as you describe, a "music phone with a web browser". Allegedly. My girlfriend just chipped in with "However what it actually was, was a small pinging machine...". This is due to exactly the kind of user-interface stupidity I mentioned earlier. Every time you plugged in the charging cable, a dialog box programmed by some over-eager engineer would pop up to tell you that it wasn't just charging, it was fast charging! And it would ping! And if the cable was a little loose (and the stupidly-designed connector meant it always was), it would do this often. And the dialog boxes stacked up! So you might be woken up at night by a weird alarm-like sound coming from the next room, and discover hundreds of stacked dialog boxes, that can only be cleared by OK-ing each one, before you are allowed use of your phone again. But the real icing on the cake? The button you needed to press to clear the dialog, was the same one that brought up the recent calls list. The same one that, on the recent calls screen, would dial the selected number. And it buffered keypresses, so if you just mashed the OK button a lot to try and make all the dialog boxes go away, you'd randomly end up calling someone! There's nothing like a little social awkwardness to really endear one to modern technology.

And that's just the basic phone facilities here, we're not even getting into how abysmal it was as a web browser (utter failure) or music player (embarrassing in the same way as David Brent telling jokes -- it's not just bad, it's just kinda cringeworthy that it doesn't seem to realise how bad it is).

I did use the web browser to look up a google map... once... when terrorists bombed the London Underground and I was stranded in an unfamiliar bit of London. That's the kind of extreme circumstances it took to drive me to it...

Seriously: The innovation of the iPhone is not in any one feature, gimmick or whatever. It's simply this: It has features that people want to use, and they can.


Canis: I'm happy you like your iPhone. I've used one, I've had one on loan for a bit. I didn't much like it. I didn't like the address book, I didn't like the contacts search and I didn't like the way it did email and messaging. I did like the browser and some aspects of the UI, but as I said, I really don't like touchscreens or the lack of tactile feedback.

For me it is about features. I want Skype? I download it. I want Google Maps. I download it. A new set of games? Different UI? New notes application? etc... that's what having a Smartphone means to me. If I wanted something that didn't do that I'd go back to a Sony Ericsson feature phone. I like the UI, it's easy to use and works everywhere.

Likewise, I want 3G, I notice the speed difference. I like not having to rent a phone in Japan.

I'll not deny that the iPhone is a sweet piece of kit, I just don't think it's that sweet. But then I have the same problem with most Apple products. I own an old Nano, so does my wife. She's going to get a Zune next because she hates the new nano form factor and both of us have always felt the UI was over rated - it was a convenient shape and size. I might even get a touch which does all the stuff I liked from the iPhone without the annoying phone implementation.

At the end of the day, with the business model they've set up, I think Apple will struggle with the iPhone once the "gosh, wow" factor has worn off. A lot of iPhone "like" stuff has been in the works for a while, the problem has been getting it all to work well on low end processors running with 64MB of memory because virtually all the operators want Smartphones with a sub-$100 BOM - when you're down to counting MIPs available for threaded processes, you know you're going to have problems making the UI look cool and work smoothly.

I think it's going to be easier for the people who already work on seriously constrained platforms to take the best of what Apple have implemented and improve on their existing stuff, than it will be for Apple to work the other way.

Finally, phones are more of a fashion accessory than any other consumer good, even MP3 players. I'm interested to see how Apple are going to handle that.


BTW, a review of the OLPC XO by a NY Times tech correspondent (my apologies if it's already been posted - I'm skipping ahead to the end of the thread):



My mobile phone will send text messages.

I've never been able to reliably enter a message on a numeric keypad.

This is the core problem of a mobile phone--the conflict between the number of keys and the number of symbols you might want to use.

I understand that the much-maligned Patientline service in UK hospitals uses a phone handset, conventionally sized so that the speaker is near your ear when the mic is near your mouth, which opens to reveal a conventional alphanumeric keypad. Something of around the same size as a Psion 3. Not that you find anything useful about their supposed internet access from their website.

I think convergence between mobile phone and computer depends on the keyboard substitute.

There are, incidentally, some fundamental optical reasons why mobile phone cameras are never going to match even cheap dedicated hardware. The lens systems are small enough to hit diffraction limits. There are advantages in always having some sort of camera with you, but that convergence will always be imperfect.


Off-topic, but Economics 1.1 is hitting some speed bumps.


Notable comment: http://www.haloscan.com/comments/calculatedrisk/4964319924583004358/#355549

Looking back on this mess it really does look like somebody decided to implement Mixmaster as a strategy for portfolio diversification. Although ideal gases do not all of a sudden decide to move to one half of the room, humans often do.

Is Toast a living document?


I'm going to assert here and now that Canis is right about one thing: almost all mobile phone user interfaces are shit. And it's not about graphical glitz or flash, it's about the fact that they're a dog's dinner of random features glommed together around a lump of awful ergonomics with no thought as to how the user's going to use them.

My parents have Nokia mobile phones that they can barely use. They're too complex for eighty-year-olds; and these are old (monochrome display, simple menus, no icons, no funky features) candybar phones from about four years ago. Modern phones just bewilder people. Another relative of mine -- fifty years old, active -- has a nokia N95 she is afraid to try and use, because she simply doesn't understand it. She was happy with her Nokia 6310i for many years, but forced to move due to it more or less falling apart, and the new stuff? It's incomprehensible gibberish that doesn't do anything useful to her. (She's got a digital camera, and she's got an ipod. She understands them. She doesn't want all the other crap in a single package; it's confusing.)

I've also got a beef with modern mobile phones (and it's not just Windows Mobile, with its execrable user interface, or Symbian UIQ 3, which is as buggy as an ant-hill and as slow as Windows 95 on a 386).

Modern phone fashion has driven the mobile phone in a direction that is physically the wrong size for a human being. I want an earpiece next to my ear and a microphone in front of my mouth, fifteen to twenty centimetres apart. I don't live on the phone the whole time so I couldn't care less about bluetooth or even wired handsfree headsets -- the only times I've tried them they were uncomfortable and didn't fit my ear properly. I don't care about voice tagging of contacts -- I am not going to spend six hours programming my phone to recognize names when I maybe make an hour or so of calls a month. I do not want fragile hinges, unless it's to expose a full-QWERTY keyboard for typing. I don't want a touch-screen, I want toughened glass so I can drop it in my pocket among keys and coins and know it's not going to get broken. And I don't care about its weight unless it goes over 250 grams -- half a pound, in old measure.

Modern mobile phones basically suck; meanwhile, if someone took the old form-factor of the Nokia 6310i and stuck a 3G stage in its guts behind the same old simple black-and-white hierarchical menu UI, I'd be very happy with it (and so would millions of other people, judging by the price 6310i's go for on eBay).


Dave@87, I'm sorry you don't like touchscreens. Personally, I've always found them great (long before iPhone), as long as they're carefully implemented. I'm also surprised you and your wife are considering switching to a Zune. I regularly see the opposite happen: People pick up another MP3 player because they want features unavailable on the iPod and/or to save money. Then some time passes, and they ditch it for an iPod because they realise they weren't using those features anyway, and the basic music-playing they really care about, just isn't working right. (I gather that version 2.0 Zunes aren't actually that bad, but they're also the same price as iPods, bulkier and don't bring anything interesting to the table. And they're the best of the alternatives.)

For you it's about features, but I'm not trying to persuade you to buy an iPhone. You're clearly happy putting up with awkward UI if it provides the features you want, and expect to be able to install exactly the configuration you desire. But then again, you work in the industry. There's a reason most people don't use Linux. Most people want a device that covers a few basic features, and works. Apple is simply expanding the range of what's considered "basic" to include email and the web, in line with how pervasive those technologies are nowadays on the desktop.

I think it's going to be easier for the people who already work on seriously constrained platforms to take the best of what Apple have implemented and improve on their existing stuff, than it will be for Apple to work the other way. Finally, phones are more of a fashion accessory than any other consumer good, even MP3 players. I'm interested to see how Apple are going to handle that.

*blink blink* Really? You really think that? You don't think the iPod is a seriously constrained device? You don't think the iPod is considered fashionable?

I think Apple have plenty of experience there.

Stop for a moment. Think back to 1984. Think about the CPU and RAM of an original 1984 Apple Mac. Think Apple don't know how to make world-class UI on a constrained device? They invented it. (Not the UI. PARC did that. But PARC threw a shit-tonne of hardware at the job. Guys like Bill Atkinson made it work in 8mhz and 128Kbytes).

They've screwed up along the way. The 90s were particularly dark years. But to pick a couple of Apple's winning areas and argue that their inexperience there will keep them from success? Oof.

Also, one final point I'd like to make clear: I don't think Apple are invulnerable, or can do no wrong, I'm not even a champion for Apple; I'm a champion for their approach to product design, and I'm mentioning these things because I want to see that adopted more widely. Despite what you say -- and I've seen plenty of pre-release phone stuff -- there are no signs of this happening, because there is too much inertia pulling the wrong way inside the MoPho industry. If anyone's likely to shake things up, it's going to be another new entrant to the field. The Android demos are technically competent but the UI is still atrocious; but maybe a new entrant will take their OS foundation, build a warm-bath UI over it and bring out something really interesting...


Dan@70: Microsoft would of made a profit this year if it wasn't for the RROD 360 burnout issue. They will make one next. There are problems with the two 3-gen consoles, but that is largely unrelated to the box cost (more, very few people can afford to *make* games for them), and the Wii has a seperate set of issues (lack of access to useability data, low attach rate).

(Game designer. Work in it. It's a heck of a mess...)

The PC's for the same price as the consoles are also nowhere near as powerful. Grab a PS3 and run Linux on it rather than going that way (and you can stick any 2.5" SATA hard disk in there, so the space argument's a null point).

Canis@72: No, not my preference. See: http://www.usercentric.com/news.asp?ID=391

And no, pulling data across interconnects from other users on the same local node isn't sensible caching. It's better than downloading remotely, sure, but it's still using multiples of bandwidth - I pointed to NNTP for reason...

As for phones, I use a Nokia 7210. It makes calls, and that's all I want from it. Its keys are on the small side of acceptable, but at least I can usually hit them unlike more "advanced" phones. And touchscreen is even worse. There's NO feedback. Tactile feedback is important for the vast majority of users.


Andrew: it's certainly true that the consoles kick sand in equivalent-price PCs' faces on raw MIPS and graphics rendering. But if you look at them as general-purpose machines, it's another story (because that's not what they're designed for). I've considered buying a PS3 as a household Linux server ... and have up. It doesn't have enough RAM for the servers I'd be running on it and it's not upgradable. The hard disk is thoroughly obsolete in size, and replacing it adds a chunk to the cost. By the time you're through turning it into a constantly-swapping server, it costs more than a sensibly spec'd PC.

I'm completely with you on tactile feedback. It's one of the reasons I don't want an iPhone, despite being a part-time Apple fanboy.

Dave O'Neil: what Apple have got going for them is that they understand design. The mobile phone industry? Doesn't, period. They're so bad at it that they can't even judge the depths of their collective incompetence, and because it's an entire industry they're locked in a pathological state of collective group-think.


Charlie - I was refering to using it as a desktop PC not a server. You can run all the standard Linux productivity software on it with no real issues.


Andrew@93: By "your preference" I meant "personally after trying one there's simply no way I'd buy one". I'm aware of the study you cite, it shows that for one specific task, texting, iPhone users are more prone to making errors (but just as fast in the long run) as a full hard-key qwerty keyboard -- but I don't call that "scoring really badly on usability".

I completely agree with you about the games industry spending itself into trouble, though. It's something I burn a lot of cycles thinking about.

I don't understand your point about NNTP and bandwidth, though. Could you explain? I see how NNTP is an improvement over 2 clients both fetching complete copies from a remote server, but not how it's an improvement on 1 client fetching it, then distributing it locally-only to another client on the same switch. (Did you look up content-centric networking, btw? It may be exactly what you're looking for.)

Charlie's assessment of the MoPho industry @94 succinctly sums up exactly what I've been talking about.

Tactile feedback is good, yes. No argument from me here. It assumes relatively static content, though. Whether the tradeoff is worth it, very much down to personal taste. However, some people are working on adding it to touchscreens using, for example, the appallingly-named "VibeTonz". There are demo units out there and apparently the effect is very realistic. Could be the best of both worlds.

(iPhone touchscreen is surprisingly good, though -- smokes "Palm Pilot" style touchscreens. It's toughened glass as you requested, by the way, Charlie :) There's no squishy "pressing the screen on your calculator to make the moire colours" effect, for instance. If you hate touchscreens, nothing will change your mind. But it's the best one I've used by a significant margin.)


The single major problem with bandwidth usage in the UK on cable and on many BT exchanges is overloading on the UBR/local switch. Sending from a ISP server to the user uses n download. Locally, that means it's uploading n and downloading n.

And if people can't tap places on a fixed touchscreen interface properly, it's only going to get worse if you do anything non-standard. Worse, there are plenty of ways to tap the wrong thing on a cell phone and cost yourself cash.


Along with Canis, I'll chime in with a me too - Charlie is exactly right on this one. About the only use I have for my phone is to call AAA on the interstate. Otherwise I'll seek out a regular-sized phone, please, thank you very much.

But it raises an interesting question - the people who design these phones are not stupid, so what is the reason they are so incredibly non-ergonomic? Are there just a few people like me and thee who hold these opinions? Are most people in fact happy with the form of their phone?

The single major problem with bandwidth usage in the UK on cable and on many BT exchanges is overloading on the UBR/local switch. Sending from a ISP server to the user uses n download. Locally, that means it's uploading n and downloading n.
Ahh, I understand the confusion now. I was talking about people directly connected, not bouncing off of an ISP -- for example, colleagues in the office, or (biggest P2P users?) university students on a dorm LAN. In the home, this would only apply to people within the household, not on the same street. (Unless someone pulls some crazy WiFi Mesh stuff out of their arse, of course :) But those don't have a good track record, sadly.)

So yes, a cache at the ISP is useful here, if you can persuade them to sign on. In the scheme Charlie envisages where content is flat-licensed, that's more likely, because currently the legal liability likely outweights the bandwidth savings.

And if people can't tap places on a fixed touchscreen interface properly
Can't they? I have to say, our desk phones at the office (huge Cisco jobs that -- according to a quick Google search -- cost as much as an iPhone) have lousy touchscreens. I think this is mostly because they're underpowered and run Java, so when you touch the screen, nothing happens for about half a second, and this not only makes it feel unresponsive but also means you get no as-you-press-it feedback on whether you've done the right thing. It's not a good experience.

Sensible touchscreens, though, highlight an option as soon as you press your finger down, but don't activate until you lift it, and allow you to slide the finger while it's down to pick the option you meant (or cancel entirely) if you did miss. Positive feedback loop, happy user. Tweak the hit-rectangles for potentially dangerous (or costly) actions to be smaller, or require a more distinctive action (eg slide-across "rocker switches"). Keep opposing actions well-separated on screen. Input need not be a single point coordinate, it can be a large oval, corresponding to the squished pad of the user's fingertip against the screen. You can use that. Some systems even offer pressure information. Ignore big mushy finger-smears across several buttons, and pay attention to how the oval changes in size or moves over time. There's a lot of heuristics you can apply to improve the experience. Some companies are doing this. Some are not...

Are there just a few people like me and thee who hold these opinions? Are most people in fact happy with the form of their phone?
Don't worry, it's not just you. My girlfriend -- a physicist who operates highly technical lab equipment -- finds her phone awkward. My parents -- elderly, but no technophobes, my father edits video on his Windows PC -- can't figure out how to change the ringtone on their phone. Most people learn just enough to get by: make calls, answer, hang up, read and reply to texts, and nothing else. If anything out of the ordinary happens, they get spooked, and ask someone else.

Why does this happen? I think it's partly because for many phones, the customer -- the person actually paying the manufacturer, anyway -- is not the person who uses the phone, it's the MoPho network.


Think he's going to eat his hat in public?

You're forgetting the most famous slide ever shone at an Apple Keynote.

It's True.


There's a certain pattern to the evolution of technology, at least some times of technology. The technology comes out in a crude, clunky primitive form with limited functionality. Now what's the simplest way to compete with this device? To add more functionality (assuming we are dealing with a device which can have feature upgrades). The reason this is a simple way to compete is that it's stuff you can mention in an ad, or put into a catalog.

And we go for it. Few people really want to buy a device that is deficient. And there's also the social games that people play. We like to have stuff that other people do not and we like to pay for it. So we economically reward the companies for feature creep as the devices get ungodly complex. Usability is a low priority at this point in the evolution of technology. At this point everyone wants more and more features.

The analogy here is the classic tale of how to boil a frog. You dump a frog in a boiling pot, they hop out. You stick a frog in a cold pot and heat it up slowly, they'll sit there until they boil. If these gadgets had been dumped on the market in the first place they might have bombed. But people have grown steadily inured to the complexity and unused features until they put up with it.

It's not just in the mobile phone industry. The VCR was a device that was the butt of countless jokes about how hard it was to program. And then someone came up with the DVR. And all of a sudden we had a device whose main selling point was usability. And the customers ate it up in droves and it fundamentally impacted the industry.

I think that this is one of those situations where the situation has to get bad before it gets better. The early mobile phones were usable because they didn't have much functionality. There was little need to invest in user interface research. At moderate complexity, there still wasn't a great demand for UI research. It isn't until now, when we have these monsterously complex gadgets and people are seriously frustrated with their mobile phones that there's a real demand for an iPhone.

So when the frog is boiling, someone comes along to the frog and shows them that they're boiling and offers a cooler pot of water for them to hop into. We go from poor but usable functionality to rich but unusable functionality, and then someone sees the pent up market for a device that may not be so full featured but is usable and pushes that onto the market. And the bar is raised and usability now becomes a selling point. Mobile phones are a young technology and given the other gadgets they've been absorbing (PDAs, electronic cameras, etc.) in their hybrid functionality they're even younger.

So I see this as the maturing of the mobile phone technology.



I think the reason the whole mophone industry is so deficient in understanding about usability is that the basic hardware and software design jobs are very hard because of size constraints. So the very best engineers for that type of job are hired, and these people are almost by definition the very worst at understanding usability. In their world you have n gates and m megabytes to deal with, and the stuff that fits in there the easiest is the best stuff; no other criteria need apply.

There's also the general industry conception of usability as being the same as having a glitzy user interface designed by a software guy and a graphic designer. There's no usability engineer in most teams; hell, most of these guys have never heard of usability engineering. The very idea of getting a cognitive psychologist or an anthropologist to tell you what "usable" is just doesn't occur to anyone.


Bruce at #102-
"The very idea of getting a cognitive psychologist or an anthropologist to tell you what "usable" is just doesn't occur to anyone."

I could have sworn that in some places and areas, people have thrown phsychologists and anthropologists and others at problems of usability. I can't recall any details, but I am sure I have come across mention of such approaches being tried over the past few decades. It would be interesting to know why they have this blind spot.



Damn right it's been done, and I've personally been involved with products where it was done. The blind spot doesn't exist for everyone, but it for sure exists for engineers in the mobile communications industry, where hardware size and power consumption have been the be-all and end-all since the beginning.

If you want to see what can be done if you ask the right people to help you, take a look at the old portable oscilloscope line that Tektronix put out, starting in the mid-80's and continuing to about 7 or 8 years ago. They're called the 2400 series, and they're all based on a common hardware and software platform, with common user interface design. I worked with several of the usability researchers who helped design that product line; there were cognitive psychologists specializing in task analysis and human vision, and at least one psychologist whose primary experience was in experiment design.

Their methodology was simple: design a strawman user interface and mock it up in cardboard slides, or post-it notes, or prototype software, whatever made sense and was quick to build and easy to change. Then they got typical users: experienced technicians and engineers who had used oscilloscopes for years and knew what kinds of things they needed to do with them. And they set these people loose on the mockups and videotaped them going through some set of tasks they might actually face in the real world. When they'd completed a task, they were asked how well it worked out, and asked to suggest changes to the mockup that would work better.

Sounds simple, but this concept of having a user evaluate what to do seems to evade most people who are responsible for the design and implementation of user interaces.


Now that I think back on projects like the Tektronix 2400, I realize that in a lot of cases decisions about who to get to do interface design, and how to evaluate that design, were made by people who were not professionally trained as engineers. In several cases they were scientists (mathematicians or physicists), in at least one case I know of the design was substantially influenced by two people who were trained* as architects.

* By Nick Negreponte, as it happens, in his incarnation as head of MIT's Architecture Machine Group.


I worked for Microsoft's Games division for a couple of years in the mid-late '90s. They had a very aggressive approach to the games/home market, but Flight Simulator was basically the cash cow that was funding the rest; they had a party when they lost "only" $1/4 million for the quarter. So I don't find it hard to believe they could blow through $21 billion this century.


Both I and my father love our Kyocera SE47s. Hyper durable, eminently reliable, slider (no hinge one display), dead simple UI, tolerable size, texting and voice. The other reason we love them is that they are on Virgin prepaid service. The only reason I'd change phones would be to get prepaid bluetooth connectivity with a Palm or Nokia internet tablet. Very simple, but orders of magnitude easer to use than Nokias or Motorolas.

As for Apple UIs I can't stand them. The 1D nature of the clickwheel is annoying to no end since it relies on mode switching to flip between differing uses of the wheel (I end up blasting volume instead of switching tracks or vice versa). I much prefer the 2D 5 button controller on the Palm TX combined with a touch screen. Volume and track control are orthogonal as god intended. (I use TCPMP for MP3 and video.) I also hate iTunes with an equal passion. I much prefer simpler and cheaper MP3 players than iPods, and generally use drag and drop content management (with the occasional python script for more complicated operations).


I don't think that computers will be comoditised much further - 10 years ago, the cost of the physical computer components was a significant part of the price, and the other factors (labour, distribution etc) were comparatively insignificant. Now, the balance is tipped in the other direction - it's the other factors that need to be made cheaper, not the 'silicon'.


The thing that makes me suspicious of this is that the thin client with power offloaded to servers is basically the old "Network Computer" concept that flopped so badly in the 1990s.

Though maybe it was just premature. I think companies were trying to sell NCs to corporate IT departments that were heavily standardized on Windows PCs, and also to homes as the "Internet terminal in the kitchen" idea. But these little low-powered devices generally weren't wireless or portable--they looked like toy desktop computers. Smartphones have demonstrated that people are willing to carry thin-client devices around on their person, at least if they are phones.

A naive soul with no prior experience of consumer capitalism might ask why, instead of doubling in power, the manufacturers don't concentrate on cutting prices? But that's not how the industry worked. Until now.

But that is how the industry works. If you didn't want your power to double - you could have that same 166MHz Pentium today FOR FREE.

That's a lot better than 100 times cheaper. The problem is that everybody wants the power. Nobody wants to use a 166MHz machine today, which is why people are throwing them away and paying for new machines.

Most people spend so much time working (or playing) with computers these days that having an underpowered machine is the real waste, as it kills quality of life and productivity. I'd rather pay more and get a powerful tool than suffering for the sake of saving a few dollars.



I should also say I don't agree about the "network computing" thing, either. Consumer applications now demand more power than ever - for example, games and HD video.

The network is not any substitute for a powerful computer. Besides, that bandwidth is needed for things like streaming video and audio (say the team-chat in a game) and is best not burdened with carrying all the data needed.

I really don't understand where this idea comes from that our processing power demands are decreasing. They are doing the opposite. In the old days, only power users and professionals needed anything more than a word processor and email. Today even grandma is watching High Definition content, or playing powerful 3D games. Or using her multi-megapixel digital camera with RAW images - or shooting footage with a digital camcorder.


"Apple have staked out a boutique territory for themselves, and more power to them for noticing that they *needed* to do that in order to survive: but that's a small lifeboat"

Wasn't Apple surviving just fine for many years before deploying their iPod and Apple retail stores? And are they not now *flourishing* with those things? Not exactly a "lifeboat" -- more like stepping up from a speedboat to a luxury yacht or a cruise ship.


Harvard Irving@111: The computer I'm typing this on is 5 years old. It can do all those CPU-intensive tasks you mention without cracking a sweat. It's got, at a rough estimate, less than half the straight-line processing speed of a mid-range 2007 box (it was expensive in 2002). And the modern box spends just as long waiting for drive heads and network connections as this machine does, because latency on those hasn't changed much for years.

And speaking of networks... have you saturated a 10Mbit/s network lately? I have, and it takes a lot more than a little streaming video and some inter-team chat. The "network computer" concept predates the 90s, anyway. Xterminal + server was the standard setup for years, since the early eighties or maybe even the seventies; our host will know better than me.


"Moore's Law suggests that every component of a PC halves in price on a roughly 18-month cycle."

No it bloody well doesn't. Moore's Law has been about 1 thing and 1 thing only: the doubling of the number of transistors on integrated circuits roughly every 18 months. It says NOTHING about the cost of said ICs, nor implies anything else.


Bruce at 105- I was thinking more things from kitchen knives through cars to furniture and stuff, but your example sounds very nice, albeit it went over my head. Last time I saw an oscilloscope was at university. I'm more of a chemistry mad scientist than a physics one.
My own example of sensible user interfaces is that on my Sony Cybershot, when you go through the menus to delete pictures, the default setting is "no". You have to physically move the highlight bit over to "yes", thus preventing accidental double click deletion of photos.
At least I'm assuming it is deliberate planning. I've seen a camera or two that weren't set up like that, and I know how easy it is to mispress buttons, especially small ones.


You probably know about this already but in case you havent I've been using my 2gb ipod nano to read ebooks using txt to note conversion software called podnote.Although I cant use all the memory for storing Ive been able to get atleast 3-4 average size novels or pdf to run on it simultaneously.I'ts a bit of an eye strain to be sure but Im re-reading Accelerando on it on the bus to and from work.Battery life is also a bit of a pain but I can plug the charger just about anywhere.


It doesn't take much to baffle the user. Judging by all the times that flashes go off with shoot-and-click cameras, usually in places where they aren't needed or forbidden (eg art galleries, historical building interiors), not many realise that the little lightning bolt toggles the setting. Almost all will say to me, it's impossible to turn off, yet have either failed to notice that rather prominent button, or assumed that it meant "electrocute me".


Charlie @95: and because it's an entire industry they're locked in a pathological state of collective group-think.

I actually think the problem is the opposite of "group think", the problem is a Darwinian collision of various competiting and conflicting urges which lead to the dog's dinners you see in the market place.

People don't want to pay full price for a handset so the operators subsidise them, which means the operators want to dictate what goes on the device. That leads to conflicts between competing UI teams in the Operators and the OEMs. My Samsung Blackjack has been rendered almost unusable by AT&T, nothing to do with Microsoft or Samsung but purely the operator sticking their inept noses into the mix. Then you have the Uber powerful Tier 1s like Nokia who want to lock users into their UI design and avoid people leaving.

Nobody who invests money in the phone business wants to do anything other than make it as hard as possible for people to change OEM or Operator.

Apple have broken that with getting people to pay full price for a handset. But this is a market shipping several hundred million units a year. The Apple slice of the pie isn't what impresses the board in Helsinki.

Apple might understand design, I like my new iPod Touch - but their software is still flawed. iTunes is just plain evil.

Canis: Compared to even a basic feature phone, the iPod isn't really a constrained device. There's lots in their but it is, I'm told, orders of magnitude simpler. You can ask me why in email, but this isn't the forum.


Canis: I gather that version 2.0 Zunes aren't actually that bad, but they're also the same price as iPods, bulkier and don't bring anything interesting to the table. And they're the best of the alternatives

Well, I've gone with an iPod Touch - it is sweet but I am hating having to use iTunes with it.

My wife wants the new Zune which looks like an old Nano, we tried out the new Nano and she hated the form factor calling it the Apple "Chubby" - I must admit I wasn't impressed. It's too small to really watch Video on so why go with portrait? I don't get it.

She also wants to stop using iTunes for various reasons.


I've never understood the iTunes hatred, but each to their own -- I'm going to try not to beat the iProduct dead horse any more on this thread :) A quick note to Mike on cameras:

It doesn't take much to baffle the user. Judging by all the times that flashes go off with shoot-and-click cameras, usually in places where they aren't needed or forbidden (eg art galleries, historical building interiors), not many realise that the little lightning bolt toggles the setting. Almost all will say to me, it's impossible to turn off, yet have either failed to notice that rather prominent button, or assumed that it meant "electrocute me".
Actually, in my experience, this is because a fair percentage of cameras have a design flaw whereby you turn the flash off, and then the camera forgets this setting and turns it back on (usually when power-saving kicks in, ironically, since the flash likely uses far more battery power than keeping the camera powered up would have).

People's learning of user-interfaces seems to be like that of pigeons in Skinner boxes; they cotton on faster than you might think, but they're also highly prone to superstition and leaping to the wrong conclusion, especially when the results are irregular/unpredictable.

Of course, some people are just inconsiderate :)


Canis, in the beginning iTunes did one thing, and did it really well -- "rip, mix, burn". That, and organize a library of mp3s and sync it with an iPod.

Today, iTunes does all that. It also provides a web-based storefront interface that insists on SHOUTING IN YOUR FACE the first time you run it, and that's not obviously easy to switch off. It lets you sync photographs from iPhoto onto an iPod. It lets you buy talking books. It tries to sell you movies. If it could control your iCoffee machine it would add purchasing Starbucks coffee beans over the internet to its wide-ranging and inconsistent repertoire.

iTunes is now a hideous collision between about four different user interfaces for four different and not obviously similar tasks, and it is well overdue for a huge redesign.


Sure, but you can ignore all that if you wish. I've never seen the web-based storefront popping up unexpectedly myself -- I think my Dad's mentioned that happening, but he's never been able to repro it when I'm around :) and I've not seen it myself elsewhere. I only ever see the store if I go out of my way to click on it -- which, obviously, I choose not to, since I get my music from independent labels or artists directly (eg Warp Records' Bleep store, or the recent In Rainbows promotion). :)

Apart from the shop, it still does just "rip, mix, burn" and "sync media [and settings] to an iPod" -- they just broadened "media" and "iPod" to include a wider range of media and devices.

Anyway, what mp3 player/manager people use is their own business, and if it doesn't suit their needs, that's cool. Some people seem to reserve a particular vitriol for it, though, and I still don't get that, or see where it urinated in their weetabix. :)

RealPlayer, now... there's a media player deserving of vitriol... maybe it's just that RP reset my expectations so low! :P


The trouble is, iTunes doesn't like the way that _I_ had organised my music and helpfully re-ordered it and, in many cases, reformated it to play on the iPod. Fortunately I now have enough storage lying around to keep a "clean" copy of standard MP3s arranged how I want them, and then there's the iTunes library.

I reserve my vitriol for the auto-sync options, especially the way it restarted the entire sync (which was 80% done) because I made the mistake of clicking on the option to also sync favourites.

I also don't understand why the Podcasting handling is so unintuitive. Why can't I manually copy podcasts into the Podcast playlist?

Finally, I'd like to meet the people who wrote their USB code and ask them why it's sooooo slow.


Dave, I'm with you on iTunes' insane sync behaviour. (Set it up to auto-sync a new pod. Decided to switch to manual control. To my horror, watched it delete 77Gb of files, all of which I then had to manually select and copy back ...)


Dave@124: Preferences, Advanced, "Keep iTunes Music folder organized" and "Copy files to iTunes Music folder when adding to library" -> Untick!

(Yes, the defaults are set for everyday users, rather than those who have alphabetised their mp3 collection by hand -- but you can tell it to do it your way)

Now your iTunes library will simply be an index of references back to the original mp3s wherever you keep them. Although I don't recommend keeping them on an external drive as a friend of mine has trouble with that configuration (probably because he keeps disconnecting it then running iTunes...). And I'd try it on something other than your pristine mp3s folders first, and check you're happy with the results before you take the disk-space-saving route... :)



To my horror, watched it delete 77Gb of files, all of which I then had to manually select and copy back ...)

That would accurately describe what happened to me on Friday afternoon. Except it was only 10GB but I'm light on the music side. It's still soooo slow. I move large video files around an awful lot across USB 2.0 on various devices, players etc... and the sync to the iPod has got to be about half the speed of normal.

Other fun sync behaviour. Download new podcast. Press sync. Nothing happens, nothing at all. The only reliable way I've found to do the sync again is unplugging the iPod and plugging it back in again.

Canis: Nice to know, but not this "intuitive" behaviour I keep having rubbed in my face :)

Windows Media Player, OTOH, just seems to find everything and work. All I ask is that I can leave my MP3s ripped in the folders I ripped them to by album in the CD collection folder, separate from other music files, podcasts and other things.

I tried using an external drive for iTunes on my Media Centre and it wouldn't let me do it at all. It wants an internal drive apparently.


re phone design: I've used quite a number of old-school Nokia phones and I'm really happy with their UI. Although I should also admint that I never understood why people complained about their VCRs, I always found them easy. As for an easy to use phone that just does calls and maybe text-messages: Motorola Motofone F3. It really literally can't do anything else. And it has big characters on the display. And it's light. And very flat. As for the people complaining about the form-factor: I've seen that quite a bit with people who were upset about the small size of their phones, mainly because they (wrongly) assumed you have to speak into the microphone. Which you don't. It just totally doesn't matter where the microphone is. Keep the earpiece at the ear and speak normally and everything is fine. So what would you need a "longer" phone for? As for touchscreens and tactile feedback: doesn't apply to me personally but to most of the females I know: The great thing about the old-school numerical keypad with tactile feedback of buttons pushed and the fact that these buttons are discernible by touch: you can write text-messages without looking at the phone. My wife e.g. can walk along on the street, taking care about our baby or whatever, and at the same time have her left arm swinging by the side, holding her phone, writing text-messages at a speed which is quicker than what most ppl can do on a qwerty-keyboard. All she needs is a quick look at the display to make sure the adressee is correct. Touchscreen just wouldn't be good _at_ _all_ there. Which I also remember from using one of them Motorola Smartphones that 3 had as early UMTS (3g)-gadgets: A touchscreen with stylus is nice, but I only realized after I had bought it that you need both hands to operate it and you have to actually stand still to write something. doing it while walking and surveilling your surroundings (like e.g. cars that might hit you while you walk along or poles you might run into) was impossible. Darn. So I switched back to some old Nokia.


> Now consider the combined market cap of the shrinkwrap software industry and the PC business, and contemplate what deflating it by, oh, 80-90% will do to the western economies.

The smart vendors/producers saw this coming awhile ago and are adjusting; micropayments are becoming feasible and we'll soon pay per use/second/GB/whatever instead of a flat fee up front.

This makes more sense from just about everyone's point of view anyway, and even before micropayments we're already in a "subscription model" for much software, especially in business; pay $X per year for the use of the software - lower than the flat-fee price and new versions are included.

What does this do to hardware though? It becomes the razor and software is the blade; the hardware is merely the enabler to get you in the system, making the micropayments. This too is already happening.


Going back to the discussions about the bottom being due to fall out on desktop prices too, Wired now has a review of the $200 Everex PC being sold at Walmart. Also Linux, natch.

Hands-on with Everex's $200 gPC

So, yeah... Bruce called it - that drop is already nearly half the price of the Dell critters I was commenting on.