Back to: Normal service will be resumed ... | Forward to: Typo archaeology: The Family Trade/The Hidden Family

Gadget patrol: why my Android tablet experience sucks

(This flame bait bought to you in lieu of a real blog entry, due to exhaustion from traveling.)

Driven by Apple's persistent failure to lighten my wallet by announcing a 7" iPad, I recently acquired an Android tablet: a Samsung Galaxy Tab 2 (7-Inch, Wi-Fi), aka GT-P3113. What can I say? It's cheap, but it's not one of the nasty knock-offs. If the hardware ran iOS and had an Apple dock connector (or just plain ordinary micro-USB) I'd be singing its praises. As it is, it has rapidly become my preferred ebook reader, beating out the Kindle Fire (which was designed for that task). So you can take this as a lukewarm recommendation—if you want a jacket-pocket ebook reader that can do other stuff on the side, this one is quite classy.

But I have reservations about the bigger picture ...


Let's fast-forward through the pros first: the Samsung Galaxy S II 7" (such a classy name!) is decently designed hardware, has a good feel, and makes a better 7" ebook reader than the Kindle Fire—it's thinner and lighter, of roughly the same dimensions as the Kindle Keyboard, but comes with extras such as cameras, a microSDHC slot, bluetooth, and GPS. Samsung seem to know what they're doing when they stick to making machinery. Oh, and it runs Ice Cream Sandwich which, while not quite as slick as iOS right now, is a big improvement over earlier versions of Android. Also, yay, 50Gb of extra free storage on Dropbox for the next 12 months, until the bill comes due.

Cons: Alas, Samsung majored in the Microsoft OEM school of crapware vendors; they seem to think Sony are a good object of emulation in this respect. The Galaxy Tab II may run Android 4, aka Ice Cream Sandwich, but they just couldn't resist the temptation to slather it with embarrassing quantities of junk applications in a pathetically poor attempt to ape Apple's walled garden approach to providing tools. It's full of Samsung-only apps (chat to other Galaxy Tab users, mail via walled garden servers, share photos with other Samsung owners, and so on) that you can't delete. By my estimate they take up around 20-30% of the not-terribly-large internal 8Gb FLASH storage, and the app launcher they supply keeps trying to push them on you.

I can hide most of the junk so that it doesn't get in my face the whole time, but it's still occupying valuable storage: meanwhile, Android comes with the obvious Google apps. Why does Samsung insist on trying to steer users towards Samsung apps that duplicate their functionality but miss out key features that make them useful, like, oh, being able to share stuff with folks who don't own a Samsung device? (Don't answer that: it's because a high level marketing committee thought it would be a really good idea to try to sell web services to their customers that locked them into Samsung, not realizing that this is adding negative value to the product.)

Adding insult to injury, it's relatively hard to root the Galaxy Tab II. (I will freely confess to being a n00b with respect to both Android and Windows: doubtless if you're heavily into these platforms it's quite easy, but there's a bit of a learning curve if you don't routinely work with them.) Rooting—a necessity if one is to remove the crapware or replace it with a Cyanogenmod build—seems at present to require installing Android dev tools on a Windows machine and then using an arcane piece of debugging software. It's not outright impossible, but it's not inviting. (I'll try it later. If I don't return within three hours, send a search party ...)

To add to the fun, Samsung have some strange ideas about my willingness to buy into their hardware ecosystem. Apple's products use the now-familiar dock connector instead of regular micro-USB. This is annoying, but (a) you can buy a tiny dock connector to micro-USB dongle for about £5 if it irritates you sufficiently, and (b) there are lots of cheap third-party cables. Lots of third party kit out there uses the dock connector, which has been stable for about 8 years: the evidence is in the shape of all those alarm clock radios and speaker docks. Samsung, in contrast, invented a wholly new and incompatible dock connector for the Galaxy S II tablet. One that is not compatible with earlier Galaxy tablets released as recently as late 2010. The cable sells separately for $20 (so if you lose the cable for your tablet you're stiffed paying nearly 10% of the total price for a replacement wire to the wall wart).

What Apple have learned and Samsung appears to be in denial over is that having a decent peripheral ecosystem—both software and hardware—is what makes the tablet computing experience a happy one. By making it hard to hook the Galaxy Tab up to peripherals (see "rapidly changing proprietary connector" above) they've screwed the third party market, and by slapping poorly-performing junk all over the tab they've degraded the end user experience.

Editorial time: tablets are aimed squarely at people who don't use computers (except at work, managed by an IT department, to do business). Apple are trying to build out to a whole new market of people who never bought into the PC or Mac, dismissing computers as "too complicated". These people neither care about nor understand technical specifications. What they care about is aesthetics and convenience and price. Samsung's Android offerings compete well on price, but clunky aesthetics and walled-garden obstacles are not the way to build repeat business. Android is already so badly fragmented that it's hard for developers to work with (just look at the graphs of screen resolutions and sizes in that link if you want to see a user interface horror story in the making) and it's possible that Samsung think the only way to provide a consistent high quality experience is to build their own walled garden for their own machines. But in so doing, they're making it harder to leverage what value exists in the rest of the Android marketplace.

It's no wonder the Android tablet vendors are losing ground to Apple and that Google have bought a chunk of Motorola so they can issue their own tablet as an example of how it should be done; Android is showing all the signs of fragmentation that hit the Windows PC market, only much faster than Windows fragmented—and meanwhile, there's a monolithic, well-designed walled-garden rival that, while more expensive, delivers better value for the money.

425 Comments

1:

Charlie, any thoughts about Surface?

2:

I'm waiting another month for the Google branded 7-inch tablet that is supposed to be on its way. At least that way I get first class Android support and none of the OEM idiocy that seems to be happening these days

3:

I don't know if it's different between Samsung and Asus (I bought a TF101 Transformer last year; its ICS upgrade happened relatively quickly but rendered the device too unstable to rely on until subsequent patches (marginally) improved it) - and it's impossible to say without owning both - but if you think that's all a pain, wait until you delve heavily into the Play Store and discover just how many ICS apps are actually incompatible and unavailable for tablets, even when the OS is the same.

Or discover apps which drop adverts into your notification area.

If anything shows just how little tablet focus there is in the Android dev sector, it's how many 4.x compatible applications seem to be phone-only device-wise.

4:

The google Nexus 7" tablet is supposed to be announced Wednesday next week. Hopefully it will do for android tablets what the nexus phones have done for android phones.

What I really want though is google to state "This is what the dock connector on an android tablet must be". That more than anything will help the tablet market.

If the dock connector is cheap and easy to do then every Chinese clone (of which there are many) would switch to it pretty quickly. They've all switched to ICS already because its been so cheap and easy for them to do.

5:

Charlie, any thoughts about Surface?

Yes: I'm ignoring it. Simples!

(The most plausible analyses I've read suggest that (a) MS is giving their OEMs a reference target to aim for, and (b) its commercial goal is to head off the iPad insurgency in corporate boardrooms before it becomes unstoppable.)

But as Steve Jobs noted, the trouble with Microsoft is that they've got no sense of style. Which, in turn, impacts on usability. I am too old to spend my remaining years spending hours making badly designed software do what I want it to do.

6:

I'm waiting for the Google machine, which if not totally crap, I will buy. I am currently playing with a £75 7" ICS tablet from China, which is actually surprisingly good given the price. I expect that Google is going to produce the definitive 7" machine at a very affordable price. A 7" tablet will just fit into a jacket pocket. Anything larger is really not comfortably portable.

7:

Hopefully it will do for android tablets what the nexus phones have done for android phones.

I'm an iphone user: humour me. What have Nexus phones done for Android phones? (I mean, have they goosed the other suppliers into providing a better user experience?)

8:

My only thought is that its Win8, which looks like continuing the MS tradition of every other OS release being a bit crap. By the time Win9 is available I suspect Android and Chrome OS will have merged and will be appearing on multicore ARM 64 bit desktops.

9:

For me, the big downside to the TF101 is that cable. Guys, yes, we know Apple use a totally proprietary cable instead of USB to connect. And yes, we know you're using a single socket as both external connector and keyboard docking connector. But don't follow them down that path. Or if you do, make sure everyone and his dog can get that cable easily.

If one end of the cable is USB, there's no reason for the other end not to be. I love that when I bought a Kindle, it just worked with the same charger as my Samsung GII phone. None of this having to scare up another cable that Apple and Asus make you do.

(At least the other end of the cable is standard USB so you can plug it into a PC or into a power brick or whatever. But it's way too short too.)

FWIW, I hit no instability on ICS, but I've probably got a different balance of apps.

10:

Mainly what they've done is push the hardware. Look at the Nexus 1, Nexus S and Galaxy Nexus. Each was a leap over what was available at the time. 3 to 4 months later all the mainstream phone makers were announcing products which were very similar in terms of specs.

Yes its true that they all want to put their on spin on the UI and this can be a pain in the arse. But ultimately my android phone is just that, mine. I can do whatever I like to it. And yes I know thats because I'm a geek and the average man on the street just wants a phone that works.

11:

The primary intent of the Nexus line (I'm typing this on a Galaxy Nexus) is to show how good Android can be when unencumbered by crapware – and thus dissuade folks like Samsung from installing it. So far, this doesn't seem to be working very well, but at least it gives me a good phone that's not a walled garden.

12:

I got a Samsung Galaxy Nexus when my old phone crapped out and I was trying to future proof myself as much as my budget would allow. Samsung definitely makes good hardware, but I wouldn't be able to stand the crapware after doing without.

Since many Android devices have quite limited storage it's ridiculous that app developers don't take that into account when they're bloating up their offerings, especially when they're putting together something that's destined to be a permanent fixture. iOS developers regularly commit the same sins, but that's mostly a matter of testing bandwidth limits. The new Windows 8 phones are likely to be the worst offenders as the big selling point is that the code for desktop applications will be easily ported.

13:

Yeah, the cable - and it's piddly length - is a major pain (and like you say, pretty inexcusable design-wise, even allowing for the keyboard dock - which I don't have). ICS reliably freezes Angry Birds in all iterations - which help forums suggest is relatively common, even with both OS and apps updating repeatedly since Asus pushed its initial upgrade - as well, less frequently, as CTDing Dolphin and... uh... whatever text editor it was I generally used. Even Repligo and (uh?) EZPDF, which I use for PDF annotating, occasionally bug out. The stock onscreen keyboard became less responsive too, especially the spacebar. Used to be I could touch type on it, but not reliably any more.

For all that Honeycomb was a massive rush job that shipped without "why isn't this here?" UI elements like the scrollable open apps list, with the exception of the still-buggy stock Android browser and the slow drag waiting for software to catch up to the larger form factor, it was reasonably nippy. I've never wanted to downgrade as much as when 4.0 initially hit, and if it wasn't such a pain (and risked wiping out my 5 year-old's collection of Minecraft saves; no one needs that level of stress!) I would've done. ICS adds some nice features, but I've consistently had the impression that it would be much better on a phone instead.

14:

Sad to hear Samsung have loaded non-deletable applications to their Android tablet. To me - android is about linux really, and linux is very much about choice. It would be nice to hear Samsung not follow the Sony route in terms of loading pre-installed garbage.

Good to hear your recommendation as the device as an ebook reader. What's your take on the best formats for ebook reading these days? Is pdf still really a bad way to travel?

15:

The iPad has become very popular at my workplace. A lot of professionals have started purchasing it out of pocket (it isn't standard equipment yet) and carrying them at all times. It seems that they've ditched laptops in favor of the tablet. Interestingly, no one uses Android tablets. We still us Windows machines in most departments too.

They funny thing to me is when people use it exclusively high end cases with built in keyboards and basically turn their iPad into an $900 laptop with less capability than a $250 netbook....

Personally, I like the direction Motorola was exploring with their lapdocks - turning phones into netbooks. Webtop isn't a great OS, I'm hoping that Google will develop the hardware further and merge them with Chromebooks. Being able to plug an Android phone into a laptop-shaped accessory and use it as a cloud-connected laptop would be wonderful.

16:

The proprietary cable thing is so awful. At least cheap i-device dock cables are available. (Though why I can't buy one that charges but doesn't sync is beyond me; my IS department freaks out on certain devices being connected to the USB.) I had hoped the EU charger reg would end this nonsense but it only applies to phones.

The surface hardware looks interesting, especially the keyboard bit. (Yes, nobody has actually used it yet but it looks interesting.) I'm quite happy with my Adonit Writer Plus keyboard on iPad 3. Though I wish there were pointing devices! I'm annoyed that I will have to jailbreak my device if I want the Bluetooth stack to support mice. It's true that the complaint about iPads not being for creation is overblown but Apple does make it harder than it should

17:

I know I'm not Charlie, but my first thought on seeing the pictures was "Oh, cool!" and despite being a mac-buyer for all my hardware needs for a couple of decades I even wondered about saving up for it.

Then I noticed some issues. If I'm going to save up for it I'd like to know how much I need to save and how quickly. M$ failed to provide either piece of information. I know it was an official release, not a leak from somewhere, but it's uncomfortably close to the "iPhone 5 will be liquid metal" leaks and the like. It's a step better than vapour-ware - I understand they let some journalists actually touch it briefly - but there's a lot waiting to appear about how it works, how it feels to use, how much it costs and when I could buy one if I'm still interested after reading the other stuff.

So now I'm aware it's there. I'll look at it and keep it in mind. But not so excited.

18:

YOu're not Charlie sure, but having read some of your techie blog entries about things like OOPS, IM (informed) O anything you say on this sort of subject is pretty much equally valid.

19:

I think OEM idiocy sums it up very well.

I have a cheap 7" tablet that runs Ice Cream Sandwich. It doesn't have Bluetooth but does have an HDMI connector, the small size, so it is a useful media player.

It is outside the Google ecosystem for Apps, and some you can download don't work in interesting ways. So you can read an epub, but you have no way of getting the Kindle app. It has a micro-SD slot, but you can't write to it, not unless you connect the machine to a PC and treat it as USB storage. (I suspect permissions.)

Incidentally, my brother bought a refurbished Android smartphone, which had been upgraded with Ice Cream Sandwich. Only trouble was, it didn't workm as a phone, the mic only produced noise. He went to the manufacturer's website, and luckily they had the downloadable files to go back to factory-state. And a week later they released their automatic upgrade. Works fine now.

It makes me wary of rooting anything Android.

My conclusion: tablet computers are still a bit of a gimmick, but Apple is a known unknown. You don't know how you will cope with an Apple tablet, but they really want you to have something that works.

20:

I think in your position I would have taken a punt on one of the Very Cheap slates from China - the better ones have a decent reputation for build quality now, and they are driven to be interoperable and generic where Apple and Samsung are driven to be the opposite. Thus they all have standard connectors and run vanilla ICS. Just bought the Eken a90 for my mum and it seems very solid

http://www.futeko.com/browse.php?cat=tablet

(but rooting the Samsung and flashing CM is fun, right!)

21:

I agree with the other posters that say it's not Android, it's the OEMs. My Archos 70 tablet is underpowered, and stuck on Froyo, but at least it's very close to the stock Google experience for Froyo. The data connector is microUSB, and, with the right cable, it will do USB host as well (so can be used with a standard keyboard or external drive).

I'm looking to replace it with the upcoming Nexus 7 if Asus/Google hit their projected price point; mainly just to get the ICS upgrade more than anything else.

22:

The Surface Pro (the PC-onna-slab model) keyboard has a built-in trackpad so you don't have to fatfinger the screen to make things happen.

I'm rather looking forward to the Surface Pro hitting the market once the OS is gold, in October or thereabouts, in part because I want to see what the OEMs will produce to compare with it now they've been given a kick up the ass and also to see the race by hackers to install OS/X on it. It will run VMs so Linux etc. will be easy-peasy.

A killer vertical-market niche for the Surface Pro is graphical artists and editors. It is, in effect, a pocket-sized portable Cintiq. It can run the full version of Adobe Creative Suite including Photoshop and its 600dpi stylus (which I am assuming is pressure-sensitive although this has not yet been confirmed) makes it a no-brainer for such users.

23:

Google isn't using Motorola as a captive OEM. Asus is Motorola's partner for the Nexus tablet.

Google is helping clean up the OEMs' act: Instead of a single OEM chosen as the lead for each new Android version, all the top OEMs will release new products and new versions at roughly the same time.

24:

Ebooks: there are two horses in trade fiction -- Amazon's walled garden based on mobipocket format, and ePub. Everyone else uses ePub and Amazon has the internal resources to roll out ePub support if they ever feel like it.

The only fly in the ointment is DRM, about which I have said enough for now. My ebook readers of choice on the Tab are: the Kindle app, and FBReader (which I've been using since, it seems, forever on one OS or another). Oh, and Dropbox to sync my Calibre library with whatever reader I'm using, and Calibre to transcode and organize files.

25:

Samsung are obviously tools - trying to ape the Apple model on the Android platform, which is never going to work. You end up with the worst of both; a walled-garden of crap. That said, I don't think that's a problem with Android - it's Samsung that're causing most of the issues you've described.

Personally, I have been using Android devices for years, but have always bought HTC after one or two recommendations. I have heard lot's of people have problems, but have never encountered them myself (although I have never tried an Android tablet either). They have the miniUSB for starters - no walled garden peripherals there. I'm going to stop now as I'm starting to sound like a spamBOT, maybe I am and this is a more sophisticated obfuscation.

Ok, getting to the point, the problem is that Android is an open platform that is still new and growing into itself. There are still so many pit-falls to avoid and it's not clear they will, but I really hope they do. Alot of people (such as youyrself) desire the sleek controlled ease of Apple, but I don't see it as a great evolutionary path. It's easy for Apple to produce the kind of 'ecosystem' you desire as they maintain authoritarian control over the ecosystem and therefore can guarantee homogeneity (within reason - I have read about new stuff that will not be backwards compatible to the iPad1) and encourage a greater peripheral ecosystem, but where does this lead? It certainly doesn't chime with my preferred future of more open platforms allowing greater inclusion. iStuff certainly has it's own downfalls - my boss got a iPad for work purposes (primarily, to take to meetings instead of a laptop to show presentation material) but to date it's mostly used by his kids to play games. Why? Because getting the required material onto it is a royal pain in the arse. If it was Android all I'd do is plug it in like any other flash-drive and copy the files across. It's dumb to make such a simple thing so complicated.

As for the Win Surface - the ONLY reason I'm waiting on tenterhooks is I want a tablet that supports ASIO drivers, so I can plug my soundcard peripherals into it so it's actually useful for musical applications. If Android get's that first I wouldn't even care.

My god, sorry I've left such a long reply. In brief - I don't know if Android will get better, but I hope it does because I want a more open future, not one contained within a series of mutually-exclusive walled-gardens....

26:

Rooting the tab should only require installing the tab drivers and an app called Superoneclick -

Here's a straightforword explanation with links to the necessary drivers and software:

http://www.informationweek.com/byte/howto/personal-tech/tablets/232500651

27:

In brief - I don't know if Android will get better, but I hope it does because I want a more open future, not one contained within a series of mutually-exclusive walled-gardens....

I hear where you're coming from: I come from there myself.

The trouble is, you hear "open" and the folks tablets are aimed at -- people who do not use computers -- hear "malware, viruses, and spam". The majority of existing PC owners aren't competent to secure their machines against malware (and don't ask me about Mac users: they've had a wake-up call this year after decades of smug near-immunity, but it's anyone's guess how many heard it), and we're seeing tablets pushed at people who are, culturally, even further removed from the computer users we know. People who don't know any distinction between active memory and storage, between apps and operating system, and who don't know what files or folders are for.

Apple's design philosophy is that these people need protecting. They're arguably wrong for folks like us. But for the 99%? They're right on the mark.

28:

You didn't read my blog entry or that article properly, did you?

Firstly, this is the Galaxy Tab II. A different, newer machine running a different version of Android.

Secondly, I don't own a PC and if I did it wouldn't run Windows. (We'll tip-toe past the Viliv hand-held, which frankly isn't up to working as a rooting host.)

29:

How is the charger mandate affecting iPhones in Yurp? Is Apple supplying micro-USB adapters or has it changed the socket to micro-USB?

30:

It seems like Samsung is trying to bootstrap itself into an Apple-like position. It might well succeed, but all of these problems will be rampant for several years while it tries (just as it was several times with Apple products, while Apple hemorrhaged money trying to bootstrap a new walled garden incompatible with the one it was locking everyone into six months prior).

While I haven't rooted my own Android device (it was like that when I bought it), it sounds strange that everyone who had done such a thing used windows to do it. I suspect that someone has solved the root-android-devices-on-an-intel-unix problem, and while such a solution may well be linux-specific and make stupid assumptions like the existence of pyusb and glib2 and apt (a decade ago it would have assumed the existence of RPM and all of CPAN), you can actually expect the thing to be trivially portable to any x86 unix (meaning if you run OSX or some other BSD derivative you're probably ok). I would be very surprised if such a thing didn't exist -- but, I was quite surprised that all the non-python code for interfacing with consumer EEG machines depended upon a particular version of visual studio too.

31:

Why not instead get a Samsung Galaxy Note? The screen is large and beautiful, it fits into your pocket, it's a phone, too, and CyanogenMod 9 nightly builds run on it quite well - the nightlies are stable, as I can attest to from first-hand experience.

With the 5.3" screen of the Note, I no longer need my iPad or indeed a tablet of any sort. I don't need a laptop for 95% of what I do, either.

I'm working on trying to convince the Clamcase people to make a clamshell case for the Galaxy Note which includes a Bluetooth keyboard. If they won't do it, I guess I'll pay someone to build a 3D model of what I want and get one of the 3D printing outfits to churn it out for me.

32:

Why not instead get a Samsung Galaxy Note?

Because it costs more than twice as much, and when I'm just dipping a toe in the water I'd rather do so on the cheap.

Oh, also: another goddamn proprietary connector!

I have a perfectly good iPhone 4, and am sitting out the iPhone 4S. I'll make my mind up which direction to jump in when the iPhone 5 is announced -- either go for an iPhone 5, or change platform if Apple have (in my opinion) jumped the shark.

Note that my next-phone options include: stick with the old iPhone 4 (it still works), buy a new iPhone 5, switch to a high-end Android phone and an iPod Touch to keep a toe in the small-handheld-iOS camp, or some other combination. (e.g. stick a PAYG data-only SIM in the iPhone and use it as an iPod Touch 3G, while buying an Android phone.) I simply don't know yet. If you know what the best value for money and trade/off for my needs will be in 4-6 months time, you ought to be earning a decent living as an industry analyst :)

33:

tablets are aimed squarely at people who don't use computers (except at work, managed by an IT department, to do business). Apple are trying to build out to a whole new market of people who never bought into the PC or Mac, dismissing computers as "too complicated". These people neither care about nor understand technical specifications. What they care about is aesthetics and convenience and price

This is undoubtably true but it also applies to those of us who use computers all the time recreationally but still don't want to have to learn computational arcanery (I'm not even talking about code, just the counterintuitive user interfaces that most software possesses). I have a laptop running windows and an iPad, I much prefer using the iPad and if it wasn't for the fact that I'm not good at using it for extended writing I'd chuck the laptop.

A rather large feather in Apple's cap (that other vendors seem to have finally caught onto but are competing by copying Apple in inferior ways) relevant to this discussion is the ease and enjoyment in the user interface. I've only had limited experience in using non-iPad tablets but I've yet to find something that works so intuitively and smoothly. It's still incredibly fun for me to tap, glide and pinch my way around even the most mundane of apps.

34:

No, the Galaxy Note has a MicroUSB connector. Nothing proprietary in the least.

35:

A friend of mine referred me to this article after a discussion about tablet (iOS, Windows RT, Android). He felt that this was a good read. I have to say, I am completely disappointed in what I read and it seems that your focus on what you consider cons is ridiculous. Lets start with the most obvious idiocy first. You state "By my estimate they take up around 20-30% of the not-terribly-large internal 8Gb FLASH storage" - you do realize that for a very small cost (About $10 US, not sure about UK pricing) you can add a 16GB MicroSD card to the device bringing it up to 24GB, don't you? To do the same in the iOS world you would need that little dongle hanging off your device 24/7. Not to mention the dongle costs money too, but more on that later. Next, you discuss the hardware ecosystem I'm skipping rooting because I don't think a person should have to root their device and void their warranty to get a functional device and I'm skipping the canned apps part because that argument is just...stupid. - Okay, lets be serious for a minute. I've owned a lot of tablets (iPad, iPad 2, Transformer, Streak 7, some Chinese knock off tablet, Archos 101, Nook Color, and Kindle Fire). I also know a LOT of people with tablets. Not a single person, including myself, has ever really bought more than a cover, sans the dongle for the iPad for my photography friends. I would guess that based on a small sample (N = ~14) that most people don't buy all these add-on hardware items you reference. Perhaps a bluetooth keyboard, but that is really the most you see. With regards to the operating system, as you can see from the list, I've had just about every flavor including ICS on the Fire right now (yeah, I get that this invalidates my comment about rooting, but I'm a gadget whore that likes to try new things). About the ONLY thing iOS seems to do better, in my experience, is web browsing. The stock Android browser even in ICS sucks in comparison. However, Chrome Beta is becoming a serious contender and ones that is finished I would imagine that web browsing in Android will be just as slick. Now, lets talk about the one thing you didn't really draw any attention to in your article. That's total cost of ownership. It has been my experience that to do anything in the iOS world is more expensive. I'm hard pressed to think of a single app in the iOS world that is less expensive than the Android equivalent. Most Android tablets offer an Office suit as a default app (something you DIDN'T mention in your discussion of the canned apps on the unit). In the iOS world, that's another what, $20 US...give or take? Not to mention that so many useful apps in the Android world are completely free compared to being nickle and dimed to death in the App Store. So, not only do you pay more money for the Apple tablet out of the chute, but when you add all the other crap you need to pay for to get something useful (beyond web browsing) your TOC jumps up DRAMATICALLY. The last thing I'm going to comment on is fragmentation. This is the big cry of the Apple fanboys. I'm curious what applications you are using that suffer from fragmentation? I use a LOT of apps on a regular basis, not including games, and I have yet to run into a single app that works on my Android phone that doesn't work on an Android tablet with the large bulk of them scaling correctly and looking beautiful. So what does that mean? I buy 1 app and it works on my phone and my tablet. In the iOS world, sure they have tablet specific apps, but that's a drawback. If there is an app I like, I buy it and then on the tablet I have to hit the little x2 button and get a horrid looking stretched out app experience OR I have to buy the tablet version. Yes, you guessed it, TOC goes up AGAIN because you are basically buying the same app twice!

So, no offense, but your review is incredibly horrible, inaccurate, misses key information, and seems like it was written from an incredibly bitter and biased perspective. I mean really, you spend half your article complaining about a couple of stock apps. That's pretty petty.

36:

Regarding the chat client, if it is called ChatOn, there is a huge difference from iMessage as it is not Samsung only but available for all Android users and also seems available for Blackberry and iOS (not sure about symbian though). I have not tried it yet (came with my new phone) but it does not feel like the regular vendor lock-in you see today. I cannot comment on the quality, privacy or security of it though.

37:

you do realize that for a very small cost (About $10 US, not sure about UK pricing) you can add a 16GB MicroSD card to the device bringing it up to 24GB, don't you?

Yes. However, you can't install apps on the external MicroSD card under Samsung's implementation of ICS!

I also know a LOT of people with tablets. Not a single person, including myself, has ever really bought more than a cover, sans the dongle for the iPad for my photography friends.

Then you don't know enough people -- or you know a different self-selected subset of tablet users than me.

I'm hard pressed to think of a single app in the iOS world that is less expensive than the Android equivalent.

You're aware that most Android apps are subsidised by advertising, right? This isn't the place to get into privacy leakage issues, but in general iOS paid apps appear to rely much less on adware and personal information monetization than Android apps.

I'm curious what applications you are using that suffer from fragmentation?

It's not about me, it's about the Android Marketplace (or whatever Google renamed it to last week). Bluntly, the quantity and quality of apps there is inferior to the Apple App Store. Numerically inferior in absolute numbers of apps, and a far higher proportion of adware (and some outright malware).

Final note for Robert Hoppe (if that wasn't just a drive-by): read the moderation policy before you comment again. I'm cutting you some slack this time, but I expect you to be more polite next time I see you around here.

38:

You don't need to install apps on the MicroSD card, just use it for data. The Android apps are small - smaller than the equivalent iOS apps, in most cases.

Yes, you need to root your device and then get CM9 going on it in order to have a decent user experience. I wasn't ever able to get the root stuff working on my MacBook Air, so I used my Windows gaming PC to do it. It's a stupid process, but once you're rooted, it's relatively easy to install CM9; once you're on CM9, it's painless to move from one build of CM9 to a newer build, you can do it all on-device.

39:

Vizio is touting its lack of bloatware as a feature on its newest Windows machines. That move has placed Vizio on my short list for my next laptop.

(The others are Lenovo and Apple, FWIW.)

40:

such a classy name!

Excellent point. I really don't understand why the vast majority of electronics have such convoluted naming schemes that make product differentiation so difficult. Apple makes it so easy: MacBook Air, MacBook Pro, and a few spec increases in each line. Contrast this with any major PC vendor, and you get upwards of a dozen different "brands" from any particular vendor, often with non-trivial differences & options intra-brand, none of which makes obvious what the product-line differences are.

I'm not saying it should be as simple as Apple's spartan product line, only that consistency & clarity of branding would go a long way towards eliminating the confusion of having to parse out what the differences are between an HP ProBook 5330m and a near-identically speced Pavillion dv4t-5100 or Envy Sleekbook 4t-1000.

41:

Since Windows itself is bloatware, this doesn't hold a lot of appeal, heh.

;>

42:

The instructions you (Charlie) link to look pretty average for rooting an android thingy. Certainly a similar amount of effort compared to everything else I've tried rooting in the last few years (except when someone found that PDF exploit that let you root an iphone from a web page, that was easy). Mind you, I do find rooting something and installing CM a fun activity, but that's just me I guess. Perhaps a windows VM would be handy to have lying around?

43:

phuzz, you didn't read my reply. (TL:DR; those instructions are for a different model.)

44:

Any manufacturer who loads non-deletable software on a machine is telling me in effect that I don't own that device. My response to that is simple - if I can't own it I don't buy it.

45:

"The trouble is, you hear "open" and the folks tablets are aimed at -- people who do not use computers -- hear "malware, viruses, and spam". The majority of existing PC owners aren't competent to secure their machines against malware (and don't ask me about Mac users: they've had a wake-up call this year after decades of smug near-immunity, but it's anyone's guess how many heard it), and we're seeing tablets pushed at people who are, culturally, even further removed from the computer users we know"

This thread seems to be generating alot of comments, so I'll keep it short - yes, this is true for now. But ATM the majority of users with cash were brought up using sub-standard school PCs or some such or basic home PC. Over the next decade the first generation raised in a world with ubiquitous mobile phones will be coming into the marketplace. These people will be generally far more savvy. This is the hope I cling to....

46:

Over the next decade the first generation raised in a world with ubiquitous mobile phones will be coming into the marketplace. These people will be generally far more savvy. This is the hope I cling to....

You're an optimist. I'm afraid I have exactly the opposite expectation of that generation's degree of savvy. (Going by things like the Raspberry Pi initiative, the educators agree with me, too: it reeks of desperation -- "oh shit, we've mistaken basic computer literacy for being able to use Microsoft Word and we've raised a generation of clueless lusers instead of software developers! Where do we go now?"

47:

A tablet with Android will always suck because Android sucks. A couple more tablet tweaks to Ubuntu and it will be a far superior choice.

tablets are aimed squarely at people who don't use computers (except at work, managed by an IT department, to do business)

I think you're right that most companies are AIMING at this sector, but I don't think it will prove to be the primary demographic that wants/uses them. As a heavy computer user who loathes the laptop format, I long for a tablet from 2018 (though not a walled garden model).

48:

Thanks, this was very helpful.

49:

I agree, and have similar feelings about any manufacturer (or softco, or indeed media vendor) who claim that they have the right to remove content from a device in my posession.

50:

"You're an optimist"

Always.

Just read your moderation policy (since you told that guy off who posted around No.35) - love it!

51:

"If one end of the cable is USB, there's no reason for the other end not to be."

Certainly true in general, although the Apple cables have a lot more stuff in them besides just USB (e.g., various video and audio signals, which some peripherals make use of). USB itself is just four wires.

Even when both ends are USB, there's a lot of unnecessary annoyance. Kindle, Android phone, iPhone/iPad, USB mic, camera... all "USB", and every one of them uses a different connector, and that's just among the devices that I use regularly. A year or two ago, I searched out all the USB adaptors I could find and bought one of each. I had at least eight (one of which is now MIA), and I don't think that covers the full gamut.

52:

AIUI, if the pinouts aren't as per http://en.wikipedia.org/wiki/USB then the device isn't USB.

53:

Got to say, I'm with Charlie.

There is (I'm learning to my poor brain's aches and WTF?! screams) a pretty significant barrier to writing code for Android and iOS devices.

I'm not convinced that using either of them really encourages you to become a sophisticated IT user with any understanding of underlying processes. Both modern versions of Windows and the Mac OS are continuing a process of making this more and more obscure too - although there might be hope for the small subset of linux users - you may or may not learn it formally, but if you're cd-ing through directories and the like you've surely got to pick up some clues on the way?

I rather strongly feel that, as computers have penetrated the first world population fully - including iPods, smart-phones and the like in that (as I would) extends it to quite a lot of the second and third world too - it's become far less about knowing what's going on, and much more about how to get it to do what you want. All the furore about digital natives, when you actually examine it critically, basically says us old fogeys learn IT stuff better. We poke, prod, experiment because we've had to learn it that way. As a population (and there are definitely individual exceptions) the "digital natives" learn the basics of what they need and stick in their safety zone with that. Next Gen Nonsense is a good source for some research on this.

Free CMS packages and wikis have destroyed the need for HTML skills even. Do you know how many people don't know you can write a (very bad) webpage in Word and export it even?

54:

Not to shill, but dealextreme.com has tonnes of knock-off cables of all sort---order three of them, at least two of them will work, and if only one did, you'd still be ahead.

Now to prove that I'm not a spam-bot:

751 + 2 =

753

55:

Even if the manufacturers adhered to the "standard" (which they don't), you'd still have full-size, mini and micro, each of which comes in both male and female and "A" and "B" configurations. That's at least 12 possibilities for a simple 4 wire cable. That's quite a "standard".

56:

Rooting... [is] not outright impossible, but it's not inviting. (I'll try it later. If I don't return within three hours, send a search party ...)

Seconded. It took me three goes, only to find the "stable" version of Cyanogenmod for my particular phone was lacking in certain areas, like data over 3G.

Asking on the forums for help with my setup got me responses on the lines of "try another mod build" (which wouldn't work on my phone anyway), "get a better phone" (thanks a bunch) and "download this source code here then compile using this compiler and these variables and open this package and enter this string at..." (Oh yes, Android is Linux based and the Linux Taleban love it, it gives them so may opportunities to lord it over the n00b...)

I am so not going for an Android tablet.

Or a Windows one: two different core architectures, two different OS, two different keyboards (which hoi polloi haven't been allowed to touch, but they're wonderful, just trust us). Oh, and no prices and launch dates. Looks like they've learned nothing from their Tablet PC launches which world+dog seem to be citing as "Apple copied Microsoft la la la I can't hear you...".

57:

I'm lacking in OS specifics for some of the cited systems, but I certainly do agree the principle, since it's a concern I've always had about MacOS since I first used a Mac in about 1990, and about Windows since W95.

58:

Google should have picked 2 (or possibly 3, but imo apple are right on that) screen resolutions for android mobile, announced it to manufacturers a good couple years in advance and started making new versions of android support only those resolutions (and an eventual doubling for higher dpi).

With big screens you can write programs so everything scales and works pretty well, but for small screens you can't. This means that you must restrict available resolutions so your apps don't have to be redesigned for every resolution a hardware manufacturer finds to make some particular model $15 less expensive to manufacture.

As far as getting the phone companies and handset companies to stop installing their horrid software... well building your own brand is probably the way to do that as second mover (Apple was first mover and was able to impose that condition). Not sure how exclusive a deal Google gave Samsung with Nexus, but google needs to be broadening "the real android experience".

Be happy you aren't in the US where you would find it very difficult to avoid having the phone company apply their own layer of software-fail on top of the hardware company's. :)

59:

Be happy you aren't in the US where you would find it very difficult to avoid having the phone company apply their own layer of software-fail on top of the hardware company's. :)

One of the reasons I bought my Galaxy SII outright rather than go for a bundle from Vodafone.

(The other is the ability to walk away from VDF anytime should their service piss me off sufficiently.)

60:

Free CMS packages and wikis have destroyed the need for HTML skills even. Do you know how many people don't know you can write a (very bad) webpage in Word and export it even?

Such is the fate of all technology.

Time was you HAD better take with you on a car trip an inner tube patch kit and the tools to remove the tire from the rim and put it back on then pump it up. Cause if you had a flat your spare might also get one before you got to a repair shop. Now we have those run flat spares that are good over nails for 80 miles or so.

And as has been discussed around here before, some of us have our timing lights, and know how to set the dwell. Anyone here under 30 have any idea what I'm talking about? (NASCAR is switching to fuel injection so will there be any carbs left outside of lawn mowers and such?)

As technology gets better people will have to understand it less. And yes they will not be as efficient with every bit of storage the way us old farts are but then again why should they? My iPhone has more 1000 times as much RAM and flash as the computers i first got paid to program had in CORE and DISK.

Computers are changing from projects to tools. Just like most technology does over time.

61:

Re #5 "...as Steve Jobs noted, the trouble with Microsoft is that they've got no sense of style. Which, in turn, impacts on usability. I am too old to spend my remaining years spending hours making badly designed software do what I want it to do...."

I'd cast this as Stross's Third Law: "Any sufficiently advanced Microsoft technology is indistinguishable from really crappy, bloated, buggy, noninteroperable, overpriced magic."

62:

You can delete my post. Don't mind at all. Granted I did not know you can't move apps to SD on the Tab, and auto-correct changed TCO to TOC for some reason, but the rest of my points were valid. I did read your terms of service and you can censor me if you want. Enjoy!

63:

According to ZDNet via TechCrunch, the iPhone 5 is dropping the dock connector. We'll see.

64:

Actually I kink of like the post. Nice to hear from someone who thinks everyone on the planet should think and act like him and his close circle friends. :)

65:

Ditto. Not deleting it.

66:

I completely agree with your criticisms of the Galaxy Tab; the similar considerations for the 10" model were the reasons I went ASUS rather than Samsung for my tablet (and the nifty keyboard dock, which makes it enough of a netbook for me not to need a separate one). The flakiness of the first ICS updates were unfortunate; luckily most, including mine, seem rock-solid again now. (I don't blame the Angry Birds freezes on anyone but Rovio.) It is a custom connector, but with that keyboard dock I feel like it's warranted.

The iPad was never an option for me; I don't have a machine that runs iTunes natively, and don't want one. Of course, I'm not a typical user.

My phone's an HTC (One X); there's a little bloat, but their UI overlay is really good and works well with ICS, and oh that screen. Seriously. That screen.

Sadly it does look like there's no real developer interest in android tablets, which is a terrible shame because the Transformers are lovely machines. Luckily there's little I want to do that there isn't a good app for. (Not nothing, though.)

67:

The iPad was never an option for me; I don't have a machine that runs iTunes natively, and don't want one.

No longer an issue. No second machine needed.

68:

Hence the past tense :).

When my Transformer comes to replacement time, I will look at the iPad. But I've looked at current iPads, and for me at the moment the Transformer wins, for a variety of reasons not all of which I can articulate clearly and some of which are possibly irrational. (But if I were buying a machine for my mother, it'd have an apple on the back.)

69:

In defense of Samsung the connector on the Galaxy Tab has a lot more than usb on it. It breaks out HDMI, analog audio and composite (and possibly a few other items).

On the hardware side, the proliferation of different incompatible proprietary connectors is the bane of android devices. I do hope that the Nexus tablet provides a standard. Dense connectors are handy for consolidated docks.

On the software side OEMs and cell carriers are the bane of Android. I'd liken it to televisions. The default settings of most consumer televisions are horribly off standard, to allow them to stand out in a showroom. It generally takes an hour or so to straighten things out with a calibration disc. In the same fashion the big name tablet and phone producers tend to try to differentiate via "improvements" to the UI and cute apps. Surprisingly most of these faults do not exist with no-name tablets. ASUS is also particularly good at only adding "value added" content to their tablets. I haven't rooted my TF-101 since it pretty much just works.

Keeping in mind the moderation policy, I'd suggest trying Cyanogen Mod 9 on the Galaxy. This probably is the link to a simple root and flash (I don't know your specific model number): http://www.redmondpie.com/how-to-root-galaxy-tab-2-7.0-gt-p3110-on-android-4.0.3-ics/ I just flashed my original flavor Galaxy Tab 7" last night (in 1/2 an hour with little or no drama) to CM9 and regard it as a major improvement for that model.

70:

One central failure: Google believed in the magic of the market; they sincerely thought that the vendors would Do The Right Thing. Dear gods, they even trusted cell phone carriers.

Surprise.

Dependent on that, smaller failures. Important to remember that what we are calling ecosystems are in fact markets, and very different in their order than actual ecosystems.

BTW, a lot of good UI research is now coming from Microsoft Labs, and if some of that finds its way into MS's new products we may have to take another look at them. The Windows 7 UI seems to me to work very well. It would be interesting to see MS's own usability studies on the matter.

71:

I'm not completely convinced that a more open approach couldn't work. Apple devices have obviously a very closed ecosystem, and Android devices don't seem to be that much more open (though they do run the Linux kernel), but there might still be room for somebody to be more open with mobile devices.

I know that there are a lot of problems in doing this, and of course the issue of figuring out what software to let on the devices is a big part of that. I have no real solution to this, but I have this dream that a large enough company could make a phone which could both have vendor checked software available for the normal users, but which could also use more open software for people who want that.

It's not an easy task, obviously.

There was this one mobile phone company which somewhat tried this with phones. On my N900 I installed a lot of software from the open source repositories and had no problems - I could've looked at the source code of the programs if I'd wanted to, but that's of course not feasible for most users, and even I didn't do that really often.

Currently I'm reminded every day of the future that never will be, as my current phone is the N9. It's more closed than the N900, but there's still a lot of open source code running on the device straight from the package, and there's some software even available in the Ovi store. Too bad it was stopped before it really got even started.

72:

These devices all spy on you, out of the box--they work for Google and Apple, and we may reasonably assume that Google and Apple have been penetrated by various security agencies.

There are, so far as I can tell, no spyware-free Android or iOS devices. As a certain wizard once remarked, "Get it through your head: there is no metal! It's a demon, a bound demon."

73:

Looking forward to OGH's and his readership's assessment of the recently announced MSFT Windows 8 tablet.

74:

One central failure: Google believed in the magic of the market; they sincerely thought that the vendors would Do The Right Thing. Dear gods, they even trusted cell phone carriers.

Which is a problem when you have only engineers making decisions. They just don't get it when it comes to markets and people making decisions on style and profits over "best tech solution". Of course many engineers also can't imagine a case everyone doesn't see the absolute Zen of their solution. Big/little endian comes to mind. (Big myself.)

And you get an entirely different, but in many ways just as bad or worse, when the marketing/MBA types run things.

So how do we fix human nature?

75:

David, no, it's a problem of the propaganda of capitalism in the USA and Silicon Valley. There's all these stories about "enlightened self-interest" and the market as progenitor of creativity. These were all over Silicon Valley, back when.

And Google staff, which is the major Silicon Valley survivor company, still believes them, or at least believed them when they drew up the Android marketing strategy.

76:

Sorry. But I see this almost every time I see a bunch of hard core engineering types making product and marketing decisions. Not just from the US. And not just capitalism. It is a mindset that if we engineer it right, nothing else matters. It's ingrained into the field. Much of it due to what we have to learn in school where there a "right" answer for most all problems in most of our classes.

And I'm one of them. Those hard core engineering types. But I've learned over the years to step back and listen to others with different backgrounds. Most of the time.

77:
BTW, a lot of good UI research is now coming from Microsoft Labs, and if some of that finds its way into MS's new products we may have to take another look at them.

I don't think much of what the labs do will filter down to the primary products, though some of it seems to be getting into ancillary products and concept prototypes.

And this is why Apple is going to own the tablet market over Android. There's a common corporate dynamic that exists in large, relatively mature technology companies. The product development and manufacturing divisions are run by very pragmatic ex-engineers who've internalized the marketing strategy of feature laundry lists, and are very suspicious of research from the Labs that leads in new directions rather than adds chocolate frosting and sprinkles to the basic cake. This is exacerbated by the almost universal organizational structure in which the research labs are a completely separate part of the company, and are funded by the P&L of the product divisions. AFAIKT Apple is not structured this way, or it may be but Jobs spent a good part of his time beating the reluctance to use advanced technology out of his senior management.

The Windows 7 UI seems to me to work very well. It would be interesting to see MS's own usability studies on the matter.

I've only used Windows 7 running in a VM on a Macbook Pro, but I was not terribly impressed with its UI; it seems to take much more effort than necessary to get anything done with the default settings. FWIW I wasn't tempted to throw it across the room, either; I was able to do some things.

78:

I'm with you; Maemo showed immense promise, until Nokia panicked and began looking for a partner. Intel: not a good match. Microsoft: even worse. And of course the Microsoft match was the cuckoo in the nest -- prematurely killing off Symbian and then smothering Maemo/Meego.

79:

In terms of the next few years, Android seems on a glide path, MS just showed the planet theirs (to the dismay of many of their supporters) but Apple is the wild card.

Tim Cook is not Steve Jobs. In no way shape or form. Over the last 3 or 4 years Mac have been getting harder to manage in large situations as Apple kept taking away tools and hooks. With ML really cutting deep. But this was all a part of Steve's direction to consumerfy over any other consideration. Now Cook has a very different background. Cook has over a decade in big company systems. IBM, Compaq. And a subtle (or maybe totally misread by me) key is how he indicated that big things were planned for the MacPro and iMac in 2013. Which would be in line with him putting his stamp on some internal priority changes. While I don't think he'll switch away from a consumer product focus he might (many of us hope) he'll reverse the anti management trends of the various devices.

Anyway it will be interesting to see where MS, Google, and Apple are at the end of 2013. Cook will have had enough time to put his stamp on things and MS will have succeeded or failed with Win8 on everything and Android/Google will have reacted.

80:

Eloise@17, not sure sure if anyone's addressed this yet as I'm jumping past 50 comments or so, but the Surface has been undergoing testing at MS's corporate headquarters in Redmond for at least 3 or 4 months.

I know I'm an anonymous stranger and thus not credible, but my girlfriend reports when she interviewed for an SQA position in Redmond every executive she interviewed with had a MS-branded tablet (now known to be the Surface) on their desk, and were quite evasive about what it was when asked.

81:

Charlie @ 5 I am too old to spend my remaining years spending hours making badly designed software do what I want it to do. Ditto, I'm too old to do the same with hardware - namely the iPad. I've tried to use one precisely ONCE. Never again. I was trying to type onto its' virtual keyboard, with the pad laid flat on (someone else's) desk. It rocked and moved, and I mistyped badly, lost "the lot" twice, and, in the end, dictated the input to the device owner. Euuughhhh.

82:

Looking forward to OGH's and his readership's assessment of the recently announced MSFT Windows 8 tablet.

It's Microsoft. If they don't somehow cock it up I will take an interest. (Which means: probably not.)

83:

ISTM that "technical savvy" as applied to smart phones consists mostly of being able to thumb type fast enough to need the autocorrect feature and to understand the basics of about half a dozen app or webapp UIs: Facebook being the primary one.

The open source revolution of the last 10 or 15 years has been a tremendous boon to the IT professionals (look, in a truly shitty job market we're in very high demand, with salaries actually going up), but it's causing some serious problems already, and the long-term prognosis is for cloudy (pun intended) and shitstorms. Rather than trend towards a few internally consistent tool sets, based on a few well-designed and well-understood programming languages and paradigms, we've got the Development Environment of Babel. Each new tool or language has generated yet another technology stack, with yet another large learning curve, and everyone wants to do everything in their favorite stack1, so they just keep growing, like Topsy.

For several years of my career as a software engineer I analyzed programming languages and software tools by considering them as user interfaces to software design and programs, and to runtime environments. Sadly, very few of those languages and tools were designed with that view in mind. And the ones I've seen becoming popular in the last few years are mostly worse; many of them start out to solve one particular type of problem, and then accrete major revision after major revision to add layers of new features, sometimes compatibly. I don't think I need to list them, I"m sure the readers of this blog are familiar with many of them.

At one time, it was possible for a graphic designer like my younger son, who's quite computer-sophisticated but isn't really a programmer, to do complex and very good-looking web design with CSS and HTML. Now he needs to learn CSS 3 and HTML 5, as well as Javascript, either standalone or in one of about 5 different toolkits which are all incompatible. I can't see how this situation can possibly be sustainable for people who don't want to be IT gurus but who do want to use computers or the web for useful work.

The core of my point is that there are several different sets of requirements for "technical savvy" in computers, phones, and tablets (which, I think, we can all agree are different kinds of devices). A very few people (a few hundred thousand world-wide at most IMO) need to be able to get into the machines and tinker. A larger group, perhaps a few million, need to be able to build tools and applications for particular problem domains on top of the basic devices, and they don't want (nor can they afford) to spend a large part of their time learning tools that provide a level of access beyond their requirements. But most people, potentially a market of more than 6 billion, don't need, don't want, and probably couldn't use any of those tools; they're mostly using rather than creating technology. And that doesn't mean that they deserve the contempt of the smaller groups, especially because they will drive the development of these devices, not us.

  • I recently read a blog entry by a Javascript programming guru who was using multi-threading inside his client Javascript code. I failed my saving throw for sanity and so cannot remember who wrote it or where I saw it.
  • 84:

    It is a mindset that if we engineer it right, nothing else matters. It's ingrained into the field.

    Yup. It's ingrained so deeply that Steve Jobs was able to stand on stage for Apple announcement, year after year, telling them what Apple was about to do, and why, in words of one syllable, and they kept screening him out because his design/aesthetic led strategy didn't make sense within the engineering paradigm. It was just quackspeak to them, meaningless stuff about art and beauty and the intersection of liberal arts with technology.

    I saw pretty much the same thing inside SCO (back when it was a UNIX company in the early-to-mid 90s) with respect to this open source Linux stuff: blank incomprehension of what it was all about, even when it began to eat their lunch.

    I'm coming to the conclusion that what Steve Jobs did when he returned to Apple in 1998 was to turn it into something other than a computer company, at least in terms of its priorities: it's an art/design company manufacturing mass produced artefacts that rely on the technology emerging from the computer industry, but without the underlying ideology. Their priorities are so different that the rest of the industry just don't understand why they do what they do.

    Most amusing symptom: HP's unibody laptop range, called the "Envy". Kind of says it all, really.

    85:

    You can use the software now. Win8/Metro for PCs has been on open beta for over six months now and the latest release is damn near gold. I'm typing this on a Win8 machine at the moment, for example. Software compatability is excellent, from Firefox down through older apps as far back as a graphics bundle I bought about twelve years ago (Corel Draw) which was written for Win98SE. The only package I've tried which didn't work was 16-bit GWBasic.

    Since you're primarily focussed in this blog article on tablet and phone-based computing surfaces, unless you run Win8 on those sorts of devices you'll not be able to decide whether you like, it, can stand it while feeling icky or run away from it screaming.

    86:

    "They funny thing to me is when people use it exclusively high end cases with built in keyboards and basically turn their iPad into an $900 laptop with less capability than a $250 netbook...."

    If I were a busy executive, I'd value the instant start-up of an iPad as being worth a lot of money.

    I almost never use my netbook any more, and whenever I do, I'm frustrated that I can't open/touch a button and start working at once.

    87:

    I don't have to make that decision for a while. My next computer has been ordered, and is a couple of weeks away from delivery: I expect to get a couple of years use out of it -- yes, you know my habits: this time it's different, honest! -- due to the nose-bleeding price. The nose-bleeding price is partly because I ordered it with a big enough SSD that when the boot camp drivers arrive, I'll be installing Ubuntu on it: if OSX truly jumps the shark, I intend to have an exit strategy ready.

    (Yes, it's a retina display Macbook Pro. Because I only have one pair of eyeballs, I stare at a screen 40-80 hours a week, and I don't want raster burns on my fovea, thanks very much.)

    88:

    charlie,

    How does ePub work on the iPad? (if you or anybody else here knows).

    I've been using iBooks, but it's not good for reading PDF's (for some reason, the 'swipe to turn a page' function doesn't work on many PDF's on iBooks).

    89:

    If engineers ruled the world the technology would look like Terry Gilliam's Brazil: http://s4.hubimg.com/u/214471_f260.jpg As for televisions, well, having a rectangular screen instead of a round one makes little engineering sense esp with CRTs.

    90:

    SATA3 or something faster?

    91:

    "While I haven't rooted my own Android device (it was like that when I bought it), it sounds strange that everyone who had done such a thing used windows to do it. "

    The obvious reason would be 'I want my phone to do things that it doesn't, badly enough to work with it. My Windows computer does what I want enough that working on it would be not worth it. And if I trash my phone, I can wipe and reload; if I trash my Windows computer, that would be a different thing.'

    92:

    How does ePub work on the iPad?

    It's the native format for iBooks. Just load it in via iTunes file sharing, or use Dropbox or something. Works fine.

    Alternatively there's a very good epub reader called Stanza -- but Amazon bought the company who wrote it and borgified them, and it's now unsupported abandonware in the app store (but free).

    And there are other epub readers.

    For PDFs, your best bet is GoodReader.

    93:

    Charlie,

    nowadays you can avoid all your troubles by buying cheap Chinese knock-offs. With the only downside being that they look as cheap as they are.

    On the upside, mine came rooted right out of the box (a pleasant surprise), has no useless crap ware (what pre-installed software it has can easily be removed) and has all standard connectors you could dream of - microSD, miniHDMI, miniUSB and standard sized USB as well (3.5mm headphones too, of course).

    All that for 100 Euro.

    94:

    The Retina ProBook SSD is supposed to be a fast 6gbps SATA drive, able to do sequential writes up to 400mbps. Which is just insane. (I remember when shunting 200Mb around took all day ... on a Sun workstation!) It appears to be optimized for client-side use and so is a bit slower than a server-grade SSD, but not by a huge distance.

    Put it another way: that write speed corresponds to writing 3Gb of data per minute.

    95:

    "Any manufacturer who loads non-deletable software on a machine is telling me in effect that I don't own that device. My response to that is simple - if I can't own it I don't buy it."

    This is a sensible attitude for a very few people (and probably a lot fewer than think so).

    The questions are what, how much, how hard, and what effects.

    96:

    You're pretty much as credible as the sources I'll look at before deciding if I'll make the effort to look at one - they're likely to be internet voices after all.

    But the issue remains - they're apparently rolled out internally from your post. But there's no pricing information, no release date. Tech journalists were barely allowed to touch it.

    It might all be wonderful and sweetness and light of course. But looking at things like that just sets my internal alarms ringing. If it's that wonderful, why aren't they saying "It will be with you by...?" If they wanted a "one more thing" moment they wouldn't have announced it as they did. If they've got a market ready product, where're the market details?

    97:

    "For PDFs, your best bet is GoodReader."

    thanks!

    98:

    non-deletable software on a machine is telling me in effect that I don't own that device.

    My last 3 TVs, microwave, clothes dryer, and car all fit that bill. Ditto things like stereos, NAS boxes, the POTS phones you might buy, and any number of other things. It's only when we attach the word "computer" to it that some folks get up in arms. But for most folks this lock down is what they've wanted for years. They want a device that works a certain way and will continue to work a certain way, even if their nephew plays with it for a while. Unlike that "crazy computer we got 10 years ago" that never did work right after Johnny installed "that game".

    And to be honest while I don't really mind folks modding their engine profiles in a car all that much (except if it makes it harder for me to breathe) I'm not sure I want folks modding the software on the cars that will be coming out in 5 or 10 years or even now. The ones which will be driving themselves on major roads and parking themselves. I can just see the lawsuit when a modded "parks itself" car jumps the curb and runs over someone or through a storefront.

    99:

    Commodity SATA3 SSDs like the low-cost OCZ Agility series are running about 500MB/sec write although they are likely to bottleneck on the computer side of the interface. PCI-e interface SSDs do a lot better (up to 1700MB/sec write speeds) but cost a lot more.

    100:

    And do you also reserve the right to reconfigure the light bulbs you purchase ... just asking.

    101:

    Details? Even if it looks like somebody's unemployed brother-in-law designed it, that sounds alright.

    102:

    tablets are aimed squarely at people who don't use computers (except at work, managed by an IT department, to do business)

    As a corollary to this that you might not have thought of: an iPad or iPhone with a credit card swiper seems to have 100% replaced cash registers in new businesses in my town. Restaurants, stores, YARD SALES.

    I consider this less a testament to the wonders of ubiquitous computing, and more a testament to how shitty and overpriced POS systems are. About the only magic thing that registers can do then is accept cash.

    103:

    Charlie, I would agree with you most of the time, on most things, but I have to come to the defence of Android here :) I'd also like to thank you for giving me the urge to write a blog post in response (where, ironically, I usually bitch about 'Droid).

    104:

    That's been my experience as well. The Kids Nowadays live on their computers, but understand them better than those who grew up without them did.

    You saw the same evolution in the first generations of automobile penetration. The first generation, the ones who bought their first cars as adults, were able to take the car apart and put it back together. They had to -- the things broke down so often.

    Hot-rod culture survived into the 70s. Any real man was able to do basic car repairs and maintenance at home, the same way he could carve a roast or mix a cocktail.

    But now cars are sealed, and you need to have professional tools to work on them.

    We see a similar evolution with digital technology, happening much faster.

    105:

    As a corollary to this that you might not have thought of: an iPad or iPhone with a credit card swiper seems to have 100% replaced cash registers in new businesses in my town. Restaurants, stores, YARD SALES.

    You're not in Europe, I take it.

    (We've moved to chip-and-pin smartcards; the magstripe is a legacy feature for visitors from insecure foreign locales. I suspect chip-and-pin iPhone solutions will show up in due course, but it's intrinsically more expensive to engineer than a magstripe reader connected to the microphone jack.)

    106:

    Argh, I meant to say the kids live on their computers and phones, but understand them LESS WELL.

    Proofreading. I've heard of it.

    107:

    Bruce @83: "Each new tool or language has generated yet another technology stack, with yet another large learning curve, and everyone wants to do everything in their favorite stack, so they just keep growing, like Topsy."

    I hope I didn't post this before, but if you haven't heard of Bret Victor before, you might like this: Bret Victor - Inventing on Principle. It's about making the effects of code more easily visible. He's obviously a UI guy, but I think his ideas apply to controller and model development as well.

    108:

    Charlie, why do you prefer a tablet (the Tab or Fire) as an ebook reader rather than an epaper gadget like the Kindle?

    I'm guessing maybe screen contrast? I find the paper quality can be better, but I love the light weight of my Kindle 4 for reading books.

    109:

    Screen contrast is one reason.

    Another is that e-ink looks like wet newsprint. It's also annoyingly slow to refresh (or flashes at you).

    Finally, I'm used to reading off LCDs for 60 hours a week. And the Samsung is about the same weight as a 3rd generation Kindle Keyboard, or an earlier Sony PRS-505 reader.

    110:

    The accessories market for Android is pretty weak. You can get a lot of useful and/or nifty stuff for Apple phones that doesn't seem to be economical to make in many different sizes for different Android devices.

    http://zgp.org/~dmarti/misc/little-android-phone-why-so-sad/

    111:

    That's why they have USB ports

    112:

    "what Steve Jobs did when he returned to Apple in 1998 was to turn it into something other than a computer company"

    It's a media company now; iOS devices are media devices: music players, readers, viewers, and sometimes communications devices. I think that's 80% of their revenue. Think of RCA, which founded NBC just so it could sell radios.

    113:

    Every Mac notebook I have owned since 2003 goes to sleep when I close it, wakes almost instantly when I open it.

    114:

    Makes sense. The wet-newsprint look is a problem but I find the Kindle is comfortable reading in good, bright light. And the light weight more than compensates for the poor display.

    I'm thinking of getting a 7" tablet myself, so your review is very useful to me, because we generally agree on technology issues. (What often makes these kinds of technology discussions frustrating is so many people fail to grasp that basic point. Consider the current fake-controversy about the Retina-display MacBook Pro. As far as I can see, it comes down to this: If you want a lightweight, stylish notebook with a great display, get a MacBook Pro. If you want a notebook that's user-customizable and user-serviceable get something else. I hear Lenovos are lovely. That wasn't hard, was it?).

    The reason I want a 7" tab is I made the mistake of not buying the iPad 3 when it came out. My iPad 1 is slow as molasses running iOS 5 and as far as I can see there's no way to roll it back. And I don't want to buy an iPad 3 now and deprive myself of 25% of the time when it's the latest, greatest iPad. So I just want something for web browsing and Instapaper and social media and such.

    115:

    Apple stopped calling themselves a computer company back in 2007 or thereabouts.

    116:

    I don't want to buy an iPad 3 now and deprive myself of 25% of the time when it's the latest, greatest iPad

    Is this why you buy an iPad? There are cheaper drugs, you know.

    117:

    Darn it, I'm thumb-fingered today. What I mean to say is that Charlie and I generally agree on technology issues -- which doesn't make the people who disagree with us wrong. And so many people in technology discussion fail to grasp that basic point, which makes those discussions frustrating.

    118:

    Dirk, if engineers designed the world it might look like "Brazil", but if systems programmers designed it it would look like a Unix shell. Programmers aren't engineers. I do it myself. Over the last few years I've been gradually semi-automating my job into heaps of stringy Perl libraries I access via Windows batch scripts called from the command line...

    119:

    Terrific subject. I'm using the galaxy tab 2 at this very moment. It's my first android device. I also have a Macbook Pro. I had an iPad 3 but returned it after one day bc it was too big to use comfortably. This galaxy 2 tablet is the perfect size and weight for an ereader. But I'm disgusted and offended by the crapware in this device. The typing function is horrific. I think Android must have been designed by NASCAR. I'm convinced; after reading the comments here, that Apple customers are more sophisticated and design-sensitive. Android customers are tinkerers who are into cheap. If you want to save a few bucks and be stuck with an offensive array of crap such as: a useless TV remote control, ugly clocks, and poorly designed Samsung chat apps, Android is for you. It's like buying a house on the highway next to the airport. Cheap... for a reason. I don't want Samsung apps. I want a device that works for my purposes, not to advance the marketing interests of Samsung. Why can't I get this crapware off my tablet? What gives Samsung the right to pollute my experience? By the way, this comment has taken 30 minutes to type on this poorly designed keyboard, which is overly sensitive and was probably never tested prior to delivery. Android must be for cheapskates whose time is worth very little.

    120:

    My iPad 1 is slow as molasses running iOS 5 and as far as I can see there's no way to roll it back.

    If you synced/backedup and those files still exist (you have to dig for them if they exist but no in iTunes) you can restore your iPad to the state of that backup. But it's a way back machine. You data is also back to that date.

    121:

    What I mean to say is that Charlie and I generally agree on technology issues -- which doesn't make the people who disagree with us wrong. And so many people in technology discussion fail to grasp that basic point, which makes those discussions frustrating.

    Yes. Yes. And again Yes.

    I'm tired of fan boys failing to grasp that the perfect Andriod/Apple/Dell/ChinaCheap/BuilditatHome/Lenovo/whatever may be just what they want but my and other people just might have a different value weighting system that makes their perfect choice for them junk for me or others.

    The best technology to buy is the one you'll like. No matter what the reasons. If you like it then it is for you.

    122:

    Charlie, I'm afraid there is a flaw your plan to use Ubuntu as a fallback path: the Retina display itself.

    Apple has a tremendous advantage in that most modern OS X apps use the Cocoa UI framework (derived from NeXT object technology), and the rest are mostly Carbon (derived from classic MacOS). Most Cocoa apps already get partial Retina support (text and standard UI widgets) for free, and can be brought up to full support without much modification.

    Linux, on the other hand... violent, enthusiastic fragmentation is its middle name, and I doubt any of the Linux UI toolkits were designed with much thought put towards cleanly adapting to high DPI. I expect that Linux (and even Windows, for that matter) will go through a lot more pain adapting to high-DPI displays.

    This and other more traditional annoyances with running Linux native on modern Macs (like trackpad drivers) mean that even if you feel the need to bail out of OS X, you might have a better experience running Linux as a virtual machine under OS X. I sometimes run CentOS in a VM on my ~1yo MacBook Air. When I fullscreen it, it feels like I dual-booted into a native OS, except for the bit where the Mac menu bar drops down if I hit the top of the screen with the mouse.

    What are the shark-jumping risks you see in OS X? I tend not to be that concerned over the changes in Lion and Mountain Lion; I remain unconvinced that Apple intends to make OS X exactly the same as iOS. But I'm interested in your take.

    123:

    I'll have to look into that. I don't mind losing all the data on the iPad; I think I can restore it in other ways. Thanks.

    124:

    "It's a media company now; iOS devices are media devices: music players, readers, viewers, and sometimes communications devices. I think that's 80% of their revenue."

    That is a popular meme, but I'm afraid I must counter it with an ugly fact. Apple is a public company, so you can get data on such things from their SEC filings. In their most recently reported quarter (the one which ended March 31), content stores generated about 5% of their net income, not 80%.

    Note: 5% of net. Unless I misunderstand the way they're reporting the numbers, they turn around and write checks for 70% of that 5% to the content owners. So the best case profit on that slice of their net is 30%, before the costs of running the content stores. They reported 47% profit margin over all operations.

    So, Apple's content stores are a not-so-profitable side business. Why do they do it, then? Because it's strategically important. Streamlining and integrating the process of Buying Things Online helps sell their hardware, especially so long as the equivalent experience on competing platforms continues to be wretched.

    125:

    You have a misunderstanding of 'stable' vs. 'nightly' in Android build terminologies. I'm running a nightly build of CM9 on my Galaxy Note, and it's rock-solid.

    Nightlies are made available at the point where all the major functionality is solid, and additional tweaks are being performed, things like language support esoterica and the like. If CM9 nightlies are available for your phone, you'll be fine.

    126:

    The samsung galaxy products are perhaps the most overpriced and bloatware-laden Android devices around.

    On the other hand, I bought a 10.1 inch Acer A200 tablet for $300, and I've not yet had a bad experience with it.

    The only apps they've added to it were useful tools that you could already acquire from the Google Play store, like Astro File Manager.

    Suggested Apps:

    -Aldiko Ebook Reader -gReader (to hook into Google Reader RSS feeds) -Pulse News -Advanced Task Killer (free)

    127:

    I've got a Samsung Galaxy Player 5.0, which is the not-so-little media player that could. It's essentially an undermarketed minitablet, and my only gripe is that Samsung isn't going to bother updating it to Ice Cream Sandwich. Samsung has since released the GP 3.6 and 4.2, which seemingly will get that upgrade, but the reviews aren't convincing me that those models have anything on this one (and I really like the 5x3 screen, hence "5.0"). I don't mind Samsung's interface, but I'd probably change my mind if I got to see how fast it could be without it. But it works, works well, and serves my purposes.

    Comes with a micro-USB port, 3.5 mm audio jack, micro-USB-to-USB cable with a socket plugin attachment, Wi-Fi, and typically lasts me 8 to 10 hours of thorough use. All the apps I have work terrific on it, including the Kindle app. The preloaded Memo app is useful, but I've barely touched any of the others, so I can't speak for its office app. The keyboard works admirably, in my opinion, since the screen real estate is enough to let me type easily with my gigantic hands, and /suggests/ autocorrections rather than imposing them (Is this a feature on the Galaxy tablets/phones? Seems reasonable.)

    All that being said, I do appreciate Android letting me fool around with so many settings; I can poke around to my satisfaction even though I'm not familiar with the intricacies, just one of those kids from the "more tech-familiar" generation.

    128:

    "Android is showing all the signs of fragmentation that hit the Windows PC market, only much faster than Windows fragmented"

    I only hope that happens even faster. I love having dirt cheap, but quality gear that I can tinker with. Further, one can argue that cheap Windows hardware supports a vibrant Linux ecosystem.

    129:

    I have to admit, I've been carefully avoiding most of the whole mess about tablets, smartphones etc (it's amazing what not having any money will do for one's ability to neep over new technology). Then again, my eventual plan is to take my old Palm m515 into the nearest Tel$tra shop, or Optus or Vodaphone dealership (the phone companies sell us the phones here in .au; it's a workable arrangement, but it does mean they tend to be locked to a particular network) and saying "I want a replacement for this" to them.

    What this means is I'm primarily looking for something which is small, has a decent screen size for the size of the gadget, and is capable of acting as a portable adjunct to my memory. It should be able to hold things like phone numbers, appointments, random notes, etc, and it should have an input system which allows me to enter these quickly and easily (my current cellphone, a Nokia 6070, is theoretically capable of doing this, but the keyboard is so damn fiddly it's not worth the effort of attempting to transfer the information over). I'm not so worried about the e-reader capabilities, the camera, or the online apps, games or whatever - I already own a perfectly good ebook reader, a digital camera, and a PSP Slim & Light. If it makes phone calls as well, all the better, and I'd prefer it if the blasted thing could deal with my present prepaid SIM.

    Of course, first I need the money.

    By the way, Charlie, Samsung's "come play in our walled garden" nonsense extends to their larger items. I have a Samsung laptop which comes complete with a whole heap of various "easy" this and that systems, most of which as far as I can tell just trigger the appropriate system apps in Windows 7. Given Windows 7 is basically dumbed-down one hell of a lot already, I shudder to think who'd need them. I tend to ignore them most of the time, although their "Easy Software Manager" is fast heading up the list of next to be deleted - it has a tendency to nag.

    Randolph @ 70 - My nickname for the Windows 7 UI (coming to it straight from having spent most of the past 5 or so years using W‌indows XP) is "Windows Large Print" - I'm using a laptop with a larger pixel space than the previous one, with a better graphics card (complete with go-faster stripes, according to the bloke who sold me the lapdog) and yet I wind up with less usable screen real-estate than I've had on any screen since I gave up using Windows 3.11. Only seven icons to a side on the screen? No option to make these smaller? I'm not using a bloody tablet where my clumsy fingers would have to be doing the clicking - I'm using a keyboard and a mouse. I can click on things which are smaller than 1cm square - so why not let me have that option? I also don't like that it got rid of my "My Recent Documents" folder. I've finally found the trick that replaces it (you have to click on the program you used to create the document, then it brings up the list). I don't find it to be an improvement.

    There's a reason why the new laptop is named Orac. It's pretty, it's flashy, and it can apparently do everything known to mankind - if it wants to. If it doesn't want to, I get to spend three-quarters of an hour arguing with it before I take out the key and throw it across the room (or rather, switch the bloody thing off, which is the OS equivalent). Windows 7 has made me very keen on the notion of installing one of the Linuxen, because quite frankly, I want something which does what I want it to do when I want it to do it, without stopping and asking about every second click I make in that infuriating "are you SURE you want to do that?" way it does.

    (PS: ENKI-2 @30 - "While I haven't rooted my own Android device (it was like that when I bought it)" - Reading this in an Aussie accent conveys a strong sense of "broken as designed" - is this the impression you were attempting to convey?)

    130:

    (Whoops, sorry about the duplicate - got a 500 error and chose to reload the page - could one of the mods please delete the duplicate? Thanks!)

    131:

    not sure about this model, but every android i've owned (5+), including my current htc sensation xl, i've rooted via the excellent XDA Developers forum. They can also supply a broad range of ROMs for Android devices for a variety of purposes.

    132:

    "if engineers designed the world it" would work. Look how well its doing now.

    133:

    [grumpiness]

    So here is this Android tablet and there seems to be no way to delete a file from a micro-SD card, without connecting the whole machine to my PC and using it as external storage.

    Yes, file-system permissions. It is how you stop the naive from breaking things by deleting the OS. But this is a chunk of storage that the user had added to the machine, containing user-supplied data. and your stupid fscking set-up doesn't give the user write permissions. Worse, you don't explain.

    [/grumpiness]

    Android is the Mills bomb of OS trench warfare, and the idiots keep fiddling with the pin...

    Incidentally, the original Eee PC had a pretty poor, badly supported, Linux distro, though the UI was a good fit with the screen size and resolution. I replaced it with a Netbook-optimised spin-off from Ubuntu. Asus didn't try to lock the machine down in the way that Android hardware is. I followed some of Charlie's suggestions, chiefly getting a large SDHC card for storing my data. And for almost everything I do, while the UI is different in a few ways, one might think it was using a different version of Windows to my desktop.

    Android and touch-screens: it's such a different way of using a computer that it needs good explanations. It is as big a jump as from MS-DOS to Windows. Am I being led astray by too much computer experience? Or do we have the old problem of the designers thinking that what is easy to them is easy for the naive new customer?

    134:

    "Engineering it Right" Well, actually, in the end ... nothing else DOES matter. IF, if, if .... You are talking about something that will LAST or will operate under awkward/rugged/harsh conditons &/or will really give good value for money. The perfection occurs, occasionally, when not only does it work superbly "out of the box", but goes on working, and looks beautiful as well.

    Examples are rare, but they do exist. The Gresley class "A-4" pacifc locomotives. The Spitfire and the Termite's Dream (a.k.a the "Mossie"), the Jaguar XK120-50 & E-types. Clipper Ships, generally. H-P instrumentation 1960-80, probably.

    As opposed to the utter crap of an iPad - see my post @ 81. It does NOT work. I mean, if I can't even type names into boxes without the so-called "design" being so bad that it fails, what's the bloody use or point?

    Megpie @ 129 Wholeheartedly agree. I can't keep up any more (but then I started with real actual core-store & 80-chrs-per-line, as in FORTRAN_IV) Now, I still want to change my phone, but I don't want a touchscreen ... but I do want e-mail ... except this machine is so set (POP-3, I thnk) that if I read e-mail somewhere else, I will NEVER see it on this one - once it's downloaded once, I can't get it back to here .... And yes, I'm still on WinXP & do not want to go to 7 or even 8, because 7 looks like shite ..... Und so weiter. Sigh.

    135:

    I don't think it's quite as bad as all that, actually; public awareness has at least hit the point where everyone has some sort of anti-virus protection installed. MS have apparently raised their game a bit.

    I also think your categorisation of the R-Pi is a little on the harsh side, although I wholeheartedly concur with your assessment of the dire state of IT education in this country. Desperation move or not, a £20 computer aimed at kids who want to experiment with writing their own code but can't have more than an hour at a time on the family PC because their mum wants to order Grandma's birthday present or their brother needs to do his English homework is a really good idea.

    136:

    Greg, I changed from Win XP to Windows 7 a few months ago.

    I'm getting used to the different look, and there is a lot you can change from the defaults. And there is so much that works better. It does depend on what your hardware is, but I found that Win XP had been holding me back, because there were some things it did badly, or not at all.

    I was thinking about the timescales, and when I bought Win XP, and the equivalent of Win XP to Windows 7 looks awfully like MS-DOS to Win XP. But that isn't a good comparison.

    137:

    I place little store in the Daily Wail, but given your comments on the charger for iProducts, I wondered what your thoughts were on this? http://www.dailymail.co.uk/news/article-2162867/Smartphone-users-hit-Apples-rip-plan-make-ALL-iPhone-accessories-obsolete-changing-design-device.html?ICO=most_read_module You call yourself a n00b wrt rooting; I am a n00b wrt everything Apple - so this article may be completely irrelevant. Hey ho. ;-)

    138:

    And if programmers ruled the world the only thing you would see on bootup would be :>
    I am old enough to remember "real" programmers sneering at GUIs and telling me how much more productive and fast a command line was. Usually Unix bores.

    139:

    What are the shark-jumping risks you see in OS X?

    This one's very unlikely, but: Gatekeeper becomes something you can't unlock without invalidating your warranty. That'd be a very bad sign indeed.

    Again, changes to the boot firmware to make it impossible to boot other OSs. Ditto.

    I see both of these as highly unlikely options: for one thing, it'll be a cold day in hell before Microsoft sells Office or Adobe sell CS through the Apple App Store and hand a 30% cut of their flagships to Apple. And again, a chunk of the Mac's post-2006 appeal to high end customers lies in Boot Camp. If Microsoft lock down the boot firmware on all PCs licensed to run Win8, as they have been threatening to do, this may actually make the Mac the most open platform.

    Oh, and finally: if they make that idiotic trackpad-scrolling-like-a-tablet thing mandatory rather than optional! (It drives me up the wall when I meet a machine that does it. Too much old-school muscle memory.)

    140:

    Given your priorities ... can I note that Palm kit is still available via eBay, often for a song? Here's an eBay auction for a Palm T5, second hand from the US, with a buy-it-now price of $42; the somewhat higher end TX also comes up from time to time (it's a T5 plus wifi), and if you can stretch a bit there's an outfit selling off warehouse stocks as new.

    For a giggle, folks are also selling Palm Lifedrives ... with an upgrade: you replace the original 4Gb hard disk with a 4Gb CF card and it goes like shit off a stick and gives you a much longer battery life. (hHre's one.)

    The original 4Gb addressing limit on SD/hard drive space has been hacked by an OS extension that allows multiple 4Gb partitions on an SD card, so in principle you can have up to 20Gb (4Gb plus 16Gb) on a LifeDrive.

    The T5 or TX or LifeFlash units run all your existing software, price is in 2 digits or 2 digits with a 1 in front, unlike the modern tablets, and you'll find it a night-and-day improvement over a Palm M515 (the screens are 480x640 pixels, rather than the 160x160 you're used to, and the cpu is w-a-y faster). Third party replacement batteries are available and require 5 minutes with a jeweller's screwdriver to fit. The only real drawback is, it's classic PalmOS -- orphaned five years ago and counting.

    141:

    "if engineers designed the world it" would work. Look how well its doing now.

    If engineers designed the world human beings would all be cube-shaped. Easier to stack that way.

    142:

    Charlie, stock ICS has this wonderful "Settings > Apps > All > [find an app you don't want] > Disable" which completely hides away given system app. If Samsung did not yank this feature out, you can use it to effectively get rid of crapware.

    143:

    Am I being led astray by too much computer experience? Or do we have the old problem of the designers thinking that what is easy to them is easy for the naive new customer?

    A bit of both. The real problem is that tablets are marketed and aimed at the computer-illiterates -- folks who have never owned a computer before. Consequently they're either locked down or dumbed down to an astounding degree. Your problem, it seems to me, is that you're expecting a device designed to support general purpose computing, but the designers have gone out of their way to prevent that.

    144:

    "Engineering it Right" ... You are talking about something that will LAST or will operate under awkward/rugged/harsh conditons &/or will really give good value for money.

    You don't understand modern business, Greg.

    It's NOT about making it last or operate under rugged conditions. And it's not about good value for money. It's about making it ATTRACTIVE, so that the punters will spend their money on it, and PROFITABLE, so you can develop a replacement which will make it obsolete inside of two years (so that the punters buy another).

    Making it so unreliable that it breaks prematurely is, I'll grant you, a liability -- that damages the customer experience.

    But you're looking at it from an engineer's viewpoint, contemplating a tool. Whereas computers are mostly consumer items these days.

    145:

    Charlie wrote that Stanza (the epub reader) is abandonware since it was borged by Amazon. I don't know how often updates came out prior to getting it originally, but there was a big lag post IOS 5 where it didn't work, but a patch was issued and it's back to being my ereader of choice. I tried Good Reader (I think - there were quite a few readers when Stanza was dead) but Stanza's swipe to change brightness and pinch to change font combined to make it that much better that I went back.

    146:

    It's the Daily Heil. Reporting on rumours that Apple are going to change their dock connector.

    Cons: they'd FUBAR the entire third-party ecosystem if they did so. I don't think they'd do it lightly.

    Pros: they just obsoleted the Magsafe charger-adapter for their laptops, standard since 2006, by introducing Magsafe 2 on the new Macbook Airs and Retina Macbook Pro.

    On the other hand ...

    They're selling a £10 Magsafe-to-Magsafe-2 adapter dongle for the laptops.

    My guess is that if they obsolete the iPhone/iPad 30-pin interface they'll sell an adapter dongle for the new iPhone 5 for a while -- plugs into an old dock connector port at one end, supports the new iPhone at the other.

    To do anything else would be commercial madness, and they've tried to avoid cutting their own wrists reasonably successfully for the past 15 years. (About pre-1998 Apple we shall say nothing.)

    147:

    Do any of the cheap Chinese tablets have working GPS? My use case for a tablet is as a traveller's tool: pocketable web access, navigation aid and media player. My phone can already do all these things, but map-reading (and movie-watching!) on a 2" screen is a frustrating exercise.

    148:

    just one quick note (as I haven't actually used (for a longer time) any tablet so far, only Android phones) but what I utterly don't get is this claim that iOS devices are "easy to use". Whenever I've tried using an iPhone or iPad I got close to throwing it across the room in utter frustration within a minute. Only thing holding me back was that they didn't belong to me. Just one "bloody wtf" after the other.

    Much less for Android. Some things irritate me but it's tolerable. Reference point: not a Windows user. not a Mac user. Linux user for 18 years and counting. Utterly annoyed at the death of Maemo. I so wanted an N900 ...

    149:

    Charlie @ 144 Actually I do understand that some so-called "modern business'" are making the ... exact same mistake(s) the motor industries made 1955-90. "Planned obsolescence" OK, so real tech-&-engineering advances make replacements inevitable, but just making shiny crap, so you can sell the same sucker NEW different shiny crap in another couple of years isn't actually a sustainable business model, in the long term. So, computers are "consumer items", yes ... they are ALSO engineering - they have to work - people expect them to work, and to be hassle-free, and they are not, are they?

    Now what?

    Oh, Micheal @ 148 Yes, me too!

    150:

    Ah, that works usefully! It's a well-hidden feature; I'd never have guessed what it did by accident.

    (I just acquired a cheap Motorola bluetooth mouse and keyboard set, and suddenly the tablet looks almost useful for writing purposes. Except it now weighs as much as an iPad plus Logitech ultrathin keyboard cover. Hmm ...)

    151:

    Much the case over here too. Apple users say Apple stuff is easier to use.

    To which I can only say that there is a group of people (let's call them 'Apple Users') whose expectation in UI matches what Apple provide. For those people Apple is the best choice.

    Should your expectations or requirements be different, then Apple is an abysmal choice, because it makes assumptions that aren't valid for you.

    Apple do concentrate a lot on design, but they are rather Big Brother in their assumption that one design can work for all. It doesn't. The most annoying thing about the fanbois is that they think that what works well for them should also work well for you.

    152:

    The point is, it's not just planned obsolescence -- it's real obsolescence. For example, a new (mid-2012) Macbook Air has a cpu/gpu/ssd combination that performs comparably to a mid-2011 Macbook Pro, a higher-end machine, and absolutely hammer a late-2010 Macbook Air into the ground (per a recent Ars benchmarking comparison). An entry-level budget Mac laptop today out-performs a top-of-the-range machine from 3 years ago; an iPhone has about 2-3 orders of magnitude better number-crunching performance than a Cray X-MP (if anyone was crazy enough to use an iPhone as a dedicated number cruncher).

    Why design for permanence when the machine in question is going to be massively out-performed by something cheaper in about 2-3 years?

    Now, as/when Moore's Law finally tapers off, I expect we'll see a revival of interest in computers as durable functional machinery. (Simultaneous with certain radically different developments I spoke about at a conference last week and really need to turn into a blog essay over the weekend ...)

    153:

    installs FBReader ta!

    I've had a Motorola Xoom 2 for the last few months. It's almost completely free of OEM crapware, and works fine for ebooks, web browsing, Twitter, casual gaming and the like (once I'd installed the Swype keyboard replacement program, that is). I have three problems with it: the one I've got has no 3G, so I can't use it on trains and the like (unless I use my phone as a wifi hotspot, but that's a hassle); it seems rather short on RAM, as evidenced by apps swapping completely out when I switch away from them and having to start up again when I switch back; and as a coder I have to take my laptop with me most of the time anyway.

    I'm still completely in love with my two-year-old Nexus One, though.

    154:

    SSDs Are Your Friends.

    155:

    I've found that Google's "identify my location by nearby wireless networks" service works remarkably well, in UK cities at least. It locks on much faster than GPS (I keep it turned off most of the time, because I fear the black helicopters) and it often gives a more precise location. Damn-all use up a mountain, of course, but I wouldn't like to rely on my tablet in that situation anyway.

    156:

    "Do any of the cheap Chinese tablets have working GPS?"

    Not that I know of - their major drawback IMHO

    157:

    The trouble with the dumbed-down computer argument is that it makes the tablet depend on being connected to a "real" computer to be able to use the micro-SD adapter. Which means the advertised ability to add storage space is looking perilously close to a lie.

    I know enough to recognise that the system doesn't give me write permission for that Micro-SD card. So what bloody use is that Micro-SD adapter?

    158:

    The MS UEFI thing has been blown out of all proportion by the Linux Taliban on Slashdot and other places.

    I'm using Win8 right now on a machine with a classic boot system, twenty-five years of hacks and bodges layered over the original Basic Input Output System that IBM created. It's insecure; if I left the wrong CD or thumb drive in place when it reboots then it could be virused and Trojaned up the wazoo before I realise it. There are other attack vectors that can also, theoretically, get at the OS via the boot. MS get flak from everybody about how insecure their OS is and they've decided to take steps by making Win8 and its successors securely bootable on a UEFI-equipped PC system in the same way Apple has been doing with OS/X on x86 machines for several years now.

    MS aren't requiring that Win8 only runs on UEFI-based machines; it would be commercial suicide since there are hundreds of millions of PCs out there with a classical BIOS boot system which MS wants to sell upgrades for. However Win8 will (almost certainly, it's not gone gold yet) load on a UEFI PC and do all the verification and signing required to secure the boot process in the same way Apple does on their own hardware. Other OS writers such as the Linux community will have to write UEFI loaders if they want their own products to load onto these machines but there will be production of classical BIOS machines for years to come as well as a historical clade of older machines which already exist and are favourite targets for Linux anyway.

    Ten or fifteen years from now it may well be that UEFI will be universal and no new-manufacture BIOS PCs will be getting built, in the same way that there are no new manufacture ISA-bus PCs being built today (at least I've not heard of any although some odd markets for such come to mind). By that time Linux or its descendants will have come to terms with securing PCs from attack via the boot process and there will be signable loaders for open source and hobbyist OSes which will hopefully shut them up (although I do not hold out much hope on that score -- there's a reason someone came up with the term "Whinux").

    159:

    Now, as/when Moore's Law finally tapers off, I expect we'll see a revival of interest in computers as durable functional machinery. (Simultaneous with certain radically different developments I spoke about at a conference last week and really need to turn into a blog essay over the weekend ...)

    Yes please :)

    160:

    David L., you replied to somebody else's post, using my name.

    161:

    Charlie @ 152 Yet desktop machines don't SEEM to be improving much in speed. Storage capacity, yes, and things like Solid State Drives (SSD's ?) and other useful peripherals.

    Essay? To do with what you were in München for, or the book thing in the USA?

    162:

    Actually desktop machines are very rapidly increasing in speed. However, its mostly in graphics processing and floating point. A modest graphics card will get you a teraFLOPS ie a top of the range supercomputer circa 1990

    163:

    Moores Law will not be tapering off in the near future. As soon as feature size stops shrinking the substrates will start spreading. The future is what we have in current supercomputers ie million core heterogenous architectures. I would guess at that point we might see yet another revival of wafer scale integration. Or even stacked wafers.

    164:

    Other OS writers such as the Linux community will have to write UEFI loaders if they want their own products to load onto these machines but there will be production of classical BIOS machines for years to come

    Not really. If you want to manufacture PCs with Win8 on them they MUST use UEFI. And i doubt that many OEMs will want to produce both UEFI and non UEFI systems as the later will not be able to be sold with Win8 which is 99%+ of the market.

    165:

    Whenever I've tried using an iPhone or iPad I got close to throwing it across the room in utter frustration within a minute.

    Then iOS isn't for you. Just as Win8 isn't for everyone either. Or Ubuntu.

    People used to yell at how dumb it was to buy a Ford or Chevy or whatever. Now days I don't hear much of that. It seems now it's OK to buy the car you like. Maybe in 10 or 20 years it will be OK to just buy the computer/phone/tablet you like.

    166:

    Something else we are likely to see now that mass memory is becoming solid state and no longer a miniaturized washing machine is distributed SSD to go with the million cores.

    167:

    One point I really like with Android is that apps are not connected to a single device but to your Google account. So, when your device is lost or got broke, you still have everything.

    One thing I do not yet really go along with on tablets in general is the touch screen keyboard. When typing on a touch screen keyboard, I have to type quite slowly to avoid typing errors. Even on the tiny keyboard of my N97 mini, I type faster than on a 7" tablet.

    168:

    Something I sorely miss in most computers produced in these days are matte screens (I also miss 4:3 screen ratio for work use, but I got resigned to that...). I understand the logic of "shiny" from a marketing perspective, and when in a dark room watching a movie the contrast of glossy screen is nice, but if there's even a badly placed light around it does become a real stress to use without talking of outdoor use.

    I read on OSNews about a new line of PC from Vizio that should include matte screens for its laptops.. when I'll change my current one I'll consider them...

    169:

    But unless the planet switches to a new technology for solid state memory used for non volatile storage then planned obsolescence is built in by design. The current technologies for such storage have only so many write cycles before they stop working. So we can replace the battaries all we want but at some point all these ultra portable devices will stop working as the memory in them quits working. And for most people this time will be measured in years, not decades.

    170:

    One point I really like with Android is that apps are not connected to a single device but to your Google account. So, when your device is lost or got broke, you still have everything.

    Apple iOS devices work the same way but with a few detail differences. But in general it's the same.

    171:

    Indeed that's a really nice feature, and I loved that when I got my Asus Transformer out of the box and turned it on, I had to enter only three bits of information.

    a) WiFi password b) My GMail user name c) My GMail password

    and shortly thereafter, all my contacts were on it and my apps and photos and so on.

    It also makes the possibility of losing or breaking my phone a lot less worrying.

    The downside, it's handing a large chunk of knowledge about me to a central controlling authority.

    172:

    Note - I agree with what you said. The trick is that people who can radically tinker with their computers don't realize that they are in the same position as people who can radically tinker with their cars. Most people aren't and won't be in that position.

    The major widespread legitimate grievance would be things like getting rid of crapware/adware and the ability to change service providers.

    173:

    SSD will be dropping flash within 5 years. My guess is some kind of memristor replacement since they have been built at 4nm feature size. Not sure whether it will be write cycle limited like flash, but if so it would be far more durable.

    Also just seen a colleagues LG 3D phone, which was really impressive (the 3D). No glasses needed.

    174:

    I wouldn't class a tablet as a data-entry machine. It's a data-access machine.

    I just "rooted" mine. It makes a huge difference. I now have Gmail and the Kindle apps, which just did not work on the Android ICS from the factory. There are a few other apps I might add and use. Without GPS, Maps/Streetview isn't much use, for instance, but I have the choice.

    If this level is what a dumbed-down computer is like, I can only say the manufacturer is selling a moronized model.

    175:

    Note - I agree with what you said.

    Yep. Sorry. After I posted I realized I should have tracked back one more comment before hitting the reply link.

    As to the cars and computer relationships I suspect you have to be older than 40 or 50 to see these cycles. Otherwise this all seems new and fresh.

    I keep remembering around 1970 or 71 when I was being driven around by someone who when to grade school with my grandfather (which would make this guy about 85 at the time) he only used 1st and 3rd when shifting his car. To him second gear was a useless affectation. He had learned to drive on a model T with only 2 gears and that's all anyone needed to drive a car. And to him automatic transmissions were just plain foolish. I kept wondering how long his clutch lasted and how many people had nearly rear ended him during those slow down shifts to 3rd.

    I feel the same way about people who refuse to buy (and tell others they should also) a computer they can't root, swap out the kernel, replace all the drivers, etc... When are they going to realize that most people don't want a computer. They want a device that does something they want done and could care less about how it does it. And I have some reasonably close friends who feel this way and just think it's stupid to to buy something like a TiVo when I/they can create a Rube Goldberg setup to do the same thing. (And get a divorce from my wife to go with it.)

    176:
    One central failure: Google believed in the magic of the market; they sincerely thought that the vendors would Do The Right Thing. Dear gods, they even trusted cell phone carriers.

    That, plus they chose a license for Android that gives a fair amount of power to the vendors and cell phone carriers at the expense of the end-user. There were other choices available, but they did not choose them.

    At that point the fate of Android was perhaps not inevitable, but certainly quite likely and reasonably foreseeable. Crapware, fragmentation, vendor lock-in...

    177:

    I feel the same way about people who refuse to buy (and tell others they should also) a computer they can't root, swap out the kernel, replace all the drivers, etc... When are they going to realize that most people don't want a computer. They want a device that does something they want done and could care less about how it does it.

    Yes, they do, but ... they also want the current "free market" in software, not the manufacturers choice. A computer is not like a car: its a platform for the software to do things the user wants. Right now, there is a reasonably free market in software, but if the manufacturers get their wishes, thats history.

    While most people will never root their computer system and reinstall new OSs, the ability to do so keeps vendors honest. It points to an open source world which they might not like, but have to live with: if Apple and Microsoft systems didn't run freely OS code, their customers would run away. In a more "mature" market with 2-3 vendors dominating, they can simply decide cartel-like not to allow that, if they can lock down the platforms.

    Its not just a generational thing. I've (helped) put a car back together from scratch and don't care that I can't do that with modern cars. I do care that I can't get an app onto an iphone without Apple agreeing.

    178:

    Sorry I'm reakky out-of-date "Root" youe machine. Like you wipe (almost)evrything off it, then re-install just what you want. ANd how much do you leave behind, so that you can re-boot? Especially if you have been using say WinXP as your o/s? Intersting, but sounds dangerous to me.

    But then, as an example, I've got a hardware fault on this (no red) and it isn't the new graphics card. I've got a spare replacement motherboard - had it for a couple of months, but I haven't the nerve. It's like being a non-driver, who has only watched car mechanics & wondering about, oh changing the water-pump on my L-R, as an analogy.

    179:

    Yep - go to pretty much any computer science conference that isn't a maths conference in disguise, and you'll hear the word "manycore" so often your ears will start bleeding. Everyone's convinced it's the future, but we still don't know how to make good use of all those cores...

    Dave Bell @174: Without GPS, Maps/Streetview isn't much use, for instance, but I have the choice.

    Like I said, try using Maps/Streetview with "Use wireless networks" turned on under "Location Settings". I think you'll be pleasantly surprised.

    Joern Pachl @167: One thing I do not yet really go along with on tablets in general is the touch screen keyboard.

    Have you tried Swype? It's not as fast as touch-typing on a full-sized keyboard, but it's much easier to use than the default keyboards.

    180:

    You are sadly completely correct in this assessment, Charlie. The Computer Science department of Manchester University more or less concurs, to the extent that their Linux image on their PCs is explicitly designed NOT to look friendly and "windowsy"; it starts in runlevel 3, for a start, and includes some fairly basic runlevel 5 interfaces. The entire intent here is to shock new students out of any complacency and get them used to the interface they'll be using for most computing work: the simple commandline.

    Recruiting for new computing staff here is equally soul-destroying. We start off by asking that applicants complete a fairly simple application form. Out of this process way over half the prospective applicants for a job get thrown away on the basis of inability to fill out a simple form legibly and correctly. Then we filter for required skills, and end up with a final selection of perhaps half a dozen people whom we think are worth interviewing; the other few hundred applications go in the bin.

    We're not picky here; we'll train up people for a job role if we think they would be able to do it, but we need more than Windows point'n'drool ability. Sad thing is, you have to work very, very hard to find people with these sorts of skills as they're just not taught them in schools these days. Even having the frankly minimal intelligence needed to install Ubuntu on bog-standard PC hardware counts as far-out 'leet computer skills to most people.

    Ask most of this lot how you talk to a minimal Linux instance too small to have ssh or even telnet (it does have wget though) and they'll simply stare in blank horror...

    181:

    You are sadly completely correct in this assessment, Charlie. The Computer Science department of Manchester University more or less concurs, to the extent that their Linux image on their PCs is explicitly designed NOT to look friendly and "windowsy";

    If Charlie was speaking more about the general public and not computer science majors (a very small subset of the public) then I'm with Charlie.

    I fought the battle for a while in my very large school district starting back in the mid nighties that computers should not be fancy typing classes. With only IBM PCs running Win 95 or DOS. It was a loosing battle and I quit tilting at those windmills after a while. There was this one person brought in to oversee how computers were to be used by students system wide (about 80K kids at the time) and they came up with a 5 year plan to put Netware in all the schools for directory services as that was the best future proof option. The also had some 10 year plans. At the elementary level they tossed out Macs with working usable teaching programs as the stated line was "Kids should learn to use what they will be using in the real world after school." This was at K-6 school where I was told this. And now that all those kids are now in college 50% or more have Macs.

    Anyway my point, and I think Charlie's, is that much of the "teaching" of computers has to do with Email, typing, spreadsheets, etc... not with applying them as tools to get jobs done. And by that I mean they show you how to do column and row math in a spread sheet but never give you anything close to a simplified real world problem to solve with these skills. So 18 year olds can "spreadsheet" but they can't use it to solve problems. Just simple data entry and formatting.

    182:

    "Use wireless networks" is fine. Until you encounter a mobile network, such as one of the ones that spends most of their time doing 15 knots between Bergen and Kirkenes.

    Then it becomes frankly painful. Anyone on the quayside in Oksfjord, for instance, could be transported to Bergen twice a day.

    (It doesn't help that GPS is a bit dodgy above the Arctic Circle.)

    183:

    The cynical response is, of course, they were taught the real-world skills necessary for the modern workplace... ",)

    184:

    One central failure: Google believed in the magic of the market; they sincerely thought that the vendors would Do The Right Thing. Dear gods, they even trusted cell phone carriers.

    Amusingly, this is pretty much 100% wrong.

    Andy Rubin and Rich Miner were really clear from the get go that they believed that the developer community and Open Source would keep the platform pure, that developers would drive this and do an end-run on the carriers and OEMs.

    Anybody who has spent anytime in the phone industry knows this just isn't the case. The software costs of the OS are a fraction of the costs of getting a handset to market and, in relative terms, always will be.

    The reason why the phones get infected is purely the economics of building and deploying phones. A phone is a pricey piece of hardware, if you want to sell them at affordable prices and you don't have the brand 'pull' of Apple then you need the carriers underwriting the cost of the process.

    Likewise, putting together a phone/tablet or similar represents 10K-30K in engineering hours, which is an enormous sunk cost to amortize over your hardware BOM. So without Google paying, and with developers, frankly, having little to no power, versus the guys who are sticking in $30M of engineering effort into the upfront NRE, you're going to end up with stuff thrown in their by committee.

    One of the reasons I thought Apple would fail is because I didn't think enough people would be prepared to pay the real cost of handsets in order for them to break the model. They did. Kudos. Samsung's, HTC, Sony, Motorola and Nokia all still work off a different model.

    185:

    My CTO refers to Android as Windows Mobile 2.0 for it has a lot of the same problems that Microsoft had with WM, in terms of the compromises you have to make when you're supplying an OS to multiple vendors who don't really care about an OS.

    186:

    Multi-core processors are the bees' knees for server applications, but you're right that we don't really know how to use them effectively on the client side (read: personal computers, tablets, and phones). That's one of the things that's driving the move to the cloud: comm bandwidths are still increasing incrementally, while effective processor speeds are stalling out1. The exception to that is graphic processor speeds; GPUs now can have more than a thousand cores and will use them effectively when displaying complex 3D scenes. High-performance graphics hardware on the client is useful in cloud scenarios since it improves interaction latency and gaming display. If history is any guide (and it has been quite accurate over the last few swings of this particular pendulum) we'll see that start to turn around in about 8 years or so, and by that time most users will be heartily sick of the security failures and reliability problems of many of the cloud vendors.

  • The chips are still following Moore's Law: 8 cores will pump out twice as many Gips as will 4, on-chip caches, pipelines, and data buses keep getting bigger, and RAM channel bandwidths are still increasing. But if half the cores remain idle most of the time, that's not very useful.
  • 187:

    Apple iOS devices work the same way but with a few detail differences. But in general it's the same.

    Oh, didn't know this. Just remember a guy complaining he had to buy all the apps again after having crashed his iPhone. However, this was a couple of years ago.

    188:
    Something I sorely miss in most computers produced in these days are matte screens

    I agree completely. One of the reasons I haven't yet replaced my mid-2007 MacBook Pro is that Apple no longer provides a matte display as an option. I do much of my work on my computer in the living room, where most of one wall is windows and there are 2 skylights (and 2 more in the kitchen with no wall between the rooms). That flood of daylight is one of the main reasons we bought this house; winters in Oregon are mostly overcast, and SAD is endemic, so natural light is something you treasure. But it does mean that I have to be careful of glare on the screen.

    189:

    It's possible your friend had a problem with iTunes or the App Store. That's been known to happen. However, most of the time it works as designed.

    I've bought new iPhones twice and a new iPad once. Each time, I just register the new device under my iTunes ID and my apps and data just appear on them.

    190:

    Have you tried Swype? It's not as fast as touch-typing on a full-sized keyboard, but it's much easier to use than the default keyboards.

    I have. It's not bad, but I'm not yet completely convinced. As an author of technical textbooks with a lot of specific terms, this tool has its limitations. Maybe, I need more practice, however.

    191:

    My iPhone backs itself up on my main computer whenever I plug the docking cable into the computer's USB port. And the OS installation process automatically backs up the phone (over WiFi now) before installing a new OS version). I"ve had to have my phone replaced under warranty three times now (my fault in 2 of those, but Apple is awfully good with customers who've been buying hardware from them for 20 years), and in all cases a simple restore on the new phone brought back all my apps with their old state (although the network preferences had to be put back in again because it's a different network address).

    192:

    According to a recent interview with Google executives that I'm too lazy to Google a link for, Google uses Android as insurance to be sure they're not beholden to Apple to reach mobile users. By that standards, Android is a wild success even if it doesn't make a profit for Google, and even if it fragments in a million pieces, most of the pieces being loaded with telco crapware.

    Google doesn't care. All Google cares about is that no single hardware vendor controls its path to mobile consumers.

    That's a similar reason why Apple came out with its own mapping technology. Apple understands that maps and location services are now essential parts of the mobile device -- more so than the phone itself -- and Apple needs to control that technology internally to avoid ceding control of its platform to a partner.

    Everybody in upper management in computer companies today remembers the days when everybody who didn't want to go out of business had to make the pilgrimage to Redmond, kneel before Bill, kiss his ring, and leave behind wagons heavy with gold and treasure. Apple in particular remembers; Microsoft could have killed the Mac at any time by simply ceasing to support Office on it. Microsoft could STILL do a lot of damage that way.

    193:

    I'm on iDevice #6 or thereabouts and I've never had to re-buy an app. (Let's see: iPod touch 1G to iPhone 3G to iPhone 4, iPad 1 to iPad 2 to New iPad.)

    I suspect you friend who had to re-buy everything did more than crash his phone; sounds like he completely lost (or wiped) his Apple ID and had to create a new account from scratch.

    194:

    Apple in particular remembers; Microsoft could have killed the Mac at any time by simply ceasing to support Office on it. Microsoft could STILL do a lot of damage that way.

    One of Microsoft's most profitable product lines is Office on Mac. Above all else they're a business, killing that line would have been daft. Not to mention, Microsoft needed Apple to survive to ensure that they could keep the DOJ at bay. Without a credible competitor, even one with only 20ish% of the OS market, Microsoft would never have weathered their problems in 2000.

    If they had been broken up, that Mac business would have been the core of the new apps business that would have had to be spun out.

    By all means complain about Microsoft being aggressive in business, complain they write crap software, but leave the conspiracy nonsense at the door.

    195:

    Something I sorely miss in most computers produced in these days are matte screens (I also miss 4:3 screen ratio for work use, but I got resigned to that...).

    Completely agree. Charlie wrote that the Samsung Tab is an excellent ebook reader. It is, except for the glossy screen. Try to read ebooks in bright daylight, it's simply impossible. All my computers I use for real work have matte screens. There are still a few netbooks (I still love 'traditional' clamshell-style netbooks) avaible with matte screens. Mine is more than two years old, and I hope I will get a replacement some day.

    196:

    I suspect you friend who had to re-buy everything did more than crash his phone; sounds like he completely lost (or wiped) his Apple ID and had to create a new account from scratch.

    That's probably true. I have no personal experience with apple so far. Maybe, I should try it some day.

    197:

    He either got some really bad advice or charged ahead with some really bad personal decisions. I can imagine something like "my phone is dead thus I guess I need a new ID and email address" or some such. He still owns those original apps and can use them at some point on another iOS devices if he knows the Apple ID.

    Heck, my wife and I share our AppleStore ID and when I buy apps they just appear on her phone. :)

    Now understand this is allowed. A family gets to have up to 5 devices on one ID.

    And now with iClould if you loose your phone you can have the DATA restored via the cloud if you wipe/destroy/crash/etc.. your device. You can even restore between iPads and iPhones but you lose your folder structure.

    198:

    Try to read ebooks in bright daylight, it's simply impossible

    I live in Scotland. What is this "bright daylight" you speak of?

    199:

    fizz @168 et al.

    You might like the Asus UX31A then. Its display is not a 4:3 at 1920x1080px, but at least it isn't glossy and it provides more desktop real estate for the form factor than most. I was looking into getting one, until I heard some obscure little outfit offers a 2880x1800px display on one of their laptops.

    Sadly, this is no alternative to someone who dislikes this particular vendor's keyboard layout and hates the fact that its impossible to swap out the battery even if you manage to open the thing without wrecking it. BTW, easy battery removal seems to be very low priority for all of the new rigs. I wonder why so many people seem to have no problem with being unable to carry a spare battery for that extra long workday on client premises.

    So I guess I'll have to hold out until those quantum dot displays really get going (they are supposed to be light emitting like OLEDs only much better).

    200:

    Charlie @198: But, but, but you said ... wet newsprint? Wasn't direct sunlight what e-ink is really, really good at?

    201:

    Not mine, but they exist. (The guys in Shenzhen are running through just about any possible permutation of features in any generally recognized kind of gadget.)

    You'll have to query your favorite, least untrustworthy trader to find out particulars. Back when I was looking for mine, I was simply trying to find a cheap tablet with a reasonably fast chipset and current version of android. (Allwinner A10 and ICS respectively.)

    202:

    One of Microsoft's most profitable product lines is Office on Mac. Above all else they're a business, killing that line would have been daft.

    When all that was playing out with the return of Jobs in the later 90s I was confused as the numbers didn't add up as to Apple about to go out of business. If you looked at their balance sheet soon after the big MS investment of $150 million into Apple to save it Apple had well over $1 billion in cash. And it was growing. To quote from one report: how did Apple manage to spend nearly two billion dollars more than it earned across two years, lose 14% of its income, and still manage to sit on the same $1.2 billion in cash without pawning anything?

    Later one I found a series of essays on the issue written by someone who had dug deep into a lot of Apple/MS issues who basically said it looked like Apple also had MS by the short hairs over patents dealing with media playback on computers and Jobs apparently was in no way shape or form going to get taken by Bill as he had been in the early days of Apple over patents and copyrights. The speculation by this person was that the public announcement of continuing Office for Mac plus the investment was the public face on a much more complicated deal that had MS paying Apple a few billion over some years plus cross licensing deals covering all kinds of things. If you notice Apple and MS are suing everyone but each other over patents all the time.

    I'll rummage around a bit and see if I can find the links. ..... This may be the original article I have bookmarked somewhere. Long read but interesting.

    http://www.roughlydrafted.com/RD/RDM.Tech.Q1.07/592FE887-5CA1-4F30-BD62-407362B533B9.html

    203:

    I wonder why so many people seem to have no problem with being unable to carry a spare battery for that extra long workday on client premises.

    I have 3 spare batteries for my iPhone/iPad. They are external and "just plug in" or via a cable. Plus the iPhone market is big enough that there are multiple cases with external batteries built into the case.

    To be honest I prefer this over the battery covers that will not stay on after a year or two due to constant opens.

    204:

    I live in Scotland. What is this "bright daylight" you speak of?

    Haven't you recently been to Bavaria? There, they have not always bright daylight, but sometimes they have.

    205:

    @blog: I've never understood all the belly aching about screen resolution. Somehow we've had 20 years of screen with various resolutions, but once you're able to put the computer into your pocket screen resolution suddenly becomes make or break for application developers.

    It's not hard to design your software with layouts that make use of screen real estate. And DPI isn't archane magic. Just do it...

    @various comments from Charlie: I do think you are selling MSFT short regarding design. I think they've learned a lot in the 00s. They aren't SCO numbly watching their empire crumble. The Windows Phone actually looks pretty good. If they do continue to fade into relevance no one will be able to say they did so without a fight. As a consumer I think it's great. (some context: I use Linux on the desktop and my phone is an N9).

    206:

    It's not hard to design your software with layouts that make use of screen real estate. And DPI isn't archane magic.

    It's about information density. 15" displays and smaller were a pain to deal if you assumed non perfect vision of users. You can't just keep shrinking down the size of everything and/or assume screen scrolling will be "OK".

    And when you start getting small fonts can look like crap if scaled up or down from their optimal size and dot resolution.

    So a developer has to decide just how much to put on a screen using which fonts and icons at what minimum screen size. And know that anyone using smaller will likely have a crappy experience. And toss tablets into the mix and maybe you decide to go universal. But again you get to make the same decisions on how much you show on what minimum resolution tablet. And the rest get turned into phone displays or scaled with inevitable font results. Now you can do things like have 4 or 5 or 10 different display modes and switch out fonts depending on screen size and dots but that costs real $$$ in design, development, testing, and support. Real $$$ and/or £££.

    Anyone who has designed non trivial user applications and done it right will tend to have a lot of scars and sleepless nights from the experience. I know I have.

    207:

    Rooting does not look that difficult to me; it sounds like a pretty standard Samsung platform. But I'm on Samsung device #4 now, so learning all the background to make sense of that linked page is a sunk cost.

    Heimdall is a reverse-engineered version of the Samsung ODIN boot protocol tool (which is indeed Windows only). Heimdall is cross-platform; the same page with the Windows binaries and source has Ubuntu binaries. It's pretty easy to build from source on Linux too.

    The standard Android development tools are also cross-platform, with binaries for Windows, OS X, and x86 Linux; developer.android.com has download links. In this case the only thing you care about is "adb", the Android USB/IP debug tool. In particular, you just want "adb shell"; this gives you...a shell on the device.

    Once you're there, you know what "su" does. I have no idea what "gtab2-root.sh" does; it's apparently part of the thing you flashed with Heimdall. I'd say "it's just another /bin/sh script" but the commands it invokes may involve some platform-specific tools. Or it could just be remounting /system read/write and dinking with file permissions. Read it and find out.

    I bet the ROM image ships with busybox, but in case you're missing ls, there's always "echo /bin/*".

    There are some Samsung devices which have not been reversed into Heimdall and do require the ODIN3 tool. Some Android root methods have required exploit tools only available as binaries; in those cases you might need Windows. You don't for this, AFAICT.

    208:

    The Windows Phone actually looks pretty good.

    And to be entirely fair to MS, it is a good OS for a phone. It does a lot of things much better than iPhone, and massively better than Android.

    However, it's built around a completely novel UI/UX paradigm, which just pisses developers off; they threw out a lot of stuff they'd got right in Windows Mobile and lost 2 years in the process which pissed off their OEM partners.

    I'll not write them off because phones are a fast moving personal consumer product and people tend to think shiny and be less tied to the OS over the shiny. Or have historically been. I don't see that changing.

    It's not hard to design your software with layouts that make use of screen real estate.

    Actually it's horrifically horrible to do this and enormously time consuming to boot. As was pointed out, everything from font size, to button layouts and how to handle menus and the like, changes the less screen real estate you have.

    209:

    It depends largely on what they were reporting/showing as cash on the balance sheet.

    AIG had a LOT of cash assets showing in early 2008, but a lot of them were tied up in properties that technically couldn't be realized as cashflow.

    I can personally testify that it's really easy to have a great looking balance sheet but be on the verge of going bust because you can't make your payroll.

    210:

    BTW, easy battery removal seems to be very low priority for all of the new rigs. I wonder why so many people seem to have no problem with being unable to carry a spare battery for that extra long workday on client premises.

    Firstly, client premises may be presumed to have mains sockets. Right? If your laptop is good for 6+ hours on battery, then you're all set. If the client won't even let you plug in your laptop to recharge, then you have bigger problems with the job than the laptop's battery life.

    Secondly, newer LiION cells are good for 4-5 years before their capacity drops below 80%. At which point, most of them are ready for the scrapheap. Or for someone to pay up once to replace the battery before handing it down to a relative or a kid to see out its twilight years.

    Thirdly, I note that in 2008 when Apple moved from the unibody 13" Macbook to the unibody 13" Macbook pro, the changes they made consisted of (a) adding a Firewire port, and (b) making the batteries non-removable. At which point the battery life jumped by around 30%, because they were able to cram more cells in (in place of a battery casing and the shell of a battery bay).

    Finally, I suspect Apple (and others) are waiting for the TSA to realize that a removable LiION cell and a device for creatively damaging and shorting it makes a nice incendiary device, and bans all devices with removable lithium batteries from flights (they're already illegal in checked baggage due to the inability of aircraft hold fire suppression systems to extinguish a lithium fire inside a bag). Apple's non-removable batteries are actually an advantage in this scenario, because they don't represent a terrrrrst threat.

    211:

    Direct sunlight (and a long battery life) are all e-ink is good for. And I get so little of it in my lifestyle that I've got better things to do with it than read.

    212:

    Daveon - It's not conspiracy nonsense to suggest that Apple is weaker to the extent that it is dependent on other companies, including Microsoft.

    213:

    I love my Mophie iPhone 4 case. I seldom have to use it, but it's comforting to know that it's there when the battery starts getting below 50%.

    214:

    No, it's not. But that's not what was implied.

    215:
    I'm not convinced that using either of them really encourages you to become a sophisticated IT user with any understanding of underlying processes. Both modern versions of Windows and the Mac OS are continuing a process of making this more and more obscure too - although there might be hope for the small subset of linux users - you may or may not learn it formally, but if you're cd-ing through directories and the like you've surely got to pick up some clues on the way?

    What are these underlying processes that you speak of? Seriously. What, in your opinion, are the underlying processes that you need to understand to become a sophisticated IT user? I used to think I knew, but now I'm not so sure.

    This is not an abstract question btw. I teach what is called 'Discrete Mathematics' at least one semester every year, which is billed as "all the math you really need if you're going to get a degree in math for computer science". So, for example, would one of those underlying processes be the notion of recursion? How about binary trees and assorted data structures? Or were you thinking of things like addressable memory, register operations and the like?

    216:

    One other thing about Android is that it is evolving very rapidly with major new releases less than a year apart. Android 5 (jellybean?) should be available within a few weeks. I imagine 6 will be the stable tablet workhorse and 7 something that is merged with ChromeOS with Google eyeing the desktop market.

    217:
    Why design for permanence when the machine in question is going to be massively out-performed by something cheaper in about 2-3 years?

    One point that I hope gets increasing air play is that quantity has a quality all its own. Increasing hardware performance can let you radically redesign the basic algorithms implemented in the software. Remember all that rah-rah about intelligent behaviour didn't need all that much memory are processing speed, so long as you had a good algorithm? Seems like a lot of buzz is being generated lately to the contrary (I'll let people google partial quotes rather than post a link to avoid being hung up in moderation):

    To sum it up even more succinctly, the idea behind AskMSR was that: unsophisticated linguistic algorithms + large amounts of data ≥ sophisticated linguistic algorithms + only a small amount of data.

    and:

    But are human readers really so different from the computer plodding through its database of sentences? How do people learn all those nuanced meanings of thousands of words and the elaborate rules for putting them together in well-formed sentences? We don’t do it by consulting dictionaries and grammar books . . . In spirit at least, our experience of language seems closer to statistical inference than to rule-based deduction.

    Notice that not only are the algorithmic approaches quite different, but that optimizing the hardware for the two approaches involve substantially different (and sometimes contradictory) tweaks.

    218:

    What are these underlying processes that you speak of? Seriously. What, in your opinion, are the underlying processes that you need to understand to become a sophisticated IT user?

    First, merely having the concept that the machine is a mechanism that executes instructions on data (and can in principle be understood) puts the user w-a-y ahead of most members of the public.

    Understanding how to navigate a filesystem via a command line and a GUI helps. (Different views of the same underlying paradigm to provide a parallax perspective.) Understanding that data is stored in files and operated on by programs. Recognizing the difference between a program file on a storage medium, and an executing process (that has been loaded into dynamic memory and is active).

    Basic concepts of networking.

    Basics for programming include boolean algebra and binary and hexadecimal arithmetic (stuff I was taught in school around age 8-9 and then 12-13 respectively), variable assignment (in turn, dependent on basic algebra), iteration and looping constructs, recursion, indirection (pointers), and finally the concept of data structures. Most of which come in handy even if the only things the user ever programs are macros for MS Word or Excel.

    Some idea of the hardware and components that go into the magic box would help, although it's changing so fast that anything more than a high-level overview would be largely pointless.

    This is way before we get into stuff like predicate calculus, computer architecture, operating systems theory and concurrency, or various programming methodologies.

    I think if this stuff was part of the core mathematics and IT curriculum (along with typing skills) we'd have ... well, we'd have young adults who recognized they were operating machinery, rather than magic boxes.

    219:

    One book that's climbing on my to-be-read list is "Tubes," about the physical structure of the Internet, the datacenters and cables that lets Joe Shmo order loafers from Zappos.

    Even sophisticated users can lose sight of the complex and enormous physical infrastructure that is the Internet. All this talk about "cyberspace" and "the cloud" encourages literal magical thinking.

    220:

    My point is that the Internet isn't a magic-like cloud of data. Or, it isn't just that. It's a physical object, an artifact built by human beings that encompasses the entire world. That's amazing.

    221:
    At the elementary level they tossed out Macs with working usable teaching programs as the stated line was "Kids should learn to use what they will be using in the real world after school." This was at K-6 school where I was told this. And now that all those kids are now in college 50% or more have Macs.

    Another gripe I have is that the education tools aren't handsier, by which I mean that a lot of people don't process the abstract symbols they see on a display very well, but understand just fine if they can hold the thing in their hands and physically play with it. Explain the concept of inheritance in an abstract data structure and you'll get a lot of puzzled looks. Show them the same thing with a quipu and the light dawns.

    222:

    A big problem at the time and I'm not so sure it has gotten better is that the "computer specialist at this elementary school was the secretary who jumped in at the chance to become the school's "IT person". Peter principle to the nth degree. She was all into keyboarding and such and almost got hostile when I suggested that we should be buying (1995) 800x600 displays instead of the cheapest 640x480. I even said the PTA would spring for the difference but she wanted cheapest possible quantity over mid grade quality with slightly less quantity. :(

    223:

    I feel the same way about people who refuse to buy (and tell others they should also) a computer they can't root, swap out the kernel, replace all the drivers, etc... When are they going to realize that most people don't want a computer. They want a device that does something they want done and could care less about how it does it.

    Don't you see the moral problem with this approach? If you don't, let me rephrase:

    "When are they going to realize that most people don't want a DEMOCRACY. They want a GOVERNMENT that does something they want done and could care less about how it does it."

    224:
    I think if this stuff was part of the core mathematics and IT curriculum (along with typing skills) we'd have ... well, we'd have young adults who recognized they were operating machinery, rather than magic boxes.

    Thank you!!! I'm tilting at windmills of course, but I've been agitating for years that we drop our algebra requirements (or at least drastically cut down on the number of core topics we're required to teach) and replace them with something like this. Something a little more relevant for our students as well as promoting the social good.[1] This shouldn't be difficult; the concepts you're talking about (and I agree with) are quite accessible to the average high school student. Heck, the average middle school student. And quite bluntly, even if certain design decisions are justified on the basis that, say, a lot of people don't understand a directory structure, well, they should.

    [1]I hate wasting my students time on money by teaching them subjects that are about as useful and topical as Latin. Instead, I'm of the mind that we should be giving them good value for what they're paying for. Call me old-fashioned ;-)

    225:

    Direct sunlight (and a long battery life) are all e-ink is good for

    Nope.

    The main advantage of e-ink (for me, at least) is that it doesn't flicker and doesn't glow.

    226:

    Don't you see the moral problem with this approach? If you don't, let me rephrase: "When are they going to realize that most people don't want a DEMOCRACY. They want a GOVERNMENT that does something they want done and could care less about how it does it."

    Yes I see it. But this is what happens unless we freeze technology. Where should we freeze it?

    1900? 1920? 1940? 1960? 1980? 2000? 2020?

    You pick.

    Switching to automobiles. (From a US perspective.)

    1900 You buy a car you better know how to take it apart and put it back together. To the point of tearing down the engine and having replacement parts forged at the local blacksmith. (I was recently told that until about 1900 every county in NC had at least one BS who could forge parts for things like autos or railroad engines.)

    1920 You don't have to know how to tear down the engine but you do need to know a lot about how carburetors work (so you can start it in various weather situations) how to adjust the spark, etc. Maybe clutching a non sync transmission.

    1940 You still need to know how to or pay people to adjust the dwell and timing, change the oil, etc... unless you lived in a major urban area and turned that over to a mechanic. But if you do this should you be intimate with what he/she is doing?

    1960 Still dwell and timing. But tires are to the point you don't worry so much about patching an inner tube on the spot. Auto transmissions and power brakes and steering are around. Are these the equivalent of buying a Mac today? You need to remember that at idle you can run the battery down with the lights on as alternators were rare or non-existent. Especially if you have air conditioning.

    1980 Gawd. Carb are starting to go away. Transmissions are mostly automatic. Brakes with ABS are around. Auto seat belts. Are people who buy such things headed for losing all their freedoms?

    2000 Engines are black boxes. They work or not. No adjustments due to those sealed at the factory computers. Ditto airbags and auto seatbelts. Some cars know which driver is in it and adjusts the seat, mirrors, etc... Most drivers have no idea how this all works. Are they fools for buying such a car?

    2020 Very little of anything a car buying of 1960 or earlier worried about is even a consideration. And much of 1980 and 2000. It is all taken car of for them. And the cars drive themselves in many situations. Legally they may be required to do so to be on certain roads. Again are the buyers of such a car a fool?

    My father in the later 40s had a car that every Friday after college he would drop the oil pan and pull the bearing brackets off the piston rods and insert folded newspaper. This was to avoid the expense of a full overhaul. Did it for about a year. I could do it if needed. My son might. But none of us ever expect to do that again and have no desire to go back to such a time. When my dad did it it wasn't odd. If I had done it a non trivial number of my peers would have understood. Doing it today, 99.999% of the population would have no idea of what or why we would be doing it.

    We could say the same things about radios, TVs, stereos, telephones, etc..

    Technology moves forward. At some point it becomes a black box for the vast majority of it's users. If this is bad you're living in the wrong century.

    227:

    Last post was too long. I need to bow out for a bit.

    228:

    Re: Cheap tablets with GPS

    I have an Optus Tab (a ZTE v9) with GPS & 3G that cost the equivalent of £66 when everyone bought them and they sold out in hours.

    It's no processor king, but when I replace it with the latest and greatest (probably a Nexus 7) it will end up in the car, doing navigation, handsfree phone, music/radio, and with Torque and bluetooth ELM327 - detailed car stats.

    Most companies have ditched the GPS/3G elements to save money, but if you can find a phone manufacturer that has just ripped the guts out of one of their smartphone models and put on a bigger screen, you can sometimes find a tablet with the requisite functionality.

    To charlie, I'm wondering how the apps apple dump on the ipad aren't considered vendor bloat? Can you delete them, do they not take up space/processing power? With apple you have no choice over what apps you use for many tasks, they are apple mandated. Whilst Samsung's behaviour is unacceptable, you do at least have a route to root, and gain control - one denied the icrowd.

    Seems churlish to say your experience with the apple-like behaviour of one manufacturer means Android tablets "suck", unless you are going to say "ipads suck" as well.

    229:

    "It's a media company now."

    Apple is like the Radio Corporation of America (RCA), which founded the National Broadcasting Company (NBC), so as to sell radios.

    Gotta have that content.

    NBC was separated from General Electric (GE)/RCA by anti-trust action. It is hard for me to imagine a similar action separating Apple from iTunes, Inc.

    230:

    There seems to be a big gap between the personal machines we have now and the type of machines that can run really useful AI software eg Watson. So we are back to timesharing with Siri.

    231:

    Can you delete them, do they not take up space/processing power? With apple you have no choice over what apps you use for many tasks, they are apple mandated. Whilst Samsung's behaviour is unacceptable, you do at least have a route to root, and gain control - one denied the icrowd.

    Jail breaking (rooting) is common with many iOS users. While replacing the OS is not really an option it does open up many more options. Basically your iPhone is a Mac without the Finder/Windowed UI. Using T-Mobile and loading apps not from the Apple store are the biggest reason. Plus UI tweaks. The TM across the mall from my nearest Apple store has a SIM cutter to handle the iPhone owners who want to switch to TM for service. Estimates for jail breaking range from 10% to 20% of the installed base.

    As to the pre-installed apps. You can gather them up into a folder in a few minutes and they don't take up much room. In general they are fairly lightweight. And if you jail break you can dump them. But most of the Apple apps are not really considered "crapware".

    232:

    "Andy Rubin and Rich Miner were really clear from the get go that they believed that the developer community and Open Source would keep the platform pure, that developers would drive this and do an end-run on the carriers and OEMs.

    "Anybody who has spent anytime in the phone industry knows this just isn't the case. The software costs of the OS are a fraction of the costs of getting a handset to market and, in relative terms, always will be."

    Yes, that is so. But Google sure did a good job of ignoring that. They believed in the magic of the market rather than the realities of industrial production and marketing.

    ...or perhaps they are just ahead of their time.

    233:

    The problem of closed systems is not the loss of technical skills, but rather the monopolies and pervasive surveillance which closed systems enable. This point is constantly forgotten. No, I don't want to return to the command line for many aspects of my use of information technology. I am unhappy with, and intimidated by, being dependent on business organizations which do not have any public interest at heart, and which gather vast amounts of information about me, which in turn are available to governments and criminals.

    234:

    This sounds about right.

    When you have so many nits to pick I guess it's sometimes hard to focus on the one wicked thing that actually requires some talking time.

    235:

    "However Win8 will (almost certainly, it's not gone gold yet) load on a UEFI PC and do all the verification and signing required to secure the boot process in the same way Apple does on their own hardware."

    Apple doesn't actually do that. You can boot anything you like on a Mac without needing to circumvent any sort of signing or protection mechanism.

    Microsoft is requiring Windows 8 Certified hardware to (a) have UEFI Secure Boot enabled by default (so it will refuse to boot anything which hasn't been signed by a known key) and (b) ship with a Microsoft key.

    That's actually what has to be done for the idea of "secure boot" to have any meaning: the whole idea is to prevent an attacker from installing alternate bootstrap or kernel code and getting that to execute on the next next boot. But it does have some unpleasant implications for Linux.

    Google "implementing UEFI secure boot in Fedora" for a very calm and non-Microsoft-blaming blog post about the technical issues, written by the guy at Red Hat who is working on their solution.

    236:

    For those who dislike glossy displays, the Apple Store lists "MacBook Pro 15-inch Hi-Res Antiglare Widescreen Display" as a build-to-order option. This costs a bit more than the standard resolution glossy screen. Another option might be the Retina Display model. Since it uses the front of the LCD panel as is, it should also reflect less glare than the non-Retina glossy screen.

    Also, OGH's comments about daylight in Scotland need to be considered with a pinch of salt. Edinburgh is often sunny for parts of the day. Depending on your point of view about fullness of beverage containers, either the sunshine is interrupted by up to five kinds of weather every hour, or there are occasional glimpses of the sun to leaven the dreadful climate. Either way, I find a glossy screen pretty hard to use in Edinburgh. There is just too much sun.

    237:

    Just don't get over-focused on Office.

    It's a teaching and training problem. There is a huge installed base of Office on computers in schools, and teaching computing has shifted from teaching about computers to teaching how to use a few key applications (and sometimes not to any great depth).

    The initial skills of corporate staff are now taught at public cost, time and money, while examination and testing processes only incidentally assess those skills. Teaching which might have some public benefit, but which is not assessed, has been squeezed out. Corporate interests point to lower quality work getting high exam grades, and complain, while pushing less obviously for the non-assessed training.

    And Office is there.

    Conspiracy? Or stupidity?

    I would guess that most of us have the wit and knowledge to break out of that trap. We know that the Open Office family of office suites exists, and are quite usable. Open or Libre, we have probably all tried them, and wouldn't back off, terrified by the unknown, if a system opened that program when presented with a Word document file.

    OK, it is more complicated for spreadsheets and databases. When we talk about Office in schools, we're talking about the kids being trained, at public expense, to be low-grade workers in the digital typing pool.

    The possibility is there that tablets, because they are not general-purpose computers at a practical level, will shatter the monopoly of concepts. And that is why Microsoft might be doing what it appears to be. Only it isn't intended as a Windows machine; it aims to be a machine running Office, in the limited typing/presentation way in which most people experience Office.

    And if you leave school, full of buzz and fizz about your leet skillz, the nearest you shall get to using a tablet is waving a ruggedised, ugly, portable thingy, around the supermarket shelves you are keeping full.

    It isn't the hoi polloi who do Powerpoint.

    238:

    ... and, mere hours after telling you all how much I loved my Nexus One, the power button (which is also the "wake from sleep" button) broke. Fortunately, there appear to be workarounds.

    239:

    I understand what you are saying. Yes, as science and technology progress, one given person can understand less and less of it.

    However, I think The Car is a wrong analogy. A car is a tool that does one specific task, among other tools that do other tasks. A computer, on the other, is much more than just a tool. It is rapidly becoming an inseparable part of our intelligence and the main way we interact with the world. To be completely ignorant about how such a thing works is... unwise.

    240:

    This is an unfortunately common problem with Nexus Ones - or at least, that's what I learned when it happened to me a couple of months ago. If you're not able to get it replaced, then this is your chance to root your phone and install Cyanogenmod. Take heart, it's not as difficult as you might be thinking. It went very smoothly for me despite my lack of technical nous, and results in a phone that arguably has some advantages over stock. Good luck!

    241:

    George,

    sorry, I missed your question. The tablet is a Zixoon a71 - currently on offer for 89 Euro in Germany. (No idea about international availability.)

    242:

    I've pretty much given up buying new stuff.

    Computer manufacturers and sofware devs could learn from luthiers.

    Barring "hidden" materials changes, my electric guitars are pretty much the same as those made in the 1950's, the design for my acoustic is even older. Even newer models have a great deal of commonality with the classic designs.

    I expect that Telecasters etc will be used for quite some time to come. Will we reach that point with computing devices any time soon? I mean, where someone trained fifty years ago can pick up a brand-new machine and already know their way around it and where skills don't ablate with time?

    Would this be a good thing? I'm about Charlie's age and I'm already heartily sick of learning a new way of doing the same things every couple of years. I recently tried out the new release of Ubuntu after using Slackware and Open Solaris, and in the end I uninstalled the thing in disgust. I'm sure it does a lot of clever stuff, and that my older distros do a lot of stupid things, but for what I want a computer for, it was annoying and sluggish.

    Software companies have to knock out a new release every couple of years, but there's got to be a better approach than burning the world down after each Five Year Plan.

    Imagine if every five years or so Gibson changed the order of a guitar's strings, or changed the fret intervals.

    243:

    Charlie @210: Firstly, client premises may be presumed to have mains sockets. Right?

    You'd think.

    Naively one might assume that every meeting room chair would have at least two mains sockets and an ethernet port for a network connection (not into the corporate network, only to the internet) if they're too paranoid to run a wireless network.

    If you work on that assumption you'll end up rooting around the client's campus in search for a power strip on your very first day. Makes for a remarkable first impression when it comes to thinking ahead, too. Been there, done that, got the T-shirt.

    ... (was going to say something about corporate network and security issues here, too, but self-censored)

    My point: Bring two batteries and turn in your phone and your 3G dongle at the gate if they ask you to.

    Charlie @210: Finally, I suspect Apple (and others) are waiting for the TSA to realize that a removable LiION cell and a device for creatively damaging and shorting it makes a nice incendiary device, and bans all devices with removable lithium batteries

    I don't doubt for a minute the TSA could be manipulated to bring about this little pearl of regulatory wisdom and I'm sure Apple and other manufacturers are lobbying for it as we speak. In other words: I'm afraid you're right.

    Having said that, it would be obviously ineffective. There is no reason at all to believe that you couldn't rig a gadget to burn without any external giveaways, and I think it would make for an even more spectacular show if you managed to ignite a lump of magnesium (e.g. gadget case) in the process. BTW, if you were trying to crash the plane you're on, not damaging your gadget to do so would probably not feature as number one on your priorities list.

    For more tried and proven useless regulation ref. "binary liquid explosives", "terahertz scanner". I wonder who lobbied for that?

    244:

    "I live in Scotland. What is this "bright daylight" you speak of?"

    You know how when you go into an office and turn on the lights?

    Your iris muscles scream in agony as your pupils contract so much that your eye whites are visible, your skin starts smoking and your clothes go under 50% water by weight?

    In some areas of the world, that's a common occurrence outdoors.

    245:

    I'm about Charlie's age and I'm already heartily sick of learning a new way of doing the same things every couple of years.

    I'm older and also tired of it. I was talking to someone a few days ago and told them the difference in my field and theirs is while we both need continuing training their knowledge expires slowly while mine turns into antique status at a rate of about 50% per every two years. Anyone want someone who can rewrite the microcode on an async card programed with an 8080 chip? How about writing business applications in 8K? Or how about all those folks who were wizards at SNA and TokenRing? Or who actually know the difference between a hub and switch? Half duplex anyone?

    WinXP seems to be a skill that has lasted but the value of that is likely to fall off a cliff soon for the vast majority of practitioners.

    246:

    WinXp and Win7 are almost identical, apart from the latter being more resilient and the control panel better laid out.

    247:

    Actually, the magnesium alloy they use in computer cases doesn't burn well (long link to highly entertaining article on how to burn a NeXT Cube's case).

    248:

    I was talking to someone a few days ago and told them the difference in my field and theirs is while we both need continuing training their knowledge expires slowly while mine turns into antique status at a rate of about 50% per every two years.

    I have a CS degree from 1990; I like to explain to non-insiders that it's a bit like having an aerospace engineering degree from 1927.

    I stopped programming for a day job in 2000, stopped writing about computers every week for a living in 2005, and I am now officially obsolete. It's terrifying how fast a combination of new software releases with extra features and the slow bit rot of ageing neurones can cost you the ability to configure an Apache or EXIM server.

    249:

    I have a CS degree from 1990; I like to explain to non-insiders that it's a bit like having an aerospace engineering degree from 1927.

    And you posted that remark on the 100th birthday of Alan Turing. Some people don't think about it still being a young field as well as a fantastically quickly changing one.

    I'm about your age and was once hot stuff on an Apple ][...which has been obsolete since before some of today's programmers were born.

    250:

    Yes.

    And No.

    Windows XP is barely adequate with multi-core processors, and network traffic can drag down the speed of the whole machine.

    With the only change—same hardware and same user-software—being the Windows version, switching from XP to 7 removed the network traffic effect.

    You could call Windows 7 a new and better-made pair of pants, and suggest switching to the Mac is like switching to wearing a kilt. That doesn't get across how much better than XP is Win 7.

    251:

    "I have a CS degree from 1990... like having an aerospace engineering degree from 1927 ... I am now officially obsolete."

    -- Your comment above sums up why non-tech users (including students) refuse to waste their time trying to learn bottom-up programming.

    -- A question about multicore:

    Just how different is programming for multicore versus programming/software design for single-core? When I upgraded to a 64-bit (dual-core) OS I was disappointed I couldn't use some popular apps because there was/is no 64-bit Flash. From this experience I'm guessing that while Moore's Law might be applicable to hardware, software needs a much longer incubation/gestation period. This is something that computer-based near-future scenarios also tend to overlook/ignore, i.e., the substantive difference between hardware and software.

    252:

    Actually, my guess would be that he didn't realise as he was re-"buying" all his apps that Apple wasn't charging him a second time.

    I'm an app developer, and I frequently get complaints from users who have had the app crash, deleted it (because they don't know how to force quit) and then reinstalled, thinking that they are having to pay for the app again (despite a modal alert that explicitly tells them each time they download the app that because they have bought it before they won't be charged).

    Users generally don't have much idea what's going on when they use these devices. It's little wonder that iDevices have been so popular given the lengths that Apple has gone to protect users from themselves, and a little ironic that most of those users' complaints about the platform are because they don't realise that it already works the way they want it to because it does it so seamlessly that they never noticed.

    253:
    -- Your comment above sums up why non-tech users (including students) refuse to waste their time trying to learn bottom-up programming.

    But maybe they should get a short course with a simple language so they could at least learn the basics? Nothing fancy, just stuff like looping and if-then statements. Something that I learned myself in BASIC many years ago on a TRS-80.

    Btw, an honest question from someone who teaches but doesn't have much exposure to the real world - how often do the guys who do actual coding use (tail) recursion in those fancy languages they use these days? I can see most people cogging on to looping at the middle- or high-school level, so that would be appropriate to teach. But recursion seems to be a persistent stumbling block for most people. And even when I first took C (many years ago, which may be significant), the subject was mentioned but - ahem - 'demphasized'.

    254:
    WinXP seems to be a skill that has lasted but the value of that is likely to fall off a cliff soon for the vast majority of practitioners.

    Agreement on WinXP, probably for the same reasons a lot of people still use the Windows 2003 office suite[1] instead of upgrading to 2007 or 2012: Because they know where all the damn buttons are.

    [1]To the extent they use it at all, which typically happens only under threat of force. I still have to submit some official documents to the school twice a year which won't be accepted unless they are in Word.

    255:

    I have been writing s/w for 30 years and have never used recursion. I have also only used pointers to functions once in C.

    256:

    how often do the guys who do actual coding use (tail) recursion in those fancy languages they use these days?

    You might as well ask how much we use register colouring. The answer is that that's the sort of thing we expect our languages to worry about, just as we expect them to worry about where the variables actually are in memory.

    (In practice I'm not sure I've ever encountered a situation where it made sense for me to consider tail recursion in my code. I'm not writing tree structures, for example, even if I'm using them.)

    257:

    Here's my prediction for operating systems of the future. They are all going to get better until to the user they are functionally identical.

    258:

    Nope .. not ' Easier to stack that way. ' as cubes but rather the shape of choice would be that of those ghastly plastic stackable chairs that have replaced the once ubiquitous standard laminated wood and tubular steel chairs of UK schools and institutions of once upon a time 'new ' school buildings of the late '50s early -60s of the last century when I was young. One size fits all usually meant that that size wouldn't fit anyone.

    259:

    Dirk:

    I have been writing s/w for 30 years and have never used recursion. I have also only used pointers to functions once in C.

    Bellingham:

    In practice I'm not sure I've ever encountered a situation where it made sense for me to consider tail recursion in my code. I'm not writing tree structures, for example, even if I'm using them.

    I'm guessing you guys are representative until I hear otherwise. And what you've said is fairly depressing :-( I'm riffing off the subthread "what do users need to know", in particular DavidL's comments about things getting more black-boxy over time, whether that's consumers with their walled gardens or kids being taught how to use email and spreadsheets in their IT classes instead of BASIC.

    As Charlie notes, there's nothing wrong with the walled garden approach per se, it's what's in the garden. The same thing applies abstractly to teaching; everyone has to do triage and decide what to keep, what stays, and what needs to be emphasized and developed. I reluctant to put recursion into the "what every man needs to know" box because a) outside of academia, it doesn't seem to get used much and b) it's one of those things where a lot of people just don't get it on the first pass, so you have to spend extra time on it.

    I was hoping that people would jump in with comments that most of the new shiny! is absolutely reliant on functional programming and that's today's hot ticket into the work force. But no such luck :-( I'm guessing that Androids, iPhones, and most of the tablet offerings are still done in some old garden variety of C or Java.

    260:

    I have seldom used recursion in a real program (as opposed to coursework in my degree), but sometimes it's the easiest way to do something. As for pointers to functions - qsort in the standard library takes a pointer to a comparator, and there are callbacks in all sorts of places.

    261:

    You could say that Android is programmed in Java, but while Java is relatively easy to learn the thousands of classes available are far from trivial. Even just knowing what classes are available is tough for a beginner and you rely heavily on the development environment eg Eclipse.

    262:

    The internal structure of a Galaxy S 4G seems to be more flammable, but the test was performed with filings, not in situ.

    The Apple Wireless Keyboard works pretty well with Android; aside from Fn, it's been my favorite computer keyboard for a few years. The Incase Origami Workstation is cute and functional--the top cover is stiff enough to avoid inadvertent key presses. And you can stash the hated dock cable between the bottom of the keyboard and the bottom of the case....

    263:

    I think that since human beings (and other mammals and quite a few not-mammals) contain liquids it would be better to have a shape which can contain a pressure vessel such as a sphere in a more efficient manner than a cube (or a stackable chair), while being just as nicely stackable as a cube.

    Of course, oblong boxes, shoebox forms, hexagonal prisms, and other shapes with internal stretchable cylinders come to mind immediately, but as always my vote goes to truncated octahedrons:

    http://en.wikipedia.org/wiki/File:Truncated_octahedra.jpg

    In addition to being more efficient than a cube in containing a sphere they are just as great as cubes when it comes to space-filling tesselations.

    Also, apart from being interesting shapes for better humans they would also be good shapes for computers used in environments where heat loss and heat conservation are important issues.

    To tell the truth, I find the shapes of current computers boring and I wish some brave souls would branch out in a greater variety of directions. I mean, here you have Raspberry Pies popping up and nearly all of the casings are totally boring. And those slates? Boring casings! Notebook casings? Boring! Yes, they're rather good from an ergonomic point of view, sometimes, but when it comes to dissipating heat they are often dangerous in addition to being boring!

    264:

    I [am] reluctant to put recursion into the "what every man needs to know" box...

    Well...yeah. I've used it, but it's not my first go-to thing. It should probably be mentioned when teaching programming, but I wouldn't spend too much time on it. For "what every man needs to know," there's looping; if beginning students get through a course without that, something is wrong. Ideally they'll get shown a few variations and be able to play with them, but if looping itself isn't understood they're not going to be very effective programmers.

    265:

    Recursion.

    Over the years of my programming I spent about 1/2 of my time doing business applications and 1/2 doing systems level stuff. Microcode, device drivers, OS task schedulers (waaay back in the day), etc...

    While I used recursion at times doing OS level programming the only time I saw a case for it on the business side was when dealing with a Bill of Materials (BOM) type of application. And asking business programmers to "just do it" tended to lead to either PITA constraints on how deep it would go (due to design or really bad implementations) or no limits and the option to have the programs go off into never never land until they crashed out on lack of memory or virtual storage or similar.

    I wonder if there are other business reasons for recursion other than BOM and if not maybe it's just a special case.

    266:

    Is there anything that actually needs recursion that cannot be done any other way? My impression is that it was once a neat way of saving on memory, something that is no longer an issue. OTOH, coding recursive functions is just asking for trouble in terms of the ease of introducing bugs. The code is also harder to understand.

    267:

    WinXp and Win7 are almost identical, apart from the latter being more resilient and the control panel better laid out.

    I was speaking from an IT point of view. Installation and support are vastly different. And when Secure Boot comes out many companies will have to finally give up on XP or move to white box OEMs which will cause larger companies in the US to have issues with Sarbanes-Oxley and exceptions to internal corporate practices. I'm sure there will be issues in the EU with this as well.

    268:

    There are plenty of problems where the recursive code is more readable and understandable than for the iterative code. The one general example I can think of off the top of my head is Welzl's algorithm; the recursive form is much easier to understand than the iterative form.

    269:

    Quicksort is recursive and everyone uses that. But it's in a library so you don't need to know how it works. Sorts, searches, trees, tries - recursive and it's in a library. The people who implement the library need to know how it works. Everyone else just needs to know how to call it. Recursion is everywhere but it's write once code. Nobody writes stacks, queues, linked lists by hand anymore - the framework already has optimised versions of those and dictionaries, hash tables and every other data structure that CS used to teach. And they all have so many gotchas and obscure optimisations that the library version is bound to be better.

    270:

    And they all have so many gotchas and obscure optimisations that the library version is bound to be better.

    I've known the choice of library to make a big difference to the speed of the program.

    271:

    Logic programming uses recursion a lot (after all, the canonical logic algorithm is a pruned tree search). I've used recursion a fair amount in dealing with tree structures and more general graph structures. The standard algorithm for unpacking a serialized class or datatype stream with internal references is a recursive graph traversal with an identity set for detecting already-traversed subgraphs. And constraint resolvers are typically recursive.

    With a decent compiler you can write recursive code that's much easier to read (and maintain!) than loop code but is just as efficient.

    272:

    Writing recursive code is simple, and IMO less prone to bugs than iterative loops (especially in languages like C that give you only boolean or integer loop controls).

    273:
    "Just how different is programming for multicore versus programming/software design for single-core?"

    After you get to a certain number of cores - it gets pretty different. There are two reasons.

    1) Most software is written in a way that assumes that you don't have massive parallelism - so it won't take advantage of it automatically after a certain point.

    Metaphor time. Think of it like a haulage firm (software application) with 10 trucks. With one road (single core) they get there one after another. With five roads (five core) you get everybody there much faster with two trucks to a road. But after you get to more than ten roads you don't get any additional advantage unless you throw away your trucks and rebuild with lots more small vans... or possibly bicycles.

    Most people in software development world are still building trucks. Many software development tools, techniques and strategies only work well with building trucks.

    To take full advantage of systems with dozens or hundreds of cores people are going to have to adopt different approaches.

    2) The other problem is with the hardware. A you have more and more cores the existing caching strategies for things like access to memory become less effective. Most systems (hardware and software) are built around the idea of "the processor" accessing single shared resources like "memory". That stops working well as the number of cores increases.

    Too switch back to the haulage firm metaphor. If you have four or five trucks arriving at the same time then you can cope with a single gas station or loading dock. When you have hundreds arriving at the same time - not so much.

    This is going to lead to different hardware architectures, and in turn different software development techniques and strategies to take advantage of that hardware.

    274:

    iZettle.com Got one of their iPhone or iPad chip and pin readers for our home based business, seems to work fine. Does confuse some people when you explain that yes it is a proper card reader and now just type your PIN on the screen.

    275:
    Btw, an honest question from someone who teaches but doesn't have much exposure to the real world - how often do the guys who do actual coding use (tail) recursion in those fancy languages they use these days?

    Recursion I use whenever a recursive solution seems like a good one. Wrote a little continuation-passing style bit of logic solving in a recursive style last week for example (in Perl :-)

    I have to admit I rarely find myself thinking consciously about whether something is tail recursive or not for a few reasons:

    1) It's an optimisation... see http://c2.com/cgi/wiki?RulesOfOptimization - so unless it becomes a performance issue I'm going to prefer clarity

    2) For the subset of languages that understand tail-recursive optimisation and apply it automatically the language will do it for me - so I don't have to think about it

    3) For the much larger set of languages that don't, and where the un-optimised recursive version is causing problems, then I'll happily optimise it out by hand. However I may not be thinking about it in terms of 'tail optimisation". I'll probably be thinking about it in smaller chunks of refactoring logic. I'm sure there are hundreds of developers who have successfully turned a recursive solution into a more efficient looping one without thinking of it in terms of tail-recursive optimisation. Just coz folk don't have a name for it doesn't mean they can't spot it :-)

    276:

    Is there anything that actually needs recursion that cannot be done any other way?

    Fundamentally no. You can always rewrite in a non-recursive style and make the state of the execution stack explicit in a separate data structure.

    My impression is that it was once a neat way of saving on memory, something that is no longer an issue. OTOH, coding recursive functions is just asking for trouble in terms of the ease of introducing bugs. The code is also harder to understand.

    I don't think it's that clear cut.

    1) Clarity is in the eye of the beholder and depends a great deal on their background. For example I've found that folk who have been introduced to development via functional languages, or who have a strong math background before encountering development, find the recursive style much clearer.

    2) Implementation language makes a real difference. A recursive factorial function in erlang is much clearer than a looped non-recursive one. In C - not so much.

    3) There are a bunch of algorithms that are "naturally" recursive. Expressing them in a non-recursive way just adds noise and decreases clarity (to my eyes anyway - see point 1 :-)

    277:

    Exactly.

    Recursion is a basic tool, but tail recursion is an optimisation technique. For the structures I have, which are mostly flat trees with an occasional bit of sub-tree, tail recursion makes no sense.

    278:

    Ok, hearing otherwise here.

    I've used recursion in the last month or two: working down through large filesystems (which are tree structures).

    You don't use it (directly) much, but it does simplify the code.

    279:

    I'm not yet willing to pay 600 dollars for a tablet,but I did get an HP touch pad for 150 bucks. I run cyanogenmod on it ant it works as well as any other tablet.

    I think the MS Surface will be a train wreck. Users will see their tech saavy friend with this awesome tablet that runs PC apps. they will run out to buy one but will get the 600 dollar RT version instead of the 1000 dollar pro version because they will say "I'm not using it for work or business,I don't need the extra memory and faster CPU. Then they will get it home and spend a couple days trying to get it to run their favorite programs. finally they will fin pout that the two different models are really two completely different tablets. On is the cool tablet that is really a complete PC with a touch screen interface and can run all your windows software plus special tablet apps. The other uses a incompatible CPU and can only run special tablet apps. they will be angry,return the tablet to the store and for that matter probably stay away from windows 8 completely. in fact,some users will not conclude that there are two different tablets at all. they will conclude that windows 8 won't run ANY normal windows programs at all,and be so frustrated by the while thing they will go by a Mac,after all,MS made it so the new windows won't run any old apps anyway so either way they have to buy all new software.

    Seriously,my mother can't keep the difference between RAM memory,hard drive storage and flash drive storage strait and has to call me every time she wants to create a new word document. How's she going to understand that there are two completely different versions of windows 8,that need different versions of the software you want to run,but look identical,and have very similar sounding names. She has no idea what an arm chip is,how's she supposed to understand what the difference between RT and Pro is.

    280:

    You're sort of on the money, only not.

    Many years ago when I was writing for a large computer magazine, they did a survey of the folks who bought the mag (for the adverts -- this was pre-2005). It turned out that about 60% of the customers never installed any applications on their PCs that didn't come with the machine, pre-installed in the first place.

    Welcome to the world of tech-illiterate home users. Those folks will use the RT version of the Surface tablet with the bundled apps. If MS make it easy enough, they may even download and run Angry Birds from a managed app store. Trying to install Photoshop from DVD-ROM? Not so much.

    What you're describing is the frustration of the other 39%, who know just enough about computers to be dangerous to themselves. (As opposed to the 1% who do this stuff for a living.)

    281:

    "I think the MS Surface will be a train wreck. Users will see their tech saavy friend with this awesome tablet that runs PC apps. they will run out to buy one but will get the 600 dollar RT version instead of the 1000 dollar pro version because they will say "I'm not using it for work or business,I don't need the extra memory and faster CPU. Then they will get it home and spend a couple days trying to get it to run their favorite programs. finally they will fin pout that the two different models are really two completely different tablets."

    When the iPad first came out I had a difficult job explaining to a group of pro photographers that no, it wouldn't run Adobe Photoshop even though it was an Apple "computer". It's not just an MS thing.

    The Surface Pro WILL run Photoshop out of the box though and it's half the weight and more compact than a Macbook Air.

    282:

    Super late to the party but yes, this is exactly why I have a Transformer Prime as my main non-desktop machine and my (admittedly ancient) netbook is sitting on a shelf. I hit a button and it's immediately ready to go. Since I do a lot of freelance writing work on public transport, it's invaluable.

    283:

    I believe the cache coherency problem with arbitrary numbers of cores in heterogenous architectures has been solved.

    284:
    I believe the cache coherency problem with arbitrary numbers of cores in heterogenous architectures has been solved.

    First define "problem" and "solved" :-)

    As far as I am aware every consistency model used in cache design has performance and scalability trade offs. I'm not aware of any system that will keep the same kind of performance advantage you get for each additional core, once you start talking about hundreds of cores, with current hardware architectures.

    To get the full advantage of multiple-cores in the hundreds you're going to be looking at more transputer-ish architectures where each core has integrated memory and message-passing facilities. Getting the most out of of these kinds of architecture is [understatement]moderately tricky[/understatement] with many development styles since they're built around the assumptions of existing architectures. Suddenly functional languages with immutable state and an in-built message passing model (like Erlang) suddenly look a lot more attractive.

    Disclaimer: I am not a hardware guy. I am not a hard-core algorithms guy. I do occasionally hang around folk who are.

    I'm just a sad-techie who finds this sort of thing interesting. If there's new research around that lets you use 100+ cores to the max on current HW architectures I'd be fascinated to get pointers - as long as they're something I have a vague chance of understanding. The last book I read on the topic is ten-plus years old now so I'm sure there have been some interesting new stuff found.

    (The book, for folk who are interested, is reasonably approachable if you're of a technical bent. I'm not a hardware guy and found it understandable. "The Cache Memory Book" by Jim Handy. Get it from the library. Silly price last time I looked ;-)

    285:

    I'm intrigued as to all of the hostility towards the walled garden approach? Maybe I've been drinking too much Kool-Aid, or maybe its an age thing, but right now I just want things to work.

    If the rules around the garden are clear and serve to reduce the risk of malware, and provide a software quality update, isn't that better than having to rely on users being sufficiently technically savvy to not install rubbish on their device?

    286:

    I'm currently an academic, but I'm pretty sure I've used (non-tail) recursion while working in industry, and I've definitely used closures. Recursing over a directory tree is the obvious example for the former, and several of the modern Javascript frameworks rely heavily on the latter.

    Functional programming's making some headway - Clojure and Scala in particular are used by some real companies making real money, and I believe Haskell and OCaml have some devotees in finance - but it's still a minority pursuit AFAICT. But features inspired by functional languages are increasingly being found in mainstream languages.

    287:

    Just how different is programming for multicore versus programming/software design for single-core?

    It depends. I was three years into a career as a software engineer before I got to use a high-level language. Eight years before I got to use an OS. Eleven years before I was programming for a single core... (i.e. the point at which I left the defence sector and the delights of hard-real-time embedded systems in airborne radar. And having jumped to telecoms, promptly got grabbed for a MIMD hard-real-time assembler project in my new firm). It got really fun when the %^#*#+% hardware engineers don't give you the tools that you need (what, shared memory? We didn't have room on the board) or project managers (I used to code all of this on a PIC, why do you want a licence for an RTOS, the BoM costs can't take a licence,...).

    It takes a bit more thinking; the linear flow beloved of the single-thread programmer is a simplification. It's more like project management - in fact, it's the one thing that MS Project is good for, namely task scheduling and dependency visualisation on a multiprocessor system. All of those CS lectures about deadlock avoidance and scheduling strategies suddenly become relevant.

    It's not a problem that lends itself to an automated solution - you have to think about it. There have been various attempts to produce parallel compilers over the past several decades, and none has really succeeded - fine-grained parallelism foundering at the synchronisation overhead, coarse-grained parallelism needs language support. Meanwhile, the average CPU now contains features that used to be taught in my "Supercomputing Architectures" course at University, and combined with Moore's Law lead to a "good enough" solution that puts off the necessity. I graduated in 1988, and our dreams of Transputers (ooooh, a T800) evaporated along with the desirability of OCCAM on your CV. Aside: we actually looked at Transputers for the radar in the late 80s, but the lack of a second source / ecosystem, and the cheap licenses for the SPARC put paid to that.

    Having seen the complete arse that many programmers make of "object-oriented" (I like it, it's just that many people who say they do it are actively missing the point and making a pig's ear of it), I dread to think what they could do when handed a parallel problem.

    The hardware engineers (i.e. the HDL coders) think in parallel because that's how Verilog and VHDL work; but then, they're stuck with 80s programming styles ;) maybe the answer will come with tools that hand off from software to HDL - the first worthwhile ones are finally starting to appear.

    If I'm lucky, the answer will be some form of C++ :)

    288:

    Martin and Adrian Howard have pretty much covered the forked stick we've found ourselves in now as regards multi-core architectures and the problems of building hardware and software that actually make use of the exponential performance gains that Moore's Law has cursedblessed us with.

    One huge problem is that building massively parallel systems (>1K processors) requires either tailoring the system to the problem (see systolic processors for signal processing, pyramid architectures for machine vision, intelligent RAM frame buffers for graphics) which means a lot of different kinds of hardware, hindering manufacturing economies of scale, or tailoring the problem to the system (moving cpu-intensive operations to server systems where many standard programs can run as individual processes, i.e., cloud computing) which means a whole new set of security, reliability, and data lifetime issues.

    Having seen the complete arse that many programmers make of "object-oriented" (I like it, it's just that many people who say they do it are actively missing the point and making a pig's ear of it), I dread to think what they could do when handed a parallel problem.

    Great Cthulhu, yes! I've been working with object-oriented design and programming since 1985, and it's a truly lovely thing, especially the way it makes refactoring a routine operation1. But I've also had to read a lot of other peoples' code in that time, and especially since Java became very popular there are a lot of programmers who write Java code as if it were C, and objects as if they were C data structures.

    As for how badly those same people can screw up parallel and concurrent code, here's a really horrible example. In the late 1980's I was working at Tektronix, writing a pluggable-device graphic input system for an intelligent 3D graphics workstation. The entire graphic firmware system was written in C on top of an RTOS, and because we had 15 programmers, most of whom had never written concurrent code, we went to great lengths to have conventions about locking and synchronizing so that we could prevent deadlocks, and even so, every 3rd project build was massively broken. At one point I was having trouble hitting a deadline on a part of my system and I asked for some help. A junior programmer was assigned to help me write code; I walked him through the design of the subsystem involved (a 3-axis input driver for a particular class of devices), pointed out the hooks that connected it to the rest of the system, wrote out pseudocode for one axis, and explained that there needed to be 3 concurrent threads, one for each axis, and gave him a current snapshot of the rest of the system to build and test against. A few days later he came back and told me he was done, so I ran a test. The first axis worked, but when I tried to get input on the 2nd and 3 rd axes, the code executed a branch to Venus and died. I did a quick scan of his source code and discovered he'd written out the code for a single thread and completely ignored the fact that two others needed to be running, leaving garbage pointers in the hooks where they were supposed to be referenced. It took me half an hour of careful explaining for him to get what "three concurrent threads" meant, and this was a guy with an engineering degree and a couple of years of professional programming behind him, 6 months of it on this particular project.

  • Refactoring used to be a manual operation that often could take hours for something as simple as moving a variable and the functions that access it to a different scope, because of the need to test for and often fix bugs induced by typos or bad cut&paste. But objects give the IDE designer (whether it's emacs or IDEA) a framework on which to write automatic transformations on code that are guaranteed to be correct, so a complex refactoring can take seconds, and the unit tests for the refactored code can also be automatically refactored.
  • 289:

    Adrian, Martin and Bruce - thanks all for your explanations! (I'll continue to look up the various terms ... familiarize myself with this lexicon.)

    Bruce -- re: OOP and Cloud Could you elaborate on this relationship? Thanks! (I'm imagining that OOP 'code modules' behave like the 'packets' sent through the Internet, but this is probably wrong.)

    290:

    I think it is worth noting that Sony, who used the cell processor in the PSP 3, is abandoning it for the PSP 4. The programmers at the game companies found it too complex to deal with.

    I was reading the latest supercomputer ratings recently and noticed that one of the numbers published is a figure of merit based on how well the multiple cores are utilized.

    291:

    iZettle.com

    Ah. The EU version of squareup.com

    Apparently it takes more energy to read the chips than to swap a mag stripe. The Square dongle is powered by a tone put out on the earphone jack.

    292:

    1) Don't knock the all the "nasty knock-offs". Some of them are actually quite good. Ainol (yes, it's ripe for jokes) puts out some nice tablets, with IPS and high dot pitch screens for around half the cost of the Samsung kit. And you can get some actually okay, cheap but cheerful ICS 4.0 capacitive touch TFT tablets in the US now for between $60-$100, with HDMI and SD and even sometimes IR on-board. And many of these come straight from China with no crapware, completely vanilla ICS, pre-rooted. I'd compare the current low-end Android wave of devices to the early-to-mid 1980s microcomputer revolution. Yeah, at that time you could also go buy an Apple ][ or GS or even the weird monochrome fishbowl screen Mac for a more integrated experience, but you would also be spending 3-5x as much. Two generations later Apple is still selling the same experience. The vitality in the low-end Android market is exciting, it's full of younger kids, hackers and hardware mavens enhancing and repurposing these cheap-as-dirts little devies. I've velcro'd one onto my fridge to act as a quick stock checker using a barcode/QR scanner.

    2) The TF101's cable is laughably short. But it's a detachable USB at the othe rend and you can add a USB female/male extender cable for around $2. I've run USB extenders up to 20m and they work for signal integrity so I'm pretty sure they'll work for something simple like power.

    293:

    Multi-Process, Multi-Thread, Multi-Core

    All of these possibilities have been around for decades, and while I agree that they are difficult for anything that operates with massive data dependencies, there are many applications where it's much easier to make good use of them. Disclosure: I don't concern myself with pipelines and cache structures very much with the silent assumption that the out-of-order execution units will take care of a lot of my problems where cpu performance is concerned (no real-time requirements in any of my current projects).

    To make multi-processing useful, IMO it very much depends how you slice your problem (assuming you had to write them yourself), e.g:

    Would I want a multi-threaded DOM-Parser: I don't think so. It depends too much on the order of SAX-Events, which makes it non-trivial. It's not entirely useless to have threads for different types of events, especially if they are order independent (like attributes associated to an element) if your control thread is smart, but that will only work out if you have very good staff.

    Would I want multiple single-threaded DOM-Parser instances if I want to parse a lot of documents: Yes. Everybody and their dog could do it.

    Another example: Would I want a single-thread program convoluting GUI, Business-Objects, Data-Objects: Hell, no.

    Would I want a single-thread control for DB and IO access: Very likely. It would give me the option to multi-thread transparently if needed.

    These are all very well known, tried and proven concepts, each with several workable implementations, but you make it sound like voodoo.

    For all that may happen across this thread and hesitate to try their hand at multi-whatever programming, there are some fairly simple rules to use it:

    (1) Keep your UI thread(s) and your control thread(s) separate from all threads that manipulate data.

    (2) Be aware that once you've spawned a thread or forked a process, you can never be sure at which point in time which piece of instruction in your code are going to be executed except that thread-locally (i.e. for each thread instance) execution order is guaranteed.

    (3) A lock isn't a lock if you have more than one thread to control access to it.

    (4) In a thread you may have access to parent's variables. Don't try to use them as a lock or to concurrently write data unless they are specifically designed for it (see (2)). You can NOT be sure of the order of access to data structures even if they are in a separate control process with higher priority that picks up results from child processes or threads.

    (5) Callbacks don't guarantee order of execution.

    (6) Where you need IPC, use a single, blocking, light-weight, data-only atomic structure (forget everything larger a few K Byte unless you're prepared to make all other threads wait) for everything that one process might need to communicate to any other.

    (7) A thread is not so different from a process, except that thread tends to share more data, and one blocking thread can block the entire process it's running in. For a process, you can use the OS to ungracefully abort if you happen across bad code.

    (7) The persons discussing the stuff above are all active in very much more advanced fields of CS than you are now. Worry about the problems they discussed when you come across them, and that's not likely to be soon. If you do, you can probably step back and take a different tack at the same problem and still solve it. THEY are paid to solve a host of potential problems, YOU are most likely paid to solve one.

    294:

    Charlie--the better Android system out there is the Asus. I have the Asus Transformer (not the Transformer Prime) and it lacks a number of the issues you're having with the Samsung.

    I like it as an e-book reader, but it's also good for quick notes on the fly. The word processor that came with it (Libre Office, I think) is fully compatible with my Word or Open Office on the iMac, and I've not had any issues with it.

    Not really happy with the keyboard dock for long-term typing or word processing, but for taking notes when I don't want to carry around the MacBookPro, it works just fine.

    295:

    Making programming easier for the non elite programmer is a goal of OS vendors like Apple and Microsoft because if it's hard to write software for your platform it doesn't have as much software, and that makes it less appealing.

    Pointers are hard so Java got rid of those and when Microsoft's embrace and extend on Java failed they created C#. Idiomatic Objective-C doesn't use pointer arithmetic so the id type is basically used as a reference - although the underlying implementation is a C pointer. But you'd have to know C to try dereferencing an id so shooting yourself in the foot would require a special effort.

    Memory management is hard so Microsoft and Apple make their platform languages garbage-collected and/or reference counted. Apple's recent upgrade to automatic reference counting probably makes it as easy to use and more efficient than garbage collection on mobile devices.

    Threading is hard and multi-core makes it necessary so Apple introduces GCD and blocks and Microsoft has thread pools and something else depending which type of developer they are addressing.

    Apple has one platform language (Objective-C) which they are making average-developer-safe whilst retaining full access for those developers that need it. Microsoft has C# for average needs and C++ for those that need to get to the metal .

    296:

    Ah, thank God. I can't tell if I was feeling more stupid that I hadn't thought of chip-and-pin, or appalled that nobody had figured out how to deal with that.

    It's also deja-vu that one of their testimonial cases was, of course, a touring band. Bands, restaurants, and very tiny businesses are the major users of Square that I've seen here in Portland.

    297:

    Ah, lovely. At least under that brand name, it seems to be mostly only available in the German-speaking lands.

    I might make one of my Berliner friends play guinea-pig on my behalf...

    298:

    SFReader:

    I think you misunderstood me. I mentioned the cloud in reference to parallel computation; my point was that most programs running in the cloud don't need to be explicitly coded for concurrency because in the cloud you can run millions of instances of a given app in parallel, but they don't need to synchronize or lock resources at the application level because that's done in the web server, the application server, and the database.

    Let it also be said that I've done a fair amount of enterprise web application programming, and I have yet to see a project that shouldn't have been built with objects, and a lot of them where the programmers should have had a better understanding of how to design and program with objects.

    erald @ 293:

    Some of what you say is true, although I think a deeper understanding of some of the basics of concurrent programming (types of synchronization mechanisms, patterns of concurrency like co-routines and worker threads, programming conventions to avoid deadlock, etc) would be very useful. Different languages and runtime environments provide different locking and synchronization mechanisms, from Unix fork() and join() to semaphores, message queues, barriers, usw. Understanding their differences and similarities is important when moving from one environment to another.

    Understanding some of the more subtle points of locking becomes much more important when working with large databases or data structures, or large numbers of threads (even when they're running the same code, they need to play nice together with system resources). Modern server class processors can run tens of thousands of threads against petabyte databases while communicating over thousands of sockets, but that can all turn to molasses if something starts hogging semaphores or not freeing memory when done.

    Most importantly, in highly parallel and especially in distributed systems, good error-handling becomes vitally important. You can't just assume that a resource you need is available or working correctly, that the communication channel to it is functioning, or that it can be guaranteed to respond within a predetermined time interval.

    William T. Goodall @ 295:

    Apple also supports mixing Objective-C, C, and C++, which can be very handy when trying to use libraries from different sources, or if, like me, you're a big fan of C++ templates.

    299:

    No, iZettle is definitely being rolled out here in the UK, now if they can sort out the negotiations with Visa it will be even more useful. It will be interesting to see what sorts of transactions it will be used for,

    300:

    We had some here on the West coast in the last few weeks. The sky went blue instead of the usual grey, and this bright yellow ball appeared in it, staying there from about 4AM to 10PM (all times approximate and local time).

    301:

    OK, I am late to this and realise offering another varient of an existing point. But what I like about Apple IOS is it's ease of use and from the UI point of view, locked down, predictable handleing. Though not a programmer or engineer, I'm fairly tech savvy. Use Windows XP and Linux regularly. Comfortable with the command line. Use the other Ios on Cisco devices... However, when I just want to read an ebook, watch Iplayer, consume media, my Ipod Touch is the go to device.

    Tablets are now being bought by the people that 5 years ago, having no particular interest in computers, would have bought a laptop / netbook to surf the web, watch videos, maybe a bit of general note taking, personal admin. Often these would end up with virus' malware or junkware clogging them up because they're not aware enough of such things. 20 years ago, these peple wouldn't have wwanted a computer at all.

    302:

    Ok, to some of you I'm just "some guy on the internet", but I'm already having issues with a legal copy of Office 2010 running on a private network, so it actually can't "phone home" neeping on about "not being activated". The hardware and software configs are stable, but there are reasons why this machine can not be connected to the wider internet.

    303:

    Gissa job? ;)

    304:

    Not so good if you can't see the screen. Asking someone to put your finger on the 0 kinda defeats the point...

    305:

    It's a hastle but I thought you could still activate MS products over the telephone. (I dimly recall having to do this once.)

    306:

    I do it occasionally. It's a howling pain in the ass, as Microsoft has implemented a "conversational" interface that talks to you at random moments, with lengthy delays.

    One of my clients has a secure internal network, as in "secure means no connection to the outside world." Separate switches, wiring, two machines per desk with KVM switches. A script periodically pings from each network space to addresses on the other, and will sound an alarm if it sees any alien packets. (for employees who like to play with wires...)

    It's interesting how many commercial software applications - whose vendors swear don't need or use the internet - either refuse to install or report random errors when they can't talk to the mothership.

    "Can't you just hook the server up to the internet for a few minutes so we can push this upgrade out to you?"

    "What part of 'NO' are you having trouble with?"

    307:

    non elite programmer

    All this discussion about how parallelism and multi-core programming isn't all that hard assumes elite programmers. And most just aren't. So talking about how it isn't that hard to the 1% sounds nice but the issue is how to apply it to the 99%. Maybe even the 99.9%.

    Been there. Have the hat. Not on this specific issue but when "elites" develop a system then management brings in the non elite troops to be trained to enhance it and maintain it (because we know all programmers have basically equal skills and are interchangeable just like on assembly lines), well the results are usually not pretty.

    WTG, not arguing with your comment, just picking up your term.

    308: 305 and #306 - Cheers guys, but the point was "if I'm having these hassles now, then how long before MickeySh@ft won't allow us to have secure not-internet computers?" rather than "How do I stop A Mess Office neeping at me?"
    309:

    I take it that now includes most of the Iranian govt :-)

    310:

    "secure means no connection to the outside world."

    That's exactly what I'm talking about; there aren't even leased lines involved.

    311:

    Some of us are not even programmers, not in any format sense (even if I once did a few things with machine code in BASIC programs on a TRS-80).

    I can see how an ARM-based Android contraption could be something I could understand, because it is so simple. The problem, as I see it, is that we seem to be talking about variations on C, when the people who might want to write programs would rather use something like BASIC or Pascal.

    OK, I'm not up on the current languages. I hope the examples I have picked are clear enough to get across the difference. I don't want to trigger a sub-thread of language suggestions. I already have a headache.

    (You can still get BBC BASIC. Just saying...)

    Anyway, with a good compiler, does the human-readable code need to be quite so obscure?

    312:

    In the wider scheme of things, BASIC and Pascal are minor variations on Procedural Programming, which also includes C.

    Haskell, for example, is a whole different type of language - it's a Functional one.

    (Much Object Oriented programming, at least as practised in C++ and Java, is also procedural.)

    Then there's the Declarative type, such as SQL. If you're used to thinking in BASIC or Pascal, the apparent lack of iteration or flow control in SQL may be painful.

    313:

    "Can't you just hook the server up to the internet for a few minutes so we can push this upgrade out to you?"

    Yes, but then I'd have to kill you.

    314:

    Android programming is done in Java. If you want to use a scripting language, Python for Android or Kivy appear to allow you to build .apk installation files from your Python programs. Writing and reading Python is easier than Java, in my experience, but that's as far as I go on recommendations: I have no experience of Android programming in any language.

    315:

    If you do, be sure to check their twitter account for the latest firmware. There have been several updates (the latest including Angry Birds Galaxy by popular request).

    http://t.co/6EmOws3b

    316:

    It's also much easier to make a dog's breakfast of object programming in Python than in Java. Though, to be fair, I have seen some programmers who can turn any object-oriented language into C (even Lisp, and that was a wonder to behold).

    David L @ 307 is absolutely correct that many programmers are not up to the task of writing concurrent code. Now in many cases it's not because the programmers are incapable of it but because they have never had the requisite training. Even people with CS degrees miss out on some important areas of the field in their education because many schools have turned CS from a branch of mathematics or engineering into a vocational curriculum. So everyone learns how to write code in C and the language de jour (used to be Java, now it's probably Javascript, no relation despite the name), and gets a quick course in algorithms and data structures. Parallel and distributed systems are a high-level elective, and not considered important to the average programmer.

    For several years I was a technological evangelist and transferer at Tektronix; my job was to evaluate tools and techniques for object-oriented programming and design, and help the product development divisions adopt them. This included training their programmers in the basic concepts of object programming as well as how to use the tools themselves. What I found was that some of the programmers were eager to learn new things, some were absolutely opposed to new things, but most were interested if I could prove to them that their work would be improved in some way. On the other hand, most managers were opposed to more than minimal training for their programmers because training was recorded as a loss against their bottom line profit and loss; they wanted me to train their senior programmers and designers (who were often most opposed to the new techniques because they had a career investment in the way things were being done) and have them train the junior folk. The result was that the junior programmers often were poorly trained, and insufficiently comprehended the benefits and tradeoffs of the new techniques. So the pilot project that was to vet the new techniques often had to be saved by assistance from other groups who had gotten it right, and sometimes it became a failed project that "proved" that objects didn't work. And often when it did work the code was at best a hybrid of object programming and the "structured programming" they had been doing, so many of the benefits were reduced or lost.

    This sort of politics played out at several companies that I know about, and I've seen the resultant attitudes generally in the industry since. It was a fascinating lesson in the way technology evolves over time, in ways that are as affected by ignorance as by knowledge, and by politics as by engineering tradeoffs.

    317:

    I'm tempted to ask if there's a sample of the Lisp available. That route lies madness, though. I'm three years out of college: I got three modules on concurrent programming, none elective (two ran in the same semester, because the CS department is run by people who can't resist that kind of joke): one on processor stuff (the care and feeding of SIMD registers, Cell programming et al), one on threading, locks, etc, and one on distributed systems. How good they were is debatable, but they were there.

    318:

    This sort of politics played out at several companies that I know about, and I've seen the resultant attitudes generally in the industry since.

    This doesn't just happen with object oriented programming. Going back in the past COBOL/Fortran vs. almost anything else, SQL vs. CICS, Machine code/Assembly vs. Compiled, C vs. C++, ... Heck cards vs. teletypes vs. displays.

    And the biggest one most people here missed. Mainframes then minicomputers then PCs and now tablets. Same debates with each step with the nouns moved down one level in the conversation.

    And for those of us who saw the computer industry in the 70s, Microsoft's industry position and actions look eerily similar to IBM at around 83 or 84. Very similar. Especially as to the MS statements about Apple and Android in the workplace. Almost identical to how IBM was talking about MS.

    There's something built into human nature that when people gets comfortable with something the majority will resist change even in the face of overwhelming evidence that their lives will be better long term.

    319:

    I can see how an ARM-based Android contraption could be something I could understand, because it is so simple.

    You do know that an Android or iPhone contraption is basically a full fledged computer in terms of programming?

    Anyway, with a good compiler, does the human-readable code need to be quite so obscure?

    The problem here is our better programming methods still don't lend themselves to easy understanding when transcribing intent into letters on a page/display. It's hard. Harder than most people can comprehend to develop much in the way of complicated software that works well and presents a good interface to John Doe.

    There's this mindset that, gee this doesn't seem all that hard early in the learning. Even after a few years. But getting to really good object oriented multi-core good user interface programming from the first few years of college programming is like jumping to working on unified field theories in physics after mastering a few years of calculus.

    Which is why I think maybe 30% of the programmers out there can be good but only 1% give or take will ever be great. And our tools for programming are up to the flint axe and knife stage so it takes the elite to do great work. Or hoards of the good to maybe do good work. Maybe. See Mythical Man Month. http://en.wikipedia.org/wiki/The_Mythical_Man-Month

    320:

    I'm tempted to ask if there's a sample of the Lisp available.

    Here's the last line of most Lisp programs.

    ))))))))))))))))))))))))))))))))))))))))))))))))))))))))

    321:

    The problem of programming will not be solved until we have a fully functional AI, and anyone can just tell it what is wanted and it can show it all working in real time as the description unfolds.

    322:

    Since I saw the "tablets are better because they are instant-on" argument here multiple times:

    You guys know that you can do the very same thing with a laptop (even a netbook) if you simply use the sleep modus instead of shutdown? Optionally with using an SSD instead a normal hard disc.

    You do not need to completely shut down a PC every time. The reason tablets are instant on is because they do exactly this thing, they use flash based hard drives and go to sleep instead shut down. But nothing at all prevents Laptops to do this as well. They need power while in sleep, but so do Tablets.

    323:

    anyone can just tell it what is wanted

    Because we all know how well most folks describe things. :)

    What is needed are DWIM buttons and/or systems.

    Do What I Mean

    324:

    Hence real AI that can actually understand what the user wants eg "I want a box on screen with a graph in it that show the number of people in Trafalgar Square by the hour on summer solstice. Then if someone expands the graph at a particular point and keep on expanding it they can narrow it down to whoever is standing closest to the pillar at that time. Then see if they are on Foursquare and if so tell me on my phone. Questions?"

    325:

    Bruce,

    You can get a matte screen Mac Book Pro -- I bought one not six months ago. It's a configuration option in the online store. You can't get the Retina display with a matte screen, so you're "limited" to an 8GB i7, but they very much exist, and should last you another five years.

    326:

    Which is why the "superparen" was invented: "))))))))))))))))))))))))))))))))))))))))))))))))))))))))" == "]". :-)

    327:

    My laptop (a MacBook Pro) is on all the time, and it sometimes takes a couple of minutes to wake up completely. And every few days it doesn't wake up at all and I have to reboot it. That's not "instant on".

    And that's part of my definition of a useable computer: if I can't leave it on for days at a time without it running out of memory or tripping over a kernel panic, then it's not a computer I want.

    328:

    Actually, 8 gig is a limit I can't accept. Right now I have 4 gig, as much as this model will take, and I can't run both an IDE (Xcode + Intellij Appcode) and a web browser (Chrome with about 30 tabs open) at the same time without everything slowing to a crawl. I'd like to also have a geometry program open to test out shape configurations, but if I do that the machine freezes in memory thrash. My guess is that I'll need 16 gig to ensure I can run everything I need simultaneously.

    329:

    Last comment for now, honest. I have to go mix up a huge bowl of pasta salad. I just wanted to point out that the AI programming system will be just great until we discover an off-by-one error in its code generation, or a buffer overflow in the symbol table maintainer.

    330:

    They funny thing to me is when people use it exclusively high end cases with built in keyboards and basically turn their iPad into an $900 laptop with less capability than a $250 netbook....

    Instant on, really sharp display, and touchscreen are features my workmates' netbooks don't have that my iPad does have.

    331:

    Bruce, Apple doesn't offer 16GB as an out-of-the-box option on the current non-Retina MacBook Pros, but you can do it yourself. They have two SODIMM sockets, there are 8GB SODIMMs available, and the Intel chipset knows how to speak to them. I just checked Other World Computing's website and they offer a 16GB upgrade kit for these machines for $170 US. So you actually can buy a matte display MacBook Pro and get 16GB. (Sorry if I just caused you to fail your saving throw against the new not-so-shiny.)

    On the other hand, you might want to look at AnandTech's review of the Retina machine. To evaluate Apple's claims about the retina display's reduced glare, Anand took photographs contrasting it with the glossy and matte variants of the 2011 MacBook Pro in various glare-inducing situations.

    I'd actually say that out of the three, the retina machine looks best overall. The matte display does diffuse the edges of glare sources, but tends to look more washed out. For example, in this picture, note the contrast ratio of the menu bar in the area covered by the spotlight. Menu bar text is much more readable on the retina MPB.

    http://images.anandtech.com/reviews/mac/retinaMacBookPro/DSC_7464.jpg

    332:

    IN which case you have something wrong with the Pro. I've had Macbooks for years and they always wake up quickly. The thing that fails for me is that my cheapass Netgear router sometime decides not to connect to the 'net, but the Macs always are back within seconds.

    333:

    It depends on what you have going and if the machine goes into the deep sleep which writes out the RAM to the disk drive and when waking from this you are in essence doing a faster boot.

    334:

    And that's part of my definition of a useable computer: if I can't leave it on for days at a time without it running out of memory or tripping over a kernel panic, then it's not a computer I want.

    Show me this magical OS with no memory leaks or other bugs where it never fails when used by live meat for random tasks. (I.E. not a server where the task can be closely controlled.)

    There's a rule. All software has bugs. Including OS software.

    335:

    They funny thing to me is when people use it exclusively high end cases with built in keyboards and basically turn their iPad into an $900 laptop with less capability than a $250 netbook....

    A netbook is not a tablet. I recently gave my tablet to my wife. She's out of town most every week and home on the weekends. I find I keep wanting to reach for it. There are many times where I just want something without an attached keyboard. Sitting in bed or in front of the TV. Running to a client where I don't need a netbook/laptop. Whatever.

    And many iPad keyboard are built into cases and don't add much heft at all. And for under $150, most under $100.

    I've had the choice of a MacBook (close to a netbook in size) and iPad to take out the door for about a year and I really liked having the choice as what I have planned for the day makes for a choice.

    336:

    Surely modules on concurrent programming should be scheduled for the same day/time slot as each other? For extra points, assign the same lecturer to both modules!

    337:

    OS X, for a start; I've a Mac Mini that sits on my windowsill and runs iCal, iTunes and Calibre constantly, and occasional bursts of Firefox and sync-to-Android. Only ever been shut down for patches.

    338:

    If they could have, they would.

    339:

    No spec like that ever works in a corporate environment. When the prototype is build a dozen new usecases appears, a couple of more during development and another half dozen during user acceptance testing.

    340:

    My perspective is that it comes down to good people and decent training. Both at the codeface, and in management. With those, most problems can be solved.

    Similarly, some design problems are tricky, e.g. concurrent tasks. You can reduce the burden on the programmer (I love the STL and Boost libraries, and I think smart pointers are the cat's pyjamas), but it still requires an understanding of what's going on "under the hood".

    I rather like the idea of using several appropriate languages to attack a problem; something OCCAM-like for the top-level parallel sequencing, something C++ like for the single-thread-of-execution stuff, etc. I don't think it's likely, given the problems we have training people to use only one language properly...

    In the wider scheme of things, BASIC and Pascal are minor variations on Procedural Programming, which also includes C.

    Haskell, for example, is a whole different type of language - it's a Functional one.

    A colleague was recently using Haskell as part of his research into stronger typing of HDL interfaces; I had an interesting experience with Standard ML, because our lecturers had developed it (a good way of making your ears bleed from mental effort).

    From the outside, one problem with the Functional languages appears to be that their tool chains are more research-quality (good enough to work once/some/most of the time) rather than production-quality (good enough to work all of the time). See "Glasgow Haskell" for a worthy effort.

    This isn't helped by the concern that you don't make your name in research by reusing another person's language, when you could invent a whole new one - leading to a certain degree of fragmentation.

    (Much Object Oriented programming, at least as practised in C++ and Java, is also procedural.)

    IMHO, the problem with much "object oriented" programming in C++ is that it isn't - it's just the author writing C, and assuming that because they've grouped their variables and functions into some classes that things are now object-oriented (I did it myself until I learned the error of my ways).

    They then wonder why the promised improvements in defect rate and LOC productivity don't arrive, having made a complete mess of the whole coupling/cohesion thing through lack of effort during the design. Yes, you can draw hideously complicated diagrams with boxes and dotted lines and mutter buzzwords like "realizing interfaces" and "aggregates", but that doesn't mean that the design is any good.

    This isn't helped by cries about "Rapid Development" by people who don't understand how lightweight processes are intended to work... (i.e. just because you have lots of engineers doing lots of small incremental changes, and you don't bother with documentation, doesn't mean that you're "Agile").

    Les Hatton (he of the "Safer C" effort, and critic of "Ready, Shoot, Aim" software development) did some very interesting articles on the subject, demanding numerical evidence rather than gut feel. The Freakonomics of software engineering, if you like. His papers are quite readable :)

    http://www.leshatton.org/category/scientific-writing/computing/

    341:

    I'd partly agree with you about OOPS. The other issue is that not every system is equally suited to OOPS. For example, a car instrument panel will typically contain 1, sometimes 2, meters for a given type of quantity, so you have about as many classes using OOPs as you would modules in a procedural system where you put all the functions for each meter in a separate module. OTOH an aircraft register with OOPs might start "Things that can fly", split that into "registered aircraft" and "birds and insects", then split "registered aircraft" into "fixed wing" and "rotary wing"... All registered aircraft have certain characteristics in common, say reg_no, mass, fuel type...

    You know the rest; my complaint is about people who think that OOPS solves everything.

    342:

    OS X, for a start; I've a Mac Mini that sits on my windowsill and runs iCal, iTunes and Calibre constantly, and occasional bursts of Firefox and sync-to-Android. Only ever been shut down for patches.

    Great. Glad it works for you. Obviously your usage pattern doesn't expose the cruft that would cause you to do restarts more frequently.

    I have earned a majority of my living supporting mostly Macs in small business for over 15 years. They have issues. Some people don't use them in a way to expose such issues but many do. But a once a day restart keeps most of this at bay.

    "bursts of Firefox" Firefox leaks memory like a sieve. I know. I use it as my primary browser. But after a few days with 3 or 4 windows of 5 to 20 tabs each it can get "odd" and take 20 minutes or more to quit if you don't want to just force quit it.

    343:

    Just use multiple inheritance to simplify things :-)

    344:

    I used Firefox for years but have given up on it for those reasons. Having a browser use 1.5GB on an XP machine is just not on. Now I use Chrome.

    345: 344 - #341 was an illustration of why OOPS doesn't work for everything rather than a design spec. 342 and #344 - IE has memory leaks too. :-(
    346:

    IE has memory leaks too.

    To repeat, and I don't know where I first heard it; "All software has bugs".

    347:

    I switched to Win7 with 16GB of RAM and all my problems went away...

    sticks tongue

    348:

    Wait, no. Now I have a problem of organizing 50+ opened webpages.

    349:

    Well, I didn't say it had to stay up forever. An occasional crash is to be expected (though I'd like the machine to be able to reboot automatically most of the time). There are (or at least have been) such computers. One Unix computer that a friend of mine and I worked on was quite stable; when the product division that made it failed, my friend bought one of the last units and kept it for years after. I believe the uptime record for it was 140 days.

    350:

    Yes. These days it is mostly in the use. I have an office I support with 20 CAD users. If they restart at the end of the day hardly anyone ever crashes. If not they'll usually lock up within a week. But many times they have 3D CAD, a browser or two, various Adobe products, MS Office, a vertical market time billing package, Email, FTP, etc... all going at once. All on Macs.

    We have a collection of Minis and XServes doing server work. They rarely lock up and are on 24/7. But I'm sure segregating functions to individual computers makes a big difference here.

    As to why the separate computers, this office normally has people starting at 6 or 7 AM and many night working until 7 PM. And several times a month you'll see folks working to midnight or later. And the owners are totally focused on productivity. So with things segregated futzing with Email doesn't interrupt file sharing which doesn't interrupt the internal/external cloud site etc... I know with virtualization we would eliminate some of this but not all of it. And Mac Minis are very cheap and small. And if you put an SSD in them, also fast.

    351:

    This about what you had in mind?

    class ScrewUp : Object;

    class Fuckup : ScrewUp;

    class Chaos : Object;

    class CircularFiringSquad : Chaos, Unsynchronized;

    class ClusterFuck : Fuckup, CircularFiringSquad;

    352:

    Chrome isn't a lot better. WIth 40-50 open tabs on a single window, Chrome takes up over 2 Gigabytes on my MacBook Pro. And if I close anything else to get more room, Chrome eats whatever I free up.

    353:

    SIGH, all of what you say is very true. On top of that, the tool chains for procedural languages are still largely engineered for efficiency and optimized code generation; the IDE user experience is still not much better than it was 10 years ago. There is a lot more we could do to assist the programmer in writing code by taking over more of the routine, boiler-plate tasks, letting the computer do the tedious and (to a human) error-prone tasks, and letting the human do the things a computer (even an advanced AI of today's technology) can't.

    354:

    "I'm older and also tired of it."

    It's funny. I WAS tired of it. I got hacked off with the deliberate invalidation of my experience base by the industry and their "You've not used Microsoft Widgets eleven-point-five-nine?? No job for you!" thing.

    But it turns out that if you look in the right places there is a world out there where the change isn't a meaningless treadmill... And it's bonkers. Scale that's scary because.. humanity can USE IT.

    Yeah, we don't program them in things like BCPL any more. But curiously, a lot of the people who program them in various languages -- some of which you've never heard of -- are the same people who used to think 32k was quite a lot of memory for a whole computer and now think it's a not unreasonable size for a quick sync message between distributed processes.

    It's like having an engineering degree from 1927 but have kept up enough that you get to pilot moon-rockets.

    355:

    In the spirit of how hard it is to get computers/tablets/phones/whatever "right"...

    Charlie

    Is any of your money missing at RBS or it's partners?

    356:

    Programming is the subtle art of turning silk purses into pigs' ears. (Except with Javascript it's not so much a silk purse as a whoopie cushion.) It really is trickier than it looks to the layman.

    357:

    I really like that line about the whoopie cushion. That was very much my experience of writing Javascript; there was always a surprise when I sat down to write it.

    358:

    Off topic, but exciting: UPS tracking reports my copy of The Apocalypse Codex is enroute and will arrive by end of business this coming Monday. There will be celebrations.

    359:

    For example, a car instrument panel will typically contain 1, sometimes 2, meters for a given type of quantity, so you have about as many classes using OOPs as you would modules in a procedural system where you put all the functions for each meter in a separate module.

    Say you had a data bus that gave you access to measurements; in a procedural language, you would either have used multiple sets of cut-and-paste "sensor reader" functions in each meter, or you would have a common set of "sensor reader" functions, but multiple sets of calls to them and no guarantee that every client calls them in exactly the same way.

    Simplification I know... but if you find yourself using a type in a procedural language, that's a big hint you would probably replace it with a class in OO.

    OTOH an aircraft register with OOPs might start "Things that can fly", split that into "registered aircraft" and "birds and insects", then split "registered aircraft" into "fixed wing" and "rotary wing"... All registered aircraft have certain characteristics in common, say reg_no, mass, fuel type...

    Don't confuse "explanation of inheritance relationships" and "generic teaching examples" with good object-orientation :)

    One of the problems is beginners tend to use inheritance for everything. Inheritance (only one part of OO) is for modelling "is a" relationships, not "has-a" relationships. They also try to model the world, rather than just the bits they need. For an engine system, you probably don't need to model whether the aircraft lifting surfaces are bolted on or whirling around - if you see that explosion of inheritance, it's a hint that the designer is inexperienced.

    So; for the example above, you might bundle the common functions into a base class that was "sensor reader", and then have "fuel state" is-a "sensor reader", as well as "fuel temperature" is-a "sensor reader". The outside should only see "tell me the fuel state". If you're feeling ambitious, you could then hide away other design decisions - how often do you read the sensor? How many "fuel state" objects are you going to allow to be created?

    Another depressing thought is the impact of commercial pressure (insufficient staffing and thus compressed timescales / reduced training) and risk aversion on the introduction of new techniques...

    There's a team in Edinburgh who were auto-generating code from UML over a decade ago (using a tool called Rhapsody, created by i-Logix, and since then bought by TeleLogic and now IBM), and they're still doing it AFAIK. A team of 10 took two years to produce something that had taken a team of 30, four years. Yet you still hear engineers announcing that "autogeneration will never work". Or that "you shouldn't use exceptions in C++", normally just before they write code where they use raw pointers but don't check for non-null values before use, or don't bother checking return codes. Look, a dinosaur :)

    http://exceptionsafecode.com/

    That reminds me - another differentiator between good and bad coders is the imagination required to think through "what might go wrong", and not to write Pollyanna code that assumes everything will work - but to write code that will detect errors, and either recover or tell you what's gone wrong. First rule of embedded code - no crashes are allowed.

    PS Dirk @ 343

    Just use multiple inheritance to simplify things :-)

    You should have heard the whimper when I confessed that I'd used multiple inheritance to solve a problem we had in our code base. Diamond-shaped, no less :)

    ...Eppur si muove...

    360:

    Or that "you shouldn't use exceptions in C++"

    Ah yes. That's usually the point where I start considering that the speaker is lacking in proper clue.

    In my opinion, it's not that you shouldn't use exceptions, it's that catch blocks are way, way, overused. You need some. But not many.

    (And code that has explicit delete calls in? Unless in an RAII class, that smells.)

    361:

    That is an issue with your MacBook, not with laptops in general. Also, getting out of sleep with a HD isn't nearly as fast as with a SSD.

    Having to reboot a laptop now and then isn't really a problem (besides that Win7 has no issues running without reboots for weeks), all you need to do is to hit "reboot" instead "sleep" when you turn it off once every blue moon. You loose several orders of magnitude more time with "typing" an a tablet compared to a real keyboard.

    362:

    besides that Win7 has no issues running without reboots for weeks

    Again, it depends on the use. Some people manage to expose warts in Win7 such that they should reboot every night. Others can go a month or more.

    It all depends on how your use runs into existing bugs.

    363:

    To add to the reboot thingy - you also have to look at it from a tablet user case perspective. Sure, if you use CAD, 50 Chrome Tabs, a music player, office, etc stuff can get a teeny weeny bit unstable after a while.

    However, you do not really do stuff like that on a tablet, simply because you can't. Not even close. So I do not think it is a fair comparison. With a more "moderate" user case for a PC you will still have far more options than with a tablet and no issues with OS stability.

    364:

    I get my iPhone 3GS (2 models removed from current) and iPad v1 into a state where I reboot them about every week or three. These older models are memory limited enough and I suspect that some Apps leak memory enough that over time it just gets sluggish. Terminating most running apps followed by a reboot brings things back to normal.

    365:
    With a more "moderate" user case for a PC you will still have far more options than with a tablet and no issues with OS stability.

    So my use of my computer is constrained by the low quality of the "professional" products I use? Having been responsible for building some of those products, and having used some that are of reasonable quality, I find that unacceptable. Rather than tell me I'm asking too much of my computer, tell the software application and OS vendors that they're not asking enough of their designers and QA testers.

    366:

    A complete aside, but if we're talking about non-traditional computer experiences which rock then the Wii deserves a solid mention. Growing up I had a few computer gaming options and my dad (born 1958) embraced several of them, but it took the Wii to engage my mother (born 1960) to the extent that she had to buy her own console when my brother moved out. She could never program the VCR, but now she tells me how to use gaming consoles.

    The Wii, I think, was engaging the general public well before the iPad existed (and quite possibly laid the groundwork for the iPad's success, to the same extent Lego prepared me for Ikea). And it performed this feat by ignoring all the conventional knowledge (that it was all about pixels and performance).

    367:

    I reboot my 3Gs about once a month, though there are a few apps that seem to get their shorts twisted up more than others, and crash back to the OS.

    368:

    IMO there should be a special license that allows people who've demonstrated their understanding of OO design to use multiple inheritance. There is a class of problems that it works well for, but most people who use it don't seem to know what that class is, or that the Java interface and the Objective-C protocol were invented to solve those problems without multiple inheritance.

    The concept of inheritance is more complicated than most C++ or Java programmers are ever taught or understand. First of all there are several kinds of inheritance, and they're presented in different ways in different languages. There's implementation inheritance, which comes in 2 forms: classes and prototypes, and there's interface inheritance, which is mixed up with class inheritance in some languages. Explaining these things in language-neutral terms is important if programmers want to be able to use them well in more than one language, or even to recognize how the choice of language will affect the design and coding of the inheritance structures.

    Not use exceptions? Rather ask me to tie my hands behind my back. I find, especially writing server-side code or embedded code (Tektronix sold a line of oscilloscopes whose firmware ran a Smalltalk virtual machine, and another that used C++) that about half my code, measured by LOC, is exception handling in one way or another, and that getting the exception structure for an application is almost half of the detailed design work. If I don't put that much effort into it, guaranteed it'll come back to bite me.

    Code generation is another interesting issue. I've been using and building code generation tools (and partial evaluation tools as well) since the 1980s, in C, C++, Smalltalk, Java, Lisp, and Python. I find source generators fairly easy to design and maintain, machine code generators somewhat less so. I know a lot of people still don't trust them, but they've been a standard part of the software toolkit for a long time (compilers wouldn't be what they are today without them), so it's time to accept that as programmers we're not routinely in charge of every bit of the code the machine runs on our behalf.

    Oh, and the Java generic and C++ template mechanisms interact strongly with inheritance and make it more complicated. Using them well (beyond creating type-safe collections or simple Facade patterns) is an art that very few programmers know.

    Maybe it's time for me to write another rant about software in my blog. Looking back over this thread I can see the steam coming out of my nostrils; I guess it's time to breathe a little fire.

    370:

    News of the day - Google shoots itself in the foot again by failing to include a microSD slot, when its one of the key advantages they have over apple. Also play differential pricing to screw over non-americans.

    371:

    No differential pricing. $200 = £128 Add 20% VAT and we get £153

    372:

    That seems like you're replying to MickeySh@ft fanboi-ism.

    IME, their view is that, if you can crash MickeySh@ft stuff, you're a "bad user", and if you can crash anyone else's it's because "it's not as good as MickeySh@ft products".

    My answer to that is likely to be to show them a Grapher chart using 1 X axis, 4 Y axes, and different data interval markers on each data series, then ask them to do something similar in Excel!

    373:

    I wouldn't argue with that.

    I remember when the Wii first came out, there was much talk of thIrd generation consoles and whether the Wii deserved that dubious accolade or not with many arguing that the Wii's modest graphical capabilities disqualified it. From where I was sitting (with a couple of pre-teen boys merrily waving their motion sensing controllers around along with their grandparents) it was a no-brainer, more pixels, more polygons, and more colors did not a new generation of console make, a new interaction paradigm however most certainly did! XBox 360, PS3 etc were just more of the same while the Wii was something new and different...

    374:

    I'm replying to anyone who argues that my expectations that personal computers of the current generation should be able to maintain the uptime characteristics of Unix microcomputers of 25 years ago are unrealistic is either disingenuous or a fanboy for badly-made products, whether they come from MicroSoft, Apple, Dell, or wherever.

    375:

    is either disingenuous or a fanboy

    Or just realistic in using the tools available today. Not that we're all happy with the situation.

    But on the other hand does anyone have stats on the lines of code in a Unix system of 25 years ago vs. today. With and without the GUI removed? (That would be 1987 so we're basically talking Apollo or Sun in terms of a widely installed based of GUI Unix.)

    376: 374 and 375 - Well, I've got a network running SunOS 4 with a mixture of OpenWindows and Motif GUIs, that can manage months of non-stop uptime.
    377:

    David @375:

    >>"is either disingenuous or a fanboy >Or just realistic in using the tools available today. Not that we're all happy with the situation."

    No, no, no. I run a couple of Debian systems (testing, not stable) for private purposes. One for the classical PC role, one multimedia station and one headless home server, plus a laptop.

    All of them NEED to be rebooted when updated to a different kernel. Otherwise they go to S3. They don't need to be rebooted. Period.

    I sometimes do reboot the laptop (for obvious reasons) and the PC because that doubles as my gaming rig.

    378:

    I run a couple of Debian systems (testing, not stable) for private purposes.

    Gee. So the first thing a user should do is pick the OS even if it doesn't support the applications needed to compete in the markets in which the business competes?

    Specialty CAD, vertical market accounting and task management, etc... Plus the need to trade files hourly at times with other firms. None of that should matter in the selection of a platform for an office?

    I don't see the point.

    379:

    Fact - All para 2 does is state some buzz words. It does not identify the applications you use, or why you need specialised Windoze only applications.

    380:

    Price segmentation on the built-in storage, and no µSD expansion slot: damn them.

    381:

    From a US perspective.

    OK. Fine. The goal of a business, and I'll pick one I'm familiar with, architecture, is to earn a living for the owners of the firm and its employees. So the firm has to do business in a way that makes them money. Enough to meet payroll, expenses, and capital for future needs. And keeps business coming through the door.

    So these architects can either 1) pick a computer for purely technical IT reasons and deal with the architectural business issues as they come up or 2) decided the types of projects that they as architects want to pursue and pick perfect computers that let them do that (which do not exist) or 3) something in the middle.

    Most smaller firms (under 50 employees) tend to go with something between option 2 and 3. They are too small to dictate to the construction industry in terms of work flows and file formats (for the most part) but nibble enough to do things as needed.

    Now they have to make hiring decisions. Hire good architects are good computer people. Now they'd love to hire great architects who can build their own computers and never have any issues with file conversions or choices of CAD software. But that isn't reality. So for the most part they hire good people as architects who aren't computer stupid. Sometimes they get a computer turkey or wizard but in general that's not the case for a good firm.

    Now if you want to work with contractors on on projects big and small you have to live in their worlds. And for the US if you want to deal with governments federal, state, and local that almost always means trading files in DWG (autocad) formats. And MS Office formats. (Not just simple .doc but password protected macro filled things) And if you don't want to do this you really restrict the kind of work you get to do. OK for some high profile boutique firms but not for all. So now you have to make computer decisions based on a limited set of software. First you need to trade files in DWG formats and 90%+ of the time deal with 3D. And second MS Office. So now you're down to less than a dozen programs that run on Windows or Macs. MS Office is also Win or Mac. (And no Open/Libre Office doesn't cut it unless you're into principled stands on file trading or have enough prestige to win all trading debates.)

    Toss in Adobe for marketing and xGoogle SketchUp and for marketing and 3D initial designs and your more locked into Win and Mac. This is what the students have learned out of school and wile you can use other things, again is the point to make a principled stand or stay in business?

    And yes you can run a lot of Win stuff with Vine and whatnot but at the end of the day do you want your staff doing as much architecture as possible or bit fiddling to convert files and such as your work with other firms or trying to figure out if the REVIT errors are in the programs or due to something Vine has not yet implemented. For most firms this size doing smaller commercial work each project can involved anywhere from 5 to 30 outside firms. More on larger and/or specialty work. And you get repeat business based on your reputation. And I've seen firms with good architectural skills go under due to them being such a PITA working with other firms on projects.

    382:

    no µSD expansion slot

    These things add cost and failure points. Cost to the minimum price. So what do you want to remove from the base product. Built in memory? OK. But then ....

    One thing Apple has taught more and more product builders. Eliminating failure points adds to long term profits.

    Whether us nerds like it or not.

    383:

    Hence real AI that can actually understand what the user wants eg "I want a ...

    At which point the real AI says "fuck off, East-Enders is on," and puts its metaphorical feet up on the coffee table while it watches a soap opera.

    Srsly, nobody wants a real artificial intelligence, complete with its own messy motivations. What we want is a do-what-I-mean button. Not the same at all.

    384:

    I believe the uptime record for it was 140 days.

    The uptime for the Debian server this blog is hosted on is about to hit 563 days (in the next half hour). It is not exactly under-utilized ...

    385:

    I would probably skimp on the Tegra 3; it doesn't seem too useful outside of games. I have no idea about this failure point wisdom, but it could just as well translate into a bit of extra attention to mechanical properties at design time, rather than failures down the line.

    386:

    I would probably skimp on the Tegra 3; it doesn't seem too useful outside of games.

    Games are one of the main product differentiation points for tablets. So skimping on performance would appear to be rather unwise.

    The failure point thing: I think you're focussing too much on feature bullet points, and not enough on the overall user experience.

    Sure, not having a microSD slot annoys those folks who like swapping tiny sub-fingernail-sized memory cards in and out of their shiny. But my take is that Google -- like Apple -- consider tablets to be machines that keep their data in the cloud -- if not 100% of the time today, then certainly once gigabit wifi and LTE 4G data running at over 40mbps becomes normal. At which point the micro-SD slot becomes an annoying irritant, because ultimately the tablet's own storage is little more than a cache for cloud-hosted data and apps.

    387:

    Way back when I used to work for Digital's HPC/Unix group in Galway (Yes, we sold C shells by the sea shore) .. we had an interesting take on the uptime brag.

    As OS developers we ran Unix on our desktops. We'd keep them up and - pride was at stake - even to the point of editing a running kernel on the mail server in a debugger to work past known bugs (shudder). At times trollies with UPCs were known to be deployed when servers needed to be moved.

    But we had a nemesis : the annual maintenance downtime, where the buildings office were allowed to disconnect the power for a weekend for essential work each year. So a new metric was needed: what percentage of the workstations kept a full uptime between annual enforced downtimes.

    I believe we got 95% one year...

    388:

    Uptime?

    We had a server. Its uptime passed 1000 days, and then we moved offices, so we shut it down. We brought it back up again after the move, and its uptime was getting on for 400 days before it was then stolen.

    And that was PC class hardware.

    Its replacement ran Windows rather than Netware.

    389:

    David L. @381:

    I think I understand your goals and some of your choices. Sometimes you can't escape bad choices, because all the others are worse for a given set of problems. I see no reason to grow complacent about them, however.

    It needs to be said that some people obviously have little experience with anything else (which is o.k.) because otherwise they'd be aware of the fact that several weeks or months without reboot are the rule rather than the exception with all OS families that are not MS or Apple. For some types of machines that utilize a special variety of virtualisation uptimes on the scale of years are the order of business.

    It's not meant as an insult to it's users if I say that I think an OS with a bunch of applications that doesn't even try to run without reboots is, well, I'll just go ahead and not sugar coat it, CRAP.

    My life is going to end with an aneurysm, I can tell, because every time I hear this strategy suggested even if it's only for a custom application with a couple of hundred users, and only restarts one application server, I feel like I'm about to blow a gasket. It's a mindset that WILL lead to disaster sooner or later. To me it's like saying, yeah, something is clearly broken, but let's just gloss over it and hope it doesn't blow up before we can get out of here.

    As an architect if your client's house had cracks that kept getting more and bigger, would you just tell them to repaint it every couple of months?

    390:

    Robert, not all apps fit into the memory card (some of the bulkiest, such as the crapware on my Virgin Mobile phone and, of course, Google's own apps, will not go).

    Even if the app can be moved onto the memory card, not all of it will go, and the remainder left in primary memory can be > 50% of the app's original size. Hence, the importance of the primary memory size in Android.

    Being of Scots-Irish stock, I think my genes still remember how to avoid being spendy, yet when it came time to recommend which of the new Google tablets to get for mine vrow, I immediately abandoned all thought of economy and went for the max-memory version, even if it did require an additional 50 American pesos.

    391:

    I have just pre-ordered a Nexus 7. Having only 8GB does not bother me in the least, since almost all the time I expect this to connect to wifi somewhere, and if I want some media I will download it temporarily from home. Having an always-on NAS box attached to my 60Mb/s Virgin router for that purpose is no problem. In fact, 8GB for books and some music would keep me going for weeks of disconnection from the Net. As for apps, I am not interested in collecting GB of them. The major uses are going to be email, skype, browsing and maps/GPS. Being able to slip it in a pocket rather than have it in a bag over my shoulder is a big plus. Having used 10" tablets I have come to the conclusion that they are not conveniently portable at all.

    392:

    because otherwise they'd be aware of the fact that several weeks or months without reboot are the rule rather than the exception with all OS families that are not MS or Apple.... It's not meant as an insult to it's users if I say that I think an OS with a bunch of applications that doesn't even try to run without reboots is, well, I'll just go ahead and not sugar coat it, CRAP.

    Except the point of most businesses isn't to NOT reboot their computers. It is to earn money. And if workers on some relative 1 to 10 scale are productive at a level of 9 with computers that need a reboot at the end of the day and maybe one during the day every two or three weeks then using a computer that never needs a reboot but due to the software options their productivity is a 6 or 7 or worse then the business owner would be dumb or taking a moral stand to go with the never reboot option.

    We computer nerds tend to look at the wrong metrics. Reboot times, bits per second, whatever don't really matter until translated into prints per day, hours per job, etc...

    And this thinking is why a lot of business types hate us IT guys. We never address real issues in their minds.

    393:

    When businesspersons talk about rebooting computers, my first question is usually "why are you running them all night?". Sometimes there's a valid reason, but often it turns out that they're just assuming the things magically don't consume power when the chair in front of them is empty.

    394:

    They are supposed to last longer if you don't regularly thermally cycle them. HDDs particularly, like car a disproportionate amount of wear occurs when you first start them.

    395:

    why are you running them all night

    Usually they aren't running. They are "sleeping". Most computers are set to power sleep from the factory after a time limit. And most modern computers are pretty miserly with power these days when sleeping.

    Plus not turning them all the way off allows them to be updated and backed up as needed without disturbing the troops so much.

    396:

    They are supposed to last longer if you don't regularly thermally cycle them. HDDs particularly, ... disproportionate amount of wear occurs when you first start them.

    Used to be but not so much today. Disk drive platters today weigh in ounces not pounds like 10 or more years ago. So the start up energy is no where near what it used to be. Plus with the lighter weight there isn't as much G force shock as their used to be. No where near as much inertia.

    From what I see these days based on dealing with failures over the last 30 years, thermal shock is the biggest killer. Cold to hot to cold and so on can break connections in chips, especially with the incredibly dense microcircuits of today. I suspect that very slight imperfections in manufacturing allow hot spots to occur which burn out some chips. And given the short lifespan of any one particular chip design these issues are never addressed except in designs 2 or 3 generations removed. Assuming a manufacturer of chips gets back enough failures to even try and figure out what went wrong.

    397:

    As an architect if your client's house had cracks that kept getting more and bigger, would you just tell them to repaint it every couple of months?

    Interesting comparison. I recall reading of some rainforest tribes who'd come up with a good way to deal with pest infestations. It seems that when you build a house in the Brazilian jungle insects almost immediately start colonizing the place; within a year there's a large population in the thatched roof and in two years there will be things with too many legs eating the structure and falling out onto folks below. The locals do the rational thing and reboot the house. A replacement is built a little way away and the old instance is burned, along with as many critters as possible.

    Nobody's managed to battle computer bugs with fire, but it would be very satisfying...

    398:

    As an architect if your client's house had cracks that kept getting more and bigger, would you just tell them to repaint it every couple of months?

    No. But the analogy isn't really the same.

    If they had an Air Conditioning system that would trip and refuse to run if run for more than 6 hours continuously it would be a reasonable choice to limit it to a 5 hour continuous run with a 1 hour break instead of replacing it with a more expensive unit.

    399:

    Since this thread has devolved considerably, here's another dumb question from someone who's completely ignorant of the Apple line of products: What are the big differences between the Microsoft OS and the Mac OS?

    What I'm getting at by "difference" is, isn't every operating system all about making calls to various un/reserved memory locations, I/O ports[1], etc? But to hear some people go on you'd think there was an unbridgeable chasm between the two ;-) I did Vax/Unix/Microsoft support back in the day, with JCL/JES2 on the Vax with cms, if that helps place me on the Map of Ignorance.

    [1]I got out of the IT game just before USB ports became a common feature on every home computing device.

    400:

    !6 gb by the end of the month so nows the time to cancel that order and re-order the ....

    " GameStop announced that buyers can now preorder a 16GB Google Nexus 7 Tablet. The video game retailer will accept trades of video games, old gaming systems and Android Tablets, iPads, iPod Touches and some phones. Users with old gadgets lying around may want to consider letting GameStop take them in for a discount on the new Android 4.1 Jelly Bean tablet.

    Google announced the ASUS made Nexus 7 Tablet at this year’s Google I/O 2012 Keynote.

    The tablet still costs $249 at GameStop, minus any credit for traded in games or devices. Shoppers who trade in devices or games when pre-ordering will receive a 30% bonus on the trade in value. The tablet is available to purchase without any trades, but it won’t ship until July, when the rest of the pre-orders go out. The Google Play store offers the Google Nexus 7 tablet for $199 with 8GB of storage as well as the 16GB model sold by GameStop."

    Through I gather that its on sale in the UK at ... " Google ASUS Nexus 7 Tablet PC - 16 GB - £189.99 Delivered with code @ Currys Go To Deal Google ASUS Nexus 7 Tablet PC - 16 GB - £189.99 Delivered with code @ Currys £189.99 Vote HotVote WarmVote Cold 2804° z11, made hot 1 day, 4 hours ago Use code ABER5 at checkout to pre order the 16GB version of the Nexus 7 for £189.99 delivered from Currys. Also 5% Quidco on tablets this weekend only (1.5% Topcashback).

    Product details Enjoy a whole new experience with the stylish and feature-packed Google Nexus 7 Tablet PC, for fantastic technology that brings tablets into the next generation.

    Full of Beans

    Take advantage of a whole new operating system with the Nexus 7. This great-looking tablet features the new Android Jelly Bean operating system, so you can enjoy the new Google experience.

    The Nexus is the world's first tablet to feature the new Jellybean operating system, making it truly unique.

    Unprecedented power

    Another technological first, the Nexus 7 is the first 7" tablet that boasts a quad core processor at its heart. This impressive tablet features a NVIDIA Tegra 3 processor, offering you speedy processing when you're out and about.

    There's no underpowered graphics either, as this tablet has a GeForce graphics card, providing 12 core graphics processing.

    Vibrant visuals

    The 7" multitouch screen is made with Gorilla Glass which not only offers scratch resistance, but also features viewing angles up to 178° wide. This makes it perfect for sharing, and with the 1280 x 800 resolution, you can be sure that whatever you're using your tablet for it will look superb.

    NFC support and Android Beam

    You are treated to another world first with the Nexus 7 which boasts NFC (Near Field Communication) support, the first tablet to support this technology. NFC works by touching supported devices together or bringing them into close proximity for sharing. "

    I'm tempted. I have been looking forward to a tablet to use about the house from my PC wi fi net and as an E book viewer ...that has GOT to be better than the Wet Newspaper View of the Kindle Touch that I got to play with a little while ago. I can see the attractions of the Touch to the commuting Reader but that aint me these days and so I've been anticipating the appearance of the Kindle Fire 2 in the highest spec of its many speck rumoured appearance at the rumoured Third Quarter appearance point.

    The appearance of the Nexus 7 does rather seem to light a fire beneath the Fire 2 doesn't it?

    It's too late in the day to magically produce a leap upward from the Nexus 7 spec so it will all turn on what the Kindle Fire 2 has already been specified to do.

    I'm not in the mood to pre order a tablet just yet given the speed of development in the field ... a 10" Kindle Fire TOO would be nice at say ... £200 with a Hi Definition Retina Screen and 32 gb memory ... and so on, and so forth and, oh, say, £50 worth of e book downloads .... and Sparkly Unicorns with added Pixie Dust.

    401:

    What are the big differences between the Microsoft OS and the Mac OS?

    Much more different in the past than now. Most of what I'll mention is based on the past and has evolved into not so much of a difference.

    Macs dealt with input devices and displays much more nicely than windows. You could put up 6 displays on a Mac back in the early 90s. And arrange them anyway you wanted without effort. I had three on my desk at home back then. Now just 2 (for me) for the last 10 or so years. Individual windows on a Mac could span displays since, well, forever, without the system getting teed off that you had a window partially on a 24 bit color display and partially on a 1 bit B&W display.

    Driver issues on Macs were and still are rare. Except when a brand new model of a Mac came out you could basically take a boot drive from any Mac to another. Buy a new Mac? Clone the boot drive over. Very handy in business setups. Want to give Joe Mary's computer since Mary left? Just swap out the hard drives. You could go thin and have your system setup with only what was needed but why bother Megs and Gigs are free.

    USB devices worked better earlier on Macs. Especially devices that followed general standards.

    File extensions were not needed to figure out what program owned what file types on Macs. Although that has been an evolving topic on both platforms for decades as file systems and OS's evolved.

    Much of the debris/details of computers was/is more hidden on a Mac than on Win. In general. Although this difference has narrowed greatly.

    Fonts and typographic controls on Macs for decades was head and shoulders above Win. Close now. On DOS/Win much of this was up to the Apps for a long time where the Mac tried to take care of it for developers. Sort of. Most of the time. :)

    Printing to supported printers was in general easier on the Mac side.

    In general plug and play just happened on Macs with less hassle than on Win.

    On the Win side. Once we got to Win 95/98 memory management and protected memory was done better on Win than Mac. Mostly. Sort of. Most of the time. Depended.

    More programming tools on the Win side and a bigger developer community. And in general a better structure for developers to write a driver when needed. So as you got away from the basic devices it was easier to find things supported on Windows than a Mac. So that specialty tablet or power interface might never show up on the Mac side. If the Mac didn't have the basic hooks for it it might never show on a Mac. So a web cam or camcorder that follow published standards would just plug into a Mac and work but many of the lower cost items did not follow such standards but did come with a driver for Windows.

    On Win in very rough terms each window is a separate instance of an app so closing a window quits an instance of a program. On a Mac an app is either running or not. Closing windows does not quit the app. This is changing somewhat on the Mac side with Lion and Mountain Lion. And the two styles lead to different work habits. As someone who used early versions of both systems I evolved to like the Mac way better than the Win way but can deal with either.

    For decades on Macs if you plugged in a disk storage device it usually just worked. On Win drivers typically had to be installed which is why so many storage products had a built in micro drive with drivers that would auto install when a device was first plugged in. Much of this has gone away as Win includes more and more generic drivers for various things.

    Behind the scenes Win systems management with AD is much more advanced that what Macs have. Mac are no slouch but Win is much further down this road. Which comes from MS attacking the business market and picking up the consumers along the way. Apple followed the opposite path which lead to small market shares in the old days but has MS all in a snit of late.

    Overall I still find Macs more fluid to use than Win. Personal taste and all that.

    And due to smaller market shares and different systems design Macs have had a mostly free ride on the malware front. This is changing. The question is how much how soon.

    Printing to a PDF file is built in on a Mac. Has been for a long time.

    To be honest if you're really used to one setup or the other then switching just to switch can be painful as your muscle memory will take a while to adapt. Ctl-C vs. Cmd-C to copy and such.

    And for a variety of reasons most home and small business folk seem to turn most Win systems into junk in 1 to 3 years with all the added crap and conflicting software they and or their vendor installs. Macs not so much. Although I have a Mac sitting here now I have to clean up after a home user managed to bollix it up into a fairly odd mess.

    Price points on NEW Macs tend to be at the upper end of the pricing scale. But not so much if you compare equivalent devices from major brands. But there is NOT a clump of cheaper stuff made by more of what many might consider lessor brands. But on the flip side many people get longer life from a Mac and they tend to sell for a decent price on the used market even after 4 or 5 years.

    Al lot of it is personal preference. Where Macs are gaining for the last few years is with people who have interesting in anything but surfing web sites, visiting Facebook, maybe opening a Word doc, printing a bit, etc... They are tired of the driver installs and virus update of the hour and want something simpler. For those who want to open up the hood with a wrench in hand a Mac may not be the way to go.

    For years I'd have people tell me it was dumb to buy a Mac because I couldn't go pick out the perfect video card or some such. To be honest my life would not have been changed by a video card that was 20% faster on rendering the latest upgrade of Halo. And neither was most people's. Which is why Apple has been on a tear for the last 10 years.

    And the single biggest difference, at least in the US, and in some other parts of the world are the stores. You can haul in your 10 year old laptop that will not boot and they'll look it over and give you an analysis of what's wrong and what your options are. Many times on the SPOT. And many times fix the issue at no charge. And not feel like the Geek Squad guys are going to use it for goal post practice when you leave the store. Come in and buy a new computer? They'll transfer your data from the old one, Mac or Win. Bought an iPhone and can't figure out how to make it sync to your Win computer? Bring it in and they'll work it out for you. Want extra help? When you buy a Mac for an extra $99 you can sign up for 1 hour a week to talk about most anything with someone at the store. Bring in your computer if you want and say "how do I do this" and they'll work with you. 52 sessions for $99. The stores are the secret killer app. And I just read where MS is planning to open 75 more in the US. I wonder why.

    Anyway, too much typing. Others will chime in with different opinions and praise or shred mine.

    402:

    ScentOfViolets @399: "isn't every operating system all about making calls to various un/reserved memory locations, I/O ports[1], etc?"

    Sounds like you're talking about a kernel. IMO an OS is a kernel with some essential libraries, services and applications, but opinions on this vary (and there is substantial difference of opinion on what a kernel's purpose should be, too).

    The Mac OS X kernel is a modified BSD kernel, and OS X is POSIX compatible.

    MS OS products, not so much. There are POSIX implementations, but they are more painful to use.

    403:

    s/compatible/compliant/

    404:

    From my point of view the most important difference between recent versions of Windows and recent versions of Mac OS X is that Mac OS is a Unix variant (out of BSD by way of Mach and OpenStep)m and Windows isn't. This means that it's usually not hard to port Unix program and device driver source code to/from Mac OS and other Unix systems, including Linux. In fact, there are tools that can download, configure, build, and install a lot of open source software to Mac OS with a single command (or mouse click, if you prefer tools with GUIs to command lines).

    405:

    What are the big differences between the Microsoft OS and the Mac OS?

    Underlying it: Microsoft's current OSs are all descended from Windows NT, development of which started circa 1990-92 when Microsoft split with IBM over long-term development policy on OS/2. Chief architect of Windows NT was Dave Cutler, formerly of DEC, who had been one of the major movers behind VMS. So -- I am told -- WinNT, under the hood, bore a remarkable resemblance to VMS.

    However, that was 20 years ago.

    Apple's OSX started out as NeXT's NeXTSTEP circa 1990, a non-X11 GUI layer on top of a BSD UNIX layer running on a Mach microkernel. In 1999-2000 it acquired a Mac bag-on-the-side to run classic MacOS apps, which disappeared again circa 2004. So, under the hood, OSX gives you a full UNIX implementation, modulo a whole bunch of tweaks.

    However, that was 20 years ago.

    Today both OSs have expanded to provide much the same services. OSX has perhaps focussed less on large scale headless server applications, but if you scratch a Mac you'll still find stuff like Apache, PostgreSQL, and a Postfix mail server kicking around: the OSX Server pack from the app store provides a cuddly GUI front end for this stuff. Microsoft, I believe, provide similar functionality if you buy their business or enterprise versions.

    Both OSs are, however, primarily focussed on providing a cuddly desktop experience rather than raw transaction processing horsepower. For that, you probably want a non-graphical Linux or Solaris distro running in a VM, either Intel hosted (as on Amazon's EC2) or mainframe hosted (as on an IBM zVM system).

    406:
    Much more different in the past than now. Most of what I'll mention is based on the past and has evolved into not so much of a difference . . .
    To be honest if you're really used to one setup or the other then switching just to switch can be painful as your muscle memory will take a while to adapt. Ctl-C vs. Cmd-C to copy and such . . .
    Al lot of it is personal preference. Where Macs are gaining for the last few years is with people who have interesting in anything but surfing web sites, visiting Facebook, maybe opening a Word doc, printing a bit, etc... They are tired of the driver installs and virus update of the hour and want something simpler. For those who want to open up the hood with a wrench in hand a Mac may not be the way to go.

    Thanks to everyone for their replies to a vague and poorly worded question. It seems that in aggregate the answer to "What's the big difference between the PC and Mac OS's?" is, "not a lot and converging to less" on the functional side.[1]

    What I'm getting at with my questions about recursion and operating systems is that I'm (I think) coming around to Charlie's notion about walled gardens. Because when I analyze my own objections to them (and this is wrt the comments about Samsung's attempts at one) it's not that they're straightjackets, it's that these walled gardens don't take my own preferences into account.

    I'll chime in with what I use on my machine in addition to the usual browser/office/media stuff: Miktex(Latex), Acrobat, Mathematica and MatLab, SAS (via a secure connection to the school), Minitab and R, and Cygwin and Macaulay, (for the algebra packages). Iow, like a lot of other people - probably the vast majority of them - I'm spending most of my time using a small set of prepackaged apps. And if the iPad came preloaded with just those applications that I need and use, why, I'd be just fine with living in their walled garden.

    This mindset does not seem to be unique to the clueless and unsophisticated hordes who mindlessly buy the Apple brand, btw; AFAICT from my small sample here, it seems endemic up and down the line of IT sophistication. Even programmers (some of them) don't mess around with techniques like recursion if that's the best way to do the job; they just make a call to a standardized (walled garden!) library.

    So - the question I've been leading up to - if people had to live in a walled garden, what would they put in it? What's not available on the iPad that should be there?

    [1]Some nerd humor - such is the power of convergent design that intrepid space explorers making first contact with another civilization insert a standard CD-ROM into an alien machine and boot up Adobe Photoshop on the first try by pressing their standardized icon for "play".

    407:

    What's not available on the iPad that should be there?

    From my PoV: there's no commandline experience. Not even a sandboxed busybox. (Well there is, but you have to jailbreak the device before you can access it.)

    The "no interpreters" rule has been watered down, but they're still strict on "no interpretable code you can download arbitrarily off the internet or share with random strangers", so for the various language toys or IDEs you either have to sync your projects via iTunes or they work on code stored and executed on a remote server -- it's a royal nuisance.

    The applications are mostly feature-poor compared to their desktop equivalents (although they're getting gnarlier and more complex with each release; I expect within another 1-3 years there'll be some quite powerful tools there, especially if MS, as rumoured, plan to launch a full-on version of MS Office on iOS this fall).

    The "no web browsers that don't use the iOS version of webkit" rule ... I can see why that made sense at launch, but again, that's getting annoyingly ancient.

    Most irritating of all: if you want to compile code for iOS and transfer it onto an iOS device, Apple want $99/year for a developer license because All Code Must Be Signed. Why they can't hand out single-user/single-machine license codes for free so that hobbyists can dink with their own machines is a mystery to me.

    The one feature of Android that I'd love to see on iOS: the setting you can toggle that says "allow installation of apps from other sources". Even if Apple's version then said "(this will invalidate your warranty and our support)".

    408:

    Even programmers (some of them) don't mess around with techniques like recursion if that's the best way to do the job; they just make a call to a standardized (walled garden!) library.

    Ah, this is a really weird comment to make, and it seems to betray a misunderstanding of what current-day programming is about. It's a little akin to saying that because a car designer uses off-the-shelf components that have been optimised over generations instead of designing their own spark plug evey time, that they're somehow not doing it right.

    They are. But they're using higher level components. You just seem to have got fixated on what was important to know several generations ago. These days, the code I write uses recursion all over the place - the point is, I am not the one writing that repetitive code. I've asked for a red-black tree containing instances of foo, and when I want to add a new one in the right place, or to find one I put there earlier, I just make a single function call. Why on earth would I want to write the tree traversal code each time, when someone else has already done it, and tested it, and can do it for me without a bug?

    409:

    I'm not overly familiar with architecture as a business/design discipline (other than strange decisions like building a transport museum with lots of wall space and hanging the car collection off it, putting a collection of ship models on a conveyer belt instead of in fixed cases so you can't inspect the detail...) but that's making a business case for using OS-X and Windoze yes.

    That business case doesn't apply to most industries though, and if you've solved a specific issue once in most industries, then the workaround stays known. In mine, system stability is way more important than ease of communicating with clients in semi-realtime.

    410:

    Even programmers (some of them) don't mess around with techniques like recursion if that's the best way to do the job; they just make a call to a standardized (walled garden!) library.

    That doesn't match my reality I'm afraid :-)

    • Every vaguely competent developer I know still uses recursion if/when appropriate. People don't avoid recursion because it's hard. People avoid it because it's not always the best solution.

    • Calling the standard library is almost always the right thing to do since the standard libraries have had many, many more hours invested in getting them correct and fast. I know a guy who spent a year increasing the performance of an old C++ app by a ridiculous percentage by just swapping in the standard template libraries instead of various hand-rolled classes that the team "knew" would be more efficient.

    • Many, if not most, standard libraries are not walled gardens. I spend most of my time in the unix/open-source world and the guts for the stuff that I play with is pretty much always available if I need 'em. In Microsoft land you can usually get at the source if you need it (http://www.microsoft.com/en-us/sharedsource/default.aspx). Apple is about the only big player that doesn't let people get at the raw source. Even there many libraries the folk use are provided by other developers.

    411:
    Most irritating of all: if you want to compile code for iOS and transfer it onto an iOS device, Apple want $99/year for a developer license because All Code Must Be Signed. Why they can't hand out single-user/single-machine license codes for free so that hobbyists can dink with their own machines is a mystery to me.

    My personal guess...

    1) The hobbyists that really want to play will jailbreak and get a self-signed cert.So it's not a insurmountable barrier to folk who want to get into the area for "free". They are also self-selected out of the group of developers that apple has to nominally support since they've done naughty things to their boxen.

    2) It allows apple to focus resources/research on people who are invested in the platform enough to spend a bit of cash - rather than pure hobbyists.

    3) I suspect that $99 a year is about right to turn their developer support into break-even rather than loss (judging by how much I've seen support costs per-user in shops that produce dev tools.)

    4) I don't know if it's the value we put on something because we've paid for it, or because the price filters out the most problematic, but people seem to complain less (or in more productive ways) if they've paid for something than if they got it for free in my experience.

    412:

    In mine, system stability is way more important than ease of communicating with clients in semi-realtime.

    When have I ever argued against system stability?

    The real issue is "are the systems stable enough"?

    If they (a typical desktop system) don't crash or get into a state where a restart is needed during a days work 99% of the time then for most businesses that's good enough. Especially a service oriented business where you have people providing the service on a more normal business schedule rather than 24/7.

    And go find ANY CAD software that allows you to work on reasonably complex projects of reasonable size and trade files with your subs and consultants and contractors and clients without a significant increase in workload that stays up for months at a time. It doesn't exist. If you want to work in the architecture, civil engineering, and/or construction fields there is a world of standards and work flows you get to deal with. Like it or not.

    413:
    I know a guy who spent a year increasing the performance of an old C++ app by a ridiculous percentage by just swapping in the standard template libraries instead of various hand-rolled classes that the team "knew" would be more efficient.

    Must be getting near time for my annual rant against premature optimization. Right now I'll just mention the time I watched a room full of senior programmer/engineers spend 2 hours of a product planning meeting arguing about the high-level architectural features and the commitment of resources required to fix a problem on an as-yet undesigned product when no one had ever proved the problem actually existed (and no one could explain how it was severe enough to require being fixed even if it did exist). In some ways it reminded me of the strange doctrinal arguments I used to see Marxist-Leninists get into when trying to agree on what to put in the broadsheet for the next political action.

    414:

    "What's not available on the iPad that should be there?"

    The one that irritates me is that third-party media players can't access hardware acceleration.

    In real terms what this means is that if a video format isn't supported by Apple it's not going to work properly in third party players, the most glaring example of this being .MKV files, which, in spite of generally being regular common or garden H264 inside the container won't play if they've been captured/encoded at HD resolutions.

    This is very annoying as an awful lot of content available for download is in .mkv form. Most of the gadgets I have around the house will play this quite happily of the DLNA media server that sits in my office so we can watch this stuff on the big TV downstairs, on my laptop PC, or on one of the PCs in the boys rooms without having to get into re-encoding, remixing, and tedious stuff like that, but an iPad (this iPad in fact!) chokes on it...

    415:

    Sorry, the point of my last was that "stable enough is good enough" works for you, but doesn't work for everyone. You have needs for "semi-realtime conmmunication in certain formats" and I don't.

    I accept that we have different needs, but my needs do not prevent Autodesk from producing software that runs on a stable Unix platform. They do prevent me from using Windoze servers.

    416:

    Must be getting near time for my annual rant against premature optimization.

    No need - Donald Knuth beat you to it :)

    "premature optimization is the root of all evil".

    http://en.wikipedia.org/wiki/Program_optimization#Quotes

    I came across a bug in one of our IP blocks, caused when an engineer had decided to write their own circular buffer instead of using the STL. Rather foolishly, they were doing their last read of memory immediately after freeing it - ironically, as part of their "clear the buffer" code. They'd have got away with it too, if it hadn't been for my attempts to run that pesky debugger (and its insistence on detecting any such behaviour).

    :) The minimal-impact solution involved recursion :)

    As an addendum, I got to be competent at pointer-based data structures because I'd had to write my own in C for an embedded system with limited memory. Given the widespread adoption of the STL and Boost libraries, you have to wonder where the next generation of library programmers are going to come from...

    417:

    I accept that we have different needs, but my needs do not prevent Autodesk from producing software that runs on a stable Unix platform. They do prevent me from using Windoze servers.

    What stops them is the market share of the desktop Unix. They walked away from that around 20 years ago when Windoze desktops took off.

    My main argument about the issue of up time as a metric is that when making a decision it should not be applied out of context. Have a typical up time measured in months is a useful item when applied to servers or my Tivo DVR. But not so much when applied to desktops outside of maybe 911 or ops center control consoles. It is a metric that needs to be applied AFTER the business needs are taken into account. Not before.

    And us nerds tend to put these carts before such horses all the time. And business managers keep thinking we're idiots because of it.

    418:

    It's somewhat dated now but anecdotally about fifteen years ago I was involved with Autodesk software, particularly 3D Studio, just as they were transitioning to becoming Discreet. I got to talk to some folks there, developers and support engineers who had been writing for Sun and SPARC kit a few years previously. The impression I got was they were shifting to Windows for desktop and workstation because MicroSoft had a roadmap and a plan for what they were going to be doing in five or ten years time and AutoCAD, 3D Studio etc. really needed that sort of forward-thinking support.

    I mentioned Linux to them as it has a UNIX core and the original codebases for much of Autodesk's big earners were written for Solaris. Their explanation for not supporting Linux was that unless the OS development was ruled by a rod of iron and the developers could produce a timeline for a minimum of three years out then they weren't interested in rewriting their apps for Linux as it might die on them suddenly or change direction radically.

    419:

    Makes sense. Autodesk went to where the market was and is and where they could make long term plans. They figured out long term plans with a not as good OS were much better than short terms plans with an OS that might not be an option 3 to 5 years later.

    The key point being they applied long term business logic then figured out how to make use of the computers inside of that box.

    Now one I don't get unless it is a trial balloon is the new AutoCAD on Macs. It doesn't work with any of the add in packages as far as I can tell and in most industries, from what I understand, most AutoCAD users live inside of their add ons with AutoCAD hiding behind their wrappings. And in the industry I know most about, architecture, the firms moving forward after the recession seem to all be switching to REVIT. Maybe it as a skunk works thing like the iPhone/iPad viewer that wasn't an official project until it was almost done.

    420:

    My first job after I got my CS degree in a code review with my line manager (who had a geology degree) the lack of 'optimisation' of some C function I had written was brought up. And I was amazed. The compiler just turns that into the same object code I said. They believed me because I was so clever. Then I went and looked at the disassembly and indeed it had. So I wasn't going mad :-) In fact simple source code is easier for the compiler to optimise than tricky stuff. People who don't understand the gubbins underneath often think clever syntactic tricks in the source code will magically make the object code better. Usually not.

    421:

    So much comments! I couldn't read it all, but I wonder if one of them suggested that you sell your Samsung tablet and buy a Nexus 7 instead?

    You did bring up a good point about what have Nexus phones done for the phones in general.

    It can be the interface - for example - Nexus Galaxy - it's a phone that have no buttons at all on the front screen so that the screen appear really big. Phones will start to look this way. Slow process.

    But for Nexus 7? I would find it hard to imagine it'll not have an impact. It's very cheap for what you're getting. Kindle Fire was selling well due to its price and the Nexus 7 match it but blows it away spec-wise.

    Plus it'll have upgrades being part of Nexus family. A lot of tablets will find it hard to compete against Nexus 7.

    I'm considering selling my ASUS Transformer to get the Nexus 7 instead. Love the tablet, but it's a bit big for reading books etc.

    422:

    Wait awhile before you trade in. Lots of Rumours to the effect that the Kindle Fire 2 is due to be launched at the end of July and that maybe the 10" variant of the new Generation will be launched at the same time as the 7" Kindle Fire 2 ..lots of rumoured specifications on the web if you google for Kindle Fire 2 and so there's no need for me to provide a link.

    Anyway, given our hosts pattern of Shiny Tech Purchases its highly likely that he will upgrade to the Fire 2 and report back here on his blog which removes some of the risk of obtaining the New & Shiny the instant that it appears.

    I'll be interested in the New Generation Kinle Fire 2 7" as an e book reader because frankly the Kindle Touch - with its wet news print contrast screen and fiddly touch controls - has been a bit of a disapointment.

    423:

    Good Lord, you've just described AMD. After being laid off from SUN, and kicking around some very low paying jobs, I find myself working at AMD. I'm still amazed at the way they work. As a fellow employee put it to me "they want you to be a jack of all trades, and a master at none". Multiple groups doing the exact same thing but for different chips. Unbelievable. Having worked for SUN for 14 years, I think AMD's survival route at this point in time is to jump into ARM. They have reached what I consider the end of life point. When you spend more time debugging your DFT features than the processor itself, you are simply a disaster waiting to happen.

    I read through most of the comments as a purchasing guide, and it looks like it is time to go the Apple route. I'll take the known quantity over the fragmentation in the PC world any day.

    424:

    So much comments! I couldn't read it all, but I wonder if one of them suggested that you sell your Samsung tablet and buy a Nexus 7 instead?

    That's exactly what I plan to do. Not selling the Samsung until the Nexus 7 arrives, though.

    425:

    Ha! I worked for AMD for 6 months in 1976 before getting an offer from Intel and running away as fast as I could. At the time it was being run as a flat organization for purposes of purchase approval: everything that cost more than a couple of thousand dollars had to be authorized by Jerry Sanders, the CEO. I bailed the third time my request for purchase of a new oscilloscope (we had 1 in the entire Linear IC Design Group for 6 engineers and 3 technicians) was returned for "additional justification".

    Specials

    Merchandise

    About this Entry

    This page contains a single entry by Charlie Stross published on June 21, 2012 10:42 AM.

    Normal service will be resumed ... was the previous entry in this blog.

    Typo archaeology: The Family Trade/The Hidden Family is the next entry in this blog.

    Find recent content on the main index or look in the archives to find all content.

    Search this blog

    Propaganda