Apple Being Confident or Arrogant?

There was an article in Forbes recently that quotes Tim Cook claiming that Apple is the “only innovator” in the personal computing space. This could be viewed two ways: either confidence or arrogance. But the odds against the other big manufacturers (HP and Dell both tried to respond to Tim’s comments, but not very effectively) are long. It really isn’t a pretty picture for most companies not named Apple. But is Apple really the only innovator? Are they really the only one trying to push the boundaries and go into new territory? Let’s look at the two sides of this story.

Apple Is The Only Innovator

If you look at the last decade of personal computing development, the number of real advances is actually pretty small if you take Apple out of the equation. Yes, notebooks have been getting faster, and have more memory and storage. But most 14″ notebooks are about the same size and weight now as they were in 2002. They still only get 2-3 hours of battery life. Sure, they are a bit more durable, and now I can get solid state disk as an option. But they still cost about $1,500 for reasonably equipped models, still have Wifi, an optical drive and a bunch of ports to plug things in. They still run a version of Windows that hasn’t changed in any meaningful way in over a decade.

The only real advance in personal computing was the netbook, and that didn’t come from HP or Dell. The first real netbook was the Eee from ASUS, and it didn’t run Windows at first. Yes, HP and Dell eventually jumped into that market, but they didn’t start it. All they did was evolve it a little, here and there. Otherwise, HP, Dell, ASUS, Acer and others have been largely doing “more of the same”: more power, more memory, more storage, better graphics. Yes, they’ve updated the packaging and such, but not in any truly radical way.

Apple has done 2 things in hardware that are innovative. The first is the MacBook Air. This device offers the portability of a netbook, but with nearly the power for a “normal” laptop. It is super-light, razor-thin and offers astounding battery life. The second generation got even better, offering the usual “more power” along with 2 sizes. The 3rd generation is an evolution on the previous two. But the line, at first minimized or put down by other manufacturers, has spurred the entire “ultrabook” segment. For all of HP’s claims about “the first business class Ultrabook” (whatever the heck that is), anything that has come since from HP, Dell, ASUS and Acer has been a “me too” copy of the original. And it took these companies nearly 3 years from the launch of the first Air to realize that this was the real deal (despite soaring sales from day 1 for Apple).

The other, and perhaps bigger, innovation was the iPad. Here is a machine that, like the netbook, does most of what the vast majority of PC users need: e-mail, browse the web, social networking, some document editing. It also does far more than any netbook when it comes to entertainment: it can play real, 3D, very rich and highly interactive games, it can store and play back substantially more media like movies, TVs and music. It is more portable than a netbook, has better battery life, and has software that is purpose-built for the device. A netbook has to lumber along with software built for far more capable machines. The iPad apps are built specifically for the iPad.

Did HP and Dell have tablets? Sure, they did. Windows Tablets. Which no one wanted. They were big, bulky, expensive and had lousy battery life. You had to use their special stylus, which was typically pretty expensive to replace. It ran software that wasn’t really built for touch interfaces. I had a Toshiba tablet, and it was horrible. Too big and awkward to use as a tablet, and too limited in power to be truly useful as a PC. They were like some of these crossover-like cars like the BMW X6 or the Acura ZDX: all the drawbacks of a tall, heavy SUV in terms of handling, and all the drawbacks of a car in terms of interior space and cargo room. The Windows tablet PC was a compromise.

Apple has indirectly fueled changed in the enterprise, specifically with the consumerization of IT. Apple has focused on the consumer space, but those consumers want to use their iPhones and iPads at work. Up until the iPhone, there was some separation between work technology and personal technology. You used a company PC and a company phone. You may have had your own phone, as well as your own PC back at home. But the two were generally distinct. The iPhone and the iPad have changed that. People don’t want to carry around 2 devices, one for work and one for themselves. Apple may not have set out to do that deliberately, but it took their products to push that forward.

In the end, Apple has done more to set the direction of personal computing the past 2-3 years than almost any other PC manufacturer has done in the past decade. Yes, we got faster machines. Yes, we got more capable machines. We got more storage and more memory and better graphics. But they all came in boxes that were about the same physical size, had about the same weight and lasted about the same on battery. Putting in Intel’s latest processor, more RAM and bigger disks isn’t innovation. That’s simple, inevitable evolution of the platform.

Apple Isn’t Alone

Apple has really only innovated in one space: personal computing. Even then, the Air may have been the first, but thinner and lighter machines from the likes of HP and Dell were inevitable. The Air may have sped things up a bit, but Intel had been pushing for something they called an Ultrabook for a long time now, well before the Air came out. Extremely light and thin machines have been bandied about and discussed for years now. Sony had a few machines back in the late 1990’s that were close. Their Z505 was a pretty impressive machine, even if it was purple :-).

Look beyond personal computing, and Apple has gaps, holes or a wobbly story. Cloud computing is a good example. Apple wasn’t the first with cloud storage. Microsoft started with early versions of that with “offline storage”, allowing multiple machines to keep local cached copies of data stored on your servers. Even Apple’s first attempt, with MobileMe, was a bit sketchy when compared to DropBox. MobileMe was adequate when used as something of a “big USB thumb drive” to transfer data, but editing data on an MobileMe “disk” came with problems (particularly with temporary files). DropBox has done a far better job. Yes, iCloud is an improvement, but DropBox is still ahead in cloud storage.

When it comes to cloud services, Apple is basically non-existent. Google and Amazon have done far more to advance the state of the art when it comes to cloud-based services. For example, if I develop an iOS and OS X app that uses push notification, I have to turn to other providers not named Apple to set up the back-end. That means I have to figure out how to integrate my back-end service with Apple’s push servers, using Apple’s published protocols. Apple provides precious little when it comes to helping out there.

As for Apple’s innovations, one could argue that they, too, are simply copying other ideas. Apple wasn’t the first in useful PDA’s or smartphones. Some of their latest “innovations” in iOS 5 are actually copying features that have been in Android from the start. Yes, Apple has certainly improved upon these ideas, but you could argue that those are just as much “evolution” as any other company’s ideas.

Some of Apple’s technologies are still awkward when dealing with the “power user”. Consider iTunes: it is still very, very awkward to use when managing massive media libraries. It is find when you have dozens of things, but when you have hundreds of albums, and multiple thousands of songs, it becomes more difficult to use. It isn’t just iTunes. While being able to group apps in iOS was a huge leap forward, it is still very awkward for people who have hundreds of apps on their devices. Their full-screen approach in MacOS 10.7 Lion is hopelessly awkward if you run a multi-screen desktop. I do use fullscreen from time to time on my notebook, but I never use it on a desktop I have, simply because it doesn’t work well on dual-screen setups. The virtual desktop is brilliant, and I use that all the time. But fullscreen just doesn’t “scale”.

Apple has also done little to directly advance certain types of hardware design. The desire for fast devices that consume little power predates Apple’s current line of products. Companies like Intel, Motorola and IBM have done far more to advance the state of the art when it comes to materials and chip manufacturing. There is no doubt that Apple has acted as a catalyst in many ways, but it is other companies that have done the heavy-lifting.

Apple also is riding on the backs of others when it comes to hardware reliability. Companies like IBM and HP have done far more when it comes to making machines that last, and Apple has been the beneficiary of that. Advances in fault detection and fault avoidance are the bread-and-butter of the server world, and the end-user devices get to take advantage of it. But Apple has done little when it comes to dealing with, and avoiding, faults in storage, memory or processing.

Apple has also done nothing in the virtualization space, and I fully expect that we will see an Android phone with virtualized personalities (allowing one for work and one for your personal stuff) before we see it in iOS. Consumerization of IT is leading to pressure to have these devices have separate and secure personalities that IT can control, but still leave the rest of the device under the user’s control. When someone leaves a company, perhaps abruptly, a company wants to be able to delete just it’s data, apps and wireless service, but without wiping out the end-users personal contents as well.

But In The End…

When it comes to innovations that the average person sees, Apple is at the very least the most visible. They’ve changed the landscape, and the rules to the game, when it comes to mobile and personal technology. Without the Air, we would be waiting a few more years before ultrabooks came into existence. The iPad changed how consumers viewed “personal computing”, and the iPhone showed just how smart a smartphone could be. Others have followed, and in some cases improved on things. But for the innovations that people remember, Apple has definitely been the standout for the last few years.

However, no position is unassailable. and Apple would do themselves a favour by toning it down a little. No lead in technology is safe. The technology landscape is littered with the remains or survivors that were once the undisputed leaders in their segment. The IBM mainframe. The Sony Walkman. The Palm Pilot. Nokia Symbian. Blackberry. Windows Mobile. All of these either defined their segment or dominated it in overwhelming fashion. All have either dwindled to near-insignificance or disappeared entirely. Even Apple is the proud owner of a shrinking segment: the dedicated MP3 player. The iPod is on the decline, having been supplanted by their own products. The only iPod that probably has a future is the iPod Touch. People want more than a simple machine that just plays music, and Apple has done their best to diminish that part of the market. There will likely be a time where people will be engaging in “remember when” about the iPod, whose name may or may not live on in the iPod Touch (unless Apple finally smartens up and calls it the iPad Nano, which is what it really is).

Making bold statements is one thing. But if you go too far, you risk “poking the bear”. The view from the top is good, but the view on the way down isn’t always pleasant. Apple has had a good ride, and I expect their fortunes to remain positive for a while yet. But it won’t last forever. Besides, with all the negative attention Apple is getting right now, a little humility wouldn’t hurt their cause.