A Sign Of Transformation?

Recently, Microsoft announced that is has joined the Linux foundation, that SQL Server is now ready for testing on Linux and a preview for Visual Studio on Mac is available. This is a transformation that cannot be underestimated. They reinforce the direction that Microsoft CEO Satya Nadella started a couple of years ago. And they are the right things for the company to do.

Continue reading

Will OS X Become macOS?

There have been rumours that OS X, the operating system for the Macintosh, will be renamed “macOS”.  I certainly hope so. Frankly, I think Apple pushing the whole “OS X” thing was baffling and an exercise in futility. I may not be a marketing person, but when the product’s name is confusing, that is a bad thing.

Continue reading

Multicore Mobile Machines: What’s The Big Deal?

A recent post on Android Guys (a very good site for Android news and information, BTW, I highly recommend it), tries to imply that moving to multicore machines for phones and tablets is a big deal. The article covers very old ground as to the benefits of a multicore processor. I’m just not sure it’s as hard or as ominous as the author tries to make it out to be. Why? The heart of Android and iOS are kernels that already run successfully and efficiently on multicore and multi-CPU machines today. Android is effectively Linux, albeit somewhat stripped down, but the heart of the system is still there. iOS is basically a mobile version of MacOS, which itself is a variant of BSD. Both of these run quite well on multicore machines (or they better, given that my Macs and Linux boxes are all multicore). I’m not sure how much of the Windows kernel made it into Windows Phone 7, but even Windows 7 is a pretty decent system for multicore/multi-CPU setups.

And it isn’t as if multicore/multi-CPU architectures are new here. Commercial operating systems have working with SMP-enabled servers for many, many, many years now. Multicore didn’t change the dynamic that much, it mainly changed the packaging. Dealing with thread-safe memory management, thread-save device access and thread scheduling is a well-known and well understood problem, with the basic research going back more than 2 decades. This isn’t exactly new stuff we are talking about here. There is a generation of software developers who have spent their entire career using nothing but multicore machines.

Will it be a big deal for app writers? No, not really. I’ve already done a bunch of work with multithreaded apps on both iOS and Android (and one experiment I’m working on explicitly depends on it). Again, designing, coding and maintaining multithreaded apps isn’t exactly new. It is a well-understood problem that many, many people are already familiar with. Putting a multicore machine under the covers will potentially let those apps run better, but the concepts and principles remain unchanged.

Does the impending rise of multicore mobile devices change things? A bit. It brings more processing power into our pockets, and it may help reduce electrical power consumption. It may also make it easier for people to write far more powerful apps, but at the same time make it easier for lazy some people to write apps poorly but still perform well. From what I’ve heard, QNX and RIM’s PlayBook depend on it to get their performance. Rumour has it the upcoming iPad 2 (or whatever Apple calls it) will include it.

In the end, this development isn’t entirely a surprise and shouldn’t be unexpected. It also isn’t something to be feared or for people to be worried about. It is an inevitable technology change that was going to happen at some point, it just may be happening sooner than people expected or thought.

No Flash Today, No Java Tomorrow

The new MacBook Air doesn’t ship with Flash installed. The next version of MacOS (10.7 Lion) apparently won’t come with a JVM from Apple. I’ve seen people reading all kinds of things into these moves, but with no real basis in fact and only guesses. I’ve seen anything from “meh” to “Apple is trying to control the universe!” responses. Are these a big deal?

Not having Flash pre-installed is not a big deal. I’ve bought many Windows machines from IBM, Lenovo, HP and Dell, and none of them ever came with Flash. They also didn’t come with QuickTime, Shockwave, Acrobat, or Adobe Reader. I know that when I set up a new machine (Windows, MacOS or Linux), I have some work to do. And as Steve Jobs is rumoured to have said, it also means I get the latest version of all of these technologies. Is it inconvenient? Certainly, particularly for non-technical people. But is it the end of the world? Is it really a big problem? Not really. I honestly don’t see this causing a drop in Mac sales.

But what about the removal of Java from future versions of MacOS? Is this a big deal? It depends on what Oracle does. If Oracle does indeed bring out a MacOS version of Java, then I’d say this isn’t a problem, and personally I’d rather have an update-to-date JVM from Oracle than an older, a-few-releases-behind JVM from Apple. It’s not like this situation would make MacOS unique. The version of Java that normally comes with Windows is generally substandard, and anyone who ships Java-based software either recommends the user install an update-to-date JVM, or ships one as part of their product. People who use Java regularly will get the most up-to-date JVM themselves. Again, this is annoying, but not fatal. I can’t count on the JVM that ships with Windows or most Linux distros, so why would I expect it to be any different on the Mac? But to reiterate, this is only the case if Oracle starts to ship their own MacOS version of Java.

The real risk is that Oracle decides not to release the latest JVM and JDK for MacOS. This makes Apple’s decision problematic, particularly on the server OS, since a lot of server-based software depends on Java. Not having it on the desktop is a barrier for developers, and for anyone who depends on Java-based desktop software (not that there is necessarily an enormous amount of that in the current Mac user community). Perhaps some 3rd party organization will step up and maintain a JVM for MacOS if Oracle doesn’t, but who knows if that is a viable solution. If I were to guess, I would expect to see a MacOS version of Java from Oracle, simply because the Mac has started to capture a respectable (albeit still small) share of the PC marketplace. If Apple were to allow Java on iOS, then I would be very confident that Oracle would step up. The odds of Java on iOS don’t look good, since it does contravene the developer’s agreement, but who knows: Apple said that Safari would be the only browser on iOS, and now it isn’t.

The lack of pre-installed Flash is a non-event in my mind. It’s inconvenient but not fatal, since Flash does still run. The slight disappointment is that it makes setup of a Mac start to look more like setup of a Windows PC. It isn’t the end of the world, just a minor annoyance for most people.

If Java doesn’t make it past Snow Leopard, I will be very disappointed. Apple has been on a roll with the Mac, as it continues to capture a larger and larger share of the PC market. But you don’t continue to grow by limiting your options for your customer base. Growth comes by expanding options and opportunities. Let’s hope that Oracle steps up and agrees to fill this approaching void.

Once again: Market domination? Why Worry?

I read an article on MoneyCentral where the author ponders if Apple is blowing its chance to “dominate the market” in PC’s and smart phones. On PC’s, his assertion is that the Windows Vista debacle was a opportunity for Apple to sweep in and push Microsoft aside. On smartphones, he implies that Apple had a chance to dominate a green field.

From where I sit, I’m not sure the author really understands either market. On PC’s, yes Vista was a disaster, but it didn’t affect the largest and most profitable segment of PC buyers very much: corporations. Many, many corporations continued to buy PC’s running Windows XP, and Microsoft had to continually extend support for it to keep those customers happy. The reality is that, no matter how technologically superior MacOS is, businesses were not going to just dump their Windows machines in favour of it. They have far too much invested in terms of software, infrastructure and employee training/familiarity to abandon it. This isn’t like switching brands of toasters or microwave ovens. Switching PC operating systems is a significant undertaking. The Mac was able to make small in-roads in some businesses, but Apple has not focused on the Mac as an enterprise computing platform. The opportunity for this on the enterprise side was extremely small, if not largely non-existent. The release of Windows 7 won’t cause corporations to suddenly buy more PCs: they were already buying what they needed (or not buying, courtesy of the recession). The Mac wasn’t on most lists to begin with, and didn’t have a realistic chance of being there, either.

On the consumer side, Apple has had some success. But no matter how much better MacOS could be for the average person, again, Apple is up against the fact that buying a Windows machine (even a Vista-based one) is viewed as the “safe” alternative. It wasn’t about price. The average person doesn’t buy a super-cheap $500 PC, they spend $1k or more, right in the range that low-end Macs are positioned. The PC is “safe” because of the vast library of consumer software available, specifically things like games. Its also “safe” because the computer most people use at work is a Windows-based machine, and most of their computer-knowledgable friends are more likely to know how to deal with a Windows PC. Ultimately, running XP Home, Vista or Windows 7 doesn’t really matter to  most consumers. Apple is up against a monumental amount of inertia here, and has done as well as could be expected.

When it comes to smartphones, the author of the article appears out of their depth, and seems to focus solely on the US market. Globally, the biggest player in smart phones are those based on Symbian. In the US, most smartphones right now are corporate phones, and that’s were Blackberry rules. The consumer market in the US is still nascent, and Apple is doing very well there. What Apple is doing is avoiding the sorts of things that have made it hard for other players to sustain their profits on handsets: too many models, too many networks and not enough (if any) profitability per handset. Current handset manufacturers have no effective control on what they build: they are essentially told by the networks what to build, and it results in them having to have far too many variants on handsets, with the attendant costs associated with them.

Apple is trying to change that formula by taking control. By limiting their network providers initially, Apple got a chance to “test drive” the market in a controlled environment while keeping their risk and costs low. It appears Apple is going to add other network in the US. They’ve already done that here in Canada, since you can now get the iPhone on Rogers, Fido, Telus and Bell. Apple is moving in the same direction in the much larger mobile phone markets in Europe and Asia. The exclusive AT&T deal wasn’t forever. Apple’s success allows them a measure of control and leverage over other networks. This means they can say “no” to overly-customized variants of the handset that will only work well on a specific provider’s network. That in turns keeps costs down and margins higher.

Google, on the other hand, is trying to go down the road Microsoft has already travelled. Microsoft build Windows Mobile, and left it up to the device makers to do something with it. What Google is doing differently is, first, not expecting a significant revenue stream from Android on handset sales, and second, actually maintaining the operating system consistently. The risk for Google, though, is their loss of control to the handset makers. Some handset makers want to change the look and feel quite drastically (Samsung is apparently looking at this), and the risk is that it won’t look or act like other Android phones. The OS will be a commodity, and could be marginalized to the point people don’t care about it. This means Android’s success is completely at the mercy of the handset makers, and Google has no effective control of the brand. As a result, Google is hedging their bets on this, making sure their software is available on other smartphones as well.

Either way, it is far too soon to tell who is going to dominate the US marketplace, and if Symbian will be supplanted as the leader globally. Apple is doing well and RIM is making in-roads on the consumer side, by leveraging their position on the enterprise side. Android should do well, but the risk there is the handset makers marginalize it and don’t buy into it exclusively. I expect Android to be around, but not necessarily as a well-known individual brand in the minds of the consumer (like Symbian is today).

The last element of the article is the author’s dismissal of Apple’s stance that marketshare doesn’t matter, but being profitable does. On this one, Apple has it right. My only addition would be to say that it should be sustainable profit. Marketshare doesn’t pay the rent, profit does. Why should Apple attempt to bankrupt themselves to try to dominate the PC market? Particularly a market where the entrenched player (Microsoft) is there not because of superior product, but because they are the defacto standard on the business side, and that spills over to the consumer market. For Apple to dominate, they would have to go after the enterprise market in a huge way, and that is a hard sell. That isn’t to say it couldn’t happen. The mainframe dominated the business computing landscape for decades, but was eventually supplanted by UNIX, Windows and now Linux based servers. Changes in the computing landscape are possible.

Going back to a previous post I wrote (The End of Windows?) I believe that the next big battleground is smartphones, or more specifically, high-powered highly-mobile computing. The mainframe didn’t die (IBM still sells billions of dollars worth of them). It was overshadowed by other server computing. I don’t think the PC will die, but it will change form as more and more people rely on computing that fits in their pocket.

So, has Apple squandered some opportunity to dominate PCs and smartphones? Given that smartphones for consumers are still quite new, I don’t think Apple has missed an opportunity there. That battle is just beginning. For PCs, I don’t believe that there was an opportunity to begin with, as described by the article’s author. I think Apple (and Linux) would still have made the gains they have made. The Windows juggernaut may have stumbled a bit with Vista, but not enough to harm it as much as the author implies, because the real fight for the PC desktop is the enterprise, and Vista isn’t the competitor there, XP is. Ultimately, Windows has far too much momentum behind it to be replaced in a matter of a few years.

Is Apple Really Working On A Tablet?

The collective disappointment about the lack of an Apple tablet as “just one more thing” was amusing to see, but honestly, how much of a surprise should this have been? The continuing evolution of the iPod meant that something was getting a camera (just not the iPod everyone figured would get it). The updates to the iTunes store interface are certainly welcome. The absence of a tablet was certainly not that much of a surprise.

But is Apple actually working on a tablet, and not some answer to the netbook phenomenon? If they are working on a tablet, will it be a MacOS or iPhone OS based machine? They could be working on a tablet, or they could just be working on an ultraportable MacBook with a touch-sensitive screen (possibly along the lines of the convertible Windows tablets that exist today). My instinct says that a netbook is the more likely choice, but one with a touchscreen of some kind.

Is it a Tablet?

If they really are working on a tablet, I don’t see it being much bigger than a 10″ unit, possibly smaller. Much larger and it becomes a bit awkward to handle (even though 10-12″ would make a better viewing size more akin to the size of a conventional magazine). It would need to be larger “enough” than the iPod Touch/iPhone to differentiate it, but not so big as to be unwieldy.

Even if it is a pure tablet, I would be surprised if it is using the iPhone OS. Making it the same operating system as the iPhone doesn’t automatically open up all of the iPhone/Touch apps to the device, given that the sizes are radically different, and the various apps are largely built around the assumption about the current iPhone screen size (whether they should be or not). If the tablet’s dimensions are some multiple of an iPhone/Touch screen, then I could see several apps laid out like playing cards, tiled to run beside each other. The alternative would be for each app to run in its own free-floating window, which would be fixed in size to the same dimensions of the iPhone/Touch screen. I’m not sure that the iPhone OS would make the most sense at this time.

I’m just not convinced that, right now, there is a market for a tablet. The tablets offered on the Windows side of the world have done very poorly outside of specific vertical markets, and Apple has done better by being later to an robust market, than trying to ignite stagnant ones. The one time they tried a stagnant market has met with poor results, specifically the AppleTV: the home media PC market simply hasn’t materialized the way it was “supposed to”, but Apple tried to enter it anyway. MP3 players were starting to get popular when the iPod hit the market, and took off like a rocket. The MacBook, iMac and PowerMac have all simply been alternatives or attempts at improvements on an existing market. The iPhone was introduced into a market where smart phones were proven. I believe that the tablet market today is small and stagnant, not because of pent-up demand for a better product, but simply because no one wants them right now. People already wanted MP3 players when the iPod came out. The iPod merely made it more compelling. People already wanted smart phones when the iPhone came out. The iPhone, again, was perceived as “better”, and consumers have responded accordingly. Most consumers don’t seem to want a tablet.

Is it an Ultralight/Netbook?

I think this may be the more likely case. Apple already got their feet wet with ultrathin machines with the MacBook Air. The other physical dimensions of the Air, as well as a the price, put it outside of netbook territory. It also seems that Apple executives have spent more effort putting down existing netbooks, and the market they are in, than they have on other product areas. Apple’s modus operandi has been to start by running down a market segment, and then “save” it by introducing something better (real or perceived). Other than some limited remarks about the Kindle, Apple has been a bit more vocal on netbooks than on tablets or tablet-like machines.

A ultralight, ultrasmall MacBook with modest CPU, memory and storage, but boasting a touch-screen, could have a better chance at short and medium-term success. The form factor will be familiar. The expectations for such a device not as high when compared to regular-sized notebooks. Having the ability to convert it to a tablet and having a multitouch screen could be compelling to consumers.

Pricing will be key though. While Apple products are generally competitive, or at least not outrageous, in their own markets, the netbook segment appears to be the most price sensitive. Apple might get away with charging $50-$150 more for their product relative to the other machines that are out there. Much higher and people will either ignore it, or compare it to higher priced machines that will be more functional and more powerful.

The advantage of this approach is that it builds on technology that is fairly well understood. The only “new” element is a larger-sized touchscreen that isn’t pen-based. Even this isn’t new, because Apple has experience in this area with the iPhone and iPod Touch (albeit with smaller screens), and other manufacturers like HP have desktop computers with larger touch-sensitive screens today. The laptop parts, however, are well known. Apple has already had to deal with heat, power and feature issues in limited spaces with the MacBook Air. A laptop arrangement allows them to go head-to-head with the existing netbooks in a form that most people are familiar with, but adds the ability to make the machine easier to use by allowing it to be a tablet when it needs to.

I would also expect that this device will be based on MacOS, again simply because it is familiar ground. As an operating system, it isn’t very resource intensive, certainly not when compared to the high-end operating systems that it is analogous to (Windows XP Pro, Windows Vista Business/Ultimate, Windows 7 Professional/Ultimate). It has a reasonable selection of 3rd party software. Besides, the device will probably support running some version of Windows as well, so if someone needs to use Windows software, but wants the Apple hardware, that would still be an option.

Lessons From the iPod

Apple has already tested the waters of a tablet-like-device-as-netbook in a limited way, trying to position the iPod Touch as also being a netbook. The response was a decidedly lukewarm reaction. I think it would be easier to position an ultraportable MacBook as a netbook, because it would simply be Apple’s take on the segment, much like the original iPod was.

When the iPod was introduced, companies like Creative and iRiver had feature-laden offerings with significant storage and reasonable prices. The first iPod was more expensive, didn’t have quite as much storage, and lacked features like FM radios. The form factor was similar to the hard-drive based MP3 players at the time, being your basic box about the size of a deck of playing cards. If features and price were the governing factors, there is no way the iPod should have been able to dominate the market. Instead, the iPod did a few things differently. First, it was typically easier to use. The interface was simple, but still functional without feeling crippled. But it was more than that: the iPod was nice to use. People told other people this, and between word-of-mouth and a slick print and TV campaign, a device that came with a premium price and a dearth of features dominated the market. Adding the iTunes store behind it simply put the iPod in a league of its own, and it became the device that many other manufacturers now try to emulate and follow.

The netbook market is, in many ways, much like the MP3 player market was at the time the iPod came out. MP3 players were already on their 2nd or 3rd generation when the iPod (compatible with the PC) hit the market. Netbooks today are on their 2nd and 3rd generation, depending on the manufacturer. Unlike MP3 players, though, the software that drives them is either a variant of Linux or Microsoft: the software and interface isn’t exclusive to the manufacturer of the machine. However, there is a proven market and one that appears to have legs.

Lessons from AppleTV

The other end of the spectrum is the AppleTV. So far, this product has been a disappointment. A lot of that seems to stem from the fact that there wasn’t a growing and exciting market for the device when it came out. Home theater PC’s and their ilk simply have not been adopted by the mainstream consumer, and are largely relegated to a very small market of enthusiasts. HP tried and failed with a media center PC. Microsoft has been pushing the media center concept for many years now. Apple’s entry into this segment did nothing to kick start the market, or create significant new demand in this product space.

Still Not Enough Data

As always, working from rumors and speculation means there is a distinct lack of hard evidence to support either guess :-). The 10″ and 12″ touchscreens that have been rumored to have been ordered in quantity could just as easily be put in some kind of netbook as in a tablet. The netbook market is hot now, and with little sign of slowing down. The tablet market, however, is generally slow, and the “pure tablet” without a keyboard or pointing device is virtually non-existent in the consumer space. I don’t see a new tablet changing this, much like the AppleTV didn’t do anything to ignite the home theater PC market.

Apple is very good at taking an existing technology segment, and finding ways at making it easier to use, more functional or simply just better. The iPod was “better” in the minds of consumers because it was cool and easier to use, and it had the iTunes store behind it for content. The iPhone was easier for many people to use, and has a huge catalog of applications available via the App Store. The restrictions around contracts and network selection have not slowed the iPhone down. An Apple netbook would follow along the same lines: take a market segment that is robust and has a future, and put out a product that is similar enough that it is “part of the crowd” but different enough (and compelling enough) that people will pay more to have it.

I could very well be wrong, and Apple could introduce a tablet of some kind in the first quarter of 2010. They may even spring it on consumers right around Christmas. It just seems that the netbook market makes more sense at this time than a tablet does.

The End of Windows?

Will Windows ever be supplanted as the prominent operating system on the desktop? Is there an “end” for Windows? Hypothetically, it is possible. Virtually no technology has an unassailable position. The very few exceptions are things like 110 volt 60 Hz outlets in Canada and the US (and 230V 50Hz in other places) and lightbulb sockets. Examples of dominant technologies and business models being replaced are everywhere. Outside of computing, propeller drive aircraft gave way to jets. Analog TV was replaced in the US with digital broadcast TV. Cable-driven excavators were largely replaced by hydraulic equipment. Catalog sales by mail were replaced by catalog sales by phone, which is being replaced by on-line ordering via the Internet.

In computing, mainframe computers, were supplanted, or more correctly supplemented and eclipsed, by  UNIX and Windows servers. Proprietary hardware from a limited selection of vendors has faded into the background, to be replaced by commodity hardware from multiple manufacturers. MS-DOS gave way to Windows, UNIX is slowly slipping back behind Linux, and Internet Explorer has given ground to Firefox.

So too, over time, will Windows give way to other operating systems. Windows has already started to slip in market share, losing a little bit of ground to MacOS and Linux. In almost all of these cases, though, the old technology never really disappeared, and most of them are still around today (we still have mainframes, prop-driven planes and cable-driven excavators). What happened is that they lost their position as the dominant technology in use, to have people use other forms of technology to do what they needed to do. I do expect that Windows will, over time, find itself holding less of a prominent place in the computing world over time.

However, conventional operating systems like MacOS or Linux are, in many ways, “more of the same”. They don’t represent a fundamental shift in how people approach computing. They simply present 2 other large operating systems as alternatives to Windows. And, like Firefox to Internet Explorer, they are finding some ground. But I’m not sure they are better “enough” to replace Windows, or at least take over a significant share of the desktop and end-user computing space. The potential is there for MacOS to gain further ground, potentially in the enterprise space when Outlook is available, but I’m not sure we would see it dominate the desktop the way Windows has to date. What would be different enough is to have computing and end-user information always available, at any time, for users to access and use, and it is this type of shift that I think will reduce the dominance of Windows on end-user computing.

There have been many, many attempts at having a way for people to “take their information with them”. Xerox and the Altos workstation, with its dedicated network, was a way for people in an office to sit down at any Altos terminal, log in, and get “their” desktop with their files and data. Something close to this is available with Solaris and dynamically mounted NFS filesystems that can give you your environment on any machine in your network. The original vision for the NeXT was for you to use a removable disk that contained your data and your environment, including your software. You sit down at a computer, plug in your disk, and get to work. Everything you need fits in your hand.

Docking stations and notebooks are the current iteration in that direction. You can have a large, multiscreen environment with a dedicated keyboard and mouse in your office, a smaller setup at home, and all you do is take your notebook back and forth. Compaq and Apple started this with some very early subnotebooks and dedicated docking stations. Many business laptops from Lenovo and Dell offer docking stations that provide this same type of functionality today.

These various solutions have problems. The “network as computer” in the Altos and Solaris were limited by the available technologies at the time they were conceived. Since they required specialized hardware and software, and needed large available bandwidth, it didn’t scale up outside of an office environment. The NeXT concept was closer, since the device was quite portable (although not quite pocket-sized), but was burdened with limits in the technology at the time, primarily speed, and was also limited to only working on NeXT devices. The dockable notebook, while more functional and ‘scalable’ in that it is a standalone computing device you can use anywhere, still results in (for the most part) bulky devices that you have to cart around.

Cloud computing and browser-based services like Google Docs take the “network as computer” model and scales it better, since the passage of time has resulted in standards for browsers and browser environments. Our available potential bandwidth has also increased over time. The downfall, though, is availability and questions on security of your data. Some web-based services only work when you are connected to the web, and don’t work when you happen to be offline, and for business travelers, this can be an issue. However, this type of computing means that it doesn’t matter what operating system you run. As long as you have a browser with the right capabilities, you can do your work. This is a notable shift, and could supplant Windows. However, I’m not sure it is fundamental “enough”, since accessing these services is largely done via notebooks and desktops running Windows in the first place. It isn’t getting people to think about computing in a different way, just how they use some of their software.

There are two potential game changers that could throw a spanner into the Windows machinery: netbooks and smart phones. Currently, neither is in a direct position to challenge the supremacy of the desktop/notebook for resource-intensive computing or the enterprise. Their small screens, tiny keyboards and limited computing capability today aren’t there yet. But, what they are doing is getting people to think differently about their computing needs. As more and more people move beyond just using dedicated Internet connections at home or the office, and start to move to “always on, always there” Internet access from nearly anywhere you can sit, stand or lay down, the potential for desktop computing to shift is there. Inevitable advances in technology could allow these devices to become our primary computing platform, and as a result, supplant Windows as the operating system on our “desktop”. They offer a balance of being able to use Internet-enabled services, but also act as standalone computing when the network isn’t there.

Of the two, I believe that the smart phone has the greatest potential, but it also has the steepest hill to climb. While the netbook is compelling, and for some users will make sense as their core computing device, I believe that for a lot of people a powerful smart phone, with a rich set of applications, supplemented by external monitors, keyboards, etc could change the game, and start to move Windows into the background. I believe this because of the size and portability of the devices, which makes it trivial to take the machine with you.

Imagine a world where your computer is in your pocket. All of your key files and other data are available on a device that is with you whenever you need them. Using wireless connections or a dedicated docking station for your phone, you can use normal-sized keyboards and large screens to edit documents and spreadsheets, manage e-mail, and engage in real-time conversations either via voice or text messaging. When on the go, you have this little device that can get you your e-mails, allow you to search for information, browse documents, and engage in other forms of communication. It can inform you and entertain you. You can still do some things with it, even when you have no network connection available.

Right now, Blackberries, iPhones and Android phones can provide anywhere from 70-90% of this, depending on your needs. I had occasions recently where all I had was my iPhone. I discovered that I can use an iPhone in its current form to do about 80-90% of what I need to do, leaving out programming and heavy-duty document editing tasks. As these devices get more powerful, and get more storage and better battery life, I could see them being useful for the bulk of the office tasks that I do today using dedicated desktop and notebook hardware. I don’t believe I am atypical in this case. If I could get the iPhone connected to a bigger screen (with more pixels) and a real keyboard, I could probably do almost everything I need to except for programming.

Will this work for everyone? Of course not. There are some things that are simply too resource intensive. Gaming, writing and testing software, computer design and engineering simulations come to mind right off the top. But, just like no one motor vehicle can “do it all”, I’m not naive enough to think that a souped-up smartphone will be able to do it all either. But I don’t believe that smartphone technology will simply stay put, and can see it evolving and improving over time.

How close are these devices? I believe they are maybe halfway there. The three shortfalls I see are CPU power, storage, and battery life. as always, technology marches forward, and these will be overcome. I don’t expect it to be tomorrow or even in the next year or two. But they will improve.

Ultimately, for Windows to fade away, it will take a fundamental change in how computing works. I don’t see it being replaced by another variation of a desktop OS like MacOS or Linux, simply because they aren’t better “enough” to compel a change of that magnitude. I also don’t see Windows, MacOS or Linux disappearing completely: they (or something like them) will have a place on the desktop and notebooks for a long time to come, just as large UNIX servers and mainframes still have their uses. But for a lot of what people do, an uber-smartphone, with supplemental accessories under some circumstances, could be what eventually moves Windows and similar operating systems to a secondary or tertiary role in day-to-day computing tasks.