Apple’s keynote for WWDC (which isn’t considered confidential, so I can talk about it) only had one real surprise. The rest of what was presented was more-or-less expected, and some of it looks promising. The only surprise was the upcoming Mac Pro.
There are some (many?) who dislike comments in code, usually to varying degrees. There are some that believe comments have no place in the code. There are others that feel that comments should be used very, very sparingly. I have my own take that isn’t quite as harsh. Personally, I like comments in code, but only comments that meet a specific criteria.
Yesterday, Thorsten Heins from RIM proclaimed that the tablet market will diminish within 5 years. His view is that the smartphone will be the focus of our attention, and that we will attach it to “big screens” when we need more real estate. The idea of dockable phones isn’t new. But, so far, it really hasn’t caught on with smartphone buyers. Given the current growth rates, and overwhelming mass of predictions about tablets replacing PC’s, it would seem to be easy to dismiss Heins’ predictions as the words of a madman trying to shake things up. But there is a cautionary tale that should at least give us pause before proclaiming the tablet as the future of personal computing.
I managed to win the WWDC 2013 lottery and get a ticket. I’ve written and trashed at least 2 very long posts about the complaining I’ve read from those that didn’t get in (or just don’t like how fast the tickets sold). But I realize that it is pointless. No matter what system Apple uses to sell tickets, there are about 50x more people who want to go than can go. It seems most people are determined to ignore this basic fact.
I’ve already covered why bigger venues (or a different city) probably won’t work. Even if Apple switched to some kind of lottery, there would still be those who act as if they are more deserving of a ticket, or complain that the system is rigged or unfair in some way. It’s a game Apple can’t win. The best they can hope is to not lose too badly. At least this year we had a chance to prepare to participate. I missed out last year (and it was my own fault), but I’m glad I get to go this year. I am not making assumptions about whether I can go next year or not. If I do, great. If not, oh well. It won’t be the first event I’ve missed out on, and it won’t be the last. You see, I know this one basic fact about the universe: life’s not fair. I deal with it, and move on.
There is an excellent 2-part series on AMD on Ars Technica, that provides a succinct review of AMD’s history, and a concise review of the company’s strengths and weaknesses. I, however, want to take a different viewpoint on AMD’s future. It isn’t that I disagree entirely with some of the conclusions in the article. It is just that I think I have a different perspective on why AMD is in more trouble than they realize, and that they may be heading the wrong way.
I’ll start this with a confession: I have written and discarded about a half-dozen posts on Bitcoin, but none of them ever seemed “right”. But something else caught my eye in the mean time: a piece on Ars Technica trying to sensationalize an EU investigation into Google’s practices with Android as an “attack on open source”. I’m not sure that the author actually understands antitrust law, even in a casual sense, nor do they seem to know what comprises a predatory practice.
The past few years in computing technology discussions have been dominated by mobile computing, followed closely by cloud computing. Now, the talk is turning to “wearable computing”, distributing computing technology as jewellery, accessories and embedded in clothing. But I think that the real future is bigger than this. It isn’t specifically about what you wear, what you carry or what is sitting on a desk/table/lap in front of you. It is about what computing is available, and what computing you need at the time.
News came out that Apple has hired Kevin Lynch, most recently the CTO for Adobe, as Vice President of Technology. The rumblings on the Interwebs are resulting in questions, the biggest being: is this a good idea? Given Apple’s recent track record on senior hires (admittedly, a single sample), the questions could be valid. But there may be method in the madness.
Some colleagues and I are in the midst of a (potential) new startup, and one of the questions we are exploring is around principles and philosophies for running the business. One element I contributed was that profit, while important, cannot be the underlying motivator for what we do. What’s important is the product/service, and the execution to build and deliver the same. It’s trite, but it is about “doing the right things” and “doing things right”. But does that mean I’m against profit? Does this mean that I think “profit” is a dirty word or something? Not in the least. I just want to see it put into the right context, and when it is, good things can follow. When it isn’t, disaster is usually lurking.
A recent post by Tog speaks to issues with recent Apple UI design decisions, and that got me thinking about a bit of pet peeve of mine. My complaint? Much of the Apple desktop experience for Macs seems to presume the use of modest-sized screens (up to around 1900×1200 or so). Some of the historic holdovers (menubar fixed at the top, the dock fixed at some location on the desktop) as well as the newer arrivals (full-screen mode for Mac apps) all make the assumption that no one actually wants to use either a really big monitor, or a multi-monitor setup.