This column originally appeared in the September 2011 issue of About This Particular Macintosh.
September 11th this year marks a decade since the United States suffered the worst-ever attack on its own soil. Like my parents’ generation with the assassination of John F. Kennedy, I can vividly recall where I was and what I was doing when the news broke. I remember watching NBC’s live video as the second plane flew into the South Tower. That moment told us this was no accident. That moment, in hindsight, was when everything changed for America.
A familiar mantra rose up: “Never forget.” Such a simple phrase has obvious connotations, yet can carry different meanings for different people. For some, it denotes revenge, not only never forgetting, but never forgiving those who attacked our nation and killed our fellow citizens. For others, myself included, it means learning from the history that lead up to the attack so as to prevent another in the future.
Millions of bits and reams of paper have been published over whether the US should be in Afghanistan and Iraq. My personal position has shifted to one degree or another in the decade since 9/11, and we have yet to experience another successful attack. This appears to be a result of fighting terrorist groups who wish us ill over there not having sufficient resources for those same groups to attack us here. That’s a lesson best summarized by the military maxim, “Take the fight to the enemy,” and falls into the learning-from-history category.
Over the past decade, Apple has been doing quite a bit of learning from its own history. When Steve Jobs returned to the company he’d co-founded then summarily been driven out of, he certainly put his stamp on the organization moving forward, doing so with an eye on the corporation’s past. Model lines were streamlined, costs were slashed, and then new products began to emerge, with a new executive team to back it all up.
The debut of the iMac was the shot across the industry’s bow that this was no longer the old Apple. Building upon that success, ten years ago this past March, Apple debuted Mac OS X. While that initial release had its issues, the past decade has seen polish that indeed has made every successive version of the operating system, including today’s Lion, “the best yet.”
That same year, Apple began an industry disruption with the release of the iPod. Apple didn’t invent the MP3 player category, but the little white electronic box the size of a deck of cards would go on to dominate that same category. Apple under Jobs certainly did not forget lessons from the company’s past here, and did something so audacious, it’s still being talked about in MBA classes*. The iPod mini, the company’s most popular iPod model, was killed. Nuked. Replaced. And the iPod nano then shot to the stratosphere.
(* Totally made that up, but it sounds good, don’t it?)
When Apple killed the iPod mini, it was a signal that not only was this no longer the Apple of years past, but that Apple was, as many of us have long observed, very different from other tech companies. Would Michael Dell have killed his best-selling model of anything? Would HP? Toshiba? The old Apple would have continued to milk the iPod mini for all it was worth; while allowing innovation to stagnate. Not so with Jobs at the helm. How do you innovate your way away from a best-selling product? Make another best-selling product.
So you continue to polish the best operating system on the market, and you pretty much take over an entire market segment. What’s the encore? Another industry disruption: the iPhone.
Apple wasn’t going to just walk into the mobile phone industry and do well, remember? Now, for the non-tech crowd, “smartphone” has become synonymous with “iPhone”. Four years ago, in my little corner of Texas suburbia, I would never have envisioned the penetration amongst the soccer/band mom crowd that the iPhone has now seen. Every time I turn around, if a middle-aged, minivan-driving mom has a smartphone, it’s an iPhone. Sure, there are a few Android phones floating around, as well as the rare Windows 7 Mobile, but the iPhone remains dominant. And the industry has only begun scratching the surface with smartphone purchases among users.
Then there’s the iPad. Remember the tablet market before the iPad? On the Apple side of things, a third-party was taking PowerBooks, nee MacBooks and converted them to touchscreens, with swivel tops to cover up the keyboard. PC vendors, working closely with Microsoft or not, had developed similar models for one Windows flavor or another. A few were sold in niche areas, but never in significant volume to justify there being a “tablet market”. Then Apple released the iPad, and it was all over before the rest of the tech industry could even blink.
The iPad was derided as an oversized iPhone, without the phone. Consequently, this actually sounded like a feature to quite a few people, rather than a bug. Here was a tablet which shared the same ecosystem that allowed for vetted apps to be purchased, was isolated from the threat of viruses, and didn’t require a For Dummies book to get up to speed with.
When we first started going to our current pediatrician a few years ago, all of the doctors and nurse practicioners were using netbooks to track patient information during a visit. Now, they all have iPads, running in a ZAGG keyboard case. Here’s a niche where the Windows-based tablets of old would have been targeted, and have now been supplanted. As the industry has too-slowly come to grips with, there isn’t a tablet market, there’s pretty much only an iPad market.
How did it come to this? Learning from the past. Over the past decade, Apple has looked to its own past to see what worked and didn’t work. It has also looked to the past of the entire tech industry. With such knowledge in hand, Apple has charted its own course, marched to the beat of its own drum. Apple’s profits and highly valued stock are the result of Apple setting the trends, not following what others might have done. The rest of the industry has yet to grasp this important distinction, and thus continues to flail about, chasing the tail of Apple’s comet.
Now the man who energized and turned the company around with his vision is stepping down. Steve Jobs has turned the reins of Apple CEO over to the able Tim Cook, and I have no doubt in the Cook era that Apple will continue to remain the dominant player in the tech industry. (Yes, I said the dominant player. Who else has accomplished what Apple has in the past decade? Google? Microsoft? Please.) We imagine the current management team will remain relatively unchanged by Cook moving forward. When something’s not broke, why fix it?
Yet Tim Cook and the other Apple executives will be in a unique position to learn from their history. For while Jobs is no longer Apple’s CEO, he will remain on as the Chairman of the Board, and everyone knows he will continue to have some say in product development. Cook and Company have been living Apple’s history, and will continue to do so, and they must check future development against what has worked for the company in the past, so that it might continue to work in the future, making changes as needed. Be disruptive. Don’t do what everyone else is doing. Go against the grain. Think different.
The actions taken in Afghanistan and Iraq over the past ten years have reverberated across the Middle East, even the entire globe. We are still learning valuable lessons which our leaders, current and future, need to take heed of and understand. Be disruptive. Don’t do what everyone else is doing. Go against the grain. Think different.
For that’s how the world truly gets changed.