Many of these posts are written before the web publishing infrastructure is finished, with the effect that some of them will refer to events that have passed. This post was originally written on the resignation from Apple by Steve Jobs, and edited a bit on his death.
A great many posts after the man’s death celebrate him as a visionary, as the strongest influence in our ‘technology culture.’
I am writing this as someone who has encountered the man and his products in ways that most people have not, at least those that have written publicly. As someone myself trying to make what he once called ‘a dent in the universe’ I often take apart what choices he made because I want to be lucid and focused in the decisions I make in my own work.
Some history: I was peripherally involved in the Mach project, driven by the idea at DARPA that the operating system should be unique, universal, free and subsidized. (Mach, based on the strongest variant of Unix that became OS X.) I was also involved as monitor of Taligent in the early nineties. This was an effort by IBM, Apple and HP to create the next generation consumer operating system, one which was object oriented and that integrated user interface concepts deep into the abstractions of services. It was during the period that Jobs was not with Apple, and in fact Apple bought his company NeXT because Taligent failed. NeXT was meanwhile being sustained from large purchases by the agency I was supporting. His NeXTstep (later openstep), based on Mach, was considered the most mature and capable development environment for real men at the time and the creative people in the intelligence labs really valued it.
I was somewhat more involved in this period, also as monitor, of the development of Dylan, a language that promised to merge functional, object oriented and dynamic programming concepts. Dylan (DYnamic LANguage) was being developed by the Apple Cambridge Labs, heavily influenced by ideas from MIT and Apple’s own implementation of Common Lisp. An immediate use was expected to be in the Newton PDA device, probably leveraging the SK8 environment.
I had and have rather strong notions about ideas that would be ‘good’ for society, their implementation in technology and appearance in products and services. These values are based on individual empowerment, the ability to form self-directed creative collaborations and the freedom from unnecessary institutional constraints.
I am also aware of the limits of getting from here to there, using the very infrastructure we hope to change. That infrastructure has two components: what government can usefully do and how to harness the many horses of market forces. I succeeded and failed over on the government side in various efforts. Steve failed and succeeded on the market side.
It is those choices that I bring to this post.
Now what I will write below should be taken as a personal reflection, closer to thoughts about my own next moves rather than a judgement on the man himself. He surely deserves the credit he is getting now. He was a good man and we are better off for having him.
I contrast him with Bill Gates. Bill and his cohorts were the greatest evil I encountered in my past life — and that life exposed me to much of the world that is evil. I use the term evil with care so as to deliberately place him in the category of having done conscious serious harm to the well-being of society.
(Meeting both of these men early in their careers tells you nothing of the personalities they would be known for. They were both personable but strong advocates for their products, people of the kind you routinely ran into that could have become friends.)
Bill did three things that I deplore.
Now this was not as unique to Gates as the other offenses. It was and still is common in other industries, but we forget that before his rise, work in the technology sector was more of an art, more like science where sharing was the norm, as opposed to the culture of commerce. Every key research center worked according to the values of art and what Jobs would later call ‘the liberal arts.’ It wasn’t a perfect time, but it was radically different than what followed, what Microsoft brought.
The Basic interpreter and then the operating system were stolen, pure and simple. I cannot fully represent the chilling effect that had on the communities that could have fed the world. Sharing went away; it had to. Now, we have the extreme situation where companies exist that make nothing at all; instead they ‘own’ ideas so they can extort patent payments.
Some analysts understand the cost to society, as Microsoft sucked wealth out of the ecosphere, but few understand the far, far greater opportunity cost, lost because of the creation of a huge business based on what can fairly be called theft. This extends to Microsoft even today; though they were the giant and though they accumulated untold wealth, none, absolutely none was turned to innovation. For the over a trillion dollars they pulled out of the economy, they have invented essentially nothing beyond the business model and what it takes to reinforce it. The profit was based on the innovations of others.
The bottom line is that the sector was too immature to have protections against predators, and Gates took unseemly advantage.
He Developed a Culture of Destruction.
In post-war America, companies pretty much did well or not based on what they made — how good it was. This is the key idea defining market forces that attracts the faith we have in it: if you create value for the customer you will do well. Gates took a different approach; whenever a competitor or potential competitor appeared, the entire company was turned to crush it. Their primary product was destruction and they were very good at it. Within Silicon Valley, it was known that Microsoft would go further and personally humiliate the defeated — the way a mob boss would — so as to deter competition.
The threat was so effective that Microsoft could force a competitor to withdraw or fail simply by announcing that they would offer a competing product, even if it were a lie. Microsoft did this so often without actually creating the product, that it became the subject of an intimidation watch in the Justice department. (I may have testified.) Even with friendly clout purchased in the US Congress and Administration, their egregious behavior was deemed criminal. Bottom line here: Gates was a dangerous thug when he had the power to be and to the extent he could.
He Built a Closed Empire.
This is what powerful business organizations do when they can, becoming bigger and asserting more control over the market. They did so by unethical means, but the nature of what Gates built was an offense of its own. Microsoft was the first company whose tools are used for everything. Every significant business on the planet had to use their tools, for better or worse. These tools never were designed with the first priority being to enhance the lives and work of the customer. They were instead designed to be merely ‘good enough’ in that regard while being excellent in how they served the strategic business goals of the company.
Few people will criticize Microsoft for this. After all, the tools were in fact good enough. But what if Microsoft really did engage in open competition when it mattered? What if every business process and clerical transaction, every desktop analytical tool on the planet was not designed to benefit a single monopoly but to unlock the creative power the market could empower in those businesses? I solidly claim that we would be vastly wealthier, happier and more just as a society.
I do honestly believe that we would have made significant progress in eliminating problems we face, instead of every one getting worse. One man and his support crew prevented this for gluttonous gain. It is why Google, when they started, took the slogan ‘don’t be evil.’
...has almost singlehandedly been able to reverse the second of these offenses. By this, I mean he fixed it for the individual; it is too late for American businesses. Apple is not going to fight the fight of restoring creativity in business, and it may be too late for American industry to overcome the structural problems it has, many of which I blame on the tech industry. (Read my professional work on virtual enterprises for more on this.) But what Apple has done is restore at least the consumer technology marketplace to health. Now, you simply have to make the customer’s life better or you cannot sell your gear. Apple gets that ‘better’ is multidimensional but it has to be actually better for the consumer. (Jobs himself had Zen values, and understands that what we call kutachi matters.)
This was not an insignificant feat. Jobs is rightfully credited with saving and building Apple, but what he did as well was save the soul of an entire industry. For this we need to celebrate the man. All the qualities we have been hearing about after his death feed this triumph.
But now to the qualifications. Apple has to play the intellectual property game the same as everyone else. I am not a detractor of software patents: I believe they have their place, and in fact have some personal interests in that dimension. Jobs was not able to influence this, though Apple does play a non-trivial role in the free and open source community. OS X has a central core that is built on open source, and they do give back to the BSD Unix community in the form of Darwin. The open source WebKit is the basis of every relevant browser today: Safari, Chrome, and the browsers in Kindle, Android and Linux KDE. Wherever there can be an open standard, Apple pushes it. The most notable example is HTML5/CSS3 (a web standard), where the other big guys are being dragged in against their will.
Where Jobs is not a model for my life is in the last of the three offenses by Gates, the empire building. Others have remarked that Apple has gone from the business of supporting producers of creative content to serving the much larger class of content consumers. This is a marked transition, a profound shift for Apple. From the introduction of the original Mac in 1984, the target customer was the creative professional. More precisely, the target was ‘the rest of us’ with unrealized creative talents. I remember going to MacExpo shows and thinking that I was in the midst of the most concentrated group that could possibly be pulled out of the West and considered creative souls; included among these folks were our most creative programmers.
Almost every user customized their Mac using some of the many third party utilities, ranging from the appearance of custom windows and controls to more fundamental behavior. Programming language diversity was as great on the Mac as on the researchy Unix and in some ways greater. The appeal of the Mac in those days came close to my ideal of the future of computing. That future had each person in control of their own lives and tools, the personal computer adding to and extending that control. People could celebrate their talents, probably discovering new ones, by modifying their creative environments to fit themselves. Using a computer to create a magazine, website or movie was the least of these; people would be creating their tools for life as well.
One religious war in the computing world at the time involved a programmer’s editor called vi/vim. It was/is a great editor, geared to helping you do your work. Others used Emacs as an editor. Now this was a completely different beast. It was designed to be extended, the assumption being that any programmer worth his salt would want to make his own tools. Emacs was more of a foundation for building tools than the end vision of the tool. I was in the Emacs camp and still cannot imagine anyone living in a machine (or any creative tool) without tailoring it to one’s life. The early Mac was in the Emacs tradition, a tool that allowed you to make your own tools.
My Mac in those days was used mostly for writing. Nearly everything about the environment was designed by me to suit, down to the very font design and its antialiasing algorithm. I extended my word processor (Nisus Writer) to appear and behave like no other. Most elements were different than normal and no element was unexamined.
The machine had some serious deficiencies in its operating and file system, but it was clearly designed for those of us who lived in the machine, modified it to suit and then used it to create great things to send out into the world.
Then Jobs came back to Apple. He brought the wonders of NeXT, which was the deep, rich environment of my professional life. NeXT was based on a philosophy similar to what we valued, but had vastly better tools and infrastructure. The machine came with a development environment that accessed and leveraged ‘objects all the way down,’ like Taligent had tried and failed.
We were in heaven, because we all thought that the promise of computing had opened up. Sure, programming environments would have to mature to be more accessible to the ‘liberal arts user,‘ but we had HyperCard, Frontier, OneClick and SK8 as guides, with Dylan pushing the envelope. Haskell was right around the corner. We thought that every user could potentially have a unique creative environment, and user communities would grow specialized suites of tools to be shared.
But Jobs adopted a different set of fundamentals, which could also be said to be in the DNA of the original Mac if considered as a consumer appliance. The inside workings of the machine should be set in stone and made invisible to the user. It should be an appliance like a toaster: easy to use and impossible to make complicated. Applications should be ideally powerful only in a few ways, so that the user won’t get confused. You weren’t supposed to make stuff as much as consume it. Apple got into the music player, movie and photo business and the Mac became your hub for your digital life.
Third party modifications vanished and the Apple Way became the only way. The personal expression, if you wanted one, was limited to the case color of your lickable new iMac. Dylan was killed and Apple Lisp discontinued. Ultimately, if you wanted to program for the Mac, you pretty much had to use the Apple language and Apple development tools. To qualify for their software stores, you have to now conform to a very restrictive set of requirements.
In the big shootout among Apple, Amazon, Facebook, Google & Microsoft, Apple still is the winner in my book, but I have investedenough in it to have earned the privilege to complain. In ten years, three of these will likely be irrelevant; I am betting on Apple and Amazon to win, but think they will only get there if they allow us some room to innovate as individuals so we can get down to the business of solving problems.
Computer science curmudgeon Alan Kay said of the Mac that it was the first computer worth criticizing. Similarly, Steve Jobs is the first personal computing entrepreneur worth criticizing. He fixed something that Microsoft broke and did it using values I cherish. He taught the value of excellence and focus. He had the sense to ignore MBA wisdom regarding customers, in deciding to find the few who would value that excellence and then grow that base. In all these he really was something of a genius. He had aggressive-passive Zen centering.
But he made tradeoffs that I would not, benefitting the institution of Apple at the cost, I believe, of individual creative empowerment and small group collaboration.
I hope to make different choices where I can.
When I was active in public research policy, the computer companies we hoped would counter the Japanese Threat were given a Congressional exemption from the antitrust laws to create a research consortium. These were Control Data, Digital Equipment, Harris, Honeywell, Motorola, National Semiconductor, NCR, RCA and Sperry-Univac. These were the best of the best in computing at the time; many of them don’t exist now and none of them are still in computing. I remember when Wang — who absolutely owned the word processing business — disappeared in a couple years. We are in the midst of another big shift and I do expect there to be imminent losers.