A bit of context: I wrote this for a magazine a couple years ago, but it ended up not working out, so I’m publishing it here instead.
I
Most of us remember when we taught our children to ride their bicycles, and I am no exception. I also remember many moments teaching my daughters another tool: our family computer. From the moment we started letting our daughters use a computer regularly, we have let them play video games a little — but given them far more time if they will spend it to create. “Does sending iMessages full of gibberish count as ‘creative time’?” is a question we have had to answer perhaps a few too many times as a result. Slowly, but surely, though, the message has stuck.
I took a picture a few years ago of my older daughter, then about ten years old. In the photo, she is sitting at our family iMac: she has headphones on, the GarageBand app open, a MIDI keyboard at hand, and is happily composing a piece for drum kit and strings and her own voice. An hour later, she proudly shared it with us. It was about what you would expect from a talented, but not prodigy-like, ten-year-old. We smiled wholeheartedly at her and found things to praise in that first attempt. I loved how she wanted to follow, even a little, in my footsteps as a composer. Even more, though, I loved that she was learning how to make the computer not only one more means of entertainment, but a tool for bringing the good and beautiful into the world.
II
Ivan Illich, in his 1973 book Tools for Conviviality, defined a convivial tool as a “responsibly limited tool.” As an example, he offered a bicycle. Nearly anyone can ride a bicycle. Bicycles require no licensure, and no training process. They are easy to borrow and lend, and relatively simple to repair. They are very unlike automobiles, which both enable and demand a radical restructuring of geography. A bicycle extends our reach only so far as we are willing to pedal in the elements. Yet bicycles are also remarkably capable tools which can sharply alter the possibilities for any given person on any given day.
The most convivial tool in my life (besides my own bicycle, anyway) is my Mac. It was easy to learn and has grown with me through the many phases of life since I first got one nearly 20 years ago. Over those years, I have used my computers to learn, write, use social media, compose music, play video games, and build software. Those uses are decidedly mixed: ranging from entertainment and distraction to creative and life-giving work. Social media, for example, has been a great dissipation at times, but also a place I made some of my closest and most enduring friendships. Similarly, video games have filled very different roles in my life: joyful play with my family and mere diversion after a long work day. Yet in the main, my laptop has been a tool, not a toy. Every one of the many seminary papers, blog posts, and essays I have written in the past fifteen years was written on one of my Macs. With these machines, I have composed orchestra music and recorded it with musicians on the far side of an ocean, written church music for our little congregation, even teamed up with a poet-friend who lives in another state to co-create a bit of music for my wife and one of our best friends. Most importantly, I have made my living on these machines for the past 15 years as a wholly self-taught software engineer.
The idea of a “home economy” sometimes gets conflated with an agrarian economy, as if it is only available to farmers. One of the few upsides to 2020’s terrible disruption, though, was that many people, and at least some companies, realized much more of their work could be done anywhere. The door to working at home cracked open for many us in the knowledge economy. If some knowledge work certainly falls under the rubric of “bullshit jobs”, much of it is also generative and good. (That division goes for all work in a broken world. Nor should it escape our notice that “bullshit job” and “taskmaster” are metaphorical, if meaningful, for knowledge workers. Both terms were and are quite literal for many agrarian workers.) Even a frustrating job in the knowledge economy can be a step toward a real home economy. It was for me.
In early 2013, my wife and I moved across the country, our 7-month-old first daughter in tow, for me to start a Master of Divinity degree. I was able to persuade my employer to keep me on part-time, working remotely as a contractor. The bookcase in our kitchen I used as makeshift standing desk was not glamorous, but I wrote a lot of code and seminary papers alike there. One day a few years along stands out in my memory. Our first daughter was now deep in the throes of potty training and we had another energetic little girl on our hands as well. My wife looked at me very seriously and told me she was going to lose it if she had to clean another puddle of toddler pee off the floor. Most of the other seminary families would have been stuck in that moment. The dads were always out of the house, either studying or working. I worked at home, though. I sent my wife out to her favorite coffee shop with our younger daughter. For the rest of the day, I alternated between stints of coding and running our toddler to the bathroom… and sometimes, yes, cleaning up puddles off the floor. A home economy indeed.
Speaking at the Library of Congress in 1990, Steve Jobs described how remarkable bicycles are. Humans are far outclassed by many other animals when it comes to our ability to turn energy into motion with our own two feet. Our toolmaking makes us capable of feats we could never manage otherwise, though. A bicycle vaults humans’ energy efficiency far beyond any other animal. The same, Jobs argued, is true of computers. They enable creative work hardly possible before: any group which cares can publish a top-notch magazine, for example. In Jobs’ telling, a computer is like “a bicycle for the mind”. The analogy is suggestive. Perhaps computers, like bicycles, can be convivial tools.
III
Jobs’ speech has its roots in an intellectual and practical project Douglas Engelbart launched nearly thirty years earlier with his 1962 report, Augmenting Human Intellect. Engelbart outlined a vision of computer programs for note-taking, ready to be adapted to each user, so as to “harness your creativity more continuously.” A variety of “personal knowledge management” and “tools for thought” apps carry on this stream today. Curiously, though, “tools for thought” enthusiasts often end up focused on the tools rather than the thinking. I confess: I have indulged in this mistake myself. A folder on this very laptop contains thousands of plain text files, some 1½ million words of notes. I have spent many hours reorganizing them, experimenting with naming schemes, trying to get the links between them just right, and following interesting trails through them. All my tinkering with that system has not made me a better thinker. No intelligence inheres in interlinked documents. (Else the world wide web would have made geniuses of us all.)
One reason I get sidetracked by tinkering is zeal for the quality of my tools. Another is that tinkering is always easier than actually thinking. I suspect, though, that the impulse is higher with this specific machine. The flexibility, the sheer generality, of computers means they do not focus or direct our use. They are not like a camera, dedicated solely to capturing photographs. This is computers’ greatness, but it is also what makes them easy to misuse. Their programmability can mislead us into believing the hard work of thinking itself is avoidable. Phrases like “outboard brain” indicate a failure to understand how thinking works, and why it is — always — work. That goes equally for a notes system made up of plain text files and for large language model chat interfaces. Engelbart’s dream, like Jobs’ word-picture, was of computers as tools allowing us think better; too often we act as if they will do the thinking for us. We treat them like automobiles instead of bicycles. No matter how good our notes system or how fluent our chatbots, though, true understanding is hard-won. We have to pedal.
As with any such imaginative framing, then, the “implementation details” matter very much. After all, some of my time spent with my notes system has been illuminating. Writing down ideas and considering how they relate to each other has sharpened, clarified, and expanded my thinking over time. Good tools can provide scaffolding for that kind of work. They can make it easier to use one’s existing notes for reflection and revision. At the end of the day, though, the thinking has to be done in real time by a human. That is what makes pen and paper note-taking so powerful.1 A convivial note-taking app would respect that reality — and promise no more.
IV.
In his February 2020 essay An app can be a home-cooked meal, novelist Robin Sloan describes a tiny video messaging app he built for his own family, with the delightfully silly name BoopSnoop. The app did just enough for Sloan and his family. It had absolutely no need for a monetization strategy, the level of polish of a published app, or even a less silly name for an app store marketing. Sloan concludes the essay by arguing that there is something good about learning to code the way we might learn to cook. Few of us will become Michelin chefs. Many of us can enjoy cooking in its own right, though. Most of us can bless our family and friends and neighbors with food. The same can — should — be true of software.
This is not the only path to convivial computing, though, any more than the goodness of home-cooking rules out the goodness of great restaurants.
The most powerful companies in the world include computer companies. Amazon, Apple, Alphabet (née Google), Meta (née Facebook), Microsoft, and lately Nvidia top the list in America. There are many differences between these companies, but they all operate at nearly inconceivable scales. They have billions of users. Their products have cost billions of dollars to develop over the span of many decades; building competing platforms would take comparable amounts of both money and time. (For all that bicycles make for a pretty good business, it is hard to imagine them dominating our social, political, and even artistic spaces — still less our cultural imaginary — the way those tech giants do.) Does that fact of scale alone make computers un- or even anti-convivial technologies?
Social media is certainly industrial in scale, and most of its harms are directly due to industrializing what should not be: community and conversation. Scale itself is indeed most of the problem there. No one can moderate four billion “users” (that is: people). At the same time, Sloan’s BoopSnoop was itself possible only because it built on the foundations laid by those massive corporations. It stood on the shoulders of thousands of corporate software developers and indie open source contributors alike. Likewise, consider how my own little home economy has both benefited from and contributed to mega-corporations — not least in a nearly five year stint at one of them.2
Scale can dehumanize. It can also empower and enable. In the small, we are free to play, to experiment, to learn, to write essays for small magazines, to compose orchestra music, to keep in touch with friends around the world. In the large, we can collaborate on vaccines and antibiotics to defang a pandemic. The sheer generality of computing makes all of these possible. We are responsible to choose how, and when, and where, and even why we use computers. We must use them to magnify rather than diminish our humanity, and we must use them for human ends.
Technological enthusiasts in every age think novel technologies good — often rightly, but not always. The technological skeptics of every age think them bad, often justly — but not always. Computer aficionados have certainly fallen prey to the first temptation often enough. Tech critics have equally often fallen prey to the second. Mostly, the two camps speak past each other. Perhaps the two can have a real meeting of the minds today in the recognition that computing is not all it could be: that convivial computing is possible. Moral hazards around computing are real. So is the potential for the beautiful and good, though. I, for one, am unwilling to let the ills of social media, or the extractive aims of the largest companies in the world, have the final word on what computing is or can be. We must continue to push back against the idea of tools as substitutes for our humanity, must always insist on shaping them into the kinds of things which make us more truly human. But we should also rejoice in the ways they do just that.
These slabs of silicon, and the software that runs on them, can be a help to home economies. They need not be substitutes for thinking; they really can be bicycles for our minds. Convivial computing exists in the world today, and we should all of us encourage it wherever we find it. The next time you use a computer, then, think about what you are about to do. Will you scroll endlessly, feed the trolls, hate-read your political enemies? Or will you build a friendship, create a work of art, cultivate a new skill, share something beautiful: ultimately, glorify God? We get to choose whether our computing is convivial. Every time we make something genuinely good of computers’ capaciousness, we build — just a little more — a life fit for humans, tool-makers that we are.
V
A month ago,3 my younger daughter asked me to compose a setting for an acrostic poem she wrote — and so I did. Last night, I sat down at the piano with her to practice it: we both played, and she sang. I composed the piece as I always do: sitting at my Mac, with a MIDI keyboard at hand, entering the notes into one of my favorite pieces of software, listening to playback with samples from some of the best musicians in the world, printing it out at a quality unthinkable for this kind of home project even fifteen years ago. All of this could be done purely by hand, yes, but it was better this way — much better. And when we finished playing through it together, my younger daughter looked up at me, smiled a huge smile, said “Thank you, daddy,” and hugged me hard.
Notes
Though we should also acknowledge that even writing can be a crutch. ↩︎
I worked at LinkedIn, a subsidiary of Microsoft and a behemoth in its own right, January
2019 – October 2023. ↩︎Per the context comment at the top, this was “a month ago” when I wrote this essay, but that was a couple years ago. ↩︎