N.B. This essay, in the same spirit as my previous post, appears in the Communications of the ACM blog.
Those of us of a certain age (i.e., once able to use a slide rule) remember when the university computer (note the singular) was a scientific and engineering shrine, protected by computer operators and secure doors. We acolytes extended offerings of FORTRAN, ALGOL or COBOL via punched card decks, hoping for the blessings that accrued from a syntactically correct program that compiled and executed correctly.
The commonality across all our experiences was the need to husband computer time and plan job submissions carefully, particularly when one's job might wait in the queue for six to ten hours before entering execution. I distinctly remember spending many evenings laboriously examining my latest printout, identifying each syntax error and tracing the program flow to identify as many logic errors as possible before returning to the keypunch room to create a new punched card deck.
Because computing time was scarce and expensive, we devoted considerable human effort to manual debugging and optimization. (The subject of manual memory overlays before virtual memory shall remain for another day.) Today, of course, my wristwatch contains roughly as much computing power as that vintage university mainframe, and we routinely devote inexpensive computing time to minimize human labor. Or do we?
Yes, we routinely use WIMP interfaces for human-computer interaction, cellular telephony is ubiquitous and embedded computers enhance everyday objects – from microwave ovens to thermostats and running shoes. However, I suspect much of computing is still socially conditioned by its roots in computational paucity to recognize fully the true opportunity afforded by computational plethora.
Many of us are still wed to a stimulus-response model of computing, where humans provide the stimulus and computers respond in preprogrammed ways. For example, traditional web search (traditional indeed – how quickly the new becomes commonplace) requires typed or spoken search terms to initiate a search. In a world of plethora, computing could glean work, personal, and even emotional context, anticipating information queries and computing on behalf rather than in response. My computer could truly become my assistant.
In economics, the Jevon's paradox posits that a technological increase in the efficiency with which a resource can be used stimulates greater consumption of the resource. So it is with computing. I believe we are just at the cusp the social change made possible by our technological shift from computational paucity to computational plethora.
Comments
You can follow this conversation by subscribing to the comment feed for this post.