An Unevenly Distributed Future by Matt Briggs

IMG_6406.jpg

by Matt Briggs

It is hardly news to anyone in Seattle that humanity over the entire planet is experiencing an unprecedented rate of technological change. In Seattle this is visible in entire neighborhoods replaced in the last ten years. According to Governing Magazine, Seattle has experienced a 50% gentrification rate since 2000, compared to a 40% rate in the 1990s. Cleveland, in contrast, has experienced a 6.7% rate since 2000. In Seattle, to travel to a new city, you only have to spend an afternoon watching a movie. You will find a new skyline when you go outside. Major shifts such as the movement from stone to metal tools, from hunting and gathering to agriculture, or from human labor to mechanical labor, once took place over millennia or centuries. Since the end of the 19th century, however, we have experienced a continual and increasingly rapid succession of equally large technological shifts: the internal combustion engine, the rise of machines capable of computation, nuclear power, global communication networks, the spread of pervasive data collection, and automation of complex information and physical systems.

The speed of change has become a given for us. For example, as increasingly automated or abstracted labor (robots, sweatshop workers, three dimensional printers) produces material goods; the design, style, and function of these goods are disposable by design. Built-in obsolescence doesn’t just mean more sales of goods, but also means these goods can be replaced with newer (meaning improved) items. Anyone on the treadmill of computer upgrades or smartphone upgrades understands how this works. You don’t buy a computer to hand down to your grandchildren. You buy a computer with strong enough specs that it will last you through one or two future and inevitable upgrade cycles. The experience of the new isn’t just the experience of new style, but of new capabilities. My computer now does things that can’t even be compared to the black and white Mac Classic I used in 1990; even my watch has a more powerful computer.

This winter, my toaster broke. I had originally bought it in 1990 and it worked perfectly until a few months ago. It wasn’t anything fancy. My toaster took in a slice of toast and then heated some coils and toasted my bread. A Paleolithic man would have no use for milled grain, much less sliced bread, and a toaster would be uselessness wrapped in uselessness. But for me, the toaster did a fine job for more than twenty-five years, and then one day it stopped working. My new toaster costs 1/4 of the old one’s price, and it’s made out of metal instead of plastic. It has expandable sides to toast bagels, muffins, whatever. I could probably toast a hot pocket in it. And it has a button and settings for toasting different things. I am certain I will have to replace this toaster in five years, and will pay yet again 1/4 of the price (or 1/16 of the old toaster price), and get a toaster that will do things as inconceivable to me now as a toaster itself would be to a Paleolithic man.

But this isn’t just about toasters. This is about everything in our society from the type of buildings being erected in the Denny Regrade to the kind of jobs that people do.

In 1980, My mother started her first white-collar job as a technical illustrator drawing parts for a Boeing 737 on a drafting board in a hanger filled with other drafters. It was a sea of tables and lamps. By the mid-1980s, those tables were replaced with CAD software and drawing components on a screen. By the mid-1990s, my Mom was interacting with a data model describing what was originally a circuit diagram. She created parts as a database record. And her story is anomalous and a bit nostalgic because she has worked for the same company for the last thirty-some years.

Most of us are aware that what we are doing right now will not be done in the same way for the same reasons in ten years (but probably more like three). In my generation, working for a single company for more than five years is very rare. I don’t know what I’ll be doing or how I will be doing it in five years. We are all longshoremen in the decade before the advent of container cargo put them out of work, radically lowered the price of importing goods, and made it feasible to have goods manufactured by labor in a cheaper overseas market. Only we are experiencing this sort of change in months rather than in years.

In Abundance: The Future is Better Than You ThinkPeter H. Diamandis and Steven Kotler write:

Culture is the ability to store, exchange, and improve ideas. This vast cooperative system has always been one of abundance’s largest engines. When the good ideas of your grandfather can be improved upon by the good ideas of your grandchildren, then that engine is up and running. The proof is the enormous bounty of cumulative innovation produced by specialization and exchange. “A large proportion of our high standard o f living today derives not just from our ability to more cheaply and productively manufacture the commodities of 1800,” writes J. Bradford DeLong, an economist at the University of California at Berkeley, “but from our ability to manufacture whole new types of commodities, some of which do a better job of meeting needs that we had back in 1800, and some of which meet needs that were unimagined back in 1800.”

We encounter objects of the same scale of utility and change as a toaster to a caveman every couple of months, and this rate of novelty, newness, or whatever you want to call it is only accelerating.

Conventional wisdom says that in a rapidly-changing world having a science and math education is essential. Presumably this is because these hard skills take a long time to master and learn. How long does it take to gain a STEM education? It probably takes a decade to become a mathematician, but how about a run-of-the-mill computer programmer?

It takes probably about four years of study to become a good software developer, especially if you are using one of the industrial grade computer languages such as C#, C++, or Java.

But really, a great developer like a great writer is something that takes at least a decade. In school, you can start writing coherently in the second or third grade. You can write a computer program within months of learning to write code. Neither one of those accomplishments will qualify you for greatness. The answer to “how long,” is really “it depends.” It takes can take a decade or four years or a year. It doesn’t take months.

Four years? A decade? A year?

Technology is in constant motion. Since 2000, we have seen the rise of (and relative fall of) Microsoft’s .Net initiative, Java; the rise of mobile computing as tablets and phones have become the dominant consumer platform; the shift from company-hosted servers using large applications to what is sometimes called Software as a Service (SAAS) or Service Oriented Architectures (SOA)—you use this type of model when you use a travel web site or Facebook. And then the surprising rise of what had been a handy and light-weight programming language used by HTML jockeys, JavaScript, to a flexible and widely-used language not just for the face of the web but also in the nuts and bolts infrastructure that processes and provides this data. Technological change is not happening at the level of paradigms, but in recursively nested and increasingly rapid spirals as you get deeper and deeper into any discipline from software development to genetics.

In this rapidly changing environment, a worker must assess how long it takes to learn a technology and how long that technology will remain current. If it takes ten years to become a good software developer, odds are very good at this point that you will not be programming in the language in which you started. It is not the language itself that is important, but the capacity to learn a programming language. It is not the capacity even to learn the computer language, but instead the capacity to learn and communicate what you learned.

While this is a stereotype, one of the things that turns people off who have gone into the arts is the idea of “the right answer.” I liked math for this reason and found as I progressed in math that the students who could get the right answer were rewarded. At first, in arithmetic, algebra, geometry, and trigonometry, there were in fact right answers. But, very quickly, there were also in our education system applied mathematics. And it is the precision and correctness of answers, of results, that is rewarded. And it is from these students that the future software developers are culled.

In the humanities, however, we tend to learn that all answers are relative, and that it is the act of communication or the ability to distill an essence into something tangible that is rewarded, as well as the student’s ability to navigate changing value systems. In a system that does not have correct answers, the individual value system of our instructors is something we always have to figure out if we want a good grade. In high school, I could not figure this out. I had a New Critics idea of aesthetics as a kind of calculus of signs and symbols, and I thought something was good because of its ability to demonstrate mastery and coherence as a self-contained system. I never got a good grade. In a way, this was a good lesson that something meaningful to me could be meaningful to me but not really worth much, not even worth something like a good grade, to a teacher. Each class I took ended up being a kind of paradigm shift as I encountered a teacher with a different world view.

The most extreme version of this was in college where I had just come out of a critical theory class and learned about feminism, Marxism, New Historicism, Neo-Formalism, and Deconstructionism: a library of ways in which to interpret a text (and to produce contradictory readings of the same source material). Then I had a class on the great English poet, John Milton. John Milton to me was a figure of the late English Renaissance, a rational person who wrote a great defense of the freedom of thought and its expression in the written word, Areopagitica. Like any humanist, I felt that an individual human experience was of principal value and the source of truth. If I have faith in anything I have faith in “the genius of man … the unique and extraordinary ability of the human mind.” Less Descartes’s “I think, therefore I am” (although I believe that, too), but rather, I am human and therefore I value that which is human. It is tautological, but in an age when a phrase such as “post-human” is not merely a fever dream of Philip K. Dick, it seemed important to me to say the obvious. In this class, as a diligent English major eager to try my new critical tools, I applied New Historicism to a paper on Paradise Lost, and got back a C-. So I applied a Marxist framework and got back another C-.

I went to talk to the professor during his office hours because I couldn’t figure this out. Spelling errors by themselves usually didn’t get me a C. As he talked to me, I realized that not only was the professor a Christian, but that my professor believed Milton was a prophet of God, and therefore Milton didn’t have a literary context or intent: his words revealed the mind of God. I may be a self-conscious humanist, but I am also an atheist. I went away thinking I should have talked to him sooner because it was getting pretty late to drop his class. Then I figured, “what the hell?” I could figure out how to understand Milton with the assumption that his words came from the mind of God. I put myself into the world-view of this professor.

In retrospect, my education in the humanities was just that, to learn the ability to deal with this sort of thing. An understanding of history, the history of ideas, knowing the dates of major events such as the adoption of agriculture, the reign of Augustus Caesar, the Norman Invasion, the Armory Show, were important in so far as they revealed how much I did not know and could not know. It was less a fluency in facts and the verifiable than a fluency in my own ignorance and ability to navigate the unknown.

It is the cultivation of this plasticity that is essential to any sort of work in the future. To repeat Diamandis and Kotler’s quote, “Culture is the ability to store, exchange, and improve ideas,” and an education in culture, in the humanities, is an education in how to store, exchange, and improve ideas.

If I had been encountering a disciple of the prophet Milton in a state-run, public university—separation of church and state!—I would have flipped into a geek rage at the arbitrariness of this professor’s nonsensical authority. I might have been upset at the system of tenure that allowed the professor to use his chair as a religious pulpit. But, this type of reaction, having to be right, has the unfortunate side effect of closing down lines of communications, closing down the mind, and embracing the known instead of the unknown.

The only thing we know about the future is that we can’t predict it. An education in the humanities prepares us for this uncertainty.


Matt Briggs is the author of eight works of fiction including The Remains of River Names and Shoot the Buffalo. His most recent book, Virility Rituals of North American Teenage Boys, was released by the Publication Studio in Portland in 2013. You can find him at The Suburgain.

This essay was presented on Feb. 4, 2016, at Seattle Central College Library as part of their Conversations on Social Issues Series. The topic was: “How Artists/Writers Rule the Information Economy.”