Wired The End of Code

Let’s Explore Wired’s Article about “The End of Code”


An article published in Wired controversially predicts “The End of Code.” That’s an effective way to get the attention of people who code for a living.  Let’s explore what this means for professional coders like myself and my A&L colleagues; for our clients who hire us to write code; and for society as a whole.

Is coding as a human activity going away?

No, coding isn’t going away in the foreseeable future. The article emphasizes that its headline prediction is still an unknown possibility, a plausible bit of science fiction. Coders remain in high demand today, and the successes of artificial intelligence, while impressive, are still growing within relatively narrow and confined domains. Today’s CS students can breathe a sigh of relief because the robots haven’t jumped the fences just yet.

Is society mistaken to teach coding in the classroom?

There is a wide push to introduce coding and computer science into today’s classrooms. If coding is soon going to go away, or even if the job pool is expected to shrink, this would seem short sighted. I’d argue, however, that coding is valuable as a way of expressing and manipulating abstract ideas using math and logic. Students who study coding will come away changed by their exposure to this powerful concept, just as with the study of algebra, trigonometry, or formal logic. It’s another way of thinking about our world and, in my opinion, a bit of a steroid injection to inflate the muscle of our kids’ intelligence — if you’ll pardon the terrible analogy.

Should professional coders be afraid of the future?

The Wired article might give casual readers the impression that coders are shuddering in fear at the thought of AI creeping up on our jobs. The reality is that we don’t fear AI, and consider it part of our own wheelhouse. We invented it, right? Far from taking away my job, in reality, AI makes me better at my job.
It’s not a new phenomenon that advances in technology make low-level drudgery work obsolete. The critical insight is that it frees engineers up to make use of new and powerful tools to express their customers’ requirements, instead of spending all their time doing the plumbing. For example, there was a time when every programmer had to work in machine code. Very few of today’s programmers even know how anymore; but even so, the minute they import a state of the art, pre-written library they can become much more powerful as programmers than any of the computing pioneers.
My colleagues and I find artificial intelligence fun to play with. This ranges from occasionally kicking the tires, to an experiment recently published on GitHub by colleague Davy Wybiral, all the way up to A&L alumnus Dr. Alan Lockett, who did a postdoc fellowship in Switzerland researching AI. 

How will our customers benefit from AI?

In the months and years ahead, programmers will take tools like machine vision, speech recognition, and even natural language processing for granted, just as today we take database systems and network stacks for granted. Software engineers will push the envelope to help clients think about creative ways to use these technologies. Amazing new experiences are coming, whether in the cloud, in handheld/wrist-held devices, or via totally new appliances like Amazon Echo using Alexa

So what’s the problem?

I’d suggest there are two much bigger and more immediate problems to worry about than coders losing their jobs to artificial intelligence:
First, AI will eliminate many other categories of careers sooner, touching what everyone agrees is an alarmingly large fraction, especially of the blue collar workforce. This is not science fiction, since much of the tech exists today and is just limited in its rollout by national infrastructure. Transportation , factory and warehouse jobs, food services, custodial services, and many other jobs are likely to be taken over in a dramatic and rapidly accelerating trend. Technology and economics will be driving this, and society needs to prepare for the political and humanitarian implications of the upcoming displacement.
Second, we shouldn’t ignore the privacy and security concerns that are only growing every day with the ubiquity of surveillance and the AI software to process it. While superficially useful for law enforcement, these trends are widely criticized for their potential abuses. Are we ready to live in a world without privacy? Are we comfortable with search engines revealing details about our personal lives to strangers? Are we prepared for governments to go all Orwellian with this kind of surveillance, and for the chilling effects on personal freedom? These are immediate questions for us to grapple with as a society.

The future will reward the bold.

Circling back, will coders ever be vulnerable to automation of their jobs? Maybe. We can’t know the future. Certainly the tides of AI are rising and lapping at the heels of drudgery and repetition in software engineering, but that’s a natural part of the business. Software engineers who are “coding the impossible” today and who keep up with the pace of technology will be best prepared to ride the wave. We will always need analysts who can talk to the customer’s project team and glean a cohesive requirements picture from potentially conflicting information. We will always need engineers who can pick up the raw materials and tools of software in whatever constitutes the current state of the art, and who can synthesize out of all of this something that fulfills a client’s vision.
My own prediction: there will always be engineers, and we will always rule the machines. Perhaps someday engineers will converse more naturally with their computers, much like Mr. Scott (“Scotty”) from Star Trek: “Keyboard. How quaint.

+ more

Accurate Timing

Accurate Timing

In many tasks we need to do something at given intervals of time. The most obvious ways may not give you the best results. Time? Meh. The most basic tasks that don't have what you might call CPU-scale time requirements can be handled with the usual language and...

read more