Friday Linked List (revisited)

10print

Image via http://10print.org

It’s been a long time since I’ve done a link dump here. The recurring (but not exclusive!) theme today is that specialization is for insects.

A post went up on  Dr Dobb’s yesterday that shows some very telling data that’s maybe obvious to many of us, but should be pointed out to anyone who doesn’t see this happening all around them:

About every ten years, it seems, we’re told that we’re on the precipice of some revolutionary development in computing: The PC revolution. Ten years later, the networking and client/server revolution. (…) However, during the last 24 months, the sheer volume of change in the computing paradigm has been so great that programming has felt its imprint right away. Multiple programming paradigms are changing simultaneously: the ubiquity of mobile apps; the enormous rise of HTML and JavaScript front-ends; and the advent of big data. (…) The greatest effect these changes have had on software development is the requirement of multiple languages. Web apps, by definition, are multilanguage beasts. Many mobile apps require more than one language; and even if they can be written in one language, they often rely on a Web- or cloud-based back-end that’s written in an entirely different language. And, of course, big data is written using all forms of specialized languages. We have had to become polyglots to develop almost any current application.

That post shows two graphs of interest (not going to post them here, go read the article!) displaying survey results from 2010 and 2012. In 2010, a large number of developers were reporting that they were most of their time working with a single language. In 2012, we see a massive shift to a regular use of multiple languages.  However, in my day job recruiting developers I still see a scary number of folks who started working in C++ or old-school Java a decade or more ago and…stopped. There’s still work out there someplace for those folks, I’m sure, but outside of some very specific niches, how interesting is that work going to be? I’m sure that there’s still some level of demand for COBOL developers, but unless you’re a pure mercenary, how likely are those projects to be ones that you’d be excited to work on?

More interesting graphs in a post today on GigaOM from Stowe Boyd: No Experience Necessary, Just Polymathematics that look at how the kinds of jobs that the American education system is built to supply workers for are vanishing. The kinds of routine mid-skill jobs that our economy was reliant on for most of the 20th century are evaporating, whether because of automation or globalization. While our schools double-down on standardized testing and tightly confined subject areas, people able to quickly adapt into job roles that don’t exist yet will be the most in demand.

On the same topic, last night I read a new book from NYU professor Kio Stark, Don’t Go Back to School:

Here is a radical truth: school doesn’t have a monopoly on learning. More and more people are declining traditional education and college degrees. Instead they’re getting the knowledge, training, and inspiration they need outside of the classroom. Drawing on extensive research and over 100 interviews with independent learners, Kio Stark offers the ultimate guide to learning without school. Don’t Go Back to School provides models and methods for taking a new kind of path through learning, and transforming that alternative education into an exciting career path. This inspiring, practical guide provides concrete strategies and resources for getting started as an independent learner.

I’m entirely sympathetic to this idea; on paper, I’m not qualified for the job that I have. Other than a course studying C in music school and a single graduate course in non-procedural programming, I’m completely self-taught as a developer. I’m in the process of re-learning all the math that I’ve forgotten since high school (no one told me that the AP Calculus class I half-heartedly took would eventually be crucial when I wanted to do audio signal processing work 30 years later).

However, this book relies more on interviews with people who’ve learned things outside of traditional educational channels and have gone on to successful careers. The recurrent model here is ‘Fake It Till You Make It’ — claim that you have knowledge/experience that you really don’t, then scramble to get up to speed quickly. There’s much less in the way of actionable steps or approaches one can take to become a more effective self-teacher than I had hoped there would be. Perhaps a second reading will reveal depths that I missed the first time.

One other book that I’ve bene meaning to write about here for quite a while is 10 PRINT CHR$(205.5+RND(1)); : GOTO 10 by Nick Montfort, et al.

This book takes a single line of code—the extremely concise BASIC program for the Commodore 64 inscribed in the title—and uses it as a lens through which to consider the phenomenon of creative computing and the way computer programs exist in culture. The authors of this collaboratively written book treat code not as merely functional but as a text—in the case of 10 PRINT, a text that appeared in many different printed sources—that yields a story about its making, its purpose, its assumptions, and more. They consider randomness and regularity in computing and art, the maze in culture, the popular BASIC programming language, and the highly influential Commodore 64 computer.

If you’re heavily biased toward reading things with purely practical applications, this book is not for you. If you find software and source code itself a worthwhile subject of study, whether from aesthetic, philosophical, or anthropological points of view, it’s definitely worth at least grabbing the free PDF version of the book and checking it out.

Brett g Porter

Brett g Porter

Lead Engineer, Audio+Music at Art+Logic
Lead Engineer, Audio+Music development at Art+Logic. Always looking for excuses to write code. Tweets at both @artandlogic and @bgporter.
Brett g Porter

@bgporter

Music+Software+Music Software+Ice Cream. Relapsing composer/trombonist. Day job @artandlogic. Creator of @tmbotg.
RT @NASA: Retro. Modern. Iconic. That’s the worm. #TheWormIsBack Our beloved symbol of exploration will fly once again, just in time to… - 11 hours ago
Brett g Porter
Brett g Porter

Latest posts by Brett g Porter (see all)

Tags:

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

2 Comments

  1. John

    “…how interesting is that work going to be?”
    I disagree with the sentiment. Different problem domains and different environments lead to different challenges, and nearly all can be interesting. And don’t forget there’s a significant difference between the typical job for a COBOL specialist and one for a C++ specialist: COBOL was likely chosen because it seemed like the best tool for the job a number of decades ago while C++ was (usually) chosen as the best tool for the job today. C++ is usually chosen for embedded development and for jobs that have significant latency requirements. That’s hardly a specific niche, and a tight latency requirement colors how you look at each piece of code – in many other environments people will laugh you off if you point out unnecessary dynamic memory allocation in a code review.
    I find involving mixed languages/platforms adds interesting challenges to a job, too. Even UI development should be interesting in principle, the only reason I’m not fond of it is that, in practice, the developer ends up with no decisions to make. It’s as if you’re assembling a jigsaw puzzle with five people looking over your shoulder telling you where to put pieces, complaining about where you put some other piece, and disagreeing with each other.

    • bgporter

      Perhaps I made my point poorly here — All I’m saying (I think) is that developers who don’t keep their skill set current are at the mercy of whatever kind of work their aging skills permit them to do (and perhaps mentioning C++ was a canard — I still use it daily for audio/music development for exactly the reasons you list here). On the other hand, someone who’s done nothing but 2001-era client-server work in Java using whatever was current then is probably going to be restricted to working on dealing with maintaining buggy 10 year-old code. Some people thrive on that, and ‘how interesting can that be’ is more of a rhetorical question that expresses my point of view on that — for me, the answer is ‘not at all.’