Friday Linked List (revisited)


It’s been a long time since I’ve done a link dump here. The recurring (but not exclusive!) theme today is that specialization is for insects.

A post went up on  Dr Dobb’s yesterday that shows some very telling data that’s maybe obvious to many of us, but should be pointed out to anyone who doesn’t see this happening all around them:

About every ten years, it seems, we’re told that we’re on the precipice of some revolutionary development in computing: The PC revolution. Ten years later, the networking and client/server revolution. (…) However, during the last 24 months, the sheer volume of change in the computing paradigm has been so great that programming has felt its imprint right away. Multiple programming paradigms are changing simultaneously: the ubiquity of mobile apps; the enormous rise of HTML and JavaScript front-ends; and the advent of big data. (…) The greatest effect these changes have had on software development is the requirement of multiple languages. Web apps, by definition, are multilanguage beasts. Many mobile apps require more than one language; and even if they can be written in one language, they often rely on a Web- or cloud-based back-end that’s written in an entirely different language. And, of course, big data is written using all forms of specialized languages. We have had to become polyglots to develop almost any current application.

That post shows two graphs of interest (not going to post them here, go read the article!) displaying survey results from 2010 and 2012. In 2010, a large number of developers were reporting that they were most of their time working with a single language. In 2012, we see a massive shift to a regular use of multiple languages.  However, in my day job recruiting developers I still see a scary number of folks who started working in C++ or old-school Java a decade or more ago and…stopped. There’s still work out there someplace for those folks, I’m sure, but outside of some very specific niches, how interesting is that work going to be? I’m sure that there’s still some level of demand for COBOL developers, but unless you’re a pure mercenary, how likely are those projects to be ones that you’d be excited to work on?

More interesting graphs in a post today on GigaOM from Stowe Boyd: No Experience Necessary, Just Polymathematics that look at how the kinds of jobs that the American education system is built to supply workers for are vanishing. The kinds of routine mid-skill jobs that our economy was reliant on for most of the 20th century are evaporating, whether because of automation or globalization. While our schools double-down on standardized testing and tightly confined subject areas, people able to quickly adapt into job roles that don’t exist yet will be the most in demand.

On the same topic, last night I read a new book from NYU professor Kio Stark, Don’t Go Back to School:

Here is a radical truth: school doesn’t have a monopoly on learning. More and more people are declining traditional education and college degrees. Instead they’re getting the knowledge, training, and inspiration they need outside of the classroom. Drawing on extensive research and over 100 interviews with independent learners, Kio Stark offers the ultimate guide to learning without school. Don’t Go Back to School provides models and methods for taking a new kind of path through learning, and transforming that alternative education into an exciting career path. This inspiring, practical guide provides concrete strategies and resources for getting started as an independent learner.

I’m entirely sympathetic to this idea; on paper, I’m not qualified for the job that I have. Other than a course studying C in music school and a single graduate course in non-procedural programming, I’m completely self-taught as a developer. I’m in the process of re-learning all the math that I’ve forgotten since high school (no one told me that the AP Calculus class I half-heartedly took would eventually be crucial when I wanted to do audio signal processing work 30 years later).

However, this book relies more on interviews with people who’ve learned things outside of traditional educational channels and have gone on to successful careers. The recurrent model here is ‘Fake It Till You Make It’ — claim that you have knowledge/experience that you really don’t, then scramble to get up to speed quickly. There’s much less in the way of actionable steps or approaches one can take to become a more effective self-teacher than I had hoped there would be. Perhaps a second reading will reveal depths that I missed the first time.

One other book that I’ve bene meaning to write about here for quite a while is 10 PRINT CHR$(205.5+RND(1)); : GOTO 10 by Nick Montfort, et al.

This book takes a single line of code—the extremely concise BASIC program for the Commodore 64 inscribed in the title—and uses it as a lens through which to consider the phenomenon of creative computing and the way computer programs exist in culture. The authors of this collaboratively written book treat code not as merely functional but as a text—in the case of 10 PRINT, a text that appeared in many different printed sources—that yields a story about its making, its purpose, its assumptions, and more. They consider randomness and regularity in computing and art, the maze in culture, the popular BASIC programming language, and the highly influential Commodore 64 computer.

If you’re heavily biased toward reading things with purely practical applications, this book is not for you. If you find software and source code itself a worthwhile subject of study, whether from aesthetic, philosophical, or anthropological points of view, it’s definitely worth at least grabbing the free PDF version of the book and checking it out.

+ more

Accurate Timing

Accurate Timing

In many tasks we need to do something at given intervals of time. The most obvious ways may not give you the best results. Time? Meh. The most basic tasks that don't have what you might call CPU-scale time requirements can be handled with the usual language and...

read more