# What’s the Best Way to Learn? Just-In-Time versus Just-In-Case

Illustration of an 18th-century classroom

You will never be dumber than you are right now. You will also never have more time than you do right now. Thus, you have a relative abundance of time and a relative dearth of knowledge. How do we strike a balance between these resources to optimally leverage them for learning?

These questions came up as I listened to two episodes of the Ruby Rogues podcast. In episode 70 David brings up just-in-time versus just-in-case learning. David’s ideas were prompted by Katrina Owen, who has a list of learning resources here. The other thought-provoking episode (responsible for the above paragraph) was number 87 in which the rogues discusses Sandi Metz’s new book, Practical Object-Oriented Design in Ruby. (I had the pleasure of meeting Sandi last night at a local Ruby meetup, after a draft of this post was written.) Here’s Chuck riffing off of a quote from the book:

“Practical design does not anticipate what will happen to your application. It merely accepts that something will and that in the present, you cannot know what. It doesn’t guess the future. It preserves your options for accommodating the future.” And so, what that says to me is you don’t always have enough information. You may never have enough information. You will never have less information than you have now. So make the design decisions that you feel like you have to and defer the rest, until you don’t have to anymore. And so it was basically, “Here are some rules. But use your best judgment because you’re going to get more information that’s going to inform you better later.” And so, that kind of opens things up. Here are the rules but if you have the information that says that you have to break them, then break them.

The just-in-time and just-in-case distinctions are useful in answering the question I posed at the beginning. But before I give concrete examples I think it is important to introduce another dimension to our learning classification: formal and informal. Being the good social scientists that we are, we can now formulate a two-by-two table.

Just-in-case learning is done well ahead of the time that it is needed for practical purposes. Children learn English (or whatever their native language) without thought for or anticipation of the letters, emails, and blog posts they will write in years to come. In a formal setting this can lead to the use of toy problems to make the skill seem practical. Students in an algebra class may have trouble seeing ‘the point’ of those skills until much later–and even then they may not fully recognize where that learning originated.

Just-in-time learning occurs at or very near the point of need. I could ask for travel directions to your house when we first meet, but that would be useless until you actually invite me (not to mention presumptuous). It is better to learn something like that when I can use it right away, since it has little value in the abstract. Programming–for me at least–has been much more of a just-in-time skill. I have taken one formal course in the topic and am currently enrolled in another. But the great benefit of these courses is that you get to put your skills to work immediately.

To answer the question we started with, I think that we need to place more value on just-in-time learning and less on just-in-case learning. As the quote from Sandi’s book points out, we live in a world of uncertainty. There are some skills that you simply cannot learn at a just-in-time pace (math being the main one that comes to mind). But for the plethora of other cases that our modern world and its tools make available, learning at the point of need is satisfactory and perhaps even superior. That is why we need to develop more avenues for just-in-time learning. Programmers have this in spades with sites like StackOverflow, but many other skill areas do not. Sites like Coursera also have a chance to provide a middle road between the categories in the table above. The ability to iterate quickly and pick up new skills on the fly will be increasingly valuable in the years to come.

# Wednesday Nerd Fun: Python for iOS

This one is short and sweet. Would you like to be able to write Python code on an iOS device? Now you can, with this app.

I have spent some time playing around with the app this week, and it seems to have two main uses. The first is entertaining yourself while waiting in line/riding the bus/whatever dead time you have where you would like to do something slightly more productive than checking Twitter. The second usage I see is making small edits with an iPad or iPhone while you happen to be away from your computer. (See the screenshot below showing the file loading functionality, more here.) In other words, this would not be my primary programming environment but it is a nice complement.

If you have never given programming a chance, having this app might make learning Python seem more like a game. If that’s what it takes for you to plunge into coding, go for it!

# Brookings: Hybrid University Classes as Good as Traditional Format

We randomly assigned students in seven introductory statistics courses on six public university campuses to take the course in a hybrid format (with machine-guided instruction accompanied by one hour of face-to-face instruction each week) or a traditional format (as it is usually offered by their campus, typically with 3-4 hours of face-to-face instruction each week).

We found that students in the hybrid format did just as well—in terms of pass rates, final exam scores, and performance on a standardized statistics test—as their counterparts in the traditional version of the same course.  This finding of “no effect” may seem disappointing, but we view it as hugely consequential because it shows that fears of online learning leading to worse outcomes are unfounded.  We certainly hope that more sophisticated versions of interactive online courses will produce even better outcomes, but clearly the first test they must pass is that they “do no harm.”

That’s from Matthew M. Chingos’ piece at Brookings, citing a study by ITHAKA. Read his comments here, including a link to the study’s full report. See also my thoughts from earlier in the week.

Update: The Boston Globe gave this some press too.

In his book The Success of Open Source, Steven Weber mentions how companies can integrate open source development into their workflow.* One reason for doing this is the competitive advantage that comes from a widespread, thorough understanding of the code. He then makes this remark:

A company that organized itself to boost the rate at which that kind of knowledge grew among its employees would have, in effect, created a business model that was quite viable in, and familiar to, capitalist economies.

In fact, many big companies try to foster knowledge growth among their employees as a way to gain an edge on their competitors. Google has a speaker series. Until recently, Yahoo! had a pretty good research department. Companies outside of tech do this too. I had a friend in Houston who did training exercises for Shell on learning styles and so forth. Supposedly, Wal-Mart encourages its employees to watch additional training videos while on the clock (see this podcast at about 10:04).

But none of this is a particularly new idea. Flash back to the late 18th century and you’ll see this model, on a smaller scale, all over the place. Back then it was known as “apprenticeship.” A professional, such as a silversmith, would hire a young man on a 4-6 year contract. During the first year or two, this would require a substantial investment in time on the part of the silversmith to train the young man. But by the fourth or fifth year, the apprentice would be able to repay the cost of his education by providing cheap labor. This arrangement is still around today, and is particularly widespread in Germany.

In the US, the apprenticeship model is most familiar to academics, but we have gotten away from this model in much of higher education.** Rather than providing practical training for a career, many college degrees serve only as a signal that the individual in question can learn. Thus, in many fields on-the-job training and hands-on experience are more valuable than degrees. In this type of environment you see widespread unemployment among recent college graduates.

This brings me to the key point of the post. The main shortcoming (as I see it) of new online education sites like Udacity and Coursera is not that they are a departure from the traditional model. It’s that they don’t depart far enough. In seeking to be reputable, they stay too close to the lecture format, simply exchanging in-person for video. This is not true innovation. The true innovation will come when they learn to scale the apprenticeship experience. This is what lectures were first meant to do. It took several hundred years to reach a level of affluence at which enough people were attending college that we could find the scale limits of this model, but we have reached that point.

Apprenticeship was a one-to-one, hands-on education. Lecturing is a one-to-many, hands off teaching style. With modern technology, there is an opportunity for a many-to-many, hands on approach. This will mean conflicting influences, competing ideas, and room for independent thought. A marketplace of ideas, indeed.

See also: Peter Thiel on last night’s “60 minutes.” (this post has been in the works since before that interview aired)

______________________

*Notwithstanding here Eric Raymond’s point that open $\rightarrow$ corporate is the wrong direction for thinking about the flow of open source.