Robert Sedgewick has the slides for a talk, Algorithms for the Masses on his web site.

My favorite slide is the one titled “O-notation considered harmful” — Sedgewick observes that it’s more useful to say that the running time of an algortihm is ~aN^{c} (and back this up with actual evidence from running the algorithm) than to have a theorem that it’s O(N^{c}) (based on a model of computation that may or may not be true in practice).

The serious point of the talk, though, is that everyone should learn some computer science, preferably in the context of intellectually interesting real-world applications. This is what Sedgewick is doing in his Princeton course and in his book with Kevin Wayne, Algorithms, 4th edition, which I confess I have not read. There’s a Coursera course, in six-week parts, starting in August and November respectively. For a lot of the heavy-duty mathematics you can see Sedgewick’s book with Flajolet, *Analytic Combinatorics*, a favorite of mine from my grad-school days. There’s even a Coursera course: 5-week part 1 in February 2013 and 5-week part 2 in March 2013.

### Like this:

Like Loading...

*Related*

Reblogged this on Stats in the Wild and commented:

“The serious point of the talk, though, is that everyone should learn some computer science, preferably in the context of intellectually interesting real-world applications. “

Algorithms 4th Edition is the book I currently go to if I want a easy to understand definition of an algorithm I’m looking for. It’s very “readable” compared to Segdewick’s other books.

Here’s an interesting side-effect of O-notation.

Got a (constant, finite) bunch of different algorithms for a job? Can’t decide which one to use? Just run them all in parallel, giving each one an equal slice of computation time, until one of them finishes! O-analysis? The fastest one finishes first, so you’re guaranteed the best results, only off by a constant factor!