But Officer, you didn’t see me stop!

Physicist uses math to beat traffic ticket, via Physics Central and Hacker News.

The paper is dated 1 April 2012, so it may be a joke, but the idea is at least theoretically reasonable. The ticket in question was for not stopping at a stop sign. The idea is that to a police officer, a driver might appear to not stop, despite having actually reached zero speed for a moment, if another car happens to obstruct the officer’s view at the critical moment.

But is that a “stop” anyway? Is there some minimum amount of time one must be stopped at a stop sign? Still, it’s a nice little piece of mathematical modeling.

Weekly links for April 15

Andrew Gelman asks: do statisticians practice what we preach in teaching? (His conclusion: no.)

Samuel Arbesman, Probability and game theory in The Hunger Games and Brett Keller, Hunger Games survival analysis. Andrew Gelman writes: “I think it’s always good to get practice. Analyzing a book/movie is like doing sports statistics; it can keep you in shape.”

Here’s that annual list of the year’s best jobs. Mathematician is #10.

MagicTile: Geometrical and topological analogues of Rubik’s Cube.

Luis Apiolaza has some ideas about what an introductory book on Bayesian statistics should be like; commenters there have listed some of their favorite such books. He also has a post on first impressions of Kruschke’s book “Doing Bayesian Data Analysis” (which has puppies on the cover!) and a link to a free-for-noncommercial-purposes PDF version of Joseph Kadane’s Principles of Uncertainty.

Pink is a real color

They did it to Pluto, but not to pink! Please not pink!

Robert Krulwich points out that there is no pink in the rainbow, linking to a youtube video:

Saying that pink isn’t a color is a little silly, though. The space of colors that humans perceive is three-dimensional. Start with an actual light source, which has a continuous spectrum; human vision roughly projects from the space of possible spectra to a three-dimensional space, each dimension corresponding to one of the three types of cones. The “pure” colors (single wavelengths) correspond to a two-dimensional manifold in that space — one dimension for hue, one for brightness. Just because you wouldn’t call anything in that two-dimensional space “pink”, that means pink isn’t a real color?

There must be some aliens, somewhere out there in space, that have yellow-sensitive cones as well and are offended that we don’t think yeen and grellow are real colors.

(hat tip: LA)

A coin-tossing decision problem

You have two coins, one of which comes up heads one-third of the time (coin A) and one of which comes up heads one-half of the time (coin B).

You pick one of these coins uniformly at random. Then you flip it twelve times, and it comes up heads five times. Which of the two coins is it?

The first instinct is to say that coin A is expected to come up heads four times, and coin B is expected to come up heads six times… five is halfway between those, so we don’t know anything.

But the standard deviation of the number of heads from A is \sqrt{12 \times 1/3 \times 2/3} = 1.63 and the standard deviation of the number of heads from B is \sqrt{12\times 1/2 \times 1/2} = 1.73; so in standardized terms, maybe B is more likely?

So let’s use maximum likelihood. The probability of getting 5 heads in 12 flips from coin A is

{12 \choose 5} \left( {1 \over 3} \right)^5 \left( {2 \over 3} \right)^7 = {11264 \over 59049} = 0.1907568

and the probability of getting 5 heads in 12 flips from coin B is

{12 \choose 5} \left( {1 \over 2} \right)^5 \left( {1 \over 2} \right)^7 = {99 \over 512} = 0.1933594

The likelihood of coin B is larger, so we choose B.

Alternatively, this is equivalent to using Bayes’ theorem with a uniform prior. Say P(A) = P(B) = 1/2. Then we can find the posterior probability of having picked coin B, given 5 heads:

P(B|5 H, 7 T) = {P(5 H, 7 T|B) \over P(5 H, 7 T|A) P(5 H, 7 T |B)}

and this is 0.5033877; the evidence from five coin flips is very weak. The probability of having picked coin B given k heads in n tosses can be computed similarly, and it’s

{{n \choose k} (1/2)^k (1/2)^{n-k} \over {n \choose k} (1/2)^k (1/2)^{n-k} + {n \choose k} (1/3)^k (2/3){n-k}}.

After some cancellation, this is

f(n,k) = {1 \over 1 + (2/3)^k (4/3)^{n-k}}

and in the n = 12 case this is

k 0 1 2 3 4 5
f(n,k) 0.0307 0.0596 0.1125 0.2022 0.3364 0.5034
k 6 7 8 9 10 11 12
f(n,k) 0.6697 0.8022 0.8902 0.9419 0.9700 0.9848 0.9924

We might ask: is there a general decision rule? Fix n; when is f(n,k) = 1/2? If k is less than that number then guess coin A was used; if k was greater than that number then guess that coin B was used. To get f(n,k) = 1/2 we must have (2/3)^k (4/3)^{n-k} = 1; rearranging this is (4/3)^n = 2^k. Taking logs we get n \log (4/3) = k \log 2 and so k = n (\log 4/3)/(\log 2) = 0.41503575, slightly less than /12 = 0.41\bar{6}. If the proportion of heads is greater than 0.41503575 then guess coin B; if it’s smaller than guess coin A.

This was inspired by an exercise from Freedman, Pisani, and Purves, Statistics, fourth edition (Chapter 24, exercise B1), with the language regularized: a thumbtack is thrown in the air. It lands either point up or point down. Someone suggests that the point comes up half the time; someone else, one-third of the time. How could you decide which model is correct? The answer given in the text is to toss the thumbtack many times and see whether the percentage of point-up times is closer to one-half or one-third. This is in spirit correct, but it turns out that 5/12 is not the optimal splitting point.

Weekly links for April 8

A free iPad app from IBM recreates the “Mathematica: A World of Numbers” exhibit from the 1964 World’s Fair, which is still on display at the New York Hall of Science and the Boston Museum of Science.

The BBC on Henry Moore’s mathematical sculptures.

An archive of figures from the history of probability and statistics.

Gwen Fisher’s hyperbolic beading.

John Cook, Willie Sutton and the multivariate normal distribution. (This is from September 2011, but John Cook relinked to it recently in his twitter feed StatFact.)

Jonathan Wai asks why is it okay to suck at math? (via Andrew Sullivan and John Allen Paulos)

Frank Morgan, the best doughnut has a small hole. (For a peculiar meaning of “best” which is totally nonculinary.) But the size of the hole is irrelevant to the existence of God.

Mark Liberman at Language Log writes about Evaluative words for wines, on sentiment analysis of wine reviews.

Carl Bialik, the Wall Street Journal’s “Numbers Guy”, on wind chill, heat index, and other psuedo-temperatures.

Andrew Lipson’s Lego Klein bottle is one of his many mathematical Lego sculptures.

Chris Stucchio on how to leave academia For people going from technical fields to technical jobs. No great new advice here, but lots of useful links.

Jorge Stolfi seeks the Hollywood constant, the smallest non-negative integer that has never been used in the title of a movie? (At least 85.)

Nick Berry has a better strategy for hangman, although he won’t tell you what to do after you actually find a letter that’s in the word. (At Lifehacker; at his own blog he has similar articles for Yahtzee, Battleship, Chutes and Ladders, Risk, Candy Land, and Darts.

Test Prep Authority’s “Math God” problems

Test Prep Authority, a test prep company, will now be sending “Math God” problems to those students preparing for the SAT and ACT that want them. The idea behind this, according to their press release, is that using very hard problems for training purposes is useful if you want to be able to solve easier problems quickly.

So far they’ve put out three of them: one, two, three. They don’t strike me as being particularly difficult so much as annoying. But they’re annoying in the same way that SAT problems are annoying, only more so, so from the point of view of training for the narrow slice of mathematics that is on these tests they’re pretty reasonable.

I’m actually of two minds about this. On the one hand, I instinctively shudder when I see test prep resources, because I’m afraid that they’ll be devoted to “how to take tests” and not actual learning. (Test Prep Authority seems to be pretty good in this regard.) On the other hand, the philosophy of doing harder things than you “need” to be able to do is certainly one I can get behind. When it really matters, you don’t want to spend all your time working at the limits of your capabilities; that’s exhausting. You may never need to drive faster than, say, seventy miles an hour, but you wouldn’t want a car that only goes that fast.1 Just because you never need to run to actually get somewhere doesn’t mean that running won’t make walking less tiring.2 And of course they say that you never really learn algebra until you take calculus, or that you never really learn a subject until you have to teach it.

I’m rather curious about that last statement. How would you test that?

1. This describes my first car, a 1987 Subaru DL which I drove for much of 2000-2001. Around seventy it started shaking pretty violently. But that’s not a horrible trait in a car for a new driver, who might otherwise go too fast.

2. I should take this advice, but I don’t, because I really can’t stand running.

Today’s not-so-astonishing calendrical coincidence

Alright, this is too easy. From my hometown newspaper: Good Friday, Passover overlap.

Passover is supposed to start around the full moon, and last for a week.

Easter is supposed to fall on the Sunday after the full moon.

So this should happen pretty frequently. The calculations involved are both complicated enough that I’ll just jump straight to simulation. Here’s a quick implementation of Gauss’s computus in R:


easter = function(year){
a = year %% 19;
b = year %% 4;
c = year %% 7;
k = floor(year/100);
p = floor((13+8*k)/25);
q = floor(k/4);
M = (15 - p + k - q) %% 30;
N = (4 + k - q) %% 7;
d = (19*a + M) %% 30;
e = (2*b + 4*c + 6*d + N) %% 7;
day = 22 + d + e;
if( d == 29 && e == 6) {day = 50}
if( d == 28 && e == 6 && ((11*M+11) %% 30) < 19 ) {day = 49}
day;
}

This code inputs the year and outputs the date of Easter as a day of March; for example this year, Easter is April 8, or “March 39”, and easter(2012) outputs 39.

Conway has a formula for the date of Passover. Actually it’s a formula for the date of Rosh Hashanah; if (the first day of) Rosh Hashanah falls on September M in a given Gregorian calendar year, then the first day of Passover in the same Gregorian year is March M + 21. In both cases the beginning is actually at sundown the previous evening. For example this year the first day of Rosh Hashanah is September 17, 2012 (actually sundown on the 16th) and Passover begins on March 17 + 21 = 38, or April 7 (actually sundown on April 6).

First one computes a tentative Rosh Hashanah date based on the full moon. (This is actually a rational number; it’s convenient for later purposes to keep the fractional part.


roshhashanah.t = function(year) {
g = (year %% 19) + 1;
n = floor(year/100) - floor(year/400) - 2 + 765433/492480*((12*g) %% 19) + (year %% 4)/4 - (313*year + 89081)/98496;
}

The rational constants are here because the Hebrew calendar incorporates a fractional approximation to the time between full moons.

But Rosh Hashanah is sometimes postponed by a day or even two because it falls on a day of the week that is inconvenient. (For example, one does not want Yom Kippur to fall on a Friday or Sunday, which works out to not having Rosh Hashanah on a Wednesday or Friday.) So we need a day-of-week function, namely Zeller’s congruence:


dayofweek = function(year, month, day) {
if (month<3) {year = year-1; month = month+12}
x = day + floor((month+1)*26/10) + year + floor(year/4) + 6*floor(year/100) + floor(year/400);
x %% 7;
}

This returns 0 for Saturday, 1 for Sunday, …, 6 for Friday. Now we implement the postponements:


roshhashanah = function(year) {
rh = roshhashanah.t(year);
dow = dayofweek(year,9,floor(rh));
if(dow == 1 | dow == 4 | dow == 6) {rh = rh+1};
frac = rh-floor(rh);
g = (year %% 19) + 1;
if(dow == 2 & frac >= 23629/25920 & (12*g %% 19) > 11) {rh = rh+1};
if(dow == 3 & frac >= 1367/2160 & (12*g %% 19) > 6) {rh = rh+2};
rh;
}

Finally, to get the date of Passover (as a day in March):


passover = function(year) {roshhashanah(year) + 21};

So how often does the current coincidence happen? Let’s consider a thousand-year window centered on 2000. Generate the dates for Easter and Passover in that period:


E = easter(1500:2499)
P = rep(0,1000); for(i in 1500:2499){P[i-1499]=passover(i)}

Then table(P-E) gives a table of the date of the first day of Passover relative to the date of Easter. This year we have -1: the first day of Passover is Saturday, April 7, while Easter is Sunday, April 8. The table is as follows:

difference -8 -7 -5 -3 -1 0 23 25 27 28 30
frequency of difference 12 101 234 236 228 20 27 49 48 23 22

In particular the situation we have this year occurs in 228 years of this thousand-year window, almost a quarter of the time! This is actually more often than I expected — I was thinking a bit under one year in seven, basically the one-seventh of the time when Passover starts on a Friday night, except somewhat less because sometimes the Christian and Jewish calculations pick out different full moons. But the Hebrew calendar postponement rules mean that Passover can only start on certain days of the week. You can see that even in this table — the difference of -2 never occurs, for example, because that would mean Passover starts on a Friday (i. e. Thursday night), which it can’t.

April 11, 2012: there seem to be some bugs in this code that lead its results to not agree with those from other tables. I’ll fix this at some point; for the time being please don’t rely on what’s here for anything serious.

Life lessons in mathematics classes

At the AMS graduate student blog, Luke Wolcott asks if one can take life lessons from a mathematics class.

In the past couple years I’ve taught probability, statistics, and game theory. I often find myself lamenting that I don’t necessarily handle uncertainty or strategizing about interactions with other people well in daily life, because I certainly understand the theory! But it’s easy to lose sight of the greater context in the struggle to “cover” the material that “needs” to be covered. And students, in my experience, seem to tune out when they realize something won’t be on the test.

But I agree with Wolcott that it’s part of the teacher’s job to teach life lessons. In the probability classes that I teach I hope I am teaching the lesson of perseverance and of being willing to try different approaches to a problem – that course has the strange property that there aren’t that many big theorems, and it’s not necessarily obvious which approach to a problem will work until you try it. And in both probability and statistics I want to show people that we are bad at intuitively reasoning about uncertainty. (Don’t believe me? Read the excellent book Thinking, Fast and Slow.)

Krantz’s book on mathematical maturity

Steven Krantz is a prolific mathematician; he is also the author of various books on the mathematics profession. A mathematician’s survival guide was sometimes useful to me as a book on how to survive graduate school in math; it focuses on graduate school and early-career mathematicians. Academic mathematicians spend lots of time writing and teaching; for these he’s written A Primer of Mathematical Writing: Being a Disquisition on Having Your Ideas Recorded, Typeset, Published, Read & Appreciated and How To Teach Mathematics. A book that I haven’t read (because it doesn’t apply to me), a sort of sequel to the Survival Guide, is the similarly-titled The survival of a mathematician: from tenure to emeritus. These books are perhaps not as all-encompassing as one might expect – they are rather idiosyncratic, being books on how to have a successful career at a research institution, and to do a good job of teaching without necessarily having it take up all of one’s time. But this is to be expected – Krantz is at Washington University in St. Louis and this is the milieu he knows best. In particular I’d say that the Survival Guide is better at giving advice on graduate school than books such as Getting What You Came For which purport to advise potential graduate students in all fields.

His most recent book is A Mathematician Comes of Age, which is a short book-length exploration of the concept of mathematical maturity and how to develop it in students. Sol Lederman interviews him about the book on his podcast series Inspired by Math.