An Unbiassed View of What We Should Eat . . . From a Rat

In nature animals must choose a healthy diet based on what tastes good. This doesn’t work for modern humans — lots of people eat poor diets — but why it fails is a mystery. There are many possible reasons. Are the wrong (“unnatural”) foods available (e.g., too much sugar, too little omega-3, not enough fermented food)? Is something besides food causing trouble (e.g., too little exercise, too little attention to food)? Are bad cultural beliefs too powerful (e.g., “low-fat”, desire for thinness)? Is advertising too powerful? Is convenience too powerful? Lab animals are intermediate between animals in nature and modern humans. They are not affected by cultural beliefs, advertising, and convenience (the foods they are offered are equally convenient). Their choice of food may be better than ours.

Nutrition researchers understand the value of studying what lab animals choose to eat. In 1915, the first research paper about “dietary self-selection” was published, followed by hundreds more. The general finding is that in laboratory or research settings, animals choose a relatively healthy diet. There are two variations:

[1.] Cafeteria experiments with chemically defined [= synthesized] diets showed that some of these animals, when offered the separate, purified nutrient components of their usual diet, eat the nutrients in a balance that more or less resynthesizes the original diet and that is often superior to it. [2.] Other animals eat two or more natural foods in proportions that yield a more favorable balance of nutrients than will any one of these foods alone.

Both findings imply that housing an animal in a lab does not destroy the mechanism that tells it what to eat.

Which is why I was fascinated to recently learn what Mr. T (pictured above), the pet rat of Alexandra Harney, the author of The China Price, and her husband, liked to eat. It wasn’t obvious. “We tried so many foods with him and always thought it made a powerful statement that even a wild rat turned his nose up at potato chips,” says Alexandra. “He hated most processed food. He also hated carrots, though.” Here are his top three foods:

  1. pate
  2. salmon sashimi
  3. scrambled eggs

Pate = protein, animal fat, complex flavors (which in nature would have been supplied by microbe-rich, i.e., fermented, food). Salmon sashimi = protein, omega=3. Scrambled eggs = ??

He liked beer in moderation, but not yogurt. “Owners of domestic rats say they love yogurt,” says Alexandra, “but Mr T only liked it briefly and then hated it, even lunging to bite a friend who brought him some. [Curious.] He loved cheese, stored bread for future consumption (but almost never ate it). Loved pesto sauce and coconut.” Note the absence of fruits and vegetables. Alexandra and her husband have no nutritional theories that I am aware of. They did not shape this list to make some point.

For me the message is: Why scrambled eggs? I too like eggs and eat them regularly and cannot explain why.

More Alex Tabarrok’s Thanksgiving post shows the connection between libertarian ideas (economies work better when more choice is allowed) and dietary self-selection.

iTunes For Windows is Horrible

May I interrupt my usual posts to complain about something? Something minor?

It is that iTunes for Windows — from Apple, the maker of what are said to be brilliantly-designed products — is horribly designed. I have two examples.

1. Suppose I want to see what’s in the iTunes Store. I open a new window. I can’t close that window without closing iTunes! And if, after closing the whole program, I open it again, it still gives me the Stores window! Maybe the Stores window went away after a few weeks…I don’t want to even think about it.

2. I pressed the wrong button and started 181 downloads. There is no way to cancel them! If I stop the whole program, they will resume the next time I start it. This is software design from the 1960s.

And this is iTunes version 10.something, not version 0.3.

 

Duct Tape, the Eurozone, Status-Quo Bias, and Neglect of Innovation

In 1995, I visited my Swedish relatives. We argued about the Euro. They thought it was a good idea, I thought it had a serious weakness.

ME It ties together economies that are different.

MY AUNT It reduces the chance of war in Europe.

You could say we were both right. There have been no wars between Eurozone countries (supporting my aunt) and the Eurozone is now on the verge of breaking apart for exactly the reason I and many others pointed out (supporting me).

Last week a friend said to me that Europe was in worse shape than America. I was unconvinced. I said that I opposed Geithner’s “duct-tape solution”. It would have been better to let things fall apart and then put them back together in a safer way.

MY FRIEND Duct-tape works.

ME What Geithner did helped those who benefit from the status quo and hurt those who benefit from change. Just like duct tape.

This struck me as utterly banal until I read a one-sided editorial in The Economist:

The consequences of the euro’s destruction are so catastrophic that no sensible policymaker could stand by and let it happen. . . . the threat of a disaster . . . can anything be done to avert disaster?

and similar remarks in The New Yorker (James Surowiecki):

The financial crisis in Europe . . . has now entered a potentially disastrous phase.. . . with dire consequences not just for Europe but also for the rest of us. . . . This is that rarest of problems—one that you really can solve just by throwing money at it [= duct tape]

Wait a sec. What if the Eurozone is a bad idea? Like I (and many others) said in 1995? Why perpetuate a bad idea? Why drive further in the wrong direction? Sure, the dissolution will bring temporary trouble (“disaster”, “dire consequences”), but that will be a small price to pay for getting rid of a bad idea. Of course the Euro had/has pluses and minuses. Anyone who claimed to know that the pluses outweighed the minuses (or vice-verse) was a fool or an expert. Now we know more. Given that what the nay-sayers said has come to pass, it is reasonable to think that they (or we) were right: The minuses outweigh the pluses.

You have seen the phrase Japan’s lost decade a thousand times. You have never seen the phrase Greece’s lost decade. But Greeks lost an enormous amount from being able to borrow money for stupid conventional projects at too low a rate. Had loans been less available, they would have been more original (the less debt involved, the easier it is to take risks) and started at a smaller scale. Which I believe would have been a better use of their time and led to more innovation. Both The Economist‘s editorial writer and Surowiecki have a status-quo “duct-tape” bias without realizing it.

What’s important here is not what two writers, however influential their magazines, think or fail to think. It is that they are so sure of themselves. They fail to take seriously an alternative (breakup of the Eurozone would in the long run be a good thing) that has at least as much to recommend it as what they are sure of (the breakup would be a “disaster”). I believe they are so sure of themselves because they have absorbed (and now imitate) the hemineglect of modern economics. The whole field, they haven’t noticed, has an enormous status-quo bias in its failure to study innovation. Innovation — how new goods and services are invented and prosper — should be half the field. Let me repeat: A few years ago I picked up an 800-page introductory economics textbook. It had one page (one worthless page) on innovation. In this staggering neglect, it reflected the entire field. The hemineglect of economics professors is just as bad as the hemineglect of epidemiologists (who ignore immune function, study of what makes us better or worse at fighting off microbes) and statisticians (who pay almost no attention to idea generation).

MORE Even Joe Nocera, whom I like, has trouble grasping that the Euro might be a bad idea. “The only thing that should matter is what works,” he writes. Not managing to see that the Euro isn’t working.

Vitamin D: More Reason to Take at Sunrise

I blogged earlier about what I called a “stunning discovery”: Primal Girl found her sleep got much better when she started taking Vitamin D first thing in the morning (= soon after she got up) rather than mid-afternoon. This suggested that Vitamin D acts on your circadian system similar to a blast of sunlight. (More evidence and discussion here.) In his blog, Joseph Buchignani reports another experience that supports the idea that you should take Vitamin D first thing in the morning:

I picked up a bottle of Vit-D and Calcium. Dosage of Vit-D per pill was 1.6ud. Per the instructions, I took 1 at morning and 1 at night. I began this regimin on the night of the 24th of November. It’s now the night of the 25th of November, and my circadian rhythm is completely fucked. . . . I’m fully awake now (12:30 AM), and I probably took the last dose of Vit-D around 7-8 PM. . . . I woke up with dark eye rings on the morning of the 25th. My energy level did not rise as it should have, but sort of meandered in the middle, before finally tailing off. Stress levels and depression were both elevated. I got little productive done.

Yesterday I started taking Vitamin D first thing in the morning. I took 2000 IU of Vitamin D3 at 8 am. In the afternoon I felt more energetic than usual. The next morning (this morning) I woke up feeling more rested than usual. This also supports Primal Girl’s experience.

Let me repeat: first thing in morning. If you wake up before sunrise, take at sunrise (say, 7 am). Sunlight has a considerably different effect on your circadian system at 7 am than 10 am. (Look up circadian phase-response curve and especially the work of Patricia DeCoursey if you want to understand why three hours makes a big difference.) I have two bottles of Vitamin D. Neither mentions time of day. Both say take with meals.

Assorted Links

  • Salem Comes to the National Institutes of Health. Dr. Herbert Needleman is harassed by the lead industry, with the help of two psychology professors.
  • Climate scientists “perpetuating rubbish”.
  • A humorous article in the BMJ that describes evidence-based medicine (EBM) as a religion. “Despite repeated denials by the high priests of EBM that they have founded a new religion, our report provides irrefutable proof that EBM is, indeed, a full-blown religious movement.” The article points out one unquestionable benefit of EBM — that some believers “demand that [the drug] industry divulge all of its secret evidence, instead of publishing only the evidence that favours its products.” Of course, you need not believe in EBM to want that. One of the responses to the article makes two of the criticisms of EBM I make: 1. Where is the evidence that EBM helps? 2. EBM stifles innovation.
  • What really happened to Dominique Strauss-Kahn? Great journalism by Edward Jay Epstein. This piece, like much of Epstein’s work, sheds a very harsh light on American mainstream media. They were made fools of by enemies of Strauss-Kahn. Epstein is a freelance journalist. He uncovered something enormously important that all major media outlets — NY Times, Washington Post, The New Yorker, ABC, NBC, CBS (which includes 60 Minutes), the AP, not to mention French news organizations, all with great resources — missed.

Climategate 2.0: How To Tell When an Expert Exaggerates

The newly-released climate scientist emails (called Climategate 2.0) from University of East Anglia (Phil Jones) and elsewhere (Michael Mann and others) show that top climate scientists agree with me. Like me (see my posts on global warming), they think the evidence that humans have caused dangerous global warming is weaker than claimed. Unfortunately for the rest of us, they kept their doubts to themselves: “I just refused to give an exclusive interview to SPIEGEL because I will not cause damage for climate science.”

This is a big reason I have found self-experimentation useful. It showed me that experts exaggerate, that they overstate their certainty. At first I was shocked. My first useful self-experimental results were about acne. I found that one of the two drugs my dermatologist had prescribed didn’t work. He hadn’t said This might not work. He didn’t try to find out if it worked. He appeared surprised (and said “why did you do that?”) when I told him it didn’t work. Another useful self-experimental result was breakfast caused me to wake up too early. Breakfast is widely praised by dieticians (“the most important meal of the day”). I have never heard a dietician say It could hurt your sleep or even a modest There’s a lot we don’t know. My discoveries about morning faces and mood are utterly different than what psychiatrists and psychotherapists say about depression.

As anyone paying attention has noticed, it isn’t just climate scientists, doctors, dieticians, psychiatrists, and psychotherapists. How can you tell when an expert is exaggerating? His lips move. There are two types of journalism: 1. Trusts experts. 2. Doesn’t trust experts. I suggest using colored headlines to make them easy to distinguish: red = trusts experts, green = doesn’t trust experts.

Butter and Arithmetic: How Much Butter?

I measure my arithmetic speed (how fast I do simple arithmetic problems, such as 3+ 4) daily. I assume it reflects overall brain function. I assume something that improves brain function will make me faster at arithmetic.

Two years ago I discovered that butter — more precisely, substitution of butter for pork fat — made me faster. This raised the question: how much is best? For a long time I ate 60 g of butter (= 4 tablespoons = half a stick) per day. Was that optimal? I couldn’t easily eat more but I could easily eat less.

To find out, I did an experiment. At first I continued my usual intake (60 g /day). Then I ate 30 g/day for several days. Finally I returned to 60 g/day. Here are the main results:

The graph shows that when I switched to 30 g/day, I became slower. When I resumed 60 g/day, I became faster. Comparing the 30 g/day results with the combination of earlier and later 60 g/day results, t = 6, p = 0.000001.

The amount of butter also affected my error rate. Less butter, less errors:

Comparing the 30 g/day results with the combination of earlier and later 60 g/day results, t = 3, p = 0.006.

The change in error rates raised the possibility that the speed changes were due to movement along a speed-accuracy tradeoff function (rather than to genuine improvement, which would correspond to a shift in the function). To assess this idea, I plotted speed versus accuracy (each point a different day).

If differences between conditions were due to differences in speed-accuracy tradeoff, then the points for different days should lie along a single downward-sloping line. They don’t. They don’t lie along a single line. Within conditions, there was no sign of a speed-accuracy tradeoff (the fitted lines do not slope downward). If this is confusing, look at the points with accuracy values in the middle. Even when equated for accuracy, there are differences between the 30 g/day phase and the 60 g/day phases.

What did I learn?

1. How much butter is best. Before these results, I had no reason to think 60 g/day was better than 30 g/day. Now I do.

2. Speed of change. Environmental changes may take months or years to have their full effect. Something that makes your bones stronger may take months or years to be fully effective. Here, however, changes in butter intake seemed to have their full effect within a day. I noticed the same speed of change with pork fat and sleep: How much pork fat I ate during a single day affected my sleep that night (and only that night). With omega-3, the changes were somewhat slower. A day without it made little difference. You can go weeks without Vitamin C before you get scurvy. Because of the speed of the butter change, in the future I can do better balanced experiments that change conditions more often.

3. Better experimental design. An experiment that compares 60 g/day and 0 g/day probably varies many things besides butter consumption (e.g., preparing the butter to eat it). An experiment that compares 60 g/day and 30 g/day is less confounded. When I ate less butter, I ate more of other food. Compared to a 60 g/0 g experiment, this experiment (60 g/30 g) has less variation in other food. Another sort of experiment, neither better nor worse, would vary type of fat rather than amount. For example, replace 30 g of butter with 30 g of olive oil. Because the effect of eliminating 30 g/day of butter was clear, replacement experiments become more interesting — 30 g/day olive oil is more plausible as a sustainable and healthy amount than 60 g/day.

4. Generality. This experiment used cheaper butter and took place in a different context than the original discovery. I discovered the effect of butter using Straus Family Creamery butter. “One of the top premium butters in America, ” says its website, quoting Food & Wine magazine This experiment used a cheaper less-lauded butter (Land O’Lakes). Likewise, I discovered the effect in Berkeley. I did this experiment in Beijing. My Beijing life differs in a thousand ways from my Berkeley life.

The results suggest the value of self-experimentation, of course. Self-experimentation made this study much easier. But other things also mattered.

First, reaction-time methodology. In the 1960s my friend and co-author Saul Sternberg, a professor of psychology at the University of Pennsylvania, introduced better-designed reaction-time experiments to study cognition. They turned out to be far more sensitive than the usual methods, which involved measuring percent correct. (Saul’s methodological advice about these experiments.)

Second, personal science (science done to help yourself). I benefited from the results. Normal science is part of a job. The self-experimentation described in books was mostly (or entirely) done as part of a job. Before I collected this data, I put considerable work into these measurements. I discovered the effect of butter in an unusual way (measuring myself day after day), I tried a variety of tasks (I started by measuring balance), I refined the data analysis, and so on. Because I benefited personally, this was easy.

Third, technological advances. Twenty years ago this experiment would have been more difficult. I collected this data outside of a lab using cheap equipment (a Thinkpad laptop running Windows XP). I collected and analyzed the data with R (free). A smart high school student could do what I did.

There is more to learn. The outlier in the speed data (one day was unusually fast) means there can be considerable improvement for a reason I don’t understand.

The Genomera Buttermind Experiment.

Assorted Links

  • Bruce Handy (who wrote for Spy) on Newsweek. “The second biggest problem is the way each issue begins with a miles-long slog of columns by A-list writers eager to champion the incontrovertible and rehash the already thoroughly hashed. . . . Niall Ferguson has discovered that, thanks to technology, “the human race is interconnected as never before.””
  • The Willat Effect in Venice, CA: side-by-side coffee comparisons at Intelligensia .
  • Why is the headline 28 Unexpected TV Ratings Facts more attractive than Unexpected TV Ratings Facts?
  • Engaging interview with Julia Schopick, creator of Honest Medicine. “After they [his surgeons] were done with him . . . “

“Allergic to the Practical”: Law Schools Imitating Academia

Thorstein Veblen might have gloated that this 2011 article — about the uselessness of law schools and legal scholarship — so thoroughly supports what he wrote in a book published in 1899 (see the last chapter of The Theory of the Leisure Class). Why are law schools useless? Because law professors feel compelled to imitate the rest of academia, which glorifies uselessness:

“Law school has a kind of intellectual inferiority complex, and it’s built into the idea of law school itself,” says W. Bradley Wendel of the Cornell University Law School, a professor who has written about landing a law school teaching job. “People who teach at law school are part of a profession and part of a university. So we’re always worried that other parts of the academy are going to look down on us and say: ‘You’re just a trade school, like those schools that advertise on late-night TV. You don’t write dissertations. You don’t write articles that nobody reads.’ And the response of law school professors is to say: ‘That’s not true. We do all of that. We’re scholars [i.e., useless], just like you.’ ”

Yeah. As I’ve said, there’s a reason for the term ivory tower. And seemingly useless research has value. Glorifying useless research has the useful result of diversifying research, causing a wider range of research directions to be explored. Many of my highly-useful self-experimental findings started or received a big boost from apparently useless research.

The pendulum can swing too far, however, and it has. A large fraction of health researchers, especially medical school researchers, have spent their entire careers refusing to admit, at least in public, the uselessness of what they do. Biology professors have some justification for useless research; medical school professors have none, especially given all the public money they get. Like law professors, they prefer prestige and conformity. The rest of us pay an enormous price for their self-satisfaction (“I’m scientific!” they tell themselves) and peace of mind. The price we pay is stagnation in the understanding of health. Like clockwork, every year the Nobel Prize in Medicine is given to research that has done nothing or very close to nothing to improve our health. And every year, like clockwork, science journalists (all of them!) fail to notice this. If someone can write the article I just quoted about law schools, why can’t even one science journalist write the same thing about medical schools — where it matters far more? What’s their excuse?