Chemistry professor. she/her
12 stories

Nate Silver and the Two Cultures

1 Share

You’ve probably read that Nate Silver has left the New York Times for ESPN. In this article by the Times’ public editor, Margaret Sullivan, says

A number of traditional and well-respected Times journalists disliked his work. The first time I wrote about him I suggested that print readers should have the same access to his writing that online readers were getting. I was surprised to quickly hear by e-mail from three high-profile Times political journalists, criticizing him and his work. They were also tough on me for seeming to endorse what he wrote, since I was suggesting that it get more visibility.

I tend the view the world as divided by C.P. Snow’s Two Cultures, so this story plays right into my biases. Silver is in one culture and the traditional journalists are in the other. It’s no wonder they disliked him and his methods.

Strictly speaking, I see the Two Cultures a little differently from Snow. He divides them into literary intellectuals and scientists; I divide them into those who believe in the power of argument and those who believe in the power of facts. Of course, we all have some of both cultures in us, but most of us lean strongly in one direction or the other. The first group thinks that matters get settled through discourse—may the best debater win. The second group thinks you “win” by figuring out what the objective truth is and aligning yourself with it, because facts are immune to rhetoric.

Many things are amenable not to an objective, factual investigation, and an irony of the Two Cultures viewpoint is that it is one of them. Snow, who favored the scientific approach, was attacked by those who preferred argument. Since the debate was on the arguers home field, and was typically judged by other arguers, Snow was often declared the loser. Those of us on the fact side don’t accept the results.

Indeed, we on the fact side tend to refer to the debating tools of the other side as “bullshit,” a term akin to the phrase Silver applied to traditional punditry: “fundamentally useless.” But that’s an overstatement. Just because something can’t be reduced to numbers doesn’t mean it can’t be subjected to rigorous thought. But it certainly seems that those inclined toward bullshit rather than rigor tend to end up among the arguers. It’s easier to hide over there.

As I said in this post back in November, too many people oversimplified Silver’s election predictions and declared the results a “victory for science.” Actually, Silver never said Obama would get 332 electoral votes. He said it was the most likely outcome, but he assigned it only a 20% probability. Still, this misinterpretation wasn’t Silver’s fault—he used the information presented to him and delivered an answer based on that information and generally accepted statistical methods. He’s definitely on the facts side of the cultural divide.

It’s not surprising that other political journalists at the Times took a dim view of his methods. First, they didn’t understand it. Second, it was at odds with their methods of prediction, which consisted of listening to the arguments of various political insiders and then making an assessment based on those arguments as leavened by their own experience, judgement, and savvy. In other words: bullshit. If Silver had come up with predictions by this method, no matter how different they were from the predictions of others, he would have been welcomed into the fraternity.

Political punditry fascinates me because it is so much at odds with my business. In engineering, judgement is considered vitally important because one often must make decisions with incomplete information. But there is, ultimately, an objective reality: either the machine works or it doesn’t. If your designs consistently produce machines that don’t work, you won’t be designing very long. Political punditry also has an objective reality: in an election, one candidate wins and the others lose. You would think that making predictions that are no better than a coin flip would remove a pundit from his position at the Times or CNN or wherever, but it doesn’t seem to work that way. Have you ever heard of a political columnist getting fired because his predictions were wrong?

I wish Nate Silver well at ESPN. Sports journalism is certainly an area that’s used to dealing with numbers, even though most sportswriters and sportscasters are no more numerate than political pundits. Come to think of it, I’ve never heard of a sportswriter getting fired for making bad predictions, either.

Read the whole story
3870 days ago
Share this story

The Seven-Year Postdoc

1 Share

I’m starting to think that maybe I need to add “Work-life Balance” to the tagline of this blog, given all the recent posting about such things (but then, one of the benefits of having done this blogging thing for eleven years is that I know this is just a phase, and I’ll drift on to the next obsession soon). Anyway, the genre of work-life blogging generally just picked up a new must-read post from Radhika Nagpal at Scientific American: The-Awesomest-7-Year-Postdoc or: How I Learned to Stop Worrying and Love the Tenure-track-faculty-life:

I’ve enjoyed my seven years as junior faculty tremendously, quietly playing the game the only way I knew how to. But recently I’ve seen several of my very talented friends become miserable in this job, and many more talented friends opt out. I feel that one of the culprits is our reluctance to openly acknowledge how we find balance. Or openly confront how we create a system that admires and rewards extreme imbalance. I’ve decided that I do not want to participate in encouraging such a world. In fact, I have to openly oppose it.

So with some humor to balance my fear, here’s goes my confession:

Seven things I did during my first seven years at Harvard. Or, how I loved being a tenure-track faculty member, by deliberately trying not to be one.

  • I decided that this is a 7-year postdoc.
  • I stopped taking advice.
  • I created a “feelgood” email folder.
  • I work fixed hours and in fixed amounts.
  • I try to be the best “whole” person I can.
  • I found real friends.
  • I have fun “now”.

This is really excellent, refreshing advice from somebody who got tenure at the rat-raciest of institutions. This ought to be required reading for everybody on the tenure track. Go thou and Read the Whole Thing.

I wish there had been something this clear-headed around back before I got tenure. I independently worked a lot of this stuff out on my own, in a less coherent fashion. I particularly agree with the second item– around the time I was advised through unofficial channels that I needed to “be less visible” (that is, not go to lectures and other events on campus because people might think I wasn’t working as hard as I could be), I made more or less the same decision to stop trying to follow official advice. I didn’t pursue this job because I felt a burning need to drive myself insane trying to follow contradictory checklists– I went to grad school and sought a professorship because I liked the atmosphere of a small liberal arts college campus, and wanted to work there. I decided that if I couldn’t get tenure while being a part of the sort of community that I was seeking when I set out on this path, then it wasn’t worth getting tenure. And I felt a lot better for it.

And while there are significant differences between our institutions, the “real friends” item also resonates with me– one of the essential factors that helped me maintain sanity during my tenure-track years was lunchtime pick-up basketball. On the one hand, that was time away from the classroom and lab, but after a couple of stretches where I stopped playing to spend more time on work, I realized that I was actually less productive when I didn’t play, because I was grumpy and irritable and out of shape without the stress relief and exercise. (It also got me a very nice letter in my tenure file from one of the student affairs folks who played with us; I doubt it carried much weight, but it was definitely something for the feel-good email folder (not that I had one of those, but I might create one…)). They weren’t as professionally useful in terms of grant-reading and that kind of thing, but on a personal level, the friends I made through basketball helped keep me sane. And it’s even more important to me these days.

A couple of these items I also discovered relatively late– we waited to have SteelyKid and The Pip until after I got tenure (one of the harder sacrifices for the job), so I didn’t need to be quite as strict about time management until recently. I’m taking a somewhat similar approach now, though– I basically write the weekends off completely, and do my best to accept that nothing work related will get done between 5 and 10 pm. I do try to get some blog/book stuff done in night and weekend hours, but I’ve also blocked out whole days during the week when I refuse to deal with administrative stuff at work. And, of course, the family balance stuff wasn’t necessary until very recently, but Kate and I have done some negotiation of who’s responsible for what, when, in a manner that’s sort of similar to what she describes.

I’ll also say, since I have more years since tenure than she does, that very little has changed since tenure on these points, other than having more drains on my time. Ironically, I’m actually “less visible” on campus now, because of the kids– I bring them to some home sporting events and concerts in the evening after day care, but they’re not generally up for lectures and dinner discussions, so I’ve mostly stopped going to those except under exceptional circumstances. But I’ve mostly stuck to my goal of doing things on and around campus only when I find them personally worthwhile, not because they conform with somebody else’s idea of what I ought to be doing. I suspect that there are some costs to this– in particular, I’m not sure how writing popular-audience books will be considered when I come up for full professor, and that will likely delay my application for promotion a bit. But again, I went into this business because there were certain kinds of things I wanted to do, and I don’t see the point in avoiding doing those kinds of things, or taking on unpleasant stuff that I don’t want to or need to do, simply because it would better fit somebody else’s idea of what I ought to be doing.

(This is not a totally absolutist position, by the way, as I’m more than willing to consider ways to shift the things I want to be doing anyway into channels that are more professionally valued, when/if I can find some. I’m not enough of a Zen master to avoid being irritated when the stuff I pour blood and sweat into gets looked at askance as not “serious” enough.)

Anyway, again, I wholeheartedly endorse Nagpal’s post and the advice therein. Including the anti-advice bits, for what it’s worth– if you read that and recoil in horror, feel free to disregard it and go in another direction. The important thing is to do what works for you, not throw away whatever drew you to the job in the first place in order to follow somebody else’s advice.

Read the whole story
3871 days ago
Share this story

Eight Toxic Foods: A Little Chemical Education

2 Comments and 6 Shares

Many people who read this blog are chemists. Those who aren't often come from other branch of the sciences, and if they don't, it's safe to say that they're at least interested in science (or they probably don't hang around very long!) It's difficult, if you live and work in this sort of environment, to keep in mind what people are willing to believe about chemistry.

But that's what we have the internet for. Many science-oriented bloggers have taken on what's been called "chemophobia", and they've done some great work tearing into some some really uninformed stuff out there. But nonsense does not obey any conservation law. It keeps on coming. It's always been in long supply, and it looks like it always will be.

That doesn't mean that we just have to sit back and let it wash over us, though. I've been sent this link in the last few days, a popular item on BuzzFeed with the BuzzFeedy headline of "Eight Foods That We Eat in The US That Are Banned in Other Countries". When I saw that title, I found it unpromising. In a world that eats everything that can't get away fast enough, what possible foods could we have all to ourselves here in the States? A quick glance was enough: we're not talking about foods here - we're talking about (brace yourselves) chemicals.

This piece really is an education. Not about food, or about chemistry - on the contrary, reading it for those purposes will make you noticeably less intelligent than you were before, and consider that a fair warning. The educational part is in the "What a fool believes" category. Make no mistake: on the evidence of this article, its author is indeed a fool, and has apparently never yet met a claim about chemicals or nutrition that was too idiotic to swallow. If BuzzFeed's statistics are to be believed (good question, there), a million views have already accumulated to this crap. Someone who knows some chemistry needs to make a start at pointing out the serial stupidities in it, and this time, I'm going to answer the call. So here goes, in order.

Number One: Artificial Dyes. Here's what the article has to say about 'em:

Artificial dyes are made from chemicals derived from PETROLEUM, which is also used to make gasoline, diesel fuel, asphalt, and TAR! Artificial dyes have been linked to brain cancer, nerve-cell deterioration, and hyperactivity, just to name a few.

Emphasis is in the original, of course. How could it not lapse into all-caps? In the pre-internet days, this sort of thing was written in green ink all around the margins of crumpled shutoff notices from the power company, but these days we have to make do with HTML. Let's take this one a sentence at a time.

It is true, in fact, that many artificial dyes are made from chemicals derived from petroleum. That, folks, is because everything (edible or not) is made out of chemicals, and an awful lot of man-made chemicals are derived from petroleum. It's one of the major chemical feedstocks of the world. So why stop at artificial dyes? The ink on the flyer from the natural-foods co-op is made from chemicals derived from petroleum. The wax coating the paper wrapped around that really good croissant at that little bakery you know about is derived from petroleum.

Now, it's true that more things you don't eat can be traced back to petroleum feedstocks than can things you do eat. That's because it's almost always cheaper to grow stuff than to synthesize it. Synthesized compounds, when they're used in food, are often things that are effective in small amounts, because they're so expensive. And so it is with artificial dyes - well, outside of red velvet cake, I guess. People see the bright colors in cake icing and sugary cereals and figure that the stuff must be glopped on like paint, but paint doesn't have very much dye or pigment in it, either (watch them mix it up down at the hardware store sometime).

And as for artificial colors causing "brain cancer, nerve-cell deterioration, and hyperactivity", well, these assertions range from "unproven" all the way down to "bullshit". Hyperactivity sensitivities to food dyes are an active area of research, but after decades of work, the situation is still unclear. And brain cancer? This seems to go back to studies in the 1980s with Blue #2, where rats were fed the dye over a long period in much larger concentrations (up to 2% of their total food intake) than even the most dedicated junk-food eater could encounter. Gliomas were seen in the male rats, but with no dose-response, and at levels consistent with historical controls in the particular rat strain. No one has ever been able to find any real-world connection. Note that glioma rates increased in the 1970s and 1980s as diagnostic imaging improved, but have fallen steadily since then. The age-adjusted incidence rates of almost all forms of cancer are falling, by the way, not that you'd know that from most of the coverage on the subject.

Number Two: Olestra

This, of course, is Proctor & Gamble's attempted non-calorific fat substitute. I'm not going to spend much time on this, because little or nothing is actually made with it any more. Olestra was a major flop for P&G the only things (as far as I can tell) that still contain it are some fat-free potato chips. It does indeed interfere with the absorption of fat-soluble vitamins, but potato chips are not a very good source of vitamins to start with. And vitamin absorption can be messed with by all kinds of things, including other vitamins (folic acid supplements can interfere with B12 absorption, just to pick one). But I can agree with the plan of not eating the stuff: I think that if you're going to eat potato chips, eat a reasonable amount of the real ones.

Number Three: Brominated Vegetable Oil. Here's the article's take on it:

Bromine is a chemical used to stop CARPETS FROM CATCHING ON FIRE, so you can see why drinking it may not be the best idea. BVO is linked to major organ system damage, birth defects, growth problems, schizophrenia, and hearing loss.

Again with the caps. Now, if the author had known any chemistry, this would have looked a lot more impressive. Bromine isn't just used to keep carpets from catching on fire - bromine is a hideously toxic substance that will scar you with permanent chemical burns and whose vapors will destroy your lungs. Drinking bromine is not just a bad idea; drinking bromine is guaranteed agonizing death. There, see what a little knowledge will do for you?

But you know something? You can say the same thing for chlorine. After all, it's right next to bromine in the same column of the periodic table. And its use in World War I as a battlefield gas should be testimony enough. (They tried bromine, too, never fear). But chlorine is also the major part, by weight, of table salt. So which is it? Toxic death gas or universal table seasoning?

Knowledge again. It's both. Elemental chlorine (and elemental bromine) are very different things than their ions (chloride and bromide), and both of those are very different things again when either one is bonded to a carbon atom. That's chemistry for you in a nutshell, knowing these differences and understanding why they happen and how to use them.

Now that we've detoured around that mess, on to brominated vegetable oil. It's found in citrus-flavored sodas and sports drinks, at about 8 parts per million. The BuzzFeed article claims that it's linked to "major organ system damage, birth defects, growth problems, schizophrenia, and hearing loss", and sends readers to this WebMD article. But if you go there, you'll find that the only medical problems known from BVO come from two cases of people who had been consuming, over a long period, 4 to 8 liters of BVO-containing soda per day, and did indeed have reactions to all the excess bromine-containing compounds in their system. At 8 ppm, it's not easy to get to that point, but a determined lunatic will overcome such obstacles. Overall, drinking several liters of Mountain Dew per day is probably a bad idea, and not just because of the BVO content.

Number Four: Potassium Bromate. The article helpfully tells us this is "Derived from the same harmful chemical as brominated vegetable oil". But here we are again: bromate is different from bromide is different than bromine, and so on. If we're going to play the "made from the same atoms" game, well, strychnine and heroin are derived from the same harmful chemicals as the essential amino acids and B vitamins. Those harmful chemicals, in case you're wondering, are carbon, hydrogen, oxygen, and nitrogen. And to get into the BuzzFeed spirit of the thing, maybe I should mention that carbon is found in every single poisonous plant on earth, hydrogen is the harmful chemical that blew up the Hindenburg, oxygen is responsible for every death by fire around the world, and nitrogen will asphyxiate you if you try to breath it (and is a key component of all military explosives). There, that wasn't hard - as Samuel Johnson said, a man might write such stuff forever, if only he would give over his mind to it.

Now, back to potassium bromate. The article says, "Only problem is, it’s linked to kidney damage, cancer, and nervous system damage". And you'll probably fall over when I say this, but that statement is largely correct. Sort of. But let's look at "linked to", because that's an important phrase here.

Potassium bromate was found (in a two-year rat study) to have a variety of bad effects. This occurred at the two highest doses, and the lowest observed adverse effect level (LOAEL) was 6.1 mg of bromate per kilo body weight per day. It's worth noting that a study in male mice took them up to nearly ten times that amount, though, with little or no effect, which gives you some idea of how hard it is to be a toxicologist. Whether humans are more like mice or more like rats in this situation is unknown.

I'm not going to do the whole allometric scaling thing here, because no matter how you do it, the numbers come out crazy. Bromate is used in some (but not all) bread flour at 15 to 30 parts per million, and if the bread is actually baked properly, there's none left in the finished product. But for illustration, let's have someone eating uncooked bread dough at the highest level, just to get the full bromate experience. A 75-kilo human (and many of us are more than that) would have to take in 457 mg of bromate per day to get to the first adverse level seen in rats, which would be. . .15 kilos (about 33 pounds) of bread dough per day, a level I can safely say is unlikely to be reached. Hell, eating 33 pounds of anything isn't going to work out, much as my fourteen-year-old son tries to prove me wrong. You'd need to keep that up for decades, too, since that two year study represents a significant amount of a rat's lifespan.

Number Five: Azodicarbonamide. This is another bread flour additive. According to the article, "Used to bleach both flour and FOAMED PLASTIC (yoga mats and the soles of sneakers), azodicarbonamide has been known to induce asthma".

Let's clear this one up quickly: azodicarbonamide is indeed used in bread dough, and allowed up the 45 parts per million. It is not stable to heat, though, and it falls apart quickly to another compound, biurea, on baking. It not used to "bleach foamed plastic", though. Actually, in higher concentrations, it's used to foam foamed plastics. I realize that this doesn't sound much better, but the conditions inside hot plastic, you will be glad to hear, are quite different from those inside warm bread dough. In that environment, azodicarbonamide doesn't react to make birurea - it turns into several gaseous products, which are what blow up the bubbles of the foam. This is not its purpose in bread dough - that's carbon dioxide from the yeast (or baking powder) that's doing the inflating there, and 45 parts per million would not inflate much of anything.

How about the asthma, though? If you look at the toxicology of azodicarbonamide, you find that "Azodicarbonamide is of low acute toxicity, but repeated or prolonged contact may cause asthma and skin sensitization." That, one should note, is for the pure chemical, not 45 parts per million in uncooked flour (much less zero parts per million in the final product). If you're handling drums of the stuff at the plastics plant, you should be wearing protective gear. If you're eating a roll, no.

Number Six: BHA and BHT. We're on the home stretch now, and this one is a two-fer. BHA and BHT are butylated hydroxyanisole and butylate hydroxytoluene, and according to the article, they are "known to cause cancer in rats. And we’re next!"

Well, of course we are! Whatever you say! But the cancer is taking its time. These compounds have been added to cereals, etc., for decades now, while the incidence rates of cancer have been going down. And what BuzzFeed doesn't mention is that while some studies have shown an increase in cancer in rodent models with these compounds, others have shown a measurable decrease. Both of these compounds are efficient free radical scavengers, and have actually been used in animal studies that attempt to unravel the effects of free radicals on aging and metabolism. Animal studies notwithstanding, attempts to correlate human exposure to these compounds with any types of cancer have always come up negative. Contrary to what the BuzzFeed article says, by the way, BHT is indeed approved by the EU.

Weirdly, you can buy BHT in some health food stores, where anti-aging and anti-viral claims are made for it. How does a health food store sell butylated hydroxytoluene with a straight face? Well, it's also known to be produced by plankton, so you can always refer to it as a natural product, if that makes you feel better. That doesn't do much for me - as an organic chemist, I know that the compounds found in plankton range from essential components of the human diet all the way down to some of the most toxic molecules found in nature.

Number Seven: Synthetic Growth Hormones. These are the ones given to cattle, not the ones athletes give to themselves. The article says that they can "give humans breast, colon, and prostate cancer", which, given what's actually known about these substances, is a wildly irresponsible claim.

The article sends you to a perfectly reasonable site at the American Cancer Society, which is the sort of link that might make a BuzzFeed reader think that it must then be about, well, what kinds of cancer these things give you. But have a look. What you find is (first off) this is not an issue for eating beef. Bovine growth hormone (BGH) is given to dairy cattle to increase milk production. OK, so what about drinking milk?

Here you go: for one, BGH levels in the milk of treated cows are not higher than in untreated ones. Secondly, BGH is not active as a growth hormone in humans - it's selective for the cow receptor, not the human one. The controversy in this area comes from the way that growth hormone treatment in cows tends to increase levels of another hormone, IGF-1, in the milk. That increase still seems to be within the natural range of variability for IGF-1 in regular cows, but there is a slight change.

The links between IGF-1 and cancer have indeed been the subject of a lot of work. Higher levels of circulating IGF-1 in the bloodstream have (in some studies) been linked to increased risk of cancer, but I should add that other studies have failed to find this effect, so it's still unclear what's going on. I can also add, from my own experiences in drug discovery, that all of the multiple attempts to treat cancer by blocking IGF-1 signaling have been complete failures, and that might also cause one to question the overall linkage a bit.

But does drinking milk from BGH-treated cows increase the levels of circulating IGF-1 at all? No head-to-head study has been run, but adults who drink milk in general seem to have slightly higher levels. The same effect, though, was seen in people who drink soymilk, which (needless to say) does not have recombinant cow hormones in it. No one knows to what extent ingested IGF-1 might be absorbed into the bloodstream - you'd expect it to be digested like any other protein, but exceptions are known.

But look at the numbers. According to that ACA web summary, even if the protein were not degraded at all, and if it were completely absorbed (both of which are extremely unrealistic top-of-the-range assumptions), and even if the person drinking it were an infant, and taking in 1.6 quarts a day of BGH-derived cow milk with the maximum elevated levels of IGF-1 that have been seen, the milk would still contribute less than 1% of the IGF-1 in the bloodstream compared to what's being made in the human body naturally.

Number Eight, Arsenic. Arsenic? It seems like an unlikely food additive, but the article says "Used as chicken feed to make meat appear pinker and fresher, arsenic is POISON, which will kill you if you ingest enough."

Ay. I think that first off, we should make clear that arsenic is not "used as chicken feed". That brings to mind someone pitching powdered arsenic out for the hens, and that's not part of any long-term chicken-farming plan. If you go to the very NPR link that the BuzzFeed article offers, you find that a compound called roxarsone is added to chicken feed to keep down Coccidia parasites in the gut. It is not just added for some cosmetic reason, as the silly wording above would have you believe.

In 2011, a study found that chicken meat with detectable levels of roxarsone had 2.3 parts per billion (note the "b") of inorganic arsenic, which is the kind that is truly toxic. Chicken meat with no detectable roxarsone had 0.8 ppb inorganic arsenic, threefold less, and the correlation seems to be real. (Half of the factory-raised chickens sampled had detectable roxarsone, by the way). This led to the compound being (voluntarily) withdrawn from the market, under the assumption that this is an avoidable exposure to arsenic that could be eliminated.

And so it is. There are other (non-arsenic) compounds that can be given to keep parasite infestations down in poultry, although they're not as effective, and they'll probably show up on the next edition of lists like this one. But let's get things on scale: it's worth comparing these arsenic levels to those found in other foods. White rice, for example comes in at about 100 parts per billion of inorganic arsenic (and brown rice at 170 ppb). These, by the way, are all-natural arsenic levels, produced by the plant's own uptake from the soil. But even those amounts are not expected to pose a human health risk (says both the FDA and Canadian authorities), so the fifty-fold lower concentrations in chicken would, one thinks, be even less to worry about. If you're having chicken and rice and you want to worry about arsenic, worry about the rice.

This brings me to the grand wrap-up, wrap-up, and some of the language in that last item is a good starting point for it. I'm talking about the "POISON, which will kill you if you ingest enough" part. This whole article is soaking in several assumptions about food, about chemistry, and about toxicology, and that's one of the big ones. In my experience, people who write things like this have divided the world into two categories: wholesome, natural, healthy stuff and toxic chemical poisons. But this is grievously simple-minded. As I've emphasized in passing above, there are plenty of natural substances, made by healthy creatures in beautiful, unpolluted environments, that will nonetheless kill you in agony. Plants, fungi, bacteria, and animals produce poisons, wide varieties of intricate poisons, and they're not doing it for fun.

And on the other side of the imaginary fence, there are plenty of man-made substances that really won't do much of anything to people at all. You cannot assume anything about the effects of a chemical compound based on whether it came from a lovely rainforest orchid or out of a crusty Erlenmeyer flask. The world is not set up that way. Here's a corollary to this: if I isolate a beneficial chemical compound from some natural source (vitamin C from oranges, for example, although sauerkraut would be a good source, too), that molecule is identical to a copy of it I make in my lab. There is no essence, no vital spirit. A compound is what it is, no matter where it came from.

Another assumption that seems common to this mindset is that when something is poisonous at some concentration, it is therefore poisonous at all concentrations. It has some poisonous character to it that cannot be expunged nor diluted. This, though, is more often false than true. Paracelsus was right: the dose makes the poison. You can illustrate that in both directions: a beneficial substance, taken to excess, can kill you. A poisonous one, taken in very small amounts, can be harmless. And you have cases like selenium, which is simultaneously an essential trace element in the human diet and an inarguable poison. It depends on the dose.

Finally, I want to return to something I was saying way back at the beginning of this piece. The author of the BuzzFeed article knows painfully little about chemistry and biology. But that apparently wasn't a barrier: righteous conviction (and the worldview mentioned in the above three paragraphs) are enough, right? Wrong. Ten minutes of unbiased reading would have served to poke holes all through most of the article's main points. I've spent more than ten minutes (as you can probably tell), and there's hardly one stone left standing on another. As a scientist, I find sloppiness at this level not only stupid, not only time-wasting, but downright offensive. Couldn't anyone be bothered to look anything up? There are facts in this world, you know. Learn a few.

Read the whole story
3901 days ago
Facts over fears
Share this story
1 public comment
3897 days ago
Science Literacy FTW.
New York, NY

Why Are Physics Classes Full of Old Stuff?

1 Share

Everybody and their siblings have been linking to this Minute Physics video, an “open letter” to President Obama complaining about the way that most high school and even intro college physics classes don’t teach anything remotely modern:

I’m not entirely sure where the date of 1865 comes from, but it’s true, the standard intro physics sequence doesn’t really touch what’s normally called “modern physics,” a term which is itself laughably out of date, as it generally refers to special relativity and quantum mechanics as it stood around 1935. We don’t teach really new stuff until about the 300 level in college courses (junior/senior year for students on the normal track), with the possible exception of hand-wavey non-majors courses.

So, why is that, anyway? Are physics teachers and professors just totally oblivious to how backwards and out-of date their curriculum is? No, of course not, especially at the college level. We’re acutely aware that what we teach is mostly old physics– it would be difficult not to notice that we’re not teaching anything remotely related to the research we do (in most cases). We’re pretty much stuck teaching the material that we do, though, because of constraints that are external to physics departments.

A factoid that I picked up at an AAPT workshop about ten years ago is that only about 3% of students taking introductory college physics ever take another physics course. Now, some of that is attributable to the fact that intro courses are often dry and boring, and we should absolutely fix that. But that tiny retention rate is in large part because the students who are taking intro physics are only taking it because it’s required for some other major. At Union, where I teach, we teach close to 140 students intro Newtonian physics every year, the vast majority of whom intend to major in engineering, with a smattering of chemists and mathematicians mixed in. While the class serves as the entry point to the physics major, in academic parlance it’s really a “service” course– something we’re doing for another department. The next biggest chunk of our enrollments, 70-ish students per year, is the Physics for Life Sciences course, which most of the pre-med students take to prepare for the Physics section of the MCAT.

The fact that these courses are service courses first and foremost constrains what we can teach. And much as we might wish it were otherwise, the engineering and chemistry departments don’t particular want us to teach the cool modern stuff. They want us to teach old physics from 1865, because that serves as the foundation for some of their courses. We have to teach classical mechanics first because that’s what the departments that provide most of our students want us to teach.

High school physics ends up covering the same basic material as intro college physics for the same reason that high school biology and chemistry resemble introductory college biology and chemistry– because that’s what high school classes do . They offer classes that cover the most basic stuff for those who will never take another science class, and provide a foundation for those who go on to take another course in college.

Why do chemistry and biology teach more modern material than physics does? Because chemistry and biology as sciences developed more recently than physics did. Lord Rutherford’s division of science into Physics and Stamp Collecting was snide, but not without some truth in 1900 or so when he said it. The existence of atoms as real physical things wasn’t definitively settled until the early 20th century, and a solid understanding of how and why atoms combine into molecules the way they do didn’t come along until quantum mechanics was worked out in the 1930′s. The theory of evolution, without which nothing in biology makes any sense, wasn’t put forth until the 1850′s, and most of the biochemistry of life wasn’t figured out until the 20th century. Chemistry and biology classes don’t spend a whole lot of time on chemistry and biology from before 1865 because most of what was known about those subjects before 1865 has been superseded or cast into a completely different light by more recent discoveries.

Physics before 1865, on the other hand, had accomplished a hell of a lot that’s still useful today. The basic laws of mechanics date from the late 1600′s, and still work brilliantly for describing the motion of macroscopic objects moving at everyday speeds. Maxwell’s equations, which might be the source of the 1865 date, provide a complete and correct description of classical electromagnetism, full stop. They’re even compatible with relativity– in fact, relativity grew out of attempts to reconcile Maxwell’s equations with the rest of physics. Classical thermodynamics, the laws of which are another possible source of the 1865 date, works extremely well for describing the flow of heat in macroscopic systems (and, in fact, thermodynamics probably accounts for most of the pre-1865 material taught in chemistry classes).

We spend a lot of time teaching students about physics from before 1865 because physics from before 1865 is pretty damn useful. It’s the same reason why the Math department spends most of their time teaching students about math that was developed before 1865 (differential and integral calculus, Euclidian geometry, trigonometry)– because “old” math is still extremely useful. Blaming math and physics for their early success is kind of ridiculous, given that the stuff works . It’s also an essential foundation for the cool modern stuff– it’s almost impossible to really understand quantum physics without first knowing a good deal about classical physics.

Yes, but what about Carl Sagan and Neil DeGrasse Tyson and Richard Feynman? Look, I love what they do, but Sagan and Tyson aren’t even in the same business as most people teaching physics. Feynman’s the only one of those three who attempts to teach people how to solve problems. As my angry quantum prof in grad school put it, though, while reading the Feynman Lectures may make you feel like you understand everything, “when you try to solve a problem, you realize that, well, that you’re not Feynman.”

Sagan and Tyson, and Feynman to a large extent, are popularizers, not educators. Their job is to get people fired up about science in a more general way, not to teach them how to do anything with their knowledge. Those are very different businesses– believe me, I know, having written two popular books about cool modern physics. And while I made every effort to ensure that How to Teach Physics to Your Dog and How to teach Relativity to Your Dog are rigorous and correct (to the point of probably limiting their sales…), I wouldn’t begin to claim that they teach people how to do physics. If you want to know that, you need to take actual physics classes at the college level, and you need to start with classical mechanics and E&M.

Finally, as much as I love modern physics, I have a problem with the suggestion that “old” physics is intrinsically boring. True, Newton’s Laws don’t fire the imagination the way Schrödinger’s Cat does, but that doesn’t mean you can’t do cool things with classical physics. If you don’t think classical physics can be cool, you’re clearly not reading enough Dot Physics . Rhett’s blog is one of the most consistently awesome things on the Internet, and he almost never talks about physics developed after 1865.

This is where there’s a glimmer of hope regarding the complaints in the video. The problem isn’t that old physics is intrinsically boring, but that the traditional way of teaching physics is kind of dull. But if you look at the stuff that Rhett, and Frank Noschese , and John Burk , and Kelly O’Shea , and the whole Global Physics Department crowd are doing, you’ll see that there’s a lot of room to teach “old” physics in new ways that make it much more interesting and appealing, while still satisfying our obligations to other programs and departments.

Of course, there’s another way to look at this, too, which is to try to do cool things with modern physics on a conceptual level even earlier– the whole Physics First approach to high school science. That’s a cool idea, and I’d be happy to see it pushed more strongly because I think it has a lot of potential. As it is, though, it runs up against a lot of entrenched interests, so I don’t give it great odds.

But anyway, if you’re wondering why it is that we teach all this old stuff in physics classes, those are my slightly rant-y thoughts about why. It’s not because we haven’t thought about it, believe me.

Read the whole story
3902 days ago
Share this story

Science Communication: The Audience Exists

1 Share

In the twelve years I’ve been at Union, there are only two times I’ve tried to go to an evening speaker and been turned away. Once was 4-5 years ago, when Maya Angelou spoke on campus, the second time was last night, when Bill Nye the Science Guy spoke. I managed to make it to the foot of the steps of Memorial Chapel before they hit the fire code (939, I think they said the number was) and turned everybody away. There were probably 20-30 students behind me in the line, so even if I had made it all the way to the front, I might’ve stepped aside and let one of them in instead.

It’s worth remembering that, as frustrating as the science communication business gets sometimes, when you can feel like you’re shouting into the void, there really is an audience out there for it. Over 1000 people were willing to stand in line on a cold March evening in hopes of seeing a guy in a bow tie talk about science for an hour. While I’m sorry I didn’t get in– the people I’ve heard from who were there say it was terrific– the mere fact that he drew that kind of a crowd makes me really happy.

(“Featured Image” photo by Matt Milless, grabbed off Facebook.)

Read the whole story
3902 days ago
Share this story

Why Should You Think Like a Scientist?


As you may or may not know, I’m currently at work on a book called How to Think Like a Scientist. This raises the fairly obvious question in the post title, namely, why should people think like scientists? What’s the point?

In a sense, this is (as Ethan Zuckerman pointed out at lunch the other day) the underlying question at the heart of the whole endeavor of science communication. I mean, I’ve written two books about modern physics for a general audience, and when I have time, I write this blog aimed at non-scientists. What’s the point of doing all that, anyway? What is it I hope to achieve by all this?

There are, of course, many answers to that, from the cynical (if more people were enthusiastic about science, our funding would be more secure) to the personal (I think modern physics is about the coolest thing ever, and want to share that with others). Neither of those really cover the big picture, though, or make as satisfying a justification for a book about scientific thinking in general.

In the end, the core justification for everything I do in terms of trying to bring science to a broader audience comes back to the idea that science isn’t a collection of facts, it’s an approach to the world. Stripped to its essentials, science is a four-step process: you look at something interesting in the world, you think about why it might work that way, you test your idea with further observations and experiments, and you tell everybody you know what you found.

The formal, institutional version of this is relatively recent, but the basic practice is as old as humanity. The central point of the book-in-progress is that this kind of thinking is something that everybody does, often without even knowing it, in pursuit of various hobbies and interests. The hope is that by helping people recognize their ability to think like a scientist, they will be more conscious of scientific thinking, and this will encourage them to apply it more widely.

But why bother? Why do I think this is an important goal? Well, the simple and obvious answer is to point to all the wonders of modern technology that are only possible through science– airplanes, the Internet, antibiotics. That’s a little glib, though, because it’s perfectly possible to use those devices without understanding how they function. A lot of people will also answer in terms of specific policy goals– if more people understood science, they would be more likely to accept the science involved in particular set of public health and safety issues, and would vote for policies that address. That’s also a little narrow, and runs the risk of turning people off on tribal grounds– if you tell someone “You should learn about science so you will abandon your cherished beliefs and cultural practices,” well, that’s not the most persuasive technique.

The bigger, more philosophical answer, for me, anyway, is that I think the world would be a better place if more people thought scientifically because, ultimately, science is an empowering and fundamentally optimistic approach to the world. And we could all benefit from a little more empowerment and optimism.

That might seem like a funny thing to say, when the scientific position on a host of modern issues involves telling us we’re all going to die– crushed by flaming space rocks, killed by drug-resistant diseases, roasted alive by global warming. But I’m talking about a somewhat more abstract form of optimism, here (and anyway, all of those scientific doomsday scenarios are at least partially avoidable, given knowledge of the problem…). The whole foundation of science is the idea that questions have answers and that we can find those answers. Even more than that– an essential part of the scientific approach to the world is the feeling that not knowing is not acceptable.

One of the most frustrating things I encounter in my job is the person who says “I don’t know,” and regards that as a final answer. Mostly, this comes from students asked what the next step of some problem is, hoping that I’ll just tell them what to do. It’s especially infuriating, though, when it comes from colleagues on the faculty and staff. I’m not saying that we have to instantly know everything about everything, but I find it maddening to hear “I don’t know” and not have it followed by “… but here’s what we can do to figure it out.”

That, I think, is the empowering and optimistic side of science– the idea that there isn’t any question that can’t be answered. When you hit something that you don’t know, scientific thinking is a tool you can use to figure it out. It may take a while, and require meticulous planning and testing, but given time and that look-think-test-tell process, you can get an answer to just about any question that’s worth asking.

That’s an incredibly powerful idea, and it’s something that huge numbers of people would benefit from. You’re never forced to stop doing anything because you don’t know how something works– you have the tools you need to figure out whatever it is that you need to know. And once you know how it works, you can often use that knowledge to make it work better, improving your life in the process.

(Now, I don’t want to go completely overboard into arrogance, here– using science to improve things can be taken too far, and we need to be mindful of the consequences of any scientific project. And there may be projects that are just too expensive or impractical to take on. But even there, the way we know about the costs and consequences is through science and scientific thinking…)

So, ultimately, I think we need to communicate science to a broader audience not because it enables practical technologies, or serves specific policy goals, or helps people get good jobs. We should communicate science to a broader audience because scientific thinking turns “I don’t know” into “I don’t know… yet.” That’s an incredibly liberating tool, and the world would be a better place if more people were comfortable using it.

Read the whole story
3902 days ago
Share this story
Next Page of Stories