Wednesday, June 18, 2014

Only Some of These Really Bother Me

(A version of this post has also appeared on my Tumblr)
Article header from "10 Scientific Ideas That Scientists Wish You Would Stop Misusing"
There's an article on listing ten words/concepts from various fields of science that are commonly misused by laypeople, so I had to look at it and see if any of my pet peeves made it onto the list.

There were a few:
3. Quantum Uncertainty and Quantum Weirdness
[Astrophysicist Dave] Goldberg adds that there's another idea that's been misinterpreted even more perniciously than "theory." It's when people appropriate concepts from physics for new agey or spiritual purposes:
This misconception is an exploitation of quantum mechanics by a certain breed spiritualists and self-helpers, and epitomized by the abomination, [the movie] What the Bleep Do We Know? Quantum mechanics, famously, has measurement at its core. An observer measuring position or momentum or energy causes the "wavefunction to collapse," non-deterministically. (Indeed, I did one of my first columns on "How smart do you need to collapse a wavefunction?") But just because the universe isn't deterministic doesn't mean that you are the one controlling it. It is remarkable (and frankly, alarming) the degree to which quantum uncertainty and quantum weirdness get inextricably bound up in certain circles with the idea of a soul, or humans controlling the universe, or some other pseudoscience. In the end, we are made of quantum particles (protons, neutrons, electrons) and are part of the quantum universe. That is cool, of course, but only in the sense that all of physics is cool. 
4. Learned vs. Innate
Evolutionary biologist Marlene Zuk says:
One of my favorite [misuses] is the idea of behavior being "learned vs. innate" or any of the other nature-nurture versions of this. The first question I often get when I talk about a behavior is whether it's "genetic" or not, which is a misunderstanding because ALL traits, all the time, are the result of input from the genes and input from the environment. Only a difference between traits, and not the trait itself, can be genetic or learned — like if you have identical twins reared in different environments and they do something different (like speak different languages), then that difference is learned. But speaking French or Italian or whatever isn't totally learned in and of itself, because obviously one has to have a certain genetic background to be able to speak at all.
6. Gene

[Synthetic biologist Terry] Johnson has an even bigger concern with how the word gene gets used, however: 
It took 25 scientists two contentious days to come up with: "a locatable region of genomic sequence, corresponding to a unit of inheritance, which is associated with regulatory regions, transcribed regions and/or other functional sequence regions." Meaning that a gene is a discrete bit of DNA that we can point to and say, "that makes something, or regulates the making of something". The definition has a lot of wiggle room by design; it wasn't long ago that we thought that most of our DNA didn't do anything at all. We called it "junk DNA", but we're discovering that much of that junk has purposes that weren't immediately obvious. 
Typically "gene" is misused most when followed by "for". There's two problems with this. We all have genes for hemoglobin, but we don't all have sickle cell anemia. Different people have different versions of the hemoglobin gene, called alleles. There are hemoglobin alleles which are associated with sickle cell diseases, and others that aren't. So, a gene refers to a family of alleles, and only a few members of that family, if any, are associated with diseases or disorders. The gene isn't bad - trust me, you won't live long without hemoglobin - though the particular version of hemoglobin that you have could be problematic. 
I worry most about the popularization of the idea that when a genetic variation is correlated with something, it is the "gene for" that something. The language suggests that "this gene causes heart disease", when the reality is usually, "people that have this allele seem to have a slightly higher incidence of heart disease, but we don't know why, and maybe there are compensating advantages to this allele that we didn't notice because we weren't looking for them".
Those were the ones that resonated with me the most; others were only minor peeves or didn't actually bother me at all.

Misused Word #1, "Proof," was only a minor annoyance for me in that I'm almost never talking about mathematical proofs, and even if I were the sort of person who does use them routinely, it still seems to me like most things people talking about "proving" colloquially are impossible to express in mathematical terms.

It just seems to me like there wouldn't be very many circumstances in which mixing up the technical and colloquial meanings of "proof" would be an issue that would even arise.

(I have found that the most annoying sources of confusion in scientist/layperson conversations about proof have to do with standards of evidence, or also degrees of uncertainty. You can be more unsure of one thing than you are of another, even if you're not 100% certain about the thing you are more sure of.)

Similarly, "theory" also doesn't annoy me that much because I don't usually have much trouble adjusting to different usages of words in different contexts.

I can see how it would get really old having to explain the technical meaning of "theory" over and over again, though.
It seems like those are more about the meaning of specific words than they are about whole networks of ideas, so they are easier for me to adapt to when they surprise me in conversation.
The ones discussed in the quoted text above, though? Misuse of ideas derived from quantum mechanics, misinterpretations of evolution and natural selection, or the idea that genes are "for" specific things? Those come with so many other ideas connected to them, so many wrong things tacitly accepted as premises, that I feel like I need a ball of yarn to slowly pick my way back to the start of the conceptual maze.

No comments: