I was escorted off a plane once. I had just successfully scored a stand-by seat on an earlier flight home. I was the last one on and boarded with an extra spring in my step. Seconds after I sat down, a uniformed airline person approached me and told me to get my carry-on and follow him. No explanation. I could feel the eyes of all the other passengers on me. I am sure they thought I was headed to Guantanamo. Once off the plane, the airline staff told me I was taken off because the plane was overweight. I’m not sure which is more humiliating—being taken off the plane for being a suspected terrorist or for being the deadweight that could prevent take-off. I am somewhat tall, but don’t think I am overweight. I spent a long time online checking to see if “overweight plane” is a real thing. It apparently is. Apparently even ticketed passengers sometimes have to be taken off planes and rebooked. It only happens with small planes. I still fantasize about telling my former fellow-passengers on my way off the plane, “Nothing to worry about here, just a real-world implementation of a stack; last-in, first-out”.
There has been a spate of reports of articulate and educated individuals demonstrating “suspicious behavior” being reported to security and then quickly escorted off the plane. They became objects of suspicion after speaking on the phone in a foreign language, wearing a turban, or scribbling undecipherable equations. I only know what is in the popular press, but it appears that the general sequence of events is the same. A fellow passenger “sees something and says something” to the flight attendant. Security is quickly called in, and with little to no additional risk assessment, the individual is escorted off the plane which leaves without them. They are then questioned and rebooked on a later flight.
The overwhelming majority of “suspicious” individuals detained by security are not terrorists (even the idiots who bring loaded guns in their carry-ons). So nearly everyone being deplaned or otherwise humiliated is highly likely to be innocent.
False positives are an inevitable price for vigilance. Perhaps, we just need to accept that searching for terrorists means that a proportion of people will always be inconvenienced and humiliated for no fault of theirs. The harder we look, the more false positives we generate. We need to embrace the idea and institute is a system of appropriate compensation. If you are subjected to the inconvenience of being taken off a plane and you turn out not be a terrorist (e.g. being allowed to take a later flight), then you should get an automatic reward, say $1000, for saying sorry we doubted, humiliated and inconvenienced you—you know false positives are an inevitable outcome of vigilance.
A similar system might apply in science. I think there is a stigma attached to admitting that a past experiment (particularly a published experiment) many not be all it was cracked up to be. I am not talking about major retraction-level stuff here, but rather the kinds of good-faith experiments that we all perform and report, but over time, we believe in less and less. It’s not that they are wrong, rather the observation was unknowingly overstated, or does not mean what we thought it did in the original paper. For instance, I have seen a number of genetics papers (including even TCGA) that report genes that do not stand up later scrutiny of the same data. It is likely that by having the sensitivity to detect causal genes means accepting the risk that other genes might have similar events without the causality (false positives).
It would be interesting if we can revisit our published papers and provide accompanying commentary of how different aspects fared over time (similar to a “director’s commentary” that accompanies movies on DVD, but with the benefit of time). We do this in lab meetings that immediately follow publication (“Mimosa Mondays”), but it would be interesting revisit our papers in a few years. Maybe we’ll do that for a future blog entry.