I am a behavioral ecologist, and I study how animals assess and manage predation risk.
By Daniel T. Blumstein
Despite the falsehoods that some politicians peddle, facts still matter, and getting those facts right is essential for survival. I know, because I regularly see the deadly consequences of getting facts wrong.
I am a behavioral ecologist, and I study how animals assess and manage predation risk. But, rather than study the flashy predators – with their sharp teeth, stealthy approaches, and impressive sprinting abilities – I focus on their food.
Some wallabies make bad use of facts. Too often, these four-legged snacks ignore information right in front of them – like rustling in the underbrush or the scent of a passing carnivore. And they pay for this ignorance dearly, with the sudden slash of talons, or the constricting squeeze of a powerful jaw.
But my research has shown that many would-be meals – marmots, birds, lizards, fish, and sessile marine invertebrates among them – are better at assessing risk. In 1979, the ecologists Richard Dawkins and John Krebs proposed the “life-dinner principle,” which holds that prey, with more to lose than predators, are more creative survivalists. The risk of being eaten – and thus removed from the gene pool – provides a strong incentive to up one’s game. For the predator, the only consequence of failure is going hungry until the next meal.
We see the life-dinner principle at work all around us. When shorebirds or ducks flock together as a dog runs down the beach or along a pond, it is because the birds understand that there is safety in numbers. People do the same thing. We feel more anxious, for example, when we surf alone, because we know that, in the extremely unlikely event that a shark decides it wants a fiberglass-and-neoprene meal, our odds of survival increase when the shark has more than one target from which to choose.
People, just like animals, need reliable, truthful data to make good decisions. Once, while studying marmots in the Karakoram mountain range between China and Pakistan, a lack of facts almost got me killed. A cataclysmic rainstorm and resulting landslides had cut off all access into and out of my study site, disorienting me as I sought to leave. As conditions worsened, it was impossible to craft an exit strategy.
Because I was battling typhoid and had a lot of research gear with me, I simply didn’t have the energy to walk across miles of slumping rock and mud. Only days later, when the threat passed and I was finally able to leave the area, did I realize how unhelpful the available information about the roads and alternative routes out that I had been using really was.
While any self-respecting scientist must question everything, and be critical of accepted wisdom, it is possible to make predictions, design experiments to collect data, and, after analyzing those data, draw conclusions that support or disprove an original prediction. We learn – and science advances – by constantly challenging assumptions with fresh, factual information. In this way, we test and hone our ideas until we are left with a conclusion that cannot easily be refuted. We call this our “revealed truth.”
But revealed scientific truth is always subject to new analysis, new scrutiny, and new interpretation. It is always regarded as provisional – that is, subject to later falsification – rather than becoming a fully accepted dogma.
When scientists, and the public at large, dismiss well-supported hypotheses by citing so-called alternative facts, supported by nothing more than emotion or personal belief (post-truth in political-speak), we miss an important opportunity to strengthen our analysis. When we glibly dismiss fact-checked articles in reputable news sources as “fake news,” we fail to use evidence to support our conclusions. In politics as in science, when we dismiss revealed truth, we increase the likelihood of catastrophically bad outcomes.
People have survived because their ancestors got their facts right, like the shorebird that flocks at the hint of danger. In all aspects of life, we should insist on a scientific process that bases decisions on accumulated observations. If, and when, there is enough evidence to support a particular conclusion, we should accept it. Ongoing, self-critical analysis is essential, but only if new ideas, new evidence, or new experimental methods are introduced.
For humanity, emulating the predator-naïve wallaby, and simply ignoring the rustle in the bushes, is no way to avoid being killed. Rather, it is a certain recipe for extinction.
Daniel T. Blumstein is a professor at the UCLA Department of Ecology and Evolutionary Biology, and the UCLA Institute of the Environment and Sustainability. His most recent book, co-edited with William E. Cooper, Jr., is Escaping From Predators: An Integrative View of Escape Decisions.
Copyright: Project Syndicate, 2017.