skip to Main Content
Home of the finest science fiction and science fact

Editorial

Guest Editorial: Don’t Slow Down
by  Richard A. Lovett

A few years ago, my city installed digital signs along several freeways. When activated, they post advisory speeds—not official limits, but suggestions for the fastest you might want to go, given the conditions you are approaching. In light traffic or good weather, they are turned off: just follow the normal speed limits. In bad weather, or if there is congestion ahead, they can read anything from 45 miles per hour to “slow.”

In the argot of traffic engineers, these signs are called variable speed-limit systems. The speeds they advise are based on real-time assessments of how to get everyone to their destinations as quickly and safely as possible—the type of advice we all should welcome. But that’s not how it appears to work. My own experience is that when the signs light up, advising people to slow down in preparation for dangerous conditions ahead, most do the opposite, accelerating in a mad race to get to the choke point a bit ahead of everyone else. It’s particularly dramatic if one of the signs changes status as you are approaching it. As I was working on this editorial, I watched that happen when a sign ahead of me flipped from 25 miles per hour to “slow”. Immediately, everyone around me surged forward. Clearly, whatever message these signs are intended to convey isn’t the one being received.

The idea that good ideas can backfire is, of course, nothing new. It’s sometimes called the law of unintended consequences. But usually, the unintended consequence is the result of an overlooked effect of compliance with a seemingly good idea. A great example in my part of the country (the American West) is what’s been called the Smoky Bear Effect. Seventy-five years ago, “Only you can prevent forest fires” seemed a great idea. Now we know that over-aggressive fire suppression allowed a buildup of brush that helps fuel today’s megafires.

Another example was the sinking of the SS Eastland in 1915. That was, in part, caused by a law passed in the aftermath of the Titanic disaster that required steamships to increase their contingent of lifeboats. It looked like a good idea, but complying with it rendered the Eastland so top-heavy that it capsized in twenty feet of water, only twenty feet from shore, drowning 844 passengers and crew. It was one of the worst maritime disasters in American history, and it was precipitated by compliance with what was supposed to be a safety precaution.

*   *   *

The backfire I’m seeing from variable speed-limit signs, however, is different, because it doesn’t come from unexpected consequences of people doing what’s advised: it comes from them doing the opposite.

Joshua Madsen, an accounting professor turned traffic-safety researcher at the University of Minnesota, isn’t surprised. “The story you share makes a lot of sense,” he says. That’s because Madsen has spent nearly a decade studying an entirely different type of advisory signs, and he has found that they too appear to cause people to do exactly the reverse of what’s being advised.

Madsen’s signs have sprung up in twenty-eight American states and several other countries. They are intended to reduce traffic fatalities by alerting motorists to the risks they encounter every time they take the wheel, starkly proclaiming such messages as “1159 people have died this year on Georgia roads.”

Madsen first saw one on a trip through Chicago. “Then, I saw another in Nashville,” he says. That got him wondering whether these signs actually worked.

In an effort to find out, he teamed up with Jonathan Hall, now an economist at the University of Toronto, Canada, and spent years scouring traffic-safety data. It wasn’t an easy task. The problem was that these messages are generally displayed on large digital displays, called dynamic message signs (DMSs), also used for other purposes, such as advising about road construction, estimated travel times, or unrelated safety reminders such as “Click It or Ticket” and “Drive Sober.” That meant there was no pattern to when the warnings were given, making it very difficult to collect meaningful statistics from which to assess their effectiveness.

The breakthrough came when Madsen and Hall discovered that the State of Texas had a unique policy of displaying these warnings one week each month: always (and only) on the week before the Texas Department of Transportation’s monthly board meeting. Why the state did this isn’t clear, but it made a perfect natural laboratory for testing their effectiveness.

Madsen and Hall identified the locations of nearly nine hundred such signs and compared traffic-accident statistics from the ten kilometers downstream from them on weeks when they were displaying traffic-fatality warnings to those from weeks when they weren’t. It was the perfect controlled study, because the only thing that changed from the “on” weeks to the “off” weeks was what was being displayed on the signs.

It was also a mammoth exercise in data collection. “I started in 2013,” Madsen says. “We have been working on it ever since.” But eventually, he and Hall had a data set so extensive that they could, if they wanted, examine the effects on an hour-by-hour basis. Nine years into the project, on April 22, 2022, they published their findings in Science,1 reporting that the signs not only didn’t work, but that they actually increased accident rates by about 4.5%. That may not seem like a lot, but U.S. traffic fatalities are now close to 43,000 people per year. A 4.5% increase would be nearly 2,000 additional deaths.2

Madsen thinks the reason the signs backfired is probably that they were too shocking (though he prefers the word salient). “The concept of death is much more salient than ‘Drive Sober’ or ‘Click it or Ticket,’ he says. “[It] has its own special place in human existence.” That special place, he says, may mean that the signs grab attention in the wrong way by making people ponder their mortality rather than doing what they are being encouraged to do, which is to drive more safely. Either that, or perhaps they overcompensate by doing things like braking more strongly than necessary when they need to slow down.

An alternative explanation comes from Gerald Ullman, a transportation engineer at Texas A&M University, College Station, Texas. Maybe, he says, it’s just because these signs are providing too much information, at a time when drivers, already engaged in the complex process of navigating urban freeways, really don’t need it.3

It’s a plausible argument, but Madsen counters by diving deeper into his dataset. His data don’t just show the overall effect of warning signs: he can relate that effect to the number of traffic deaths stated on them. “The bigger the number displayed, the more crashes,” he says. That suggests that larger fatality numbers are more frightening, and therefore more likely to start drivers thinking about the wrong things, such as their own mortality, as opposed to not running into the vehicle ahead of them.4

*   *   *

One thing Madsen and Ullman agree on is the importance of thinking before implementing seemingly good ideas willy-nilly. Especially because traffic signs aren’t the only things that can backfire.

In 2007, researchers from California State University, San Marcos, examined the effect of telling people about how their electric power consumption compared to average. Many of us have gotten such analyses in our utility bills (I do every month). They are intended to tell us how well we are doing at conserving power and to encourage us to do better, but what the CSU San Marcos team found, Madsen says, was that their effect varied. If people were consuming more power than average, telling them so encouraged them to conserve. But if they were doing better than average, they tended to relax their efforts, rather than trying to do even better yet.5

For Madsen, the take-home message is that it’s important to put good ideas to the test before introducing them into general practice. If Texas hadn’t accidentally created the perfect experiment with its strict, monthly cycle of how it used highway fatality signs, it’s unlikely anyone would ever have realized that they didn’t work. After all, they really did look like a great idea.

Controlled experiments, however, aren’t the only way ideas can be put to the test. Speculative fiction may not be as rigorous as the type of statistical analyses conducted by Madsen and Hall, but that doesn’t mean it isn’t a very good laboratory for figuring out what might go wrong with them.

A staple of science fiction, especially here in the pages of Analog,6 has long been the “cautionary tale,” designed to ask just that kind of question: what might go wrong with a seemingly good idea. I myself have written a number of these, but Isaac Asimov was far ahead of the rest of us when, in 1958, he penned his short story, “The Feeling of Power,” about a future in which people had become so dependent on what today we would call calculators that they became incapable of doing the simplest arithmetic on their own. When he wrote that story, sixty-four years ago, it seemed far-fetched. Today? If most people need to do anything much more complex than adding 2+2, they are grabbing for their devices.

My own best (or, at least, favorite) cautionary tale was one I called “The Last of the Weathermen” (July/Aug 2007). It centered on a backpacker going into a remote mountain range, right on the cusp of winter. He had a smart device that would give him hour-by-hour forecasts, but as a retired meteorologist, he didn’t like what he saw in the sky and trusted his eyes and experience over the predictions of his device. In the process he narrowly saved both himself and a desperate hiker he encountered along the way.

What most readers didn’t know is that the mountain range in the story is real. The hike was real. The weather was real. I was there. I did it, and bailed from my planned route because I too didn’t like what I saw in the sky. When a nearby city unexpectedly awoke to twelve inches of snow, I was safe and sound.

The bottom line is this. Whenever we have a “great” idea for how to make the world a better place, one of the first things we need to do is to find a way to test whether it works. Sometimes, that’s via a scientific experiment. Sometimes, it’s through science fiction. Either way, it’s an exercise in asking “what could go wrong?” and being willing to accept the possibility that our best brainstorms might be fatally flawed.

There’s a term for a person unwilling to ask those questions. That term is “intellectual coward.” And that’s not what those of us reading this magazine see ourselves as being.

 

Endnotes:

1  Can behavioral interventions be too salient? Evidence from traffic safety messages. Jonathan D. Hall and Joshua M. Madsen. Science. Vol 376, Issue 5691. DOI: 10.1126/science.abm3427.

2  Not that freeway crashes are the leading cause of traffic deaths. But still, even a small increase in freeway crashes almost certainly means substantially more deaths and serious injuries.

3  How safe are safety messages? Geraldo Ullman and Susan Chrysler. Science, Science 22 April 2022, 376(6591):347-348. Doi: 10.1126/science.abq1757.

4  Ullman’s counter is that the larger the number on the sign, the harder it is to process, meaning that “2937 fatalities to date this year” might be harder to understand (and therefore more distracting) than, say, 29 fatalities. This makes sense, but the bottom line remains: these signs backfire, and the psychology of why that happens is of more interest to psychologists than to traffic safety planners.

5  The Constructive, Destructive, and Reconstructive Power of Social Norms. P. Wesley Schultz, Jessica M. Nolan, Robert B. Cialdini, Psychological Science, May 1, 2007, doi.org/10.1111/j.1467-9280.2007.01917.x.

6  Can behavioral interventions be too salient? Evidence from traffic safety messages. Jonathan D. Hall and Joshua M. Madsen. Science. Vol 376, Issue 5691. DOI: 10.1126/science.abm3427.

7 Not that freeway crashes are the leading cause of traffic deaths. But still, even a small increase in freeway crashes almost certainly means substantially more deaths and serious injuries.

8  How safe are safety messages? Geraldo Ullman and Susan Chrysler. Science, Science 22 April 2022, 376(6591):347-348. Doi: 10.1126/science.abq1757.

9  Ullman’s counter is that the larger the number on the sign, the harder it is to process, meaning that “2937 fatalities to date this year” might be harder to understand (and therefore more distracting) than, say, 29 fatalities. This makes sense, but the bottom line remains: these signs backfire, and the psychology of why that happens is of more interest to psychologists than to traffic safety planners.

10  The Constructive, Destructive, and Reconstructive Power of Social Norms. P. Wesley Schultz, Jessica M. Nolan, Robert B. Cialdini, Psychological Science, May 1, 2007, doi.org/10.1111/j.1467-9280.2007.01917.x.

 

Richard A. Lovett first appeared in Analog in 1999. Since then, he’s returned more than 185 times. In the process, he has collected thirteen AnLabs, more than anyone else in the magazine’s history. His latest book of fiction is Neptune’s Treasure: The Adventures of Floyd and Brittney. His latest book of science fact is Here Be There Dragons: Exploring the Fringes of Human Knowledge. Find him on Facebook or at http://richardalovett.com/wp.

Copyright © 2023 Richard A. Lovett

Back To Top
0
    0
    Your Cart
    Your cart is emptyReturn to Shop