Banner
UCAR Home | NCAR

NCAR  > SIP  > Weather and Society Watch > Guest Editorial
 

 

Weather and Society Watch
Guest Editorial

Testable Hypotheses and the Societal Impacts Discussion Board
by Jeffrey K. Lazo*

At some point in the past, at least by the time of Aristotle, philosophers recognized that a fundamental distinction should be drawn between two kinds of scientific knowledge—roughly, knowledge that and knowledge why. It is one thing to know that each planet periodically reverses the direction of its motion with respect to the background of fixed stars; it is quite a different matter to know why. Knowledge of the former type is descriptive; knowledge of the latter type is explanatory. It is explanatory knowledge that provides scientific understanding of the world. (Salmon, 1990)

It is an interesting phenomenom that after every severe weather event the Societal Impacts Program Discussion Board (http://www.rap.ucar.edu/forums/phpBB2/) is filled with discussions about why people reacted they way that they did, about what they should have done differently, about whether or not they received the warning or forecast, about if they did receive it why didn't they respond, about if they didn't receive it why they didn't, about ….

There is inevitably a passionate discussion about what the weather community can do to get people to do what they "should do" when there is a severe weather event, about what role social media plays, about how sirens do or don't work, about whether or not people should have shelters, about how to improve the spatial scale of warnings, about …

And many of the same issues, questions, concerns come up again and again with every new event.

With the devastating tornado, flood, and wildfire season we've experienced so far this year, there has been no shortage of similar discussions on the Board. While I admit I haven't been able to keep up with all of the posts, I have seen enough to know that there has been much enlightening and thoughtful discussion.
All of these discussions are in keeping with the purpose of the Board – "for societal impacts researchers, forecasters, policy makers, and other interested parties to post and receive information relevant to the societal impacts of weather and weather forecasting."

I feel sometimes, though, that we are covering the same ground and that a certain level of frustration exists that we don't have answers to many of the discussion points.

I would like therefore to offer one approach that may help move some discussions forward.

I can't take credit for the idea but borrow an approach George Youngs used at the Red River and Devils Lake Integrated Warning Team workshop in Fargo, N.D., this June. George is a sociologist with the Department of Emergency Management at North Dakota State University.

During the first day of the workshop, George listened to what people were saying about their agencies, the problems they face, and issues related to flooding in the Red River and Devils Lake area. He identified a number of common themes and issues that came up during that first day. The second morning of the workshop, he presented these themes and issues—but George had transformed these "discussion points" into hypothesis that could be tested using social science research and methods.

Why did this make sense? Because by identifying problems and issues that were presented as assertions, anecdotes, frustrations, concerns, or problems and presenting them as hypotheses, he offered a way to actually deal with them. He offered the scientific method as embodied in the social sciences as a way to develop explanatory understanding that would allow decision makers to base their decisions on "knowledge" rather than continuing to feel frustrated with not knowing why people reacted they way that they did, what they should have done differently, whether or not they received the warning or forecast, if they did receive it why didn't they respond, or if they didn't receive it why they didn't.

When reading some of the recent postings on the Discussion Board, I felt that a similar approach could be taken with many of the comments there.

In a quick scan of these discussions, I have seen virtually no citations of research supporting or refuting the assertions made. Note: I am not saying that having a citation proves something is true or not but it does move it to a level where issues can be discussed based on evidence and scientific practice of evaluating evidence (i.e., the scientific process).

So … here is my attempt to "George Youngs" some of these comments. I skimmed several of the comments and did not choose any particular issue to address or anyone in particular to pick on or support but simply chose some statements that caught my eye. Many of these are assertions that may be based on extensive personal experience or anecdotal evidence and may or may not be true. My point is not that they are or aren't correct but that we may not know whether or not they are—and we can't make sound decisions based on anecdotes.

Also, I don't know the extent to which some of these hypothesis have already been tested and valid and reliable social science research has already answered them. I would encourage more discussion on the Board about this research where it is available. I also note that many of the comments below could generate many different hypotheses, so feel free to make up some of your own!

  • Assertion: "…you have to use social media to reach younger people…"
    Hypothesis: "Social media is the primary channel by which younger people access weather warnings."
  • Assertion: "… [on TV] a constant barrage of bugs, crawls, and cut-ins over-saturates viewers with information, and they basically tune things out…"
    Hypothesis: "Increased provision of weather information by multiple methods in broadcast media causes cognitive overload."
  • Assertion: "…with wall-to-wall [media coverage of an event] it becomes harder for people to distinguish between low-end storms and those like we've seen this year in Tuscaloosa and Joplin…"
    Hypothesis: "Increased provision of weather information by multiple methods in broadcast media causes cognitive overload."
  • Assertion: "…as long as ratings are involved, the thought of sharing anything will be difficult for broadcasters to swallow…"
    Hypothesis: "Broadcast meteorologists' decision process is based on a highly competitive environment measured by ratings and, thus, they are unwilling to work cooperatively."
  • Assertion: "…we just have to work on educating people on how to use that knowledge and information effectively…"
    Hypothesis: "Providing people with educational opportunities about weather watch and warning information will increase the likelihood that they respond effectively to this information."
  • Assertion: "…Talking about the broadcast media/industry: "the industry itself is doing everything it can to push away the good guys"…"
    Hypothesis: "The media industry is collectively working to remove high quality conscientious broadcast meteorologists."

Okay, I chose some of the comments to raise issues of the degree to which assertions—even if they are testable as hypotheses–may involve generalizations. On the other hand, if this last assertion were true, that would indicate a significant problem for the future communication of accurate and reliable weather warning information!

As you may expect, I could pull out many, many more assertions from the posts on the Discussion Board. Most of these just happened to relate to media communication.

The point is that making assertions again and again doesn't add to knowledge—in fact, it may add to frustration. By recognizing that these are assertions—perhaps based on experience and observation—and moving to formulate these as hypotheses, we take the first step toward developing knowledge. One description of four steps of the "scientific method" is:

  1. Characterization from experience and observation
  2. Hypothesis: a proposed explanation
  3. Deduction: prediction from the hypothesis
  4. Test and experiment

Let's recognize that many of the Discussion Board assertions are, at most, "characterization[s] from experience and observation." Building on these characterizations let's take a next step to developing hypotheses. As defined in Wikipedia "A hypothesis is a suggested explanation of a phenomenon, or alternately a reasoned proposal suggesting a possible correlation between or among a set of phenomena." This would be a first step into the world of the social sciences from the world of "societal impacts."

Applying the scientific method to experience and observations, making deductions, testing hypotheses, and building knowledge could move us from assertions and frustrations to developing approaches to reduce the societal impacts from hazardous weather. Think what we could do to improve societal outcomes if we had "explanatory knowledge" about human behavior during a severe weather to the same extent we have "explanatory knowledge" about the weather events themselves!

Remember the quote from Salmon: "It is explanatory knowledge that provides scientific understanding of the world."

*Jeff Lazo (lazo@ucar.edu) is the director of the Societal Impacts Program (SIP) at the National Center for Atmospheric Research (NCAR).


Reference

Salmon, Wesley C., Four Decades of Scientific Explanation, University of Minnesota Press, Minneapolis, MN, 1990.





NCAR Logo    USWRP Logo
©2021 UCAR | Privacy Policy | Terms of Use | Visit Us | Sponsored by