“If you don’t like the weather, just wait ten minutes.”
Long before there were “meteorologists”, we had “weathermen”. Their job was to give us their weather forecasts. And we took their predictions with a grain of salt.
Recently, Hurricane Matthew raged onto the eastern coast of the southern Atlantic states.
It was initially predicted to be a Category 1 hurricane, but by the time it hit the coast of Florida, Matthew was a Category 5. The storm’s winds accelerated by 63 miles per hour in one day, a far cry from the previously forecasted 17 mile-per-hour increase.
Dalia Kirschbaum is an earth scientist working for NASA. She’s studying the aftermath of Matthew’s devastation in Haiti, where 20 inches of rain create mudslides and flooding that led to 800 deaths.
With all the technology we have—satellites equipped with microwave imaging and super-sensitive radar—how did Hurricane Matthew slip through the predictive models?
The Pacific Tsunami Warning Center was established in 1948.
Since then, about 75% of the Center’s warnings were false alarms.
A tsunami results from an earthquake in the ocean floor. This disruption causes massive waves. The tsunami that hit the coast of northern Indonesia in 2004 resulted from an earthquake that measured 9.1 magnitude. More than 230,000 people died and hundreds of thousands of others were injured and homeless from this tsunami.
Why did they have no advance warning?
There was no system like the Pacific Tsunami Warning Center in the Indian Ocean.
Scientists are now utilizing the availability of increased volumes of data to provide better predictive weather shifts.
Our rapid rise of smart technology is putting more sensors in more places, with the opportunity to gather meteorological data from more sources—like vehicles.
IBM developed a weather forecasting tool, Deep Thunder, in collaboration with the Weather Channel.
The system creates high-resolution, 3-D models. Deep Thunder is super-computer-driven, mining data from thousands of weather stations and satellites. Lloyd Treinish, IBM’s chief scientist, explained that weather can be forecasted by creating a series of equations, based on data that has been collected. The more data they gather, the better the equations, which results in more accurate predictions.
With a major budget cut hitting the National Weather Service, private industries have stepped up to explore big data for weather forecasting. Why?
According to the National Oceanic and Atmospheric Administration (NOAA), in the first nine months of 2016, 12 weather and climate disasters resulted in more than $1 billion in losses, for EACH event. Damage to a facility (including networks); lost inventory, equipment and revenue; and downtime can deliver a hefty blow to a business.
We have an ever-growing arsenal of tech tools to drive productivity and efficiency in the workplace. Now, we need to turn our attention to channeling big data into reducing the economic impact of the weather.
Big data is a big opportunity!
About the Author:
Gayle DeRose is proud to be the COO and Marketing Director for L-Tron. Her passions are serving customers, all things creative and her family. She has been with the company for over 20 years, continuously developing her expertise in operations & marketing, as well as the strategy, implementation and ongoing training required to deliver the exceptional service standard L-Tron models today. Want to get in touch with her? Call 800-830-9523 x118 or email Gayle.DeRose@L-Tron.com.