Friday, November 16, 2018

Ground Up Rethink of Disaster Relief


Disaster relief has not kept up with the volume of mega natural calamities, a four-fold increase since 1970. We can do the Trump blame game and deny or diminish relief – coincidentally worse in blue states and territories than red – we can let people waddle in rebuilding hell, often exacerbating economic realities and national policies or keep pouring billions and billions of dollars in reactive relief. You’d think we could plan against such horribles better, but kicking the can down the road is a new American tradition. That we seem always to choose the most expensive alternatives, that fixing or responding without a precise plan is always more expensive than preventing or being truly ready for what is inevitable.
Look at Japan this past year, particularly devastated by natural disasters. “The economy Japan's economy shrank in the third quarter as natural disasters hit spending and disrupted exports… contracted by an annualised 1.2% between July and September, preliminary figures showed… A devastating earthquake and typhoon were among the disasters to hit Japan this year, and prompted the bigger than expected [GDP] contraction.” BBC.com, November 14th.
According to its corporate website, One Concern (a new company applying artificial intelligence analytics to help government disaster relief efforts), over the last 40 years, 3.3 million people have died in natural disasters, which have inflicted $2.3 trillion in economic damage. As populations increasing will live in cities – One Concern estimates that 60% of the earth’s population will live in cities by 2030 – a natural disaster will have a much greater potential for death and damage with such a concentration of people.
The United States continues to lumber under the vestiges of red state climate change denial. We see a reluctance from developing nations to embrace severe limitations on greenhouse gas emissions and countries like ours that place business earnings over human safety concerns. But we are spending and will spend so much more than any aggregation of business advantages by not grappling with climate change variables and not redesigning our disaster relief efforts to reflect modern capacities and realities.
As the United States deals with retrofitting its energy generation and consumption policies, which it will sooner or later, it also needs to revisit how governments analyze and deploy once a natural disaster hits. That’s where artificial intelligence can benefit us all, increasing effectiveness and implementing cost efficiencies.
One Concern is launching a machine learning platform that provides cities with specialized maps to help emergency crews decide where to focus their efforts in a flood. The maps update in real-time based on data about where water is flowing to estimate where people need help the most. It’s the latest in a wave of AI-powered tools aimed at helping cities prepare for an era of severe, and increasingly frequent, disasters.
“Since 1980, the U.S. has suffered from 219 climate disasters that cost over $1 billion, with the total cost exceeding $1.5 trillion. In 2017 alone, these disasters cost the country $306 billion. Since 2000, more than 1 million people have perished from these extreme weather events. As climate change heralds more devastating natural disasters, cities will need to rethink how they plan for and respond to disasters. Artificial intelligence, such as the platform One Concern has developed, offers a tantalizing solution. But it’s new and largely untested. And the urgency is growing by the day.” FastCompany.com, November 15th.
One Concern is the brainchild of Stanford University student Ahmad Wani, a native Kashmiri who watched helplessly in his native country as floodwaters decimated everything around him. He vowed to address the misfeasance in disaster relief. Even in modern countries, disaster relief agencies deal with static maps and old-world analytical tools that cannot process the complexity of vulnerabilities, temperatures, weather patterns, population concentrations, wind history, likely structural failures, access risks and availabilities, water availability and the inevitable changes during a disaster that simple were not anticipated. Simply, we are living in relatively unsophisticated dark ages when it comes to real time disaster analytics.
“After surviving the devastating flood in Kashmir, Wani returned to Stanford, where he was studying structural engineering. He began contemplating how to predict a disaster’s damage. The idea was that if city officials could anticipate which areas would be most harmed, they would be able to deploy resources faster and more efficiently throughout the disaster zone. But he had a problem: analyzing a single building using traditional structural engineering software took seven days on Stanford’s supercomputer. ‘We had to recreate that for the entire city’ for the idea to work, Wani says. ‘We didn’t have seven days or seven years. We wanted to do it in three to five minutes.’
“He decided to focus first on earthquakes, which are more of a threat than floods in California. Wani teamed up with fellow Stanford students Nicole Hu, a computer scientist who focuses on machine learning, and Tim Frank, an earthquake engineer, to build an algorithm that can digest data about how a building was built and how it’s been retrofitted over time. This data is combined with information on the building’s materials and surrounding soil properties to extrapolate what happens to this system when shaking occurs. Then, when a quake hits, the model absorbs new information coming from on-the-ground emergency responders, 911 calls, or even Twitter to make its predictions of the damage more accurate.
“Because the model identifies patterns by looking through large amounts of data, it needs less computing power than the previous method of asking a computer to perform complex physics equations to understand how shaking will impact a structure. The trade-off is accuracy: Hu estimates that the algorithm is only about 85% accurate. With more data over time, that number will improve, but the team believes that it’s good enough to paint a broad picture of damage immediately after a quake. (Of course, they won’t know for sure until a major earthquake hits.)
“Wani, Hu, and Frank started One Concern in 2015 and then released its earthquake platform, called Seismic Concern, in 2016. Seismic Concern predicts the damage caused by earthquakes on a block-by-block level and is now used by eight different municipalities, including the cities of San Francisco, Los Angeles, and Cupertino.
“Now, the company is launching Flood Concern, a constantly evolving risk map that crunches huge amounts of data based on the physics of how water flows, information about previous floods, and even satellite imagery to approximate the depth, direction, and speed of the water–and determine which areas of a city are most at risk. Layered on top of the damage prediction is demographic data, so that emergency planners can see what areas of a city might have particularly vulnerable populations, like a significant percentage of seniors or disabled citizens. With that kind of information, planners can figure out which areas should be evacuated, where to put shelters, and what critical infrastructure–like schools or hospitals–needs the most help when flooding begins.” FastCompany.com
There is a double message in this story. One deals with the use of modern tools to address the rise in natural disasters. The other deals with an immigrant, one of the best and the brightest, that Donald Trump wants to prevent from entering the United States, a person of color with a different religious background who is a “them” and not “us.” You tell me who the biggest loser is under than less-than-subtle Trump nationalist policy.
I’m Peter Dekom, and never before has the world required more international cooperation, more group think, to solve complex problems… and the United States needs to remember that it became great because it was a nation of immigrants!

No comments:

Post a Comment