The long-feared 9.0 magnitude Cascadia subduction zone earthquake, which seismologists say is inevitable, could claim more than 10,000 lives and force thousands more from their homes indefinitely, according to projections by the Federal Emergency Management Agency
Beyond the human toll, the violent shaking and the tsunami less than half an hour later will damage or destroy much of the infrastructure in Oregon and neighboring states. Buildings, roads and bridges, seaports and airports, electrical and telecommunications grids, natural gas and water lines — all could fall victim to the earthquake, which is expected to be among the worst natural disasters in American history.
Measures can be taken to safeguard infrastructure. But before a sustained program can be established to replace, retrofit, or redesign vulnerable components, these parts must be identified and stratified by their level of vulnerability, so each state can prepare accordingly.
This approach to planning was among the chief recommendations in the 2013 Oregon Resilience Plan. This landmark report details what Oregonians can expect to happen to the state’s infrastructure during and after the earthquake, and it lays out the priorities and actions needed to survive and bounce back. Though significant remedial steps have been taken in the intervening years, Oregon remains woefully underprepared to weather the Big One.
Ted Brekken’s focus is on the electrical grid, which will fail throughout most of western Oregon, and probably beyond, shortly after the shaking starts. Brekken, a professor of electrical and computer engineering at Oregon State University, is in the second year of a three-year project, funded by the National Science Foundation, to gauge the performance of the entire Western electrical grid after a CSZ earthquake. Eduardo Cotilla-Sanchez, associate professor of electrical and computer engineering, Mike Olsen, professor of geomatics engineering, and Armin Stuedlein, professor of geotechnical engineering, join Brekken on the grant.
“Research is lacking on electrical grid performance in the context of a major earthquake,” Brekken said. “We want to know how much is going to fail and how quickly, what portions will remain functional, and how long recovery will take.” He aims to create a thorough, data-driven process that results in an unparalleled understanding of the impact that a major CSZ earthquake will have on the extent and duration of a Western grid failure. The information could prove invaluable for policymakers charged with the enormous task of boosting the grid’s resilience and accelerating its revival.
No single method or tool is adequate to analyze and understand the power flow through an electrical system that spans 14 states and 1.8 million square miles. So, Brekken has assembled a multidisciplinary team with expertise in power system operations and analysis, geotechnics, earthquake engineering, mapping, and geospatial analytics.
The foundation of their analysis is a computer model of the Western power grid, overlaid by a digital representation of the grid’s assets — its discrete physical components — like generators, substations, transformers, power lines, poles, towers, and distribution buses, and the amount of power flowing through the system. About 30,000 assets are represented. The model can’t possibly represent every physical piece in the grid, but it contains enough to capture the grid’s large-scale behavior, Brekken explains.
The next data layer maps the geologic hazards that will threaten assets in the earthquake’s aftermath. For example, the model might show a substation resting on a type of soil expected to amplify the convulsions, or reveal which rests in a landslide hazard zone. “Now we can actually look at everything in the system, how it’s all connected, the power flow at every point, which assets are at greatest risk for damage and failure, and how that will affect the flow of electricity,” Brekken said.
To test the model, researchers will simulate a major earthquake over and over — perhaps a million times — creating ample data to make predications about which assets will survive and which will fail based on Monte Carlo analysis. The Monte Carlo method is a type of computational algorithm based on repeated random sampling to reach numerical results in the form of probability distributions. By repeating the analysis at regular intervals, corresponding to the passage of time in the real world, they’ll forecast how long it will take before service can be restored. “We’ll get a picture of the expected function of the grid as a probability distribution, and how that distribution will move forward in time to account for assets coming back on line,” Brekken explained. The team is conducting early analyses of the model and fine-tuning virtual emergency grid recovery operations to accurately reflect expected real-world behavior in the seconds and minutes following the earthquake.
Oregon (and Washington) remain well behind California in earthquake preparedness, and the West Coast is far behind seismically active countries like Chile and Japan, according to Brekken.
“A big difference is that those countries experience major earthquakes regularly, and each one highlights weak spots in their infrastructure,” he said. But because they’ve experienced so many of these disasters, they’ve weeded out many of those weak spots — components that failed in previous disasters.
“It’s not that they don’t still have problems, but the recovery has become fairly quick, and there’s not as much uncertainty about what will survive and what will fail,” Brekken said. “But in our case, we don’t know precisely what will fail. We won’t know all the weak spots until an earthquake shows us, and then it’s too late. Our work, I hope, will answer those questions and tell us what we need to know before we find out for real.”