De meningen ge-uit door medewerkers en studenten van de TU Delft en de commentaren die zijn gegeven reflecteren niet perse de mening(en) van de TU Delft. De TU Delft is dan ook niet verantwoordelijk voor de inhoud van hetgeen op de TU Delft weblogs zichtbaar is. Wel vindt de TU Delft het belangrijk - en ook waarde toevoegend - dat medewerkers en studenten op deze, door de TU Delft gefaciliteerde, omgeving hun mening kunnen geven.

Posted in February 2011

Predicting earthquakes will come sooner than we think

This article by Kees Vuik and Mehfooz Rehman first appeared in The Australian on February 24, 2011.

Kees Vuik is a professor and Mehfooz ur Rehman is a PhD candidate at Delft University of Technology in The Netherlands.

THE 6.3 magnitude earthquake that hit Christchurch is a truly appalling tragedy and it is little wonder that New Zealand’s Prime Minister John Key has said that we may be witnessing his country’s darkest day.

The country is, of course, no stranger to earthquakes; it experiences more than 14,000 a year, of which only about 20 on average have a magnitude greater than 5.0. However, this week’s earthquake, NZ’s deadliest disaster in at least 80 years, has already caused an estimated $3 billion in damage and was so forceful it shook 30 million tonnes of ice from NZ’s biggest glacier.

While world attention is rightly focusing on the effort to save the lives of those trapped by fallen buildings, some media coverage has so far obscured the fact that the science of earthquake prediction is improving and holds much promise in the next few years. Although this will be of no solace for the people of Christchurch, what this means in practice is that in the not too distant future scientists might be able to provide warnings for at least some similar events, thus helping to minimise loss of life and wider devastation.

Predicting earthquakes was once thought to be impossible owing to the difficulty of calculating the motion of rocky mantle flows. However, thanks to an algorithm created by the Delft University of Technology, we now know that it is possible to model these underground streams.

Much of the early experimentation with the new algorithm has been based around the North Anatolian Fault. This is an active fault that runs along the tectonic boundary between the Eurasian Plate and the Anatolian Plate. It extends across northern Turkey and into the Aegean Sea. The last time there was a severe earthquake along this fault line, at Izmith in Turkey in 1999, 17,000 people were killed.

Our colleagues in Utrecht are applying our algorithm to create a model (consisting of 100 million underground grid points) of the North Anatolian Fault (essentially the underground in Greece and Turkey up to 1000km deep). What this information allows us to ascertain is where the underground stresses are strongest, an often telltale sign of the most dangerous potential earthquake trigger points.

As good as this model is, however, it still needs refinement. This is because the link between earthquakes and underground flows is complex and hard to compute. In practice, calculating such flows means constructing very complex mathematical systems made up of millions of pressure and velocity values at all of the underground grid points.

This has given rise to the need to scale up the solution time linearly, a feat that researchers had previously found too difficult. While this scaling up has been achieved, thus making the model more accurate and comprehensive, the project’s complexity has been increased considerably.

Nevertheless, after finishing our work on the North Anatolian Fault, Delft and Utrecht universities intend to try to model the tectonics of the entire earth, a truly ambitious project that will involve perhaps one billion grid points (what we call our fine grid).

To make the computations for these one billion points will require surmounting yet another big hurdle, the "parallel computing" problem.

That is, increasing the number of computers in a system generally means that they work less efficiently.

However, the Utrecht team has already been working with a test sample of 500 computers and we believe we have mitigated this problem with our algorithm.

Despite this breakthrough, computing the one billion parts of the fine grid is a long-term program and, to push forward in the meantime, we are also working on a technique called coarse grid acceleration.

Our coarse grid uses only a small number of sample points in all of the earth’s various strata, thus allowing us to obtain fast, accurate solutions for all of these sample points, leading to considerable savings in computer time.

Finally, we also plan to implement the algorithm on video cards that can speed up the computation by a factor of 100 or more.

While much more hard work and innovation lie ahead, this new frontier of seismology is therefore genuinely path-breaking and already achieving exciting results. However, as the Christchurch earthquake has painfully reminded us, true success will be achieved only when we reach the stage at which human lives are saved by applying our research in practice.

© 2011 TU Delft