Categories

Disclaimer

De meningen ge-uit door medewerkers en studenten van de TU Delft en de commentaren die zijn gegeven reflecteren niet perse de mening(en) van de TU Delft. De TU Delft is dan ook niet verantwoordelijk voor de inhoud van hetgeen op de TU Delft weblogs zichtbaar is. Wel vindt de TU Delft het belangrijk - en ook waarde toevoegend - dat medewerkers en studenten op deze, door de TU Delft gefaciliteerde, omgeving hun mening kunnen geven.

Posted in January 2010

New algorithm holds promise for earthquake prediction

This article was posted on Reuters The Great Debate on January 31st, 2010 by Kees Vuik

– Professor Kees Vuik is a professor, and Mehfooz ur Rehman is a PhD candidate at Delft University of Technology. The opinions expressed are their own.-

The Haiti earthquake was a truly appalling tragedy and it is little wonder that the United Nations has described it as the worst humanitarian disaster it has faced in its history.  The 2010 earthquake follows several earlier ones, including in 1751, 1770, 1842 and 1946, which have struck the island of Hispaniola (the tenth most populous island in the world) which is shared by Haiti and the Dominican Republican.

While world attention is rightly focusing now upon the aid effort in the country, much media coverage has so far obscured the fact that the science of earthquake prediction is improving and holds much promise in the next few years.  While this will be of no solace for the people of Haiti, what this means in practice is that scientists might be able in the not-too-distant future to provide warnings for at least some similar events, thus helping to minimise loss and life and wider devastation.

Predicting earthquakes was once thought to be impossible owing to the difficulty of calculating the motion of rocky mantle flows.  However, thanks to an algorithm created by the Delft University of Technology, we now know that it is possible to model these underground streams.

Much of the early experimentation with the new algorithm has been based around the North Anatolian Fault.  This is a major active geologic fault which runs along the tectonic boundary between the Eurasian Plate and the Anatolian Plate.

The fault extends westward from a junction with the East Anatolian Fault at the Karliova Triple Junction in eastern Turkey, across northern Turkey and into the Aegean Sea.  The last time there was a major earthquake along this fault line, at Izmith in Turkey in 1999, around 17,000 people were killed.

Our colleagues in Utrecht are currently applying our algorithm to create a model (consisting of some 100 million underground grid points) of the North Analtolian Fault (essentially the underground in Greece and Turkey up to 1,000 kilometers deep).  What this information allows us to ascertain is where the underground stresses are strongest — an often tell-tale sign of the most dangerous potential earthquake trigger points.

As good as this model is, however, it still needs refinement.  This is because the link between earthquakes and underground flows is complex and hard to compute.  In practice, calculating such flows means constructing very complex mathematical systems made up of millions of pressure and velocity values at all of the underground grid points.

This has given rise to the need to scale up the solution time linearly, a feat that researchers had previously found too difficult.  While this scaling up has now been achieved, thus making the model more accurate and comprehensive, the project’s complexity has been increased considerably.

Nevertheless, after finishing off our work on the North Anatolian Fault, Delft and Utrecht Universities intend to try to model the tectonics of the entire earth — a truly ambitious project that will involve perhaps some 1 billion grid points (what we call our “fine grid”).

To make the computations for these 1 billion points will require surmounting yet another major hurdle — the ‘parallel computing’ problem.  That is, increasing the number of computers in a system generally means that they work less efficiently.  However, the Utrecht team has already been working with a test-sample of 500 computers and we now believe we have mitigated this problem with our algorithm.

Despite this breakthrough, computing the one billion parts of the fine grid is a long-term programme and, in order to push forwards in the meantime, we are also working on a technique called ‘coarse grid’ acceleration.  Our coarse grid utilises only a small number of sample points in all of the earth’s various strata, thus allowing us to obtain fast, accurate solutions for all of these sample points, leading to considerable savings in computer time.

Finally, we also plan to implement the algorithm on video cards which can speed up the computation by a factor of 100 or more.

While much more hard work and innovation lie ahead, this new frontier of seismology is therefore genuinely path-breaking and already achieving exciting results.  However, as the Haiti earthquake has painfully reminded us all, true success will only be achieved when we reach the stage at which human lives are saved by applying our research in practice.

While much more hard work and innovation lie ahead, this new frontier of seismology is therefore genuinely path-breaking and already achieving exciting results.  However, as the Haiti earthquake has painfully reminded us all, true success will only be achieved when we reach the stage at which human lives are saved by applying our research in practice.

© 2011 TU Delft