Categories

Disclaimer

De meningen ge-uit door medewerkers en studenten van de TU Delft en de commentaren die zijn gegeven reflecteren niet perse de mening(en) van de TU Delft. De TU Delft is dan ook niet verantwoordelijk voor de inhoud van hetgeen op de TU Delft weblogs zichtbaar is. Wel vindt de TU Delft het belangrijk - en ook waarde toevoegend - dat medewerkers en studenten op deze, door de TU Delft gefaciliteerde, omgeving hun mening kunnen geven.

Posts in category onbekend

‘Waterbeheer is van levensbelang’

Ir. Annemieke Nijhof MBA

‘Nederland ligt in een delta. Zonder goed watermanagement zou het een voortdurend geven en nemen zijn van water en (duur verworven) land. Het is dus van levensbelang dat we onze waterhuishouding goed op orde hebben. Daarnaast is het water volop in beweging, op verschillende schaalniveaus. Er zijn veel partijen op diverse levels mee gemoeid. Ook daarom hebben we een goed geolied, efficiënt opererend systeem nodig.

Er liggen echter bedreigingen op de loer. Bijvoorbeeld dat Rijkswaterstaat en de Waterschappen het waterbeheer zo perfect organiseren en uitvoeren dat niemand er iets van merkt. De samenleving stelt zelfs ter discussie of al die instanties en investeringen wel nodig zijn. Maar hoe maken we, zonder mensen angst aan te jagen, duidelijk dat je op dit gebied nooit klaar bent? De boodschap is dubbel. We zijn er trots op dat de burger zoveel vertrouwen heeft in ons waterbeheer. Anderzijds is het onderwerp niet top of mind; een sluipmoordenaar. Als er gaten in het asfalt ontstaan door vorst merken we dat meteen en wordt er direct actie genomen. Maar als wij keringen afkeuren is er geen tijdstermijn geregeld in de wet en gaat er zomaar tien jaar voorbij.
De waterproblematiek in ons land is, net als in de rest van de wereld, groot. De zeespiegel stijgt, wat zeer zorgwekkend is, maar er zijn ook gevaarlijke veranderingen in de afvoer van water via de rivieren. Feitelijk bedreigt het water ons land via de voor- en de achterdeur! Tegelijkertijd zien we een dreigend tekort aan zoetwater als gevolg van verminderde neerslag en verzilting van de bodem. De gebruikers van dit water zijn bovendien in felle concurrentie met elkaar. Er spelen dus veel tegengestelde belangen.

Toch ben ik hoopvol gestemd. Zo ben ik heel blij dat Nederland een Deltawet, een Deltaprogramma, een Deltafonds en een Deltacommissaris heeft. Daar zet ik vol op in, zodat de aandacht voor waterveiligheid is geborgd. We moeten een catastrofe vóór zijn, zodat ook onze kinderen kunnen zeggen dat zij van nà de watersnoodramp zijn. Mijn tweede grote ambitie is het verbinden van de watersector met andere werelden. Ik wil meer mensen bewust maken van wat waterbeheer behelst en wat een prachtig vak het is. Ook (TBM)-studenten zou ik willen uitnodigen voor watermanagement te kiezen. Voor ambitieuze, creatieve mensen liggen er volop kansen!

Een van mijn diepere drijfveren om in het publieke ambt te werken, is dat solidariteit en duurzaamheid leidende beginsels zijn. Ik streef er echter naar deze publieke opgave op een nieuwe manier in te vullen: ik vind dat je beleid niet vanachter een bureau kunt bedenken, maar open moet staan voor de samenleving en de processen daarin. We moeten op zoek naar verbinding met de maatschappij en ik wil bijdragen aan die houdingsverandering. Datzelfde zou ik ook in de zorg willen doen, ware het niet dat water zo fascinerend is. Het is een onomstreden publiek terrein. Als de overheid niets doet, loopt het land over. Zo simpel is het.”

Annemieke Nijhof (1966) is Directeur Generaal Water bij het ministerie van Verkeer en Waterstaat. Eerder was zij onder meer Raadadviseur van premier Balkenende inzake Ruimtelijke Ontwikkeling, Duurzaamheid en Milieu.

Dit opiniestuk is gepubliceerd in het juni 2010-nummer van TBM Quarterly
van de faculteit Techniek, Bestuur en Management van de TU Delft nav
het seminar Watermanagement, Bestuur en Beleid dat eerder dit jaar
plaatsvond.

Waterbeheer en politiek

Prof. dr. Sybe Schaap

‘Wie de verkiezingsprogramma’s doorleest merk dat het waterbeheer weinig belangstelling geniet, het waterschap des te meer. De hype vermindering van de bestuurlijke drukte legt de bestuurlijke autonomie van deze bestuurslaag weer eens onder vuur. Onderbrenging bij de provincies is de meest genoemde optie. Bezuinigingen en efficiency zal dit niet opleveren, de bestuurlijke drukte zal niet afnemen. Veeleer is hier sprake van een ondoordachte poging tot daadkracht. Ook al is deze move nauwelijks uitvoerbaar, ze demonstreert wel een riskante trend: de politisering van het waterbeheer. De algemene democratie (rijk en provincie) probeert greep te krijgen op de aloude functionele. Onvervalst machtsdenken, met als risico het veronachtzamen van een vitaal deel van onze infrastructuur: de veiligheid en leefbaarheid van ons land. Een paar punten.

Allereerst het risico van geforceerde bezuinigingen. Het regionale waterbeheer zou kunnen worden meegezogen in wat ook de ‘natte’ beheerstak van Rijkswaterstaat is overkomen: het wegvallen van eigen beheerskennis, evenals uitstel van investeringen en onderhoud. Waterbeheer vraagt een bestuurscultuur die zich richt op planmatig werken over lange termijnen; vergelijk het tijdsperspectief van het Deltaprogramma.
De algemene democratie heeft te weinig aandacht voor lange termijnen. Vandaar achterblijvende investeringen in infrastructuur, onderwijs en wetenschappelijk onderzoek. Nederland op korte termijn krijgt politieke aandacht, de lange termijn scoort niet.
Daarom moet ook het waterschapsbestuur weer gedepolitiseerd worden; er wordt al te veel over tarieven gediscussieerd en te weinig over lange termijn rendement. Daarbij mag ook de natte poot van Rijkswaterstaat in waterschapsverband worden geplaatst, inclusief financiering via een omslag. Samenvoegen van waterschappen en provincies verplaatst de financiering van het waterbeheer naar het provinciefonds; en dat begint met kortingen.

Er loert nog een risico als het waterschap als autonome institutie verdwijnt. Wat dan dreigt, manifesteert zich thans wereldwijd: een verkeerde relatie van waterbeheer en ruimtelijke ordening. Overal domineert de ruimtelijke ordening de agenda en is het waterbeheer de zwakke, volgzame partij. Bij overname van de waterschapsbevoegdheden door de provincies dreigt hetzelfde: onderschikking van het waterbeheer aan belangen die de ruimtelijke ordening domineren. Overal wordt het waterbeheer gedwongen zich aan te passen aan dictaten van andere belangen, zoals projectontwikkeling en natuurdoelstellingen. Een heilzame dualiteit van water en ruimtelijke ordening, met geborgde bevoegdheden voor het waterbeheer, zal worden uitgehold.

Na 1995 en 1998 had het waterbeheer nu net een positie verworven die grote schade door een onverantwoorde ruimtelijke planning moest voorkomen. Het Westland kon voorheen rustig worden volgebouwd, waterberging was een vreemd woord. Bedenk hoe de bypass van de IJssel bij Kampen ingepland zou zijn als het waterschap aldaar geen krachtige positie had gehad. Wereldwijd drijven grote risico’s voor de veiligheid en leefbaarheid er nu juist toe zelfstandige regionale waterautoriteiten in te stellen, naar Nederlands voorbeeld. Het is te hopen dat staatkundig inzicht en staatsrecht een onverantwoorde politieke hype kunnen weerstaan.’

Sybe Schaap (1946) is Hoogleraar Water Policy & Governance bij de TU Delft. Sybe Schaap is sinds 2007 lid van de VVD-fractie in Eerste Kamer. Hij was tot 2010 dijkgraaf van Waterschap Groot Salland en voorzitter van de Unie van Waterschappen.

Dit opiniestuk is gepubliceerd in het juni 2010-nummer van TBM Quarterly van de faculteit Techniek, Bestuur en Management van de TU Delft nav het seminar Watermanagement, Bestuur en Beleid dat eerder dit jaar plaatsvond.

International Crises and the value of Global System Dynamics

Lord Professor Julian Hunt – Visiting Professor at Delft University of Technology 

This article was published on Reuters The Great Debate UK on June 15th 2010

In
their different ways, the disruption and damage caused by the ongoing
Icelandic Volcano eruption, and the major oil leak in the Gulf of
Mexico, have underlined how low-probability events can wreak havoc
locally and across the world. 

Both events underline the
continuing need for well-established crisis response by international
bodies. Risk assessments taking into account all the diverse scientific
and social interactions should enable the public and private sector to
prepare in advance.

•    Although international procedures by UN
bodies for dealing simultaneously with volcanic eruptions, meteorology
and aviation had been agreed and tested at a technical level since the
1990s, the disruption caused by the Icelandic volcano led EU Transport
Ministers call for quicker and more coordinated reaction to such crisis
situations.

•    In the Gulf of Mexico, the ‘unprecedented
environmental disaster’ from the oil spillage shows the need for
environmental risk assessment as much as economic risks now being
considered in the context of the volcano. 

While the volcano
and oil spills have causes and consequences that can be explained in
terms of earth science, engineering, ecology and economics, other
disruptive events with rapid global impacts can result simply from
people’s actions – notably the fall of Lehman Brothers and the
September 11, 2001 terrorist attacks.

Taken as a whole, the
growing global attention being paid to these types of urgent,
international, complex and inter-connected problems have led a group of
scientists, working with policymakers from the European Commission, and
the private sector, to collaborate in new ways to explore how they
could be dealt with more effectively in future.

Particular
emphasis is being paid to global system dynamics when they are applied
to making decisions, consulting with the public and identifying
critical research problems for the future.  Essentially these systems
involve data input and output, models, networking with other systems
and decision making.  The role of feedback through public consultation
is an essential but poorly understood part of the process.

From
philosophical and multi–disciplinary beginnings in the 1920’s,
applications of systems methods for industry and defence began in the
1940s.  With the emergence of regional and global environmental
problems of pollution, concerns about the devastating effects of
nuclear war, planning the future resources of the planet, and then
dealing with climate change, the global systems dynamics approach with
ever growing computer power has become the only method available for
policy making, with of course a thorough going involvement of social
sciences.

Systems analysis is not yet the accepted method for
managing financial crises, but it is suggesting some of the
instabilities that have contributed to the most recent international
recession.  This could be a valuable tool for developing regulation
policies for the highly computerised financial networks.

Can
global systems science provide insights and quantitative methods to
policy makers, beyond the usual, but essential, approaches of
cost-benefit, political factors (which may be quite scientific such as
the use of focus groups), historical example and crisis response
planning?

One answer comes from several private sector
entities which are employing dynamic, time varying computer models of
present and future behaviour of the natural, technological and social
components of activities or organisations.  For instance, the French
utility company Veolia uses system models to discuss policy options
with city authorities.  In this case, the requirement for integrated
civic policies has meant that the system models had to be integrated.
 
These
practical demonstrations provide lessons for how public organisations
and politics can apply the systems approach in their domain.  Guide
books and road maps are already being written to promote this
development through a project funded by the European Commission.

Information
technologies are playing a key role in establishing the enhanced
interfaces and appropriate communication channels needed between
science, policy and society. A recent development of highly focused
data provision is the use of Twitter by environmental agencies to send
out topical warning messages.

Technical advances in
information science are going beyond software engineering, model
specification and formal methods to address the inherent speed limits
for man-machine interactions, which when exceeded can cause so-called
‘flash-crash’ disruptions in the financial markets.

There are
also other limits to the complexity and size of the models that are
used.  Firstly system models that  rely on the gathering and managing
of large scale, heterogeneous sets of diverse data use ever larger and
more energy consuming computing capacity.  Will the current
requirements of 5-7 MW in the largest centres keep on increasing?
 
Secondly,
as computer programmes become larger and more complex, their
reliability can become questionable since the only evidence that they
are correct is verified by the highly skilled, but unsystematic process
of looking at the results of thousands of calculations and studying
their patterns.  Computer science has not yet been able to find a
fool-proof proof!

Social and political aspects in the gathering,
analysis and dissemination of data also have to be recognised.  For
social administration and security systems, intrusive searching for
data must be minimised, which means that the most advanced ICT methods
are needed for the most efficient use of data for analysis and decision
making.

Political negotiations about climate change and the
controversies about the scientific data have highlighted the need for
wide communication of the policy process and about different sources
and methods of analysis of data.  Without this openness and public
trust, systems based decisions will always be suspect. 

The challenge for science is therefore two-fold:   to advance modeling of global systems
and to engage with novel forms of interaction with policy, with regard to problems that
span
from local to global decision-making.  Global research initiatives are
underway leading to data, with new ICT and remote sensing methods, and
development of  models in diverse global contexts such as city systems,
conflicts between societies and nations, water and food security,
climate change impact, and the dynamics and regulation of  financial
systems.

As global systems science becomes more directed towards policy making, research and practice it is focusing on: 

•   
Understanding and explaining better how the ways that individuals and
organisations deal with issues that can be described by the methods of
systems analysis.  The next step is to use the basic steps of data,
modeling, and communication/consultation to make improvements noting
that there are many levels of complexity and cost and consultation. 
These steps can be effective from giving conceptual and qualitative
advice to providing massive quantitative policy recommendations derived
from extensive computation.

•    Developing techniques and
concepts for systems approaches to:  (a) assist integrated policy
making such as managing complex crises or the connected  energy,
environmental and resource aspects of sustainable development
strategies; and (b) to predict the dynamical behaviour of  different
types of organisation, which for example can depend on how its parts
are connected, or how events in the system develop in time; sometimes
chaotic fluctuations are followed by  sudden changes,  as occurs in
organisational as well as volcanic eruptions and in the pattern of
communications chatter before critical events.

Arguably, the
world of science and decision making should be encouraged by the
growing and open collaboration between different disciplines from
economists to engineers and biologists in exploring new policies for
dealing with natural disasters and societal failures with their global
impacts.  Many international organisations, both public and private,
are constructively involved.  Serious disruption has resulted, but long
term physical and social disaster has been generally been averted.

In
dealing with the multi-decadal problem of global warming and wholesale
destruction of biodiversity, global systems analysis is even more
relevant as a framework for considering all the scientific,
technological and social interactions. In addition it is accepted as a
framework for specialists in countries with differing policies and
scientific understanding to discuss controversial issues, as was
evident at an EU-China seminar last May when China presented its policy
position very clearly.

Hopefully, if this approach is adopted
more widely, international scientific and political understanding will
improve and practical climate change measures will be agreed before it
is too late.
<!–
/* Font Definitions */
@font-face
{font-family:Wingdings;
panose-1:5 0 0 0 0 0 0 0 0 0;
mso-font-charset:2;
mso-generic-font-family:auto;
mso-font-pitch:variable;
mso-font-signature:0 268435456 0 0 -2147483648 0;}
@font-face
{font-family:SimSun;
panose-1:2 1 6 0 3 1 1 1 1 1;
mso-font-alt:宋体;
mso-font-charset:134;
mso-generic-font-family:auto;
mso-font-pitch:variable;
mso-font-signature:3 135135232 16 0 262145 0;}
@font-face
{font-family:TimesNewRoman;
panose-1:0 0 0 0 0 0 0 0 0 0;
mso-font-charset:0;
mso-generic-font-family:roman;
mso-font-format:other;
mso-font-pitch:auto;
mso-font-signature:3 0 0 0 1 0;}
@font-face
{font-family:"@SimSun";
panose-1:2 1 6 0 3 1 1 1 1 1;
mso-font-charset:134;
mso-generic-font-family:auto;
mso-font-pitch:variable;
mso-font-signature:3 135135232 16 0 262145 0;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{mso-style-parent:"";
margin:0cm;
margin-bottom:.0001pt;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:"Times New Roman";
mso-fareast-font-family:"Times New Roman";
color:black;
mso-fareast-language:EN-US;}
@page Section1
{size:612.0pt 792.0pt;
margin:72.0pt 90.0pt 72.0pt 90.0pt;
mso-header-margin:35.4pt;
mso-footer-margin:35.4pt;
mso-paper-source:0;}
div.Section1
{page:Section1;}
/* List Definitions */
@list l0
{mso-list-id:8066438;
mso-list-type:hybrid;
mso-list-template-ids:-1140787446 67698689 134807555 134807557 134807553 134807555 134807557 134807553 134807555 134807557;}
@list l0:level1
{mso-level-number-format:bullet;
mso-level-text:;
mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-18.0pt;
font-family:Symbol;}
@list l1
{mso-list-id:1452018777;
mso-list-type:hybrid;
mso-list-template-ids:-361434706 67698689 67698691 67698693 67698689 67698691 67698693 67698689 67698691 67698693;}
@list l1:level1
{mso-level-number-format:bullet;
mso-level-text:;
mso-level-tab-stop:36.0pt;
mso-level-number-position:left;
text-indent:-18.0pt;
font-family:Symbol;}
ol
{margin-bottom:0cm;}
ul
{margin-bottom:0cm;International Crises and the value of Global System Dynamics

A Turning Point Has Been Reached In the Gulf of Mexico, But This Crisis Will Still Transform The Oil Industry

Kees Willemse, Professor of off-Shore Engineering, Delft University of Technology

This article also appeared on Reuters The Great Debate UK on June 11th, under the heading ‘BP Gulf of Mexico crisis will transform the oil industry’ 

The news that a huge metal cap has been successfully placed over several of the leaking oil vents at the Deepwater Horizon site marks a potential turning point in the Gulf of Mexico crisis. It is already estimated that some 10,000 of the 12,000-19,000 barrels of oil that are spilling out each day into the ocean are being captured and diverted to ships on the sea surface. Barring further setbacks, it is expected that the percentage of captured oil will be progressively increased over the next few days.

Despite this major engineering success, a complete end to the oil leakage is unlikely until new relief oil wells are completed – a drilling process that could take most of the Summer, and potentially into the Autumn. This is because the newly installed metal cap is unlikely, even in the best case scenario, to stop all of the oil spilling out.

In advance of the completion of the relief wells, a potentially major new complicating factor is the arrival of the hurricane season last week. The National Oceanic and Atmospheric Administration is already predicting between 8 and 14 hurricanes this season, with perhaps a similar number of smaller storms, any of which could complicate (or indeed force a postponement) of the ongoing mitigation and clean-up activities in and around Deepwater Horizon.

Although the leakage will therefore continue for weeks to come, and the clean-up operations will take even longer, it is already clear that the disaster will herald a transformation in the oil industry by:

– Signaling a new re-regulatory era, especially in the United States, including the new six month moratorium on offshore drilling announced last month by US President Barack Obama. Authorities around the North Sea are also reasserting their positions.

– This will be paralleled by movement by the industry towards a new paradigm which, as is explored later in this article, may potentially require companies to adopt new operating models and much better engineering expertise of how to manage and mitigate disasters when they occur, especially in hard-to-reach oil in deep waters. As BP CEO Tony Hayward has asserted, ‘After the Exxon Valdez spill in 1989, the industry created the Marine Spill Response Corporation to contain oil on the sea surface… The issue now will be to create the same sub-sea response capability’.

The crisis is now officially the worst US spill in history, with between 500,000 to 800,000 barrels estimated to have leaked out into the Gulf of Mexico as of May 31. While still dwarfed by the largest ever off-shore leakage of oil (an estimated 3,300,000 barrels in 1979) at Ixtoc-1 in the Gulf of Mexico’s Bay of Campeche, Deepwater Horizon is now at least twice as large a spillage as the 260,000 barrels estimated to have polluted the shores and caused major environmental damage in Alaska following the Exxon Valdez disaster in 1989, although it should be realised that the crude oil of that vessel was considerably worse for the local environment.

Hence, the reason why Obama re-asserted last week that Deepwater Horizon will be the ‘worst environmental disaster of its kind’ in US history.
With US mid-term congressional elections fast approaching in November, a regulatory clampdown upon the oil drilling industry appears likely. However, re-examining the regulatory framework is just one part of what is now needed. In addition, much more emphasis should also be placed upon enhancing safety technology, and engineering expertise and preparedness of how to manage and mitigate disasters when they do occur, especially in hard-to-reach oil in deep waters.

On the safety technology side, we will need, in particular, to improve blow-out preventer technologies. It is also likely that more than one of these blow-out devices will be needed at major sites in the future. Had a second blow-out preventor been in place at Deepwater Horizon, it is possible that the current disaster could have been completely averted.

What is also badly needed is better management and mitigation preparation for sub-sea disasters. Much of the equipment that was used in the various attempts to stem the oil flow was hastely designed on the spot and whilst the well was blurting out the oil in huge quantities. Hence, the reason why BP CEO Tony Hayward candidly asserted that his firm ‘did not have the tools you would want in your tool kit’ to respond faster to the crisis.

That the necessary expertise and preparedness has been lacking is reflected in the fact, for instance, that workers have needed to experiment with new technologies and devices such as the five-storey, 100 tonne containment dome which unsuccessfully was used to act as a giant funnel, collecting and piping the oil to the sea surface. Before Deepwater Horizon, this procedure had only been used at much shallower depths with much lower pressures. There is now an urgent need for ‘before-the-event’ development and experimentation of this, and other techniques, in much deeper waters.

Meeting these ambitions head-on will require, in the words of Hayward, nothing less than a ‘paradigm change’ in the oil industry.

At a minimum, this will require all key industry parties (including exploration and production companies and drilling contractors, but also the engineers and researchers of the major universities with offshore expertise) to work together more closely to reduce the safety risks. However, it may also require a transformation in the oil industry business model which some assert has become excessively reliant on outsourcing important work to contractors.

For instance, while BP was in overall control of the Deepwater Horizon platform, responsibility for safety was shared with Transocean (which owned and operated the rig), Halliburton (which cemented the wall), and Cameron International (which manufactured the blow-out preventor). Reducing the risk of accidents in the future could involve moving towards a different model with less sub-contractors, or at least one company tasked with overall responsibility for safety.

Whether such an operating model transformation happens or not, this new safety reform agenda can only be cost-effectively achieved if the oil industry prioritises it. It is likely that the global public will settle for nothing less in the post-Deepwater Horizon world.

Can we stop Deepwater Horizon becoming an unprecedented environmental disaster?

This article was also published on Reuters The Great Debate UK on May 14th 2010, titled How much damage will the BP oil spill cause?
 

Kees Willemse, Professor of Offshore Engineering, Delft University of Technology

Last month’s explosion at the Deepwater Horizon rig continues to result in the leakage of an estimated 200,000 gallons of oil into the Gulf of Mexico each day.  According to US President Barack Obama, ‘we are dealing with a massive and potentially unprecedented environmental disaster’.

While the leak is extremely serious, and Obama’s words may ultimately ring true, the leak is (as yet) not one of the top 50 biggest oil spillages from either oil rigs or tankers in historical perspective:
 
•    Some 7-10,000 tonnes of oil are so far estimated to have leaked into the Gulf of Mexico from Deepwater Horizon.
•    The Exxon Valdez leaked some 36.000 tonnes of crude oil on the shores of Alaska.
•    The largest ever off-shore leakage of oil occurred in 1979 in the Ixtoc-1 spillage when an estimated 476,000 tonnes of oil polluted the Gulf of Mexico (Bay of Campeche).
•    The biggest ever on-shore spillage occurred in the aftermath of the 1991 Iraq War when an estimated 1.4 to 1.5 million tonnes was released in Kuwait by Iraqi military forces.

Most at risk from the Deepwater Horizon spill are the coastlines of Texas, Florida, Mississippi, Alabama, and Louisiana, including the wetlands near New Orleans where millions of migratory birds are currently nesting, and fish spawning.  The oil spill could also be catastrophic for the Gulf Coast’s substantial seafood industry, including oysters and shrimp.

To mitigate the environmental impact, measures will continue to be taken to prevent as much of the oil as possible reaching the shoreline, including setting fires to ‘burn-off’ the oil; soaking up the oil; and placing protective ‘barriers’ around shorelines.   

The precise scale of the unfolding disaster remains uncertain owing to the lack of clarity over how long the leak will last.  In the worst-case scenario, as US Interior Secretary Ken Salazar has suggested, the leak could continue for several more months.  Uncertainty is also increased by the fact that BP executives are reported to have admitted to members of the US Congress that the amount of oil spilling could intensify — perhaps by several multiples of the current leakage per day — if they cannot cap the flow.

Hopes for a relatively early end to the leakage are resting largely upon the success of the operation to install a five-storey, 100 tonne containment dome.  The device was to be lowered by cranes around 1500 metres to the sea floor and, if possible, positioned over the two areas of leaking pipe.  If successful, the dome would serve as a giant funnel, collecting and piping it for collection on the sea surface. However, this operation has only been applied at a much smaller water depth and the first attempt to employ the device here has not been successful.
        
Even if successful, the dome will not stem all of the leakage (perhaps 80-90% of it) and there remains a risk of explosion when the oil reaches the surface because of the volatile mix of oil, gas and water.  Thus, the operation can only be a ‘holding’ one to buy time until the spillage can be shut off at the two remaining sources of the leak.  

While shutting off the leakage at source will be an immensely difficult task, hopes will have been raised last week by the fact that BP successfully shut off the smallest of the three original leakages.  This was done by placing a valve over the ruptured pipe and shutting it off using a remotely controlled submarine.  

The best hope of shutting off the two remaining sources of leakage is getting the blow-out preventor working again.  This is the system, which should act as an emergency cut-off to stop oil continually spilling out if a pipe is damaged, and which failed catastrophically last month.

The Deepwater Horizon blow-out preventor is proving immensely hard to fix, in large part because of the exceptional depth of the water.  Robot submarines are attempting to re-start the system, albeit without success to date.

In the event that the blow-out preventor cannot be fixed, relief wells have begun to be drilled that could be used to siphon off the oil leaking from the holed pipeline. This operation will take an estimated two to three months as the drilling is taking place at 1500 metres water depth and a further 5 kilometres into the hard rock. The relief well can also be drilled such that the shaft of the new well enters into the shaft of the existing, problematic well. A cement prop can then be inserted to stop the flow in the first well. This requires extremely accurate drilling, but that technique has been proven before.

Inevitably, all of this environmental mitigation and emergency replacement activity is proving extremely expensive. The cost to BP alone is an estimated 6 million dollars a day, and independent estimates have put the final bill at between 3 billion and 12 billion dollars.  However, the effects of this accident cannot be expressed in money terms alone, because of the growing scale of the environmental disaster if the oil spill cannot be contained soon.  

Once the crisis is over, industry and government will need to make an in-depth analysis of the cause of the accident to ensure that similar incidents can never happen again.  And Delft University of Technology is ready to mobilise its resources to join this effort and provide the technological support that will be required to make offshore activities safe for the generations to come.

Why the Icelandic volcano eruption could herald more disruption

This article by dr. Andy Hooper first appeard on TimesOnline on 18th April 2010. It was rerun by Reuters UK The Great Debate on April 19th 2010
Dr Andrew Hooper is an Assistant Professor at Delft University of Technology and is an expert on monitoring deformation of Icelandic volcanoes

Why the Icelandic volcano eruption could herald more disruption
The unprecedented no-fly zone currently in force across much of Europe has already caused the greatest chaos to air travel since the Second World War. Thousands of flights have been cancelled or postponed with millions of travel plans affected. It has been estimated that shutting down the UK’s airspace alone over the weekend could cost airlines over 100 million pounds, with the share price of some leading airlines already taking a hit.

The wider economic consequence to our ‘just-in-time’ society is incalculable at this stage given the disruption to holidays, business plans and indeed the wider business supply chain. However, the global cost of the disruption will surely ultimately result in a cost of billions rather than hundreds of millions of pounds.

It is exceptionally hard to gauge how long the current grounding of flights will remain in force, although Eyjafjallajökull, the Icelandic volcano which has erupted, could potentially sputter on for months or even more than a year. Much could depend upon weather patterns, especially wind direction, over the next few days.

The worst-case scenario in terms of precedent here is the 1783-1784 eruption at Laki (a very large eruption of 14km3 compared to the one in Mount St. Helens in 1980 of 1 km3) that had a huge impact on the northern hemisphere, reducing temperatures by up to 3 degrees. This led to catastrophe far beyond the shores of Iceland (where 25% of population died), with thousands of recorded deaths in Britain due to poisoning and extreme cold, and record low rainfall in North Africa.

By contrast, the eruption of Eyjafjallajökull in 1821-1823 (when only about 0.1km3 was erupted) had little impact beyond the shores of Iceland, where livestock were killed by flourine poisoning. Like 1821-1823, this current eruption is likely to remain small in terms of volume, but in an age of mass aviation, a relatively small amount of erupted ash is having huge consequences.

One volcanic eruption in Alaska in 1989 necessitated the postponement and cancellation of flights in North America for days. It is likely that the fallout from the volcanic eruption yesterday will be worse because European airspace is more congested than in North America for global airline traffic.

How much material will be erupted? Observations of surface deformation can throw light on this. These come principally from two sources — the first being a handful of GPS receivers dotted around the volcano by the University of Iceland and the Iceland Met Office, and the second being imaging radar on board satellites.

Differencing of subsequent radar images can give surprisingly accurate maps of the movement of the ground. I and others at Delft University of Technology have been developing algorithms to push the limits of the technique, to extract measurements from radar data in regions where it is more difficult, and with greater accuracy.

The usual pattern with Icelandic eruptions is for rising and stretching of the surface as magma moves up to shallow depths of a few kilometres, followed by contraction and sinking of the surface as magma exits the shallow magma chamber and erupts at the surface.

However, in this case, Delft University working in collaboration with the University of Iceland has detected magma moving upwards until the onset of the initial eruption on March 20th, but very little deformation since then. This implies that the volume of erupted magma is balanced by new magma coming from deep within the crust, perhaps even the crust mantle boundary, and it is impossible to know how much magma may be stored at these depths. Thus, it remains a very real possibility that the volcano will continue to erupt on-and-off for months to come, as occurred during the last eruptive period in 1821-1823.

Measurements of the surface deformation, together with detected earthquakes, have indicated magma rising to shallow depths since the beginning of this year. Questions have been raised why an eruption was not predicted beyond the time scale of a couple of hours prior to initiation.

From analysis of radar data we know of two events at Eyjafjallajökull in 1994 and 1999, that started in a similar way with magma moving to shallow depths (5-6 kilometres). However, in both cases the magma then spread out laterally and remained in the crust. Apparently something differed this time in that stress conditions favoured continued migration of the magma upwards. We have some way to go before we can answer what seems like a simple question, whether magma moving upwards to shallow depths is likely to erupt, or stall within the crust.

At the end of the last ice age, the rate of eruption in Iceland was some 30 times higher than historic rates. This is because the reduction in the ice load reduced the pressure in the mantle, leading to decompression melting there. Since the late 19th Century the ice caps in Iceland have been shrinking yet further, due to changing climate. This will lead to additional magma generation, so we should expect more frequent and/or more voluminous eruptions in the future.

Eyjafjallajökull is a relatively small volcano and unlikely to erupt the volumes of material that will have a significant impact on climate. However, eruptions of Eyjafjallajökull in 1821-1823 and 1612 were followed in short shrift by eruptions of its much larger neighbour, Katla.

Katla thus has shown the potential for large eruptions in the past — the last catastrophic Icelandic eruption prior to Laki was from Katla in 934 when an even greater volume of lava was erupted. If Katla were to erupt in a significant way, the potential for travel chaos and economic damage would thus be much greater than has occurred in the last 24 hours.

Haiti: Moving beyond the conventional redevelopment paradigm

This article by Alexander Vollebregt appeared on Reuters The Great Debate UK on March 31st 2010. Alexander Vollebregt is Assistant Professor at Delft University of Technology and Head of the Urban Emergencies Programme.

At the request of President Rene Preval’s Strategic Advisory Group, several members of Delft University’s Urban Emergencies Centre are now working with the Haiti authorities to assess how post-disaster urban redevelopment can support the rebuilding process in Port au Prince.

We believe what is now urgently needed is a paradigm shift in current thinking and methods in order to facilitate sustainable reconstruction in the country and indeed similarly disaster effected areas.

Since the devastating earthquake on January 12, 2010, international organisations, aid and money have flooded into Haiti; the government itself having little control over the reconstruction efforts.  This disjunction between the local, governmental and international organizations could hamper any significant long-term reconstruction and development, creating a post-disaster transitional environment that never transitions.

To achieve this goal, it is essential to better integrate grassroots (local) voices with international NGO and governmental voices.  This will create a mutual platform for contributing knowledge about what is best long term for this devastated landscape and peoples.

This perspective stems from a major study that Delft University undertook last year on Urban Emergencies.  This included a focus using action research in order to investigate various aspects of disaster relief management and reconstruction process such as hurricane proofing, risk assessment, spatial planning, socio-economical development, land administration and water, waist and energy management.

The  Delft research focused upon the interrelationship of the social, spatial, political and economical consequences of post-disaster urban responses.  We worked 3-months on-site studying six case studies :  Venezuela (landslide), El Salvador (earthquake), Ghana (floods), Indonesia (tsunami), Bangladesh (floods, landslide, cyclones), and Philippines (volcano eruption, earthquake, typhoons).

After 50 years experience in post-disaster redevelopment, there are unfortunately many more worst-case practices then best-case practices. This is predominantly due to a lack of an integrated redevelopment vision as the reconstruction process is divided into various phases with different relief agencies each responsible for separate tasks.

This conventional paradigm used by governments and NGOs after disasters is often not sustainable or appropriate in the long term.

To take one example of this paradigm, the three common stages for shelter response are:

1. Temporary housing (installing emergency relief  tents,
2. Transitional housing (easy/quick to construct shelter) and
3. Permanent housing (better quality constructed buildings).

The second stage, while described as ‘transitional’ appears to hinder any permanent reconstruction. For example, some residents have remained in their ’stage 2′ accommodation for over 10 years. This is not designed to be disaster resilient so, if a natural disaster occurs again, it would devastate the inhabitants, bringing the area back to stage 1.

Furthermore, the constant relocation and displacement of the effected populace inhibits them to pickup their livelihoods (e.g. job opportunities, social relations) resulting in psychological traumas and frustrations. Often, even after receiving ‘permanent’ housing, due to their remote location, disconnection to the city and inability to find economic opportunities, inhabitants usually sell (or simply leave) these ‘permanent’ homes and move back to the vulnerable spaces inside the city.

Why do post-disaster communities get stuck in phase 2 for such long periods of time?

The answer is that a great deal of resources tend to be put into this phase because global pressure to see quick results mounts upon the aid agencies and governments operating within the post-disaster space.  This is coupled with the reality that very few local governments in the past have shown willingness to invest in long term disaster preparedness interventions.

This can be attributed to a lack of public funds or insufficient taxation systems in many vulnerable states, hence putting money into preparing for something that ‘may not happen’ has lacked justification on their part.

History shows, however that this view can end up proving much more expensive for these governments, who have to reinvest again should their infrastructure be insufficient to cope with natural disasters.

How do we move efficiently from temporary relief to sustainable permanent solutions (Phase 3)?

Firstly, we should introduce spatial thinkers (e.g. architects and urbanists) to the reconstruction efforts at an earlier stage; bringing not only expertise on resilient and pragmatic urban construction into the redevelopment process, but also including a deeper understanding of socio-spatial and spatial economic relationships for the long term. Though this may take more time, it is the best opportunity to include resilience strategies within the redevelopment proposals.

Furthermore, it is the only moment that vast resources are available to implement Disaster Preparedness Strategies.

Secondly, we need to bring together local knowledge and opinion that understands the community cultural needs and practices and incorporates it with the international and governmental opinion. An integrated, multi-stakeholder perspective is the only way to ensure that reconstruction and planning brings together expertise and cultural understanding. This approach facilitates local ‘ownership’ of the redevelopment and means that the inhabitants will engage in and drive it.

Thirdly, rather than applying textbook, standardised responses and procedures, we need to move more towards a collaborative, tailored response, coordinated by government and developed by multiple stakeholders; by and for the people.

Over the next few months, Delft graduates of the Urban Emergencies programme will immerse themselves in Port au Prince engaging with local inhabitants to support ‘acute’ architectural reconstruction in the rural areas; they will collaboratively construct semi-permanent housing with local inhabitants.

Simultaneously, academic staff will offer short capacity building workshops enhancing spatial and technical skills for participants in the current reconstruction phase.

Finally, when the rubble has finally been cleared away in the most devastated areas, an inter-university team of students and researchers will work for one year with UN Habitat and local stakeholders in an effort to understand what is needed by local people, the government and the international forces in order to collaboratively research and design sustainable develop strategies for neighbourhood, city and regional scale, transforming Haiti into a resilient and prosperous urban space.

If this can be achieved, this will go a long way to securing significant urban, economic and social recovery. To be sure, this will not be easy, but we are committed to doing all we can to avoid more degeneration for Haiti and its people.

Check out the website of the Urban Emergencies programme at www.urbanemergencies.org 

Bringing a new perspective to World Water Day

This opinion piece by prof.dr.ir. Jules van Lier was published on Reuters The Great Debate on World Water Day, Monday 22nd March 2010

The international observance of World Water Day, this year on March 22, is an initiative that grew out of the 1992 United Nations Conference on Environment and Development (UNCED) in Rio de Janeiro.  This year’s theme — ‘Clean Water for a Healthy World’ — reflects the fact that population and industrial growth are adding new sources of pollution and increased demand for clean water across the world.  Human and environmental health, drinking and agricultural water supplies for the present and future are at stake, yet water pollution rarely warrants mention as a pressing issue.

It is absolutely right that water quality considerations should be highlighted just as much as water quantity issues going forwards.  However, what is sometimes obscured in this important debate is that, even with a step change in global water treatment efforts, vast amounts of potentially valuable wastewater will continue to be produced for the foreseeable future.  

Indeed, in some developing countries some 80% of all waste is being discharged completely untreated, because of lack of regulations, resources and control. Globally, it is estimated that 1,500 cubic kilometres of wastewater is produced on an annual basis, whereas the world renewable fresh water reserves amount only 40,000 cubic kilometers per year. Realizing that 1 m3 of non-treated wastewater may spoil over 1000 m3 of fresh water for human consumption or other activities, the urgency of the matter is an obvious issue.

My research at Delft University of Technology, UNESCO-IHE and my previous employer, Wageningen University, has convinced me that, especially in the developing world, it is crucial that we change our perspective on wastewater for two main reasons:  

•    In an era of increasing water scarcity, especially in the developed world, it is increasingly vital that we use all our water supplies efficiently.  As a result of climate change, it is estimated that some 1.8 billion people will live in countries by 2025 with absolute water scarcity.
 
•    Moreover, in recent years, the treatment technologies for removing the harmful components from wastewater have become increasingly effective.  Thus, far from being a useless by-product which is collected in pipes and gutters and flows into a dump-hole somewhere in the ground, wastewater is actually fast becoming a potential source of valuable raw materials including water and energy that can be reused productively for energy and irrigation.

Going forwards, the potential of wastewater is truly huge, especially in the developing world.

For instance, if we assume only a 50%  recovery of the chemical energy enclosed in human excreta the potential energy generation equals about 100 watt-hours (Wh) per person per day. This alone would be enough to light a substantial part of the poorest cities of Africa all night long!
 
Moreover, a city with 1 million inhabitants with an average water consumption of 100 litres a day can theoretically irrigate and fertilise between 1500 and 2000 hectares of farmland through wastewater, while nutrients from the wastewater can also be put to good use and the farmland serve as a sand filter to purify the water. With our world phosphorous mines being depleted in about 60-70 years from now, we simply have to recover our valuable resources from our urban waste streams. In fact, the word ‘waste’ does not exist anymore and must be replaced by ‘a stream of non-defined resources’, ready for valorisation.
 
In the past, public sector municipalities, especially in the developing world, have failed to appreciate this potential and have underinvested in sanitary engineering and construction infrastructure and personnel.  

Going forwards, however, it is likely that a new generation of developing country entrepreneurs will be able to unlock the value and potential profitability in wastewater and play a key role in the construction and implementation of basic sanitary infrastructure, opening up new opportunities in areas such as micro-financing and environmental engineering.  

This would be hugely important in the developing world where 2.6 billion people still have no proper sanitation, resulting in some 200 deaths per hour, with the highest number among children under five.

Indeed, it is perfectly possible in the future that entrepreneurs might, under appropriate regulation, operate municipal-wide sewage treatment systems with investment costs being covered by loans, donations, franchise systems and/or lease contracts, and profit margin coming from sources like sewerage levies, nutrients benefits, stabilised organic matter, and recovered energy.  

Such wastewater treatment plants may even eventually become reprocessing plants that produce water suitable for reuse.  This will lend an entirely new impetus to the process that could lead to the application of new reprocessing technologies, especially in areas where waste water treatment is still seen as a ‘Western luxury’.  

One final key element that will drive this process forwards is the relaxation of very stringent Western-driven standards that have paralysed construction and implementation of sewerage and waste treatment plants in the developing world.  The resulting costs are often beyond the means of local municipalities, or encourage development of the wrong kind of sewerage and waste treatment plants for their needs.

A good example here is the city of Amman which, driven by Western donors, has built a high-technology treatment plant with costly wastewater treatment systems.  Amman would have instead benefited from a decentralised treatment plant that yields an extra 5 to 6 megawatts (MW) of electrical power which could then be used to drive irrigation pumps, for example, to benefit agriculture in local dry regions.   
     
Unfortunately, what has happened in Amman is a common phenomenon in the developing world where insufficient interest is paid to potential alternative sewerage and treatment plants that would be more robust and suitable for these regions.  The end result is often abandoned or under-performing systems, and or plants that consume so much of the available financial resources that only a fraction of the pollution problems can be handled.
 
If local entrepreneurs can shift this perspective towards one more focused on adaption towards the local situation and a proper financial cost-benefit analysis, wastewater could easily grow to become an exceptionally valuable source of resource recovery, powering a broader development process in developing countries which still often lack even basic sanitation and water treatment infrastructure.

Implications of recent climate science controversies

This article was posted on Reuters The Great Debate by Sir Julian Hunt on February 18th 2010

– Julian Hunt is visiting professor at Delft University of Technology (The Netherlands) and formerly director general of the UK Meteorological Office. The opinions expressed are his own.-

In the past few weeks, there has been a steady stream of stories highlighting major concerns over scientific evidence relating to climate change.  One example has been the world-wide furore relating to the Intergovernmental Panel on Climate Change’s (IPCC’s) assertion that all Himalayan glaciers would melt by 2035.

Going forwards, as the UK Government Chief Scientist Professor John Beddington has stated strongly, standards of openness about sources, verification and presentation must be at the highest level.

The most regrettable implication of recent events is that further confusion has been sown amongst global publics about climate change.  What I believe most people want now is enlightenment, not further argument, about what might be the gravest issue confronting humanity in the twenty first century.

One of the key challenges for scientists and indeed politicians is communicating the reality of climate change to global publics in an accurate and intelligible way.  Contrary to belief in some quarters, the leading models that forecast global climate temperature in decades ahead are reliable and this is strongly supported by satellite data.

Dismissive views expressed about climate predictions are often based on the uncertainty of long range weather forecasts.  However, this is false because even sceptics know how long it takes to heat water in a sauce pan and that it does not depend on understanding the eddy movements in the pan (which are analogous to weather patterns and are only approximately described by models).

What is needed is more openness and clarity about the huge complexity of the climate change phenomenon.  For instance, over the last decade, while the earth’s land surface has been warming overall, trends of weather and climate records reveal larger and more unusual regional and local variations — some unprecedented since the end of the last ice age 10,000 years ago.

Among such warning signs are the disappearing ice fields around the poles and on all mountain ranges, more frequent droughts in Africa and now in wet regions (such as the 2006 drought in Assam India, previously one of the wettest places in the world), floods in dry regions (as recently, the worst floods in 50 years in northwest India), and ice storms in sub-tropical China in 2008 (for the first time in 150 years).

What these data patterns underline is that, while climate change is a reality, it is impacting regions and indeed sub-regions of the world in very different ways.

Although such variations are approximately predicted by global climate models such as the IPCC’s, these data-sets need buttressing with more local measurements and studies for sub-national governments, industry and agriculture to better understand their climatic situation and develop reliable and effective strategies to deal with all the ways that climate change affects their activities and well being.

Post-Copenhagen, adopting this approach is especially crucial as we may be heading towards a future in which no long-term, comprehensive successor to the Kyoto regime is even politically possible at the international level.  One of the chief flaws in the Copenhagen negotiations was the fact that the negotiations were aimed at an ambitious top level deal that did not account for political imperatives in developed and developing countries.

Experience shows that an ‘bottom-up’ approach works very effectively.  Publics and businesses are far more likely to believe local monitoring reports on climate change.  Moreover, it is only when sub-national areas learn how they will be specifically affected that grassroots action can often be aroused.

This latter lesson was one I learned as a City Councillor in Cambridge in the 1970s when I helped introduce air pollution measurements to show the effects of traffic in the city’s town centre.  Once the high air pollution was measured and better understood by local people, traffic control measures were quickly introduced.
I am therefore delighted at the increasing numbers of regional monitoring centres across the world which, by communicating and interpreting climate change predictions and uncertainties, are contributing towards local adaptation plans:
•    In China, where provinces require targets for power station construction, regional environmental and climate change centres are now well developed.
•    In the United States, a recent report has highlighted the value of non-official centres, such as a severe storm centre in Oklahoma, which gives independent advice to communities and businesses, while relying on government programmes for much of the data.
•    In Brazil, a regional data centre is providing data and predictions about agriculture and deforestation and informs legislation about policy options.

What this activity points to is the need for a broader global network of such centres to support national climate initiatives, and to facilitate international funding and technical cooperation in delivering the right information to the right place, at the right time.

Local actions can only be effective if measurements of climate and environment are made regularly and are publicised as well as information about targets, and projections of emissions.  Experience shows that full exposure is needed about what is happening, what is planned, and how every individual can be involved (as the Danes show by their community investment in wind power).

Moreover, as legislators in Globe (Global legislators for a balanced environment) and city governments across the world are already putting into practice, adaptation to climate change also needs to build on existing knowledge and infrastructures in local settings.

Forming loose collaborative networks will enable regional facilitation centres, their experts and decision makers to learn from one another and also draw upon the resources of existing national and international databases and programmes, such as the growing number of consortia linking major cities, local governments, and the private sector.

The overall message is clear.  ‘Localisation of action and data’ must be the post-Copenhagen priority if we are to facilitate public understanding of climate change and truly tackle the menace it poses.

Time to invest in Europe’s bio-clean tech delta

This opinion piece by Luuk van der Wielen and Roger Wyse was published on Reuters The Great Debate on February 4th 2010

– Luuk van der Wielen is at BE-Basic and Delft University of Technology; Roger Wyse is Managing Director, Burrill & Company, San Francisco. The opinions expressed are their own. –

Today the global megatrends of food security, energy security, global climate change and sustainability command the attention of nations worldwide.  Confronting these challenges will test political systems, drive policy and stress international relations.

To address them successfully, nations and companies are making massive investments in R & D, seeking solutions that will drive global innovation for decades.  The application of modern discoveries in biology and biocleantechnology will be a major enabling force to address these issues.

Indeed, the application of bio-clean technology can potentially mitigate many of Europe’s ecological and economic challenges.  The markets for bio-based (or green) products and technologies made from agricultural waste — instead of oil — are currently large and open.

However, the public and private sector must act now otherwise we will miss a huge opportunity to generate economic value and delay will only worsen our environmental predicament.  Access to innovation must be global in nature as no country has the resources necessary to discover, develop, and implement solutions.  Furthermore, the problem and therefore the solution knows no national boundaries.

The Netherlands are a first-class example of how bio-clean technology can and should drive a new wave of innovation.

Much of the Dutch economy is founded on the immediate post-Second World War industrial wave that brought the likes of Shell, DSM, Phillips and Unilever to global prominence.  In recent decades, the economy consolidated with competitive and entrepreneurial potential under-utilised.

A new wave of industrial innovation for the country is now badly needed to tackle the rise of the BRIC-economies (Brazil, Russia, India and China), the recent financial down-turn, not to mention the vulnerable, climate-sensitive Dutch delta region (with about 50 percent of the country below the rising sea level) which will be affected by the consequences of energy security and climate change.

Bio-clean technology can power that new wave of innovation.  The Netherlands have consistently worked to implement a number of national public-private innovation programmes around the topic of bio-based and other clean technologies that are now starting to pay off in a major way, and within which there will be attractive near- to mid-term private-sector investment opportunities.

Earlier this month, a new wave of public-private R & D investments exceeding 0.5 billion euros was launched, including:

  • large-scale initiatives on industrial biotechnology and biorenewables (BE-Basic),
  • direct solar to chemicals and energy production (Biosolar),
  • process intensification, carbon capture and storage (CATO) and others.

This large package of bio-clean technology also forms the backbone of the recently awarded CLIMATE KIC by the European Institute of Technology.

The participants in the Netherlands’ public-private programmes include major Dutch companies (DSM, AKZO Nobel) and Universities (Delft University of Technology as well as Groningen, Utrecht and Wageningen Universities) and also other premier European academic institutes such as London’s Imperial College, the German technical universities of Dortmund and Karlruhe, and also a number of institutions outside Europe such as the Energy Bioscience Institute of UC Berkeley.

An important driver is the potential to establish a sustainable bio-based economy for the production of chemicals, materials and transportation fuels.  This could also add further value to the residues of a food producing agricultural sector.  We are well on our way to converting biomass into a range of renewable chemical and materials.

Applications of biotechnology in the chemical and energy industries are already resulting in sustainable sources of second generation biofuels, biocatalysts and renewable chemicals, meeting the growing demand for energy and renewable materials.

Thus, bio and clean technologies will be major contributors to addressing the challenges posed by the global megatrends of food security, energy security, climate change, and sustainability.  With the continued success of these technologies, exciting investment opportunities exist within and across the agricultural, chemical and energy sectors.

Prudent investments in technology-rich, private companies that contribute viable solutions should generate attractive returns to venture investors while addressing important societal issues.

This requires investments in companies at two stages;

  • early on, in companies with potentially disruptive or broadly applicable technologies, and,
  • later on, in the adaption and commercialization of proven technologies for new markets.

Companies invested in this way will be well positioned to succeed in global markets and be attractive acquisition targets for global chemical, energy and agricultural companies or for IPOs in ready markets throughout the world.  At present, Northern Europe is a most promising place for these governmental and private investments given the solid home market with a clear consumer awareness about sustainability and resource security, a world class R & D potential and a well established industry and logistic infrastructure, that needs to be reshaped toward the challenges of the future generation.

However, two key challenges lie in the way of us realising this vision.

First, Europe needs to better connect the flow of scientific and technological concepts from the large pool of sustainable bioclean technology initiatives into the generation of new companies, jobs and economic growth. Given the substantial amount of public resources in these programmes, novel mechanisms will have to be implemented to guarantee clear returns to the public purse.

The second challenge is to stimulate activities that also lead to climate adaptation (most bioclean tech programs target mitigation only).  This is crucial, for instance, for the Netherlands given that 50 percent of its land is below the rising sea-level.

The well known Dutch expertise in water and delta management — much of which was developed at Delft University of Technology — has to be advanced further.  Interesting examples have arisen, for instance, in the BE-Basic program from the exploration of biogrouting,  bioconstruction materials and other completely new bio-based mechanisms leading to new ways to build dikes and enforce weak soils.

If we can surmount these twin challenges, and by harnessing the major biocleantech innovation wave, the prize for Europe will be not only positive climate effects and energy security but the foundation-stone for a new generation of sustainable, economic growth.

© 2011 TU Delft