Research with a vision
Scientists discuss digital methods and virtual laboratories at the Helmholtz-Zentrum Geesthacht’s 2017 Annual Meeting. These topics include X-ray imaging, undertaken by HZG materials researchers at DESY’s PETRA III X-ray source, which quickly gobbles up several terabytes of storage space. The data archive that climate researchers rely on stores as much as 135,000 laptop computers. Coastal researcher Dr Volker Matthias and materials researcher Dr Martin Müller explain here how coastal and materials scientists at the Helmholtz-Zentrum Geesthacht work with huge quantities of data and what they learn from this data.
Volker Matthias heads the Department of Chemistry Transport Modelling at the Institute of Coastal Research
Photo: HZG/ Christian Schmid
Mr. Matthias, you’re interested in how ship exhaust gases spread in coastal regions. Why is that?
Approximately one thousand ships sail the North Sea alone every day. They discharge pollutants such as soot, sulphur dioxide and nitrogen oxides. Only ships using fuel low in sulphur have been allowed to travel the North and Baltic Seas for some time now. This has considerably reduced their sulphur dioxide emissions. Seagoing vessels, however, still contribute a substantial amount of nitrogen oxides to emissions. These nitrogen oxides add, for example, to the formation of ozone and fine particulate matter – particularly in harbour cities: a third of nitrogen oxide pollution in Hamburg is estimated to come from ships.
Our research aims to determine in detail how much shipping contributes to harmful emissions and how effective counter-measures could be – for example, by using catalytic converters.
How do computer simulations – chemistry transport models – help you in this work?
Our simulations are similar to those from computer models used for weather forecasting. In addition, we take into account pollutant emissions, material transport and chemical reactions. Our simulations are based on grids. We divide the atmosphere into several hundred thousand grid cells, and the computer calculates a set of chemical reaction equations for each of these cells. The cells then exchange information with each other – and that‘s how we simulate the material transport.
Our simulation initially works with a rough grid for all of Europe, with 64 x 64-kilometre-sized grid cells. We then calculate the North and Baltic Seas area more precisely, with grid cells measuring 16 × 16 kilometres. In the end, we simulate the North Sea coast at our highest currently available resolution, which is 4 x 4 kilometres.
We feed our models with meteorological data and the closest estimates for emissions obtained from traffic, industry, households and agriculture. Pollutant emissions from ships can be determined very precisely by taking into account the movements of every single vessel. We then let two simulations run: one with all pollutant sources and the other without including shipping emissions. The proportion from shipping can be derived with great precision from the difference.
Computers at the German Climate Computing Center in Hamburg. Photo: HZG/ Christian Schmid
What kind of computers are you working with?
On the one hand, we’re working with the in-house computer cluster. It includes about 2500 processors, and we can use two to three hundred of those simultaneously. In addition, we can also fall back on the computer belonging to the German Climate Computing Centre (DKRZ). We require not only a great deal of storage space for our calculations, but the data must also be able to be stored quickly. A single simulation produces a data set of about one terabyte; the time required for the calculations ranges from several days to an entire week.
What have the results produced so far?
One thing we could show was how different emission sources interact. Ship exhaust gas can react with ammonia emissions from agriculture and form fine particulate matter. We have also looked into the future with our simulations and calculated different scenarios. One example is that starting in 2021 all newly constructed ships on the North and Baltic Seas will only be allowed to emit a quarter of the nitrogen oxide quantities they emit today. How does that impact 2030? 2040? Our simulations predict that nitrogen oxide pollution by 2040 will decrease by up to eighty percent, whereby we also assume that ships will use considerably less fuel. We don’t see as much improvement in 2030, as there will still be plenty of old ships traveling the seas to which this stricter limit doesn’t apply. If we want to see improvement more quickly, we need to consider outfitting older ships with catalytic converters.
Atmospheric Simulation Copyright: HZG/ KBT
And what plans do you have for the future?
We, for example, would like to improve our model resolution, from 4 x 4 kilometres to 1 x 1 kilometre. We could then improve imaging of regions like the Hamburg Harbour. We simply need more computing power to do that. We also plan to use typical big data methods. To capture emissions from street traffic more precisely, we could use traffic information or data from Toll Collect. We could also take advantage of satellite data to determine which agricultural pollutants are emitted during different seasons.
Thank you for your time.
Prof Martin Müller heads the “German Engineering Materials Science Centre” (GEMS) at the Institute of Materials Research
Photo: HZG/ Christian Schmid
Mr Müller, GEMS holds a special position at the Helmholtz-Zentrum Geesthacht. What makes it special?
Our laboratories are not in Geesthacht. We instead operate several measurement instruments at two large facilities – PETRA III at DESY in Hamburg as well as at FRM II in Garching bei München. The storage ring PETRA III produces extremely bright X-ray light, while the research reactor FRM II produces neutrons. Both are highly efficient tools in examining the inside of materials.
How do you contribute to the HZG materials researchers’ work?
Together with the institute division led by Prof Norbert Huber, we have looked into what precisely happens during laser welding. Using bundled x-rays, we can observe how the laser beam welds the material and how the welding seam solidifies afterwards. Another example is that we have integrated a machine for friction stir welding at one of the measurement stations – a method developed substantially at the HZG. The PETRA III X-ray beam allows us to track how the process takes place with the utmost precision. Lately, we’ve also been working with Prof Regine Willumeit- Römer’s team to examine novel metal biomaterials: How does a magnesium bone screw degrade, especially under conditions present in the human body?
How much data has resulted from these experiments?
Just a single image can take up several gigabytes. If we, however, want to track a process, we need to take thousands of images in quick succession. That will quickly take up several terabytes – and with future detector generations, this will probably even be petabytes. The computer, data storage and data transmission requirements will grow accordingly. This also applies to evaluating the data – for example, in displaying 3D images. To master these challenges, we work closely with the DESY and FRM II computing centres.
Automatic sample changer at a GEMS HZG beamline. Photo: HZG/ Christian Schmid
How quickly can the measurement data be displayed and how rapidly are they available to those conducting the experiments?
Our goal is to evaluate the data in real-time and/or to evaluate it with a time delay of merely a few minutes. Fast algorithms, for example, help us here in data reduction. The advantage of this rapid evaluation is that the researchers can already see how the experiment is running during the measurements themselves. They can therefore recognize in time when a test is heading in the wrong direction. They can optimize their measurement strategy so that they can achieve their goal more quickly and with fewer experiments. This is already possible in some instances at GEMS. We want to develop this considerably in the future.
What role can big data methods play here – for example, in automatic recognition of imaging data patterns?
Methods such as “machine learning” have a lot of potential. The vision is for algorithms to recognise particular patterns in the measurement data and then to be able to automatically inform us in what direction the experiment is headed. Image face recognition from internet firms like Google have already proven the technology does work. For us, it’s now about transferring this to the scientific realm. The Helmholtz Association is helping us in this regard with “the Helmholtz Incubator Information and Data Science” initiative. Scientists from completely different fields of specialisation who are facing similar problems come together: How can we extract the right measurement strategy from large data quantities as quickly as possible?
Thank you for your time.
The Helmholtz presidents introduced the “Helmholtz Incubator Information and Data Science” initiative to improve handling of large quantities of data (“big data”) in science. The incubator comprises thirty-six IT and data science experts from all Helmholtz centres and is supported by experts from the researching industry.
Both interviews were conducted by science journalist Frank Grotelüschen.
Published in in2science #4 (Juny 2017)