by Ulrich Trottenberg and Han La Poutré

Many phenomena and processes in nature, science, technology and economy are today modeled mathematically, and these models are used for control, prediction and optimization. Virtual models substitute real systems and simulation replaces costly, long lasting and dangerous experiments. Today problems can be treated on computers that seemed utopian to be solved twenty years ago.

by Juan C. Vallejo and Miguel A. F. Sanjuán

All numerical calculations have inherent inaccuracies, and beyond certain timescales even the best method will diverge from the true orbit. The concept of shadowing time allows the reliability of a computer-generated orbit to be characterized. This indicator is being applied to some models of galactic potentials, where areas of high and low predictability mix in a fractal-like way.

by Michal Haindl, Jiří Filip and Martin Hatka

Physically correct and realistic visual appearance rendering or analysis of material surface visual properties require complex descriptive models capable of modelling material dependence on variable illumination and viewing conditions. While recent advances in computer hardware and virtual modelling are finally allowing the view and illumination dependencies of natural surface materials to be taken into account, this occurs at the expense of an immense increase in the required number of material sample measurements. The introduction of fast compression, modelling and rendering methods for visual data measurements is therefore inevitable.

by Evangelia Flouri, Dimitrios Mitsoudis, Nektarios Chrysoulakis, Manolis Diamandakis, Vassilios A. Dougalis and Costas E. Synolakis

Tsunami waves (long waves) can be efficiently simulated by numerical models solving the Shallow Water equations. Recently, FORTH-IACM has used depth-averaged computational models in shallow water with an emphasis on complex 3D domains in the context of some EU projects. One of these was TRANSFER (Tsunami Risk ANd Strategies For the European Region).

by Petr Pecha and Radek Hofman

Potential failures in man–made processes can result in the accidental release of harmful substances into the environment. Risk evaluation and a decision-making process that is focused on protecting the population has the highest priority. Historically, accidents in nuclear facilities have revealed a lack of sufficiently advanced decision support software tools. Great attention has been paid to this topic since the Chernobyl disaster. The software system HARP (HAzardous Radioactivity Propagation) is designed for the fast assessment of the radiological consequences of such a release of radionuclides into the environment.

by Stephen O’Sullivan and Turlough Downes

Many of the stars in our universe form inside vast clouds of magnetized gas known as plasma. The complexity of these clouds is such that astrophysicists wishing to run simulations could comfortably use hundreds of thousands of processors on the most powerful supercomputers. In the past, a serious obstacle to capitalizing on such computational power has been that the methods available for solving the necessary equations were poorly suited to implementation on massively parallel supercomputers.

by Emanuele Salerno

A research team at the Signal and Image Processing Lab of ISTI-CNR has been involved in studying data analysis algorithms for the European Space Agency’s Planck Surveyor Satellite since 1999. The huge amount of data on the cosmic microwave background radiation provided by the Planck sensors requires very efficient analysis algorithms and high-performance computing facilities. The CNR group has proposed some of the source separation procedures that are now operational at the Planck data processing centre in Trieste, Italy.

by Alexandros Roniotis, Kostas Marias, Vangelis Sakkalis

One of the major aims of the ContraCancrum Project is to develop a composite multilevel platform for simulating glioma development as well as tumour and normal tissue response to therapeutic modalities and treatment schedules. By efficiently predicting the evolution of a tumour and how this alters with different therapeutic schemes, clinicians could optimize the disease treatment procedure in the patient's individualized context.

by Karl N. Kirschner, Axel Arnold, and Astrid Maaß

Multiscale modelling requires the transferral of knowledge gained at different resolutions. Through the use of an expert-driven workflow we have developed reliable pathways for transferring information from quantum mechanics to atomistic and coarse-grained simulations.

by Pierluigi Contucci, Cristian Giardinà, Claudio Giberti and Cecilia Vernia

Real-world phenomena are often described by complex systems with competitive and cooperative behaviour. Such systems, as much as the described phenomena, are hard to understand in a scientific perspective mainly due to the lack of general exact solutions. For cases like this, the computational sciences provide a very useful virtual laboratory. The case of disordered systems is an example of scientific computing techniques being used to test theoretical predictions and uncover new phenomena that remain unreachable by traditional analytical methods.

by Marco Hülsmann, Thorsten Köddermann, and Dirk Reith

The Fraunhofer Institute for Algorithms and Scientific Computing (SCAI) has developed a software tool for the automated parameterization of force fields for molecular simulations using efficient gradient-based algorithms. This tool, combined with well-established simulation techniques, can quantitatively determine many physicochemical properties for given compounds.

by Klaus Wolf and Pascal Bayrasy

The Fraunhofer Institute SCAI has developed an application-independent interface for the coupling of different simulation codes, known as MpCCI (Mesh-based parallel Code Coupling Interface). The MpCCI interface has been accepted as a de facto standard for a neutral and vendor-independent coupling interface. Currently MpCCI supports Abaqus (© Simulia), Ansys (© Ansys Inc), Flowmaster, (© Flowmaster Ltd), Fluent and Icepak (© Ansys Inc), FineHexa and FineTurbo (© Numeca Intl), Flux3D (© Cedrat SA), MD.Nastran and MSC.Marc (© MSC Software Corp), Permas (© Intes GmbH), STAR-CD and STAR-CCM (© CD adapco), and RadTherm (© TAI). An open programming interface has been widely used to adapt customer internal codes as well as public research codes to MpCCI, thus allowing these codes to be coupled with the already supported MpCCI codes.

by Daan Crommelin and Jason Frank

The multiscale character of many natural systems poses a major challenge for computational studies. Often the variables of interest are macroscopic: researchers care about phenomena on large spatial scales and long timescales, not about the details of the microscopic (small-scale) behaviour. However, if the small scales do not merely perturb the large scales but fundamentally alter their behaviour, it becomes impossible to simulate macroscopic behaviour without taking into account microscale influences. Resolving the microscales explicitly usually requires such high model resolution that it becomes computationally infeasible to do sufficiently long simulations of the macroscopic behaviour (or even to simulate the macroscales at all). One approach to this problem is to use stochastic methods to represent the small scales, thereby making the macroscale simulations feasible.

Tanja Clees and Daniela Steffes-lai

Among the results of the Fraunhofer project CAROD (Computer-Aided Robust Design) is a novel strategy for the statistical analysis and multi-objective robust design-parameter optimization of chains of production processes. This strategy, PRO-CHAIN, is built upon several software tools that allow for an efficient sensitivity, stability and robustness analysis, even for simulation results on highly resolved grids. Within CAROD, concrete results have been obtained for a highly crash-relevant part of a car. In this case, the strategy also includes new material and damage models and comprises both physical experiments and numerical simulations.

by Ismael Marín Carrión, Julio José Águila Guerrero, Enrique Arias Antúnez, María del Mar Artigao Castillo and Juan José Miralles Canals

Physicists and computer scientists from the University of Castilla-La Mancha are currently performing transdisciplinary work on nonlinear time series analysis. This research will analyse some important properties of time series, and will also provide a set of high performance algorithms that will allow this analysis to be made in a reasonable time, especially in real applications such as biomedicine or climate science in which real-time responses are required.

by Antonio Frangioni and Luis Perez Sanchez

The I-DARE system aims at helping practitioners to bridge the gap between mathematical models cast in their natural form and the myriad of available specialized solvers capable of exploiting the valuable, but possibly hidden structures in the model. It does this by automating the search for the best combination of (re)formulation, algorithm and parameters (comprising the computational architecture), until now a firm domain of human intervention.

by Abder Aggoun, Nicolas Beldiceanu, Mats Carlsson and François Fages

Packing items in bins is an old but nevertheless challenging combinatorial problem with numerous applications in industry. We report on an original approach based on constraint programming and rule-based modelling, which has been investigated in the framework of the FP6 ‘specific targeted research project’ Net-WMS (Towards integrating virtual reality and optimization techniques in a new generation of Networked businesses in Warehouse Management Systems under constraints). It has applications in the automotive industry.

by Alina Sîrbu, Heather J. Ruskin and Martin Crane

Integration of large amounts of experimental data and previous knowledge is recognized as the next step in enhancing biological pathway discovery. Here, data integration for quantitative regulatory network modelling is under investigation, using evolutionary computation and high-performance computing.

by Magnus Jahre and Lasse Natvig

The computer architecture group at the Norwegian University of Science and Technology (NTNU) in Trondheim, Norway, is working on issues that are arising as increasing numbers of processors are integrated on a single chip. Discrete event simulators and high-performance computers are indispensable tools in this quest. By combining the cutting-edge multi-core simulator M5 from the University of Michigan with the 5632-core Stallo cluster at the University of Tromsø, researchers are making progress on the issues facing future multi-core architectures.

by Massimo Cossentino, Vincent Hilaire and Abder Koukam

Nobel Laureate Herbert Simon states: “Empirically a large proportion of the complex systems we observe in nature exhibit hierarchic structure.” Starting from this assertion we developed a novel approach for designing and implementing complex systems combining the multi-agent approach and the holonic social organization perspective.

by Chris Greenough, Shawn Chin, David Worth, Simon Coakley, Mike Holcombe and Mariam Kiran

The Flexible Large-scale Agent Modelling Environment (FLAME) has been developed in a collaboration between the Computer Science Department at the University of Sheffield and the Software Engineering Group at the STFC Rutherford Appleton Laboratory. FLAME is an applications program generator for agent-based simulations. Using the modeller's definition of the agent-based model and the associated C-code that implements the agent actions and state changes of the agents, FLAME generates the user's application using program templates as either a serial or parallel code.

by Anke Hutzschenreuter, Peter Bosman and Han La Poutré

With the aging of the population and the demand for cost-efficiency, logistics and planning in hospitals are becoming increasingly important. In many countries, hospitals are organized in a decentralized fashion, with (medical) departments and units having a high degree of autonomy in management and planning. The Agi-Care project (Agent-based intelligent health care planning) develops computational approaches to the optimization of patient flow logistics in hospitals, ie, concerning the various pathways of inpatients moving through various units in a hospital. The Agi-Care project has been carried out at Eindhoven University of Technology, in cooperation with the Catharina Ziekenhuis Eindhoven and Centrum Wiskunde & Informatica (CWI - the Dutch national research centre for mathematics and computer science), in the Netherlands.

by Gerhard Chroust, Karin Rainer and Markus Roth

In responding to the growing need to be prepared for chemical, biological, radiological and nuclear (CBRN) emergencies, First Responders (i.e. fire brigades, emergency medical services and police) must quickly evaluate such incidents and take appropriate actions to minimize negative effects on humans and goods. Achieving such interventions poses a critical challenge, since humans do not possess any inborn, natural sensors with which to recognize these dangers early enough. Additionally they are not equipped with natural, semi-autonomous reaction patterns. Nevertheless, it is of the utmost importance to avoid endangering the First Responders. This requires special training and adequate tools, especially since a considerable proportion of First Responders are volunteers providing part-time services.

Next issue: January 2018
Special theme:
Quantum Computing
Call for the next issue
Get the latest issue to your desktop
RSS Feed