Vienna, Austria, 12-14 October 2015
ECSS 2015 is the 11th Summit of Informatics Europe. This major event, held in Vienna, Austria, 12-14 October 2015 is designed as the meeting place for anyone interested in issues of research, education, and policy in Informatics. The central topic of the 2015 Summit is “Informatics in the future – in the year 2025”.
In view of closer cooperation between ERCIM and Informatics Europe, we present the keynotes of this event, of undoubted interest to the ERCIM community and the ERCIM News readership.
The background of ECCS 15 is the idea that informatics as the science behind Information Technology has two faces:
- informatics “in itself”, e.g. algorithm, design, information presentation, programming languages, distribution aspects, complexity issues;
- informatics “for others” and “behind others”, as a tool or methodological approach in other sciences and application fields. It is pervasive and changes the world, with its artifacts and also its vision.
Informatics is interdisciplinary quasi by nature, with engineering, formal methods (logics, maths) and natural science based approaches. The conference will discuss and reflect on research issues and methods, with a 10-year perspective.
ECSS is a unique opportunity to meet some of the leading decision makers in informatics research and education, and discuss the critical issues of the discipline. The Summit is devoted to strategic issues and trends regarding all aspects of informatics: education, research, funding, entrepreneurship, management, career development, and policies.
Traditionally, ECSS conferences are organized by Informatics Europe in collaboration with a host institution, active member of the association. ECSS 2015 is hosted by the Faculty of Informatics, Vienna University of Technology (TU Wien).
ECSS 2015 is co-chaired by Carlo Ghezzi, President of Informatics Europe, and Gerald Steinhardt, Dean of Faculty of Informatics, TU Wien. The program chairs are Hannes Werthner, professor at the Faculty of Informatics, TU Wien and Frank van Harmelen, Department of Computer Science & The Network Institute, VU University Amsterdam.
ECSS 2015 Keynotes
Ebola, Pandemic Influenza, MERS & SARS – How Computational Models Reveal the Hidden Geometry of Global Contagion Phenomena
The last decade has witnessed the emergence and global spread of new, often highly contagious and virulent pathogens that spread across the globe in a matter of weeks or months. Emergent infectious diseases have not only become a key threat to global public health, but carry the potential of yielding major economic crises. Understanding and predicting the geographic spread of emergent infectious diseases has become a major challenge to epidemiologists, public health organizations and policy makers. Large-scale computer simulations that harbor methods from statistical physics, complex network theory and dynamical systems theory have become a key tool in this context.
Dirk Brockmann is professor at the Institute for Theoretical Biology at Humboldt Universität zu Berlin. He also leads the research group at the Robert Koch-Institut Berlin, Germany’s federal Public Health institute.
On the Big Impact of Big Computer Science
Big science is bringing unprecedented progress in many fundamental fields, such as biology and medicine. While progress cannot be questioned, when looking at the foundations and models of big science, one wonders if this new approach is in contrast with critical thinking and model-driven scientific methods - which have shaped for decades higher education in science, including computer science. Computer Science education is changing due to the impact of big science, in some cases for better, in other cases for worse, and the question seems to be whether Academia is a good fit for data scientists. New models are needed for interdisciplinary education.
Stefano Ceri is Professor at Politecnico di Milano. He is currently leading the PRIN project GenData 2020 on genomic computing. He is the recipient of the ACM-SIGMOD “Edward T. Codd Innovation Award” (2013), an ACM Fellow and member of the Academia Europaea.
Jeroen van den Hoven:
Ethics and ICT: Learning to Design for Moral Values
ICT is a formidable shaping force in society. We need to treat it as such. This implies among other things that we need to shape it to express and accommodate our shared moral values and ethical considerations (e.g. regarding privacy, autonomy, responsibility, transparency, democracy, equality, social justice, safety, etc.). The more central big data, Internet (of everything), mobile and cloud computing, social media are becoming in our society, the more urgent the need is to educate the next generation of computer scientists to appreciate this and to help ourselves to the methodologies, tools and conceptual frameworks that support us in shaping our future and destiny by means of responsible innovations in ICT.
Jeroen van den Hoven is full professor of Ethics and Technology at Delft University of Technology, he is Founding Editor in Chief of Ethics and Information Technology (Springer). He won the World Technology Award for Ethics in 2009 and the IFIP prize for ICT and Society also in 2009 for his work in Ethics and ICT.
Interdisciplinarity in Robotics and ICT
Robots are cyber physical systems interacting with the physical world through their sensors and actuators. Robotics is an inherently interdisciplinary area comprising engineering fields, such as mechanical and control engineering, as well as computer engineering and computer science but possibly also other disciplines, such as biology or cognitive science. There are challenges in education, research and development of robots in the context of current technology trends that predict further emergence of virtual and physical worlds, the drive towards large autonomy and the rise of consumer robotics.
Maarja Kruusmaa is a professor of Biorobotics and the head of the Centre for Biorobotics in Tallinn University of Technology. She is also a cofounder of a company Fits.me using robotics technology in a novel way and involved in policy making in ICT though several advisory bodies, such as EU DG Connect Advisory Board.
Ethics of Computing
Recommending ethical principles is risky business: the basis for the recommendation should be universal, yet people differ in their assumptions; and the recommendations should be credible, but no one is beyond question. For prudence, the general ideas justifying the ethical advice are minimal but there are general ethical guidelines that can help progress in computing.
Bertrand Meyer is an entrepreneur, author and academic specializing in software engineering. He created the Eiffel programming language and the idea of design by contract. Since 2001 he is Professor of Software Engineering at ETH Zurich.
Leadership and Balance in Research
Successful leadership of a large research group (approx. 50 people) requires clear philosophical alignment fundamentals shared between all the members of the team. This includes maintaining a common vision and high enthusiasm towards achieving results (no nonsense rule). In order to be sustainable in the long term, we have to maintain the flow of: (a) knowledge/experience, (b) social network of partners, and (c) constant funding. The organization of the team should be preferably flat (but not too flat) with well defined roles, but also as fluid as possible (no rigidness rule) facilitating personal and group progress. One of the fundamentals is to develop trust between people and maintain good human relationships within the team (no fighting rule).
Dunja Mladenić works as a researcher and project manager at J. Stefan Institute, the leading Artificial Intelligence Laboratory and teaching at the J. Stefan International Postgraduate School, University of Ljubljana and University of Primorska.
ICT-Innovation – How Digital Sovereignty and IT-Security Can Help to Push Europe Forward
With the Digital Single Market and the supporting programs H2020, CEF, ISA2 and others, Europe is making a big effort to shape up its ICT. This is also supported by legislation where IT-security plays an excelling role. Not only is the eIDaS regulation an example offering a seamless legal framework for all 28 member states, it is also a unique chance for Europe to show its ICT-strength. The open and innovative approach needs to attract European industry as a provider, and businesses as major enablers. IT-security and data protection need to enable digital sovereignty. These are fields where Europe has developed renowned expertise in the past and has the potential to develop further strength in the future.
Reinhard Posch is professor at Graz University of Technology since 1984. As of 1999 he is also Scientific Director of the eSignature confirmation body “Austrian Secure Information Technology Center” (A-SIT) and Federal Chief Information Officer for the Austrian Government since 2001.
Ada Countess of Lovelace, a One-Person Opera, and The Role of Women in Computing
The first “programmer” of a mechanical computer, Charles Babbage’s “Analytical Engine” comparable to today’s programming procedures, was Ada, Countess of Lovelace. She was the daughter of Lord Byron and Lady Milbanke and, like her mother and her mentor Mary Somerville, she was very interested in science and mathematics. She welcomed all technical innovations of her age, the first railways and telegraphs, and the detection of the role of electromagnetism. She translated a French publication of a lecture held by Charles Babbage and extended his ideas to compute the Bernoulli numbers with a specification of the elementary operations of punched cards and the program structure of their ordering, one of today’s flow diagrams. In the first part of this lecture, some of Ada’s ideas are illustrated through in the presentation of a one person opera on her life. The second part of the talk goes on to highlight the role of women in general in the history of computing, programming and computer science. Currently, the number of women participating in computer science studies is gradually decreasing in western countries, while this is not so in the rest of the world. Reasons for these differences will be discussed.
Britta Schinzel was professor at RWTH Aachen in theoretical computer science, and worked within several areas of Artificial Intelligence, in interdisciplinary cooperation with medicine, biology, sociology etc. Since 1991 she is professor at the Institute for Computer Science and Social Research at University of Freiburg.
A Computational Paradigm of Science and its Discontents
A number of theoretical and technical innovations in the 1930s and 1940s led to a new era of computing, and computing started to develop as an independent academic discipline. Some pioneers of computing emphasized the theoretical elements of science, advocating a mathematical view of computing as a discipline. Others distanced computing from natural sciences and championed for academic legitimacy of sciences of the artificial. At the time when experimental computer science debates emerged, many meta-studies compared research in computing with natural sciences and engineering, condemning computing as methodologically deficient. But in the new century, the success of computing in many scientific applications made computing, in the minds of many, a “paradigm” for other sciences: Computing can learn from nature, or it might be the best tool for studying natural phenomena, or it might actually be what nature does. The journey of computing is a nascent young field struggling for legitimacy of the vision that computing might not be only “a” science but “the” science.
Matti Tedre is the author of “The Science of Computing: Shaping a Discipline” (Taylor & Francis, 2014). He works as associate professor at Stockholm University, Department of Computer and Systems Sciences. Tedre was professor and head of IT program at Tumaini University, Tanzania, an adjunct professor of computer science at the University of Eastern Finland, and adjunct professor of Informatics and Design at Cape Peninsula University of Technology, South Africa.
From Model-Driven Computer Science to Data-Driven Computer Science and Back
Computer science seems to be undergoing a paradigm shift. Much of earlier research was conducted in the framework of well-understood formal models. In contrast, some of the hottest trends today shun formal models and rely on massive data sets and machine learning. A canonical example of this change is the shift in AI from logic programming to deep learning. Two examples of this trend are relational vs. graph databases and formal vs. dynamic verification. However, in each case the data-driven approach does not replace the formal-model approach; rather the data-driven approach is supported by the formal-model approach.
Moshe Vardi is the George Distinguished Service Professor in Computational Engineering and Director of the Ken Kennedy Institute for Information Technology at Rice University. He is a member of the US National Academy of Engineering and National Academy of Science, the American Academy of Arts and Science, the European Academy of Science, and Academia Europaea. He holds honorary doctorates from Saarland University in Germany and Orleans University in France. He is the Editor-in-Chief of the Communications of the ACM.