Progress toward Exascale computing

Mike Vildibill1
1Vice President, Exascale Development, Federal Programs & HPC Storage groups, Hewlett Packard Enterprise

 

The U.S. Department of Energy has selected HPE to rethink the fundamental architecture of supercomputers to deliver the technology blueprint to make Exascale computing a commercial reality. HPE has a clear vision, strategy and execution capabilities as an industry innovator with deep expertise in Memory-Driven Computing, VLSI, photonics, non-volatile memory, software and systems design. HPE is now in a race to the future to deliver the Exascale performance with breakthrough capability in reduced energy consumption by 2023. Once operational, these systems based on ‘The Machine’ technologies, will help our customers to accelerate research, education and development.

PRAGMA — What does it take to really enable international collaboration for long-tail science communities?

Dr Philip Papadopoulos1,2
1Steering Committee Member, PRAGMA,
2Program Director, UC Computing Systems, San Diego Supercomputer Center, UCSD

The Pacific Rim and Grid Middleware Assembly (PRAGMA) is grass roots consortium of more than 20 Pacific Rim institutions. Activities focus on developing, and deploying practical cyberinfrastructure to assist lab-to-lab collaboration for long-tail science communities. Science focus areas include: biodiversity, fresh water ecology, software-defined networking, telescience, education, biosciences, geosciences. PRAGMA defines scientific expeditions where domain scientists and cyberinfrastructure specialists work together over long periods of time to develop solutions that meet the needs of the science but can also be applied more generally. Worked examples, like bringing high-throughput computing to the R-desktop of fresh water ecologists has had the impact of dramatically simplifying access to lake model simulations, will be presented.

This talk will survey the various technology components that PRAGMA has used, evaluated, developed, and where appropriate, discarded. The long time arc (15+ years)of PRAGMA affords a unique perspective the promises, the near-misses, and the successes in the space where technology meets international collaboration.

Stories that Data Tells And Beyond: Successes and Challenges of a Data Science Practitioner

Dr Shonali Krishnaswamy1

1Chief Technology Officer, Artificial Intelligence Driven Analytics (AiDA)

 

This talk will present case studies of how data science and machine learning are transforming organisations across sectors including Telecommunications, Transportation, Financial Services and Healthcare. The case studies will discuss a spectrum of challenges such as: extracting deep insights from sparse and not-so-rich data, innovating through smart applications of current methods and creation of new techniques and approaches to solve challenges. As the practice of data science matures and real-world implementations of machine learning become widespread, there are key challenges that need to be addressed in enabling the discipline and technology to scale. The talk will discuss the very real challenges of “solutionising” machine learning, and the difficulty of moving from point solutions that are the norm today to re-usable solutions in the future. The talk will also present key research challenges that need to be addressed to enable the next generation of machine learning systems that “learn to learn” from data and beyond –  including background knowledge, user feedback, and observations.

Software in Research: Underappreciated and Underrewarded

Assistant Director Daniel S. Katz
Assistant Director for Scientific Software and Applications, National Center for Supercomputing Applications (NCSA)

 

Software has become omnipresent in research as the research process has become increasingly digital, ranging from system software, middleware, libraries, to science gateways and web portals, to computational modeling and data analysis applications. In fact, research is becoming dependent on maintaining and advancing a wide variety of software. However, software development, production, and maintenance are people-intensive, software lifetimes are long vs. hardware, and the value of software is often underappreciated.  Additionally, because software is not a one-time effort, it must be sustained, meaning that it must be continually updated to work with in changing environments and to solve changing problems.

Thus, a challenge to the research community is how to sustain software. Tied to this challenge is the fact that in academia, research software may be developed by faculty members, students, postdocs, and staff members, none of whom are generally measured on their software contributions.  Can we improve this situation, perhaps by recognizing that developing software can be a creative research process, and that software can be a research output?

While software is increasing important to research, the sociocultural system under which research is performed has not been changing as quickly. This talk will discuss the current state of research software, a number of the challenges around it that exist in academia, and some potential solutions.  For example, led by highly motivated research software creators, new career paths and new models for software credit, which have the potential to promote and reward academic research software development, are now developing.

Managing Scientific Workflows in Heterogeneous Environments

Professor Ewa Deelman1, 2
1Research Professor, University of Southern California – Computer Science Department
2Research Director, The USC Information Sciences Institute (ISI)

Modern science often requires the processing and analysis of vast amounts of data in search of postulated phenomena, and the validation of core principles through the simulation of complex system behaviors and interactions.  This is the case in fields such as astronomy, bioinformatics, physics, and climate and ocean modelling, and others.   In order to support the computational and data needs of today’s science, new knowledge must be gained on how to deliver the growing high-performance and distributed computing resources to the scientist’s desktop in an accessible, reliable and scalable way.

In over a decade of working with domain scientists, the Pegasus project has developed tools and techniques that automate the computational processes used in data- and compute-intensive research.  Among them is the scientific workflow management system, Pegasus, which is being used by researchers to discover gravitational waves, model seismic wave propagation, to discover new celestial objects, to study RNA critical to human brain development, and to investigate other important research questions.

This talk will examine data-intensive workflow-based applications and their characteristics, the execution environments that scientists use for their work, and the challenges that these applications face.  The talk will also discuss the Pegasus Workflow Management System and how it approaches the execution of data-intensive workflows in distributed heterogeneous environments.  The latter include HPC systems, HTCondor pools, and clouds.

The best science for the world ‐ not only the best science in the world

Professor Marina Jirotka1
1Professor of Human Centred Computing – Computer Science, University of Oxford

 

Whether it is the discovery of DNA‐based diagnoses of diseases that cannot be cured, the future of a person’s identity in an interconnected world or the consequences of technical  developments on the global climate, scientists and researchers can no longer afford to ignore the societal context of their research. Some approach is needed to open up debate about the potentially rapid changes brought about by technology and the ethical and societal impacts these changes might have. The approach has to allow ICT practitioners to identify and address  issues early in the development process whilst also being practical and manageable.

In recent years, the concept of Responsible Research and Innovation (RRI) has been used to anticipate and respond to the consequences of research and innovation more broadly, by reflecting on whether the processes and products of research and innovation are acceptable and socially desirable.

In this talk, I will discuss ways in which e‐Research professionals might engage in RRI. I will illustrate this with some example problems and controversies that have emerged, and show some  practical methods that have been used to engage participants in discussing the implications of new technologies. These include novel approaches such as, the use of animations, ‘ethicons’ and observatories that both inform multiple stakeholders of the issues  at hand, and allow for the sharing of experiences and solutions.

Creativity in Digital Scholarship

Professor David De Roure1
1Professor of e-Research, University of Oxford, Director of the Oxford e-Research Centre

 

Experimental methods have been used in research since at least the seventeenth century. In recent years, e-Research has given us new computational approaches and enabled in silico experimentation, and our community has been responsible for innovating and studying these new methods. In this talk we look at the trajectory of the experimental approach in the face of both increasing citizen engagement and increasing automation, and demonstrate the emerging practice of “experimental humanities”. Ultimately this is about the role of the human in the future of research.

Recent Comments

    About the conference

    eResearch Australasia provides opportunities for delegates to engage, connect, and share their ideas and exemplars concerning new information centric research capabilities, and how information and communication technologies help researchers to collaborate, collect, manage, share, process, analyse, store, find, understand and re-use information.

    Conference Managers

    Please contact the team at Conference Design with any questions regarding the conference.

    © 2017 - 2018 Conference Design Pty Ltd