Accelerating HPC innovation with Artificial Intelligence for today and tomorrow

Steve Tolnai1
1Group Manager & Chief Technologist, HPC & AI Asia Pacific and Japan, Hewlett Packard Enterprise


System developers, scientists and researchers face obstacles deploying complex new HPC technologies, such as: energy efficiency, reliability and resiliency requirements and developing software to exploit HPC hardware. All can delay technology adoption and critical projects. The requirement to accelerate real-time insights and intelligence for deep learning with innovations is growing at break-neck speed. HPE & Intel have forged a multi-faceted alliance to advance customer innovation and expand HPC accessibility to enterprises of all sizes. Join this session to discover how HPE’s ecosystem of industry partnerships are delivering breakthroughs in HPC for deployment, security, power use and density, to make supercomputing more accessible and affordable for today and tomorrow.


Steve Tolnai is the Group Manager and Chief Technologist in the Hewlett Packard Enterprise servers business across Asia Pacific & Japan. In this position, he is responsible for all aspects of High Performance Computing technical strategy within Asia Pacific & Japan.

Steve manages the Technical and Solutions team for High Performance Computing and Artificial Intelligence encompassing Scientific Research, Financial Services Industry, Life and Material Sciences, Computer Automated Engineering, Government and Defence, Oil and Gas, Digital Content Creation and Electronic Design Automation.

Steve’s prior role was Principle Architect for HPC in Asia Pacific, where he architected the largest Life Science Supercomputers in Australia, Korea and Singapore also including the largest Supercomputer in Asia Pacific (ex Japan) in Australia and the largest Supercomputer in Asia Pacific.

In his 30+ years at Digital, Compaq, HP and HPE he has worked as a member of the Technical Sales and Marketing divisions where he has been involved with IT consulting to major corporations and governments in the areas of UNIX/Linux, Windows and VMS; using VAX, Alpha, x86 and NVIDIA architectures.

Progress toward Exascale computing

Mike Vildibill1
1Vice President, Exascale Development, Federal Programs & HPC Storage groups, Hewlett Packard Enterprise


The U.S. Department of Energy has selected HPE to rethink the fundamental architecture of supercomputers to deliver the technology blueprint to make Exascale computing a commercial reality. HPE has a clear vision, strategy and execution capabilities as an industry innovator with deep expertise in Memory-Driven Computing, VLSI, photonics, non-volatile memory, software and systems design. HPE is now in a race to the future to deliver the Exascale performance with breakthrough capability in reduced energy consumption by 2023. Once operational, these systems based on ‘The Machine’ technologies, will help our customers to accelerate research, education and development.

PRAGMA — What does it take to really enable international collaboration for long-tail science communities?

Dr Philip Papadopoulos1,2
1Steering Committee Member, PRAGMA,
2Program Director, UC Computing Systems, San Diego Supercomputer Center, UCSD

The Pacific Rim and Grid Middleware Assembly (PRAGMA) is grass roots consortium of more than 20 Pacific Rim institutions. Activities focus on developing, and deploying practical cyberinfrastructure to assist lab-to-lab collaboration for long-tail science communities. Science focus areas include: biodiversity, fresh water ecology, software-defined networking, telescience, education, biosciences, geosciences. PRAGMA defines scientific expeditions where domain scientists and cyberinfrastructure specialists work together over long periods of time to develop solutions that meet the needs of the science but can also be applied more generally. Worked examples, like bringing high-throughput computing to the R-desktop of fresh water ecologists has had the impact of dramatically simplifying access to lake model simulations, will be presented.

This talk will survey the various technology components that PRAGMA has used, evaluated, developed, and where appropriate, discarded. The long time arc (15+ years)of PRAGMA affords a unique perspective the promises, the near-misses, and the successes in the space where technology meets international collaboration.


The agenda for the International Workshop on Science Gateways – Australia on 16-17 October in Brisbane is now available.

This workshop offers participants the opportunity to engage with other members of the Science Gateways community, to explore common issues and share successes. A Pacific Rim Applications and Grid Middleware Assembly (PRAGMA) workshop will also be co-located at eResearch Australasia on 16-17 October. Registration at the IWSG-A workshop will enable participants to also access the PRAGMA workshop.

A Science Gateway is a community-developed set of tools, applications, and data collections that are integrated through a tailored web-based environment. Often Science Gateways leverage larger scale computing and data storage facilities that would otherwise be inaccessible to many domain scientists. Gateways can be used to tackle common scientific goals, engage with industry, and offer resources for educating students and informing non-experts.

To continue the development of this community, this workshop offers a venue for knowledge exchange and skills development. Australian science gateways evidence many valuable impacts for their research communities, including collaboration with international gateways in their field. The significance of science gateways programs is evidenced in the existence of a range of national/regional programs that facilitate development of science gateways.

Don’t forgot to register to attend through the eRA conference registration process.

Appreciating the ‘Method to the madness’ in Research; Optimizing the Madness via eResearch Technology

Miss Amanda Miotto1

1Griffith University, Nathan, Brisbane, Australia,



This presentation aims to highlight the value of documenting the processes for research groups: Working alongside researchers to map out their processes for their workflow pipelines, with the goal of preemptively identifying potential issues and opening up possibilities using technology to accelerate their research.

In our experience most research groups work organically- small groups working on two or three problems at the same time, all interlinked but each group looking at unique questions and often for an answer that leads itself to more questions. It can be difficult to maintain efficient and methodical workflows when you are often heading in a number of directions at the same time, uncertain where your question will lead you.

Our aim is to gain an understanding about their workflows and highlight areas where technology can enable and accelerate research. Sitting with the researchers on the ground floor, we work together to gain an understanding around their research path, map their process and data workflow and expand their documentation.

Introducing someone with a fresh perspective, without assumptions, can bring new viewpoints to problems and offer ‘out of the box’ thinking. This can illuminate areas that have previously needed to be complicated or flexible but on re-evaluation have stabilized and are ready for optimization. These interactions can also spark the conversation regarding relevant emerging eResearch technology which can enable new revenues of outcomes and collaborations; as well as highlighting appropriate data management.

Mapping these processes and data flow can have further benefits. Having proper documentation can assist new staff coming into the team, technical groups needing current infrastructure information, offer transparency for managers and audits, encourage reproducible and responsible research and reduce the knowledge lost when contracts finish or students move on.

To complicate matters, when researchers have the time to invest in their data management, it’s often difficult to know where to start. Solutions can err on either side of extremely broad or far too specialized and intricate. Then there is the paradox of when to implement a data management plan. In the beginning of a research project, there may be not enough information about future data to form a data management plan, however further down the research lifecycle there may be an overwhelmingly diverse stockpile of data to keep track of. These discussions on workflow lead to suggestions in data management resources and provides ready-made documentation for data librarians.

In this session, we will share our experiences and lessons learnt; moving on to an open discussion regarding the experiences of others. This talk would be of interest to researchers, managers and supporting staff.



Amanda Miotto is an eResearch Support Specialist and Software Developer for Griffith University. She started off in the field of BioInformatics and learnt to appreciate the beauty of science before discovering the joys of coding. She is heavily involved in Software Carpentry, Hacky Hours and Research Bazaar, and has worked on platforms around HPC, microscopy & scientific database portals; as well as engagement with research groups  highlighting relevant upcoming technologies.

The Astronomy Data and Computing Services (ADACS) Story

Dr Jenni Harrison1,2, Professor Andrew Rohl3

1Pawsey Supercomputing Centre, , Australia,

2CSIRO, , Australia,

3Curtin University, Bentley, Australia


Title The Australian Data and Computing Services (ADACS) Story
Synopsis ADACS has been established and is funded by Astronomy Australia Ltd (AAL).  ADACS is providing eResearch services exclusively tailored for the needs of the Australian Astronomy community. Services are being delivered via a unique partnership that has been created between Swinburne University, Curtin University and the Pawsey Supercomputing Centre.  By offering bespoke training, support and expertise astronomers are being supported to maximise the scientific return from eResearch infrastructure.
Format of demonstration Slide Show
Presenter(s) Dr Jenni Harrison, Director of Strategic Projects and Engagement, Pawsey Supercomputing Centre and

Professor Andrew Rohl, Director of Curtin Institute for Computation and Professor of Computational Science, Curtin University

Target research community Astronomy, or anyone who may wish to use the ADACS model to deliver eResearch services to other communities.
Statement of Research Impact ADACS was only established in March 2017, and hence it is too early too early to evaluate the impact of this in initiative on research. ADACS will be evaluated in due course, with research impact considered.
Request to schedule alongside particular conference session If possible co-located with “National Programs and Partnerships”


Any special requirements Standard AV, to allow two presenters with questions




Jenni is the Director of Strategic Projects and Engagement at the Pawsey Supercomputing Centre in WA.  Jenni’s present responsibilities include leading projects in areas of national priority, such as astronomy and as a result she currently co-Directs the ADACS initiative.  Jenni is also responsible for engagement and correspondingly is leading the Capital Refresh for the next generation of supercomputing, data and associated services for Pawsey expected by 2020.  For 5 years previously, Jenni led the Data (and eResearch) Team at Pawsey.  Prior to working in Australia, Jenni directed significant Digital Health Education and Research projects for approximately 5 years, for the NHS in Scotland.  Before this role, Jenni for the policy advisor in eResearch to the Ministry of Research, Science and technology in New Zealand.

Andrew is the Director of the Curtin Institute for Computation and has been engaged in eResearch service delivery from its inception in Australia.  Prior to being the Executive Director of iVEC (now Pawsey), he was part of the grid computing program in the Australian Partnership for Advanced Computing.  As iVEC Executive Director, Andrew was a key contributor to attracting $80 Pawsey Centre Funding to iVEC.  Andrew is currently the independent Board Member on the NeSI.

AUS-SPECCHIO: taking spectroscopy data from the sensor to discovery

A/Prof. Laurie Chisholm1, Dr Cindy  Ong1, Dr Andreas Hueni1

1University Of Wollongong, Wollongong, Australia


Title AUS-SPECCHIO: taking spectroscopy data from the sensor to discovery
Synopsis AUS-SPECCHIO is national spectral information system supported by the Australian Government through the NCRIS Australian National Data Service. Funded as a data capture project, the mission of the system is to collate, share and discover new and existing spectral libraries related to any earth and environmental feature. AUS-SPECCHIO is open source for the benefit of all proximal and remote sensing researchers, established from user demand with functionality based upon extensive stakeholder consultation, feedback, and testing. The system incorporates features such as: a metadata standard to improve interoperability and sharing, links to published best practice guides, mechanisms to house validation data associated with spectra, semi-automated operations such as automatic validation of airborne hyperspectral data and a metadata export feed to ANDS RDA. Currently hosted by the University of Wollongong, a transition is planned to Geoscience Australia where the use of the system will extend to sensor calibration and meet the national call for validation of image products housed in Digital Earth Australia.
Format of demonstration Video, Slide Show
Presenter(s) Laurie Chisholm, Associate Professor, University of Wollongong (presenter), Cindy Ong, Andreas Hueni
Target research community Australian Proximal and Remote Sensing Community
Statement of Research Impact Case studies from operational testing and use will be shown which demonstrate the capacity of the system to capture and manage an expanding range of spectroscopy research data to support research.  As the basis of a spectral information system, AUS-SPECCHIO is delivering a benefit to the end users by greatly improved management of existing and new data, increased data quality by applying algorithms to a centralised and well-defined data pool, facilitating quicker acquisition to product/publication cycles, and supporting sensor calibration and satellite image product validation. The newly structured and enhanced version of AUS-SPECCHIO, including a robust metadata standard has served as a model for international adoption.
Request to schedule alongside particular conference session  
Any special requirements n/a



A/Prof Laurie Chisholm has over 20 years of experience in remote sensing and spatial analysis in the environmental sciences.  She is Project Leader for the ANDS DC-10 project to develop a national spectroscopy information system, “AUS-SPECCHIO”. She has particular expertise in the use of hyperspectral data to discriminate between plant species, and to assess the physiological effects of various stressors (fungal, nutrient, water) on spectral reflectance. Additional research interests focus on evaluating the impact of disturbance events on ecosystem function and resilience at the landscape scale using satellite imagery. She has been a participant in several TERN Auscover supersite field campaigns, conducting vegetation surveys in support of airborne remote sensing data acquisition.  Currently Laurie is multi-sensor remote sensing data  to map invasive plant species for input into a novel mixed-methods cultural environmental research framework to address Natural Resource Management issues.

Collaborate, coordinate and thrive

Dr Markus Buchhorn1

1Australasian eResearch Organisations (AeRO), Canberra, Australia



The 2016 Research Infrastructure Roadmap1 references a vision for eResearch that includes an Australian Research Data Cloud. To establish it will require an ever-increasingly wide range of stakeholders and service providers to work together, to coordinate and align their platforms under a broad framework. However, to effectively use it across the entire national research endeavour will need much more than that. We will need the workforce to build, operate and support it. We will need the user community to be properly skilled and supported to take advantage of it. We will need the underpinning systems to be properly integrated with institutional services, with national and state scientific computing platforms, with international frameworks, and with emerging commercial services. We will need a rich and smooth flow of communication about the many services and benefits. We will need these services to be trustworthy and valued, and to increase their maturity as expectations continue to grow.

The Roadmap is largely silent on all these issues. The Members of AeRO though are collaborating hard to ensure that the many investments, from all sources, are properly coordinated, designed, deployed and operated, to ensure that researchers can thrive in the continuously growing data-driven research world. This presentation will discuss a range of activities to support AeRO Members to achieve these important goals, to seek input from the wider community, and to encourage more participation across the sector, ultimately to deliver a seamless and transformative experience for our research community.



  1. Research Infrastructure Roadmap, available from:, accessed 15 June 2017.



Markus is the Chief Executive Officer of AeRO.

Building Artificial Intelligence workflows for 21st Century research

Andrew Underwood1 and guest from NVIDIA

1Dell EMC, High-Performance Computing Leader, Dell EMC Australia and New Zealand


The world we know is built on artificial intelligence; Netflix suggests our entertainment, Facebook suggests or friends, and transportation is fast becoming driverless.

The potential for artificial intelligence in the field of scientific research has the same potential to complete tasks that may be mundane, impossible (or so we thought!) or too complex for us to undertake manually. This session will take you through the concept of the modern day precursor to artificial intelligence, known as “Deep Learning” and how Dell EMC PowerEdge HPC systems with NVIDIA technology can be used to build your first “AI workflow”.




Andrew Underwood leads the Dell EMC High-Performance Computing and Machine Learning strategy in Australia and New Zealand. His passion for innovation has driven him to architect some of the world’s largest and most powerful Supercomputers and Artificial Intelligence platforms critical to the scientific advancement and global economic competitiveness of our clients throughout the world.

Working towards the “end-to-end”: Research Data Management Capability Development at UNE

Mr Thomas Reeson1, Dr Paddy Tobias2

1University Of New England, Armidale, Australia,

2Intersect Australia, Sydney, Australia,


The implementation of an enterprise-level research data management solution is cumbersome and complex. Sets of requirements and use-cases in the research data management lifecycle vary considerably, which is further complicated by the differing motivations of parties involved. While the institution is driven by responsibilities in administration and compliance, researchers are seeking greater support to facilitate good research practice and improved outcomes.

As such many in the sector now recognise that any enterprise-level solution for research data management needs to be “end-to-end”, meaning that it integrates into a single workflow all processes from start to finish of the research data lifecycle. Simply implementing or leveraging existing systems to address one aspect of the data lifecycle is not enough without working to integrate these with other existing data management processes [1, p. 158].

With this in mind, there is a shortage in the sector of support and guidance in relation to the end-to-end institutional solution, even in terms of advice on asking the right questions when scoping solutions. This lack of a holistic and critical guidance engenders frequent second-guessing and wasted investment by the institution.

This paper is delivered mid-way through a 12-month project at the University of New England to enhance capabilities and encourage engagement in new methodologies for research data management using the ANDS Capability Maturity Model as a guide. In the context of the problem outlined above, the paper will present the end-to-end workflow developed at UNE and discuss the thinking behind the solutions put in place. It will also address the difficulties confronted along the way with retrofitting systems integration, gaining researcher buy-in, and establishing standards for the University of New England. The paper will finish by listing a number of questions that are currently unresolved and need answering in the sector in relation to end-to-end institutional support. The paper is intended to be thought-provoking and generate a discussion.


  1. Johnston, Lisa R. “Curating Research Data Volume One: Practical Strategies for Your Digital Repository.” (2017). Available from:




Thomas Reeson is a recent addition to the University of New England Library’s Research Advisory and Engagement Services team. Thomas has worked at Griffith University, the State Library of Queensland, the University of Southern Queensland, and Queensland University of Technology as the QULOC Graduate Librarian. Thomas also worked as the Paramedic Sciences Liaison Librarian at the University of Queensland. As the Research Data Librarian, Thomas assists UNE researchers with planning, storage, description, discovery, and preservation of research data.

Dr. Paddy Tobias represents Intersect at the University of New England. Paddy has several years of experience working in developing countries as an academic, program director, policy adviser and researcher. As an eResearch Analyst, Paddy works to improve research projects with better adoption of digital solutions. The role covers policy advice, training and engagement for research data management, digital research support and data-intensive research.


Recent Comments


    About the conference

    eResearch Australasia provides opportunities for delegates to engage, connect, and share their ideas and exemplars concerning new information centric research capabilities, and how information and communication technologies help researchers to collaborate, collect, manage, share, process, analyse, store, find, understand and re-use information.

    Conference Managers

    Please contact the team at Conference Design with any questions regarding the conference.

    © 2018 - 2020 Conference Design Pty Ltd