Offloading the impact of security – piloting DPUs for security

Dr Steve Quenette1, Mr Sudarshan Ramachandran2, Mr Arik Roztal3, Mrs Gin Tan1, Mr Swe  Aung1, Mr Adam Bretel4

1Monash Eresearch Centre, Monash University, Melbourne/Clayton, Australia
2Nvidia, Melbourne, Australia
3Nvidia, Tel Aviv, Israel
4eSolutions, Monash University, Clayton, Australia

Designing high performance infrastructure for sensitive data workflows is challenging. A typical research project today will have partners beyond institutional boundaries, and require simulation, image processing and / or AI ideally suited to the scheduling of precious but shared resources (e.g. via HPC, Blazar and Kubernetes), but orchestrated within safe havens. To this end, Nectar Research Cloud users and nodes collaboratively determined firewalls and other security concerns at the project level. However, in today’s cyber landscape more is needed to integrate the robust security operating practices now prolific throughout institutions.

In 2018/19 we piloted a micro-segmentation based security tool to assure sensitive data workflows that flowed across Monash’s own pools of resource (e.g. web, VDI, HPC, AI, etc). When made transparent to the researcher would ultimately scale tighter firewalls and more actively & deeply monitored data workflows. We observed, however, a loss in the order of 10% to the researcher’s allocated computing resources to do this work.

Simultaneously NVIDIA (Mellanox) has developed BlueField, an RDMA enabled Ethernet SmartNIC, also known as Data Processing Units (DPUs). In essence these DPUs extend the ConnectX NIC now prolific in the Research Cloud with Arm cores and supporting APIs. A key use case for this technology is the emerging evolution in security technology to security everywhere.

In this talk we will discuss the collaboration between NVIDIA and Monash that explores micro-segmentation and SOC integrations that scale with cloud size. We will discuss our early findings of precursor experiments, such as off-loaded encryption and introspection.


Dr Steve Quenette, Deputy Director of the Monash eResearch Centre. This multi-disciplinary centre now includes over 40 eResearch and IT professionals providing expertise, computing, visualisation and data capabilities into numerous research areas. Since 2010, the centre has been selected to host over $50M of Australia’s federally-funded national eResearch infrastructure for specialised high-performance computing, research cloud services and data storage and data management, underpinning the research of over 4,000 researchers. The centre is also a global Centre of Excellence or a strategic technology partner for NVIDIA, Mellanox, Dell, and Redhat.

Capturing the Full-path of Any Data Asset: Engineering Vertical Integration From Initial Capture, Though Intermediate Processing Stages to Multiple Derivative Products.

Dr Lesley Wyborn1, Dr  Jens Klump2, Dr Ben Evans1, Nigel Rees1, Sheng Wang1, Dr Mingfang Wu3, Mr Ryan Fraser4

1National Computational Infrastructure, ANU, Canberra, Australia
2CSIRO, Kensington, Australia
3ARDC, Melbourne, Australian
4AARNet, Perth, Australia

Science has always required that all claims are capable of being evaluated against testable hypotheses and that research be reproducible. This necessitates transparency in data observations, methods of analysis and descriptions. Modern research data processing pipelines involve complex systems of physical and digital infrastructure and publishable artefacts: starting from initial samples and observations collected at the source (e.g., field measurements, raw data from laboratory instruments), through intermediate datasets (which can be derived from multiple processing steps), to data ‘products’ that are referenced in scholarly publications. The artefacts can be released at any stage along the ‘Full-path of data’: each component can be made accessible in multiple formats, often on numerous unrelated digital locations.

The technical complexity involved in exposing the Full-path of data, from initial capture to final released product, has been a major challenge, particularly in HPC. Too often, the various stages and processes along a data pipeline are either not documented or buried within unpublished scientific workflows: source or intermediate data is misplaced and not appropriately persisted; provenance information is routinely recorded in text-rich files that are not machine readable or accessible.

For transparent science, each release from any stage of the Full-path of data should be uniquely identified to ensure it can be ‘vertically’ integrated with predecessor(s) and subsequent derivative(s). Each release should include identifiers that enable attribution for any person/organisation making contributions (including funding). Portraying the Full-path of data as a ‘Knowledge graph’ offers more possibilities than just provenance, in particular, pattern analysis of research processes.


Lesley Wyborn is an Adjunct Fellow at the National Computational Infrastructure at ANU and works part-time for the Australian Research Data Commons. She  had 42 years’ experience  in Geoscience Australia in scientific research and in geoscientific data management. She is currently Chair of the Australian Academy of Science ‘National Data in Science Committee’ and is on the American Geophysical Union Data Management Advisory Board and the Earth Science Information Partners Executive Board. She was awarded the  Public Service Medal in 2014, the 2015 Geological Society of America Career Achievement Award in Geoinformatics and the 2019 US ESIP Martha Maiden Award.

Resolving multidisciplinary challenges of seamless integration of data on people, business, and the environment through the AusPIX Discrete Global Grid System Framework

Mr Joseph Bell1, Mr Shane Crossman1, Ms Irina Bastrakova1

1Geoscience Australia

Increasingly, crisis situations require transparent, verifiable and trusted information. Solutions commonly have to simultaneously cover multiple diverse use cases (e.g. social, environmental and economic). The 2020 Australian bushfire crisis and the global COVID-19 pandemic are examples of these complex crisis events.

The unifying factor for these events is location: everything is happening somewhere at some time. The AusPIX framework being developed through OGC emerging Discrete Global Grid System cutting-edge technology. The AusPIX data integration framework is changing the way spatial data are enabled leading to an endless range of diverse and powerful data integration tasks, statistics and visualization possibilities.

The AusPIX DGGS provides a common framework to greatly increase the amount of location intelligence by flexibly linking big and small data in multiple formats, types and structures, and provides a framework for quick, reliable, repeatable, reusable infrastructure and codes. It fosters cross community collaboration and facilitates quick responses to stakeholder needs and for multiple use cases.

This paper will outline how Geoscience Australia and its partners implement the AusPIX framework to allowing for rapid and repeatable analysis of cross-portfolio response to devastating events of the 2020 Australian Bushfires. AusPIX provided location-based data in a consistent way for seamless integration of data on people, business, and the environment.


Joseph Bell is a Geospatial Data Scientist at Geoscience Australia. As Spatial Scientist and Python programmer Joseph is invested in, and enjoys, designing workflows and technology to move data up in the knowledge triangle from data to information, and on to knowledge and wisdom. With a focus on the knowledge layer Joseph is developing the AusPIX framework for data integration, statistics, and visualisation, including tools for linked-data and vocabularies.  Joseph has applied the discrete global grid system (DGGS) to define enduring locations and he is aiming for online tools to bring social, environmental and business data together.

Recent Comments


    About the conference

    eResearch Australasia provides opportunities for delegates to engage, connect, and share their ideas and exemplars concerning new information centric research capabilities, and how information and communication technologies help researchers to collaborate, collect, manage, share, process, analyse, store, find, understand and re-use information.

    Conference Managers

    Please contact the team at Conference Design with any questions regarding the conference.

    © 2018 - 2020 Conference Design Pty Ltd