Prof. Anne E. Trefethen (AET) Co-I: is Chief Information Officer and a Professor of Scientific Computing at the University of Oxford, and former director of the Oxford e-Research Centre. She has been a researcher, developer, and leader in high-performance computing technologies and algorithms since 1989 and a leader in e-Science since 2001. Before joining Oxford Anne was Director of the UK e-Science Core Programme, having been the Deputy Director for four years. The Core Programme focused on the generic issues for e-Science applications and Grid infrastructure through the development of appropriate middleware and infrastructure in collaboration with UK industry.
Anne has worked for almost 20 years in industry and academia with a focus on numerical algorithms and software, computational science and high-performance computing. She has led the design and development of high-performance software libraries in two separate industrial contexts; at Thinking Machines Corporation developing linear algebra algorithms and leading the Connection Machine and NAG Ltd where she lead the development of libraries and statistical products. She was a researcher in parallel computing at the Cornell Theory Centre, one of the NSF four USA national HPC facilities, where she later became the Associate Director for Computational Support and Software.
A recent report by McKinsey Global Institute reports that there are 30 billion pieces of content shared on Facebook every month; that the US Library of Congress had collected 235 terabytes of data by April 2011 and that 15 out of 17 sectors in the US have more data stored per company than stored by the US Library of Congress. The future only sees this content increasing, with a projected growth of 40% in global data generated by commerce and individuals per year. In science, fields such as genomics, radio astronomy and the search for fundamental particles at the Large Hadron Collider are producing many petabytes of data derived from advanced instruments. Digital forms of data require new ways of working and are enabling new discoveries.
In this presentation I will provide an overview of the diverse nature of the data forms in this age of data-driven science and consider some lessons learned from a data platform called NeuroHub.
The NeuroHub project aims to develop a research information system for neuroscientists at three different partner institutions namely the Universities of Oxford, Reading and Southampton. Each research group has different working practices, research methodologies and user requirements, which have lead to the development of a system that supports a wide variety of tasks in the neuroscience research life cycle.
Rudolf Dimper earned his degree in chemical engineering in 1981 in Hamburg. His diploma work was done in the European Molecular Biology Laboratory (EMBL) on the design of a real-time data acquisition system for X-ray muscle diffraction experiments. After his studies he held a position at the Institute Laue-Langevin (ILL), Grenoble, to design assembler programs for real-time data display of neutron detector data, followed by a position at the Institute for Millimetre Radio Astronomy (IRAM). Here his work focused on software design for a radio wave correlator and in accompanying the construction of the Plateau de Bure Radio Interferometer (2550m altitude) as the Station Manager. Since 1987 he works at the European Synchrotron Radiation Facility where he held various positions in the Computing Services Division. In 2004 he was appointed Head of this Division and member of the ESRF management team. In 2010, following an internal reorganisation of the laboratory, he was nominated Head of the Technical Infrastructure Division, bringing together services encompassing the computing infrastructure, management information systems, building construction and maintenance, electrotechnics, the vacuum systems as well as geodesy and alignment.
In his private life, he devotes most of his time to his family and to marriage and family counselling. Whenever possible he enjoys going for a hike or cross-country skiing in one of the magnificent Grenoble mountain areas.
The highly intense X-Rays produced by a third generation light source like the ESRF are used for probing condensed matter in many different ways and in a stunning array of different scientific domains. As a service institute, the ESRF invites more than 5000 scientists per year to carry out peer reviewed experiments in addition to an internal world-class scientific programme. The ESRF is a typical example in the scientific landscape where science and computing are intimately coupled and interdependent. New experiments are becoming possible because of an enabling high-performance computing environment; other experiments push the computing infrastructure to the limits and encourage us to explore new ways for staying abreast of an unprecedented data avalanche.
In this presentation I will shortly present the principles of synchrotron radiation research, and then show some key examples of research on societal challenges of advanced materials, life sciences, and energy. IT is a key enabling technology of this research allowing to control sophisticated instrumentation, read data from high resolution imaging detectors, and process the data on CPU/GPU clusters in quasi real-time. The new experimental stations the ESRF is currently designing will allow exploring samples in the nano-scale with extraordinary precision and detail and will require cutting edge IT. The contribution of gifted engineers and computer scientists is essential in enabling scientists to do the research which will determine significantly the future of our societies. I will conclude my talk by an overview of current efforts to implement a federated data management scheme across European large scale facilities for long-term preservation and open access.
The National Science Foundation (NSF) for 27 years and for 25 years of that time provided oversight to the National Center for Atmospheric Research (NCAR) and its managing organization University Corporation for Atmospheric Research (UCAR). NCAR is the largest assistance award made by NSF, approximately $100 M/year.
His oversight responsibilities cover a wide range of topics such as:
Dr. Jacobs has represented the geosciences in a variety of NSF studies and initiatives related to high performance computing and information technology, observing facilities, and best practices in the operation and management of facilities. Recently, Dr. Jacobs has assisted NSF with procurement activities associated with providing support services to the US Antarctic Program (a contract with an anticipated value of over $2B over 13 ½ years). In addition, while on detail to OPP, he has helped guide the upgrade of the National Ice Core Laboratory (NICL) and provided a framework for the competition for the management of NICL. As co-chair of the internal working group on cybersecurity for NSF large facilities, Dr. Jacobs supportedthe development of five community workshops and help to craft cybersecurity language in the cooperative agreements for large facilities.
Within the Geosciences Directorate office, Dr. Jacobs assists with oversight, planning and execution of several complex agency activities, including the operation and management of major facilities and the EarthCube endeavor. Dr. Jacobs co-chairs an internal Directorate working group on Geoinformatics and data and serves a member of GEO facilities working group.
Prior to coming to NSF, Dr. Jacobs was executive VP and senior research scientist at The Center for the Environment and Man (CEM) in Hartford, CT. Dr. Jacobs and colleagues at Travelers Research Center and later CEM conducted pioneering studies in air-sea interactions and regional modeling. In addition, his basic research interests included data analyses of large environmental databases and the development of computer graphics software for the analysis of observed and model data. Domestic and foreign governments as well as private industry have sponsored Dr. Jacobs’ research.
Dr. Jacobs published numerous of technical reports, five papers in peer reviewed journals, has given hundreds of presentations at conferences and meetings. His most recent work is a chapter for a publication on science and technology to appear in 2011.
Dr. Jacobs received his Bachelor of Arts degree in Mathematics from Texas A&M University and his Master of Science degree in Oceanography, also from Texas A&M University. His Doctor of Philosophy degree was awarded by New York University in Oceanography.
The National Science Foundation (NSF), a US government agency, seeks to transform the conduct of research in geosciences by supporting innovative approaches to community-created cyberinfrastructure that integrates knowledge management across the Geosciences. Within the NSF organization, the Geosciences Directorate (GEO) and the Office of Cyberinfrastructure (OCI) are partnering to address the multifaceted challenges of modern, data-intensive science and education. NSF encourages the community to envision and create an environment where low adoption thresholds and new capabilities act together to greatly increase the productivity and capability of researchers and educators working at the frontiers of Earth system science. This initiative is EarthCube.
NSF believes the geosciences community is well positioned to plan and prototype transformative approaches that use innovative technologies to integrate and make interoperable vast resources of heterogeneous data and knowledge within a knowledge management framework. This believe is founded on tsunami of technology development and application that has and continues to engulf science and investments geosciences has made in cyberinfrastructure (CI) to take advantage the technological developments. However, no master framework for geosciences was employed in the development of technology-enable capabilities required by various geosciences communities.
The NSF believes it is time to develop an open, adaptable and sustainable framework (an “EarthCube”) to enable transformative research and education of Earth system. This will involve, but not limited to fostering common data models and data-focused methodologies; developing next generation search and data tools; and advancing application software to integrate data from various sources to expand the frontiers of knowledge.
Also, NSF looks to the community to develop a robust and balanced paradigm to manage a collaborative effort and build community support. Such a paradigm must engage a diverse range of geosciences data collections and collectors, establish sustainable partnerships with other entities that collect data (e.g. other Federal and international agencies), integrate simulations and observations, and foster symbiotic relationships with industry.
To realize this vision, NSF carried out a number of activities that engaged the community in a process of dialog, collaboration, and convergence. These activities and their expected and actual outcomes will be presented. community for the early development efforts of all or part the EarthCube framework.
Professor Jeffrey Shaw is internationally recognized as a leading figure in new media art since the 1960’s. In a prolific oeuvre of widely exhibited and critically acclaimed works he has pioneered and set benchmarks for the creative use of creative media in the fields of virtual and augmented reality, immersive visualization environments, navigable cinematic systems and interactive narrative. Shaw was the founding director of the ZKM Institute for Visual Media Karlsruhe (1991-2002), and in 2003 he was awarded an Australian Research Council Federation Fellowship to co-found and direct the UNSW iCinema Centre for Interactive Cinema Research. Since 2009 Shaw is Chair Professor of Media Art and Dean of the School of Creative Media at City University in Hong Kong, as well as Director of the Applied Laboratory for Interactive Visualization and Embodiment. (ALiVE) and the Centre for Applied Computing and Interactive Media (ACIM).
During its infancy the cinema was opulent with technological innovation and creative diversity, but Hollywood now defines its dominant forms of production and narration. This situation is changing because of the new digital modalities of cinematic production and presentation, and over the last ten years we have been witnessing a creative renaissance providing a whole new range of experiences, from situated micro-movies to immersive interactivity in virtual 3D worlds. Professor Shaw’s presentation will discuss the convergent multiplicity of these new techniques of cinematic representation and intercommunication, with examples of ground breaking artworks that herald the digitally expanded cinema of tomorrow. He will refer in particular to benchmark artistic and technological achievements at the City University of Hong Kong’s Applied Laboratory for Interactive Visualization and Embodiment (ALiVE), whose program of research reaches beyond the dominant industrial media forms and pushes the creative and critical boundaries of the cinematic imaginary in ways that enrich human experience and are of transforming benefit to society.