The panel discussed opportunities for and challenges to international collaboration among national and international research computing organizations. Panelists represented Compute Canada, PRACE, EGI, and XSEDE. They described their organizations' user communities, which span many scientific domains, as well as practices like resource allocation and support that could benefit from sharing. Big challenges include securing long-term funding and balancing local needs with international cooperation. The panel saw potential for collaboration on operational practices, shared resources, and addressing challenges no single organization can solve alone.
2. Abstract
2
How might the international community of research computing users and
stakeholders benefit from knowledge sharing among national- or international-
scale research computing organizations and providers? It is common for large-
scale investments in research computing systems, services and support to be
guided and funded with government oversight and centralized planning. There
are many commonalities, including stakeholder relations, outcomes reporting,
long-range strategic planning, and governance. What trends exist currently,
and how might information sharing and collaboration among resource
providers be beneficial? Is there desire to form a partnership, or to build upon
existing relationships? Participants in this panel will include personnel involved
in US, Canadian and European research computing jurisdictions.
3. Panelists
3
Gregory Newby
Chief Technology Officer, Compute Canada
Florian Berberich
Member of the Board of Directors PRACE aisbl
Gergely Sipos
Customer and Technical Outreach Manager, EGI
Foundation
John Towns
Director of Collaborative eScience Programs, National
Center for Supercomputing Applications
4. Panel structure
4
Opening remarks from each panelist (~5 minutes each)
Discussion roundtable on three topics (15-20 minutes
each)
Audience questions and participation
6. The EGI federated infrastructure
300+ HTC
providers
23 Cloud
providers
23 members
(NGIs and
CERN)
650k CPU
cores
285 PB online
storage
2.6 Billion
CPU
hours/year
240 Virtual
Organisations
>48 000 users
EGI
Foundation
(Amsterdam)
7. EGI Service Catalogue
Compute
Storage and Data
Cloud Compute
Run virtual machines on demand with complete control over computing resources
Cloud Container Compute
Run Docker containers in a lightweight virtualised environment
Online Storage
Store, share and access your files and their metadata on a global scale
Archive Storage
Back-up your data for the long term and future use in a secure environment
Data Transfer
Transfer large sets of data from one place to another
Training
FitSM training
Learn how to manage IT services with a pragmatic and lightweight standard
Training infrastructure
Dedicated computing and storage for training and education
High-Throughput Compute
Execute thousands of computational tasks to analyse large datasets
à Aka. ‘Grid computing’
12. EGI Foundation • Science Park 140 • 1098 XG Amsterdam •
The Netherlands
+31 (0)20 89 32 007 • egi.eu
support@egi.eu
Thank you!
13. Opening remarks: XSEDE
John Towns
Director of Collaborative eScience Programs, National
Center for Supercomputing Applications
6
14. July 15, 2017
XSEDE Intro
John Towns
XSEDE Principal Investigator
jtowns@ncsa.illinois.edu
15. Motivation for XSEDE:
• Scientific advancement across multiple disciplines requires a
variety of resources and services
• XSEDE is about increased productivity of the community and
providing expanded capabilities
– leads to more science
– is sometimes the difference between a feasible project and an
impractical one
– lowers barriers to adoption
• XSEDE provides a comprehensive eScience infrastructure
composed of expertly managed and evolving advanced
heterogeneous digital resources and services integrated into a
general-purpose infrastructure
2
16. XSEDE Factoids: high order bits
• 5 year, US$110M project
– pursuing additional funding via independent proposals
– initial 5 year award: $121M project + ~$4.6M in supplements
• plus $9M, 5 year Technology Investigation Service
–separate award from NSF
• No funding for major hardware
– coordination, support and creating a national/international cyberinfrastructure
– coordinate allocations, support, training and documentation for >$100M of
concurrent project awards from NSF
• ~90 FTE /~200 individuals funded across 19 partner institutions
– this requires solid partnering!
3
17. Total Research Funding Supported by XSEDE to Date
4
$2.86 billion in research
supported by XSEDE
July 2011-May 2017
Research funding only. XSEDE leverages and
integrates additional infrastructure, some
funded by NSF (e.g. Track 2 systems) and
some not (e.g. Internet2).
NSF, $982.2M,
34%
NIH, $613.6M,
21%
DOE, $560.6M,
20%
DOD, $222.1M,
8%
NASA, $114.3M,
4%
DOC, $57.8M,
2%
All Others, $313.6M,
11%
18. Field of Science Trends
5
0
5
10
15
20
25
30
35
40
45
50
NUsUsed(Billions)
Molecular Biosciences
Materials Research
Astronomical Sciences
Physics
Chemistry
Chemical, Thermal Systems
Atmospheric Sciences
More details on domains with
less usage will be discussed
further in the ECSS and CEE
presentations.
19. XSEDE offers efficient and effective integrated access to a
variety of resources
• Leading-edge distributed memory systems
• Very large shared memory systems
• High throughput systems, including Open Science Grid (OSG)
• Support for VM’s and containers and HPC Cloud
• Visualization engines
• Accelerators like GPUs and Xeon PHIs
• Extensive library of research applications
Many scientific problems have components that call for use of more
than one platform.
6
20. International Collaborations of Infrastructures
• Some opportunities
– science teams are international; coordinated support
– sharing of operational practices and policies
– organizational benchmarking
– coordination of resources
• Some challenges
– infrastructures have varying structures, missions, finding time lines, user
communities, resource and application types, …
– infrastructures are typically not actually funded to support collaborations
– challenges in “optics” of potential resource sharing
7
23. 1 www.prace-ri.euPEARC - July 2017
Partnership for Advanced Computing in Europe
National Scale Research
Computing and Beyond
PEARC New Orleans Thursday July 13
Florian Berberich
PRACE Board of Directors
24. 2 www.prace-ri.euPEARC - July 2017
Partnership for Advanced Computing in Europe
▶ Open access to best-of-breed HPC-systems to EU Scientists
▶ Variety of architectures to support the different scientific communities
▶ High standards in computational science and engineering
▶ Peer review at European scale to foster scientific excellence
▶ Robust and persistent funding scheme for HPC supported by the national
governments and the EC
▶ Support the development of IPR in Europe by working with public services
and European industry
▶ Collaborate with European HPC industrial users and suppliers
26. 4 www.prace-ri.euPEARC - July 2017
Biochemistry,
Bioinformatics
and Life Sciences
15%
Chemical
Sciences and
Materials 26%
Universe
Sciences
16%
Mathematical and
Computer Sciences
4%
Earth System
Sciences
7%
Engineering
17%
Fundamental
Constituents
of matter
15%
Supporting many scientific Domains
Research Domain Pie Chart up to and
including Call 14, % of total core hours
awarded
27. 5 www.prace-ri.euPEARC - July 2017
PRACE Achievements to date
▶ More than 500 scientific projects enabled
▶ Over 14 thousand million core hours awarded since 2010
▶ Of which 63% are trans-national
▶ R&D access to industrial users with >50 companies supported
▶ More than 10 000 people trained by 6 PRACE Advanced Training
Centers and others PRACE events
▶ More than 60 Pflop/s of peak performance on 7 world-class systems
▶ 24 PRACE members, including 5 Hosting Members
(France, Germany, Italy, Spain and Switzerland)
28. 6 www.prace-ri.euPEARC - July 2017
Diagram: Augusto BURGUEÑO ARJONA
Virtual
Research
Environment
Information
Infrastructure
Trends in Europe
29. 7 www.prace-ri.euPEARC - July 2017
Trends in Europe
EOSC EDI
€ €
Int. RI, ESFRI,
CERN, EMBL Commercial MS
Governance:
• Rules of Engagements
• Standard setting
• Agenda setting
• User interface
• Catalogue of services for research
• Core service provision
• Brookerage of ext. services
• User interface
• Catalogue of services for research
• Core service provision
Researcher
EU/MS
EU/MS
Governance:
• Rules of Engagements
• Standard setting
• Agenda setting
Researcher
EGI
EUDAT
Open
AIRE
RDA
PRACE
GÉANT
…
32. Canada’s Only National Provider of Shared Essential Digital Research
Infrastructure
● Compute Canada (CC) leads Canada’s national advanced research
computing (ARC) platform.
● There is no other major supplier in Canada.
● CC is a not-for-profit corporation. The membership includes 35 of
Canada’s major research institutions and hospitals.
● Funding is through a federal grant with matching funds from provincial
and institutional partners (40% federal / 60% provinces and institutions),
which is the basis of the federated Canadian model.
● We provide shared services to over 11,000 researchers across Canada.
No fees. Large requests based on a merit-based access system.
9
33. Exciting things are happening up North! Canada’s largest
research computing update ever, with $CAD 125M over
~3 years. Funded by CFI with match from provinces &
campuses. From 27 campus-based data centers, to 5
national data centers with larger, more integrated systems
Stage 1: Four sites
Stage 2: One additional site (TBA)
10
38. Topic 1
15
How similar or different are your organization’s USERS
and APPLICATION AREAS to other panelists’
organizations?
What are some key similarities or differences?
39. Topic 2
16
Which of your organization’s PRACTICES might be
suitable for sharing or coordination with other
organizations?
40. Topic 3
17
What are biggest challenges facing your organization
today, for which some aspects of INTERNATIONAL
COLLABORATION or cooperation might be beneficial?
41. Audience Questions and Participation
18
Questions should be brief and focused
Use the program feedback form to add longer
observations, external links, etc.:
https://pearc17.sched.com/event/Asod
42. Panel conclusion
19
Thanks to all panelists
Thanks for audience participation and engagement
Follow-up and discussions on next steps are welcome
Use the program feedback form:
https://pearc17.sched.com/event/Asod