1. The Data Fabric
How to keep control and gain more choice in
hybrid cloud architectures
Peter Hanke
Country Manager Austria
2. That’s what the analysts say
Cloud is not a hype. It’s happening. And it’s happening quick.
Goldman Sachs:
Spending on cloud
computing infrastructure
and platforms to grow at
a 30% CAGR from 2013
through 2018 compared
with 5% growth for the
overall enterprise IT.
IDC predicts an 11% shift
of IT budget away from
traditional in-house IT
delivery, toward various
versions of cloud
computing as a new
delivery model IN 2016.
35% of new applications
will use cloud-enabled in
2017.
By 2018, IDC forecasts
that public cloud spending
will more than double to
$127.5 billion ($82.7 billion
in SaaS, $24.6 billion in
IaaS and $20.3 billion in
PaaS).
Amazon growing fast
(49% Q1 2015),
accelerating, and
profitable with over 49%
y/y revenue growth (Q1
2015), usage growth is
close to 90% y/y (Q4
2014)
Cisco: By 2018, 59%
of the total cloud
workloads will be
Software-as-a-Service
(SaaS) workloads, up
from 41% in 2013.
Forrester:
Hyperscale Public Clouds:
If you can't beat ’em,
join ‘em!
Current Distribution
of Total Data
Hybrid…
Public
9%
Privat
e
26%
Not
in
a…
Hybrid
13%
Public
19%
Privat
e
33%
Not in
any
cloud
35%
Expected Distribution
of Total Data –
18 Months from now
14. The need for seamless cloud services
Private Cloud
Cloud
Service Providers
Hyperscale
Cloud Providers
Private Public
15. 49%
38%
35%
34%
32%
31%
30%
27%
27%
25%
25%
Data backup and archive
Test and development
Disaster recovery
Pimary storage for files
Web servers
High-performance…
Business intelligence
Temporary projects
Internal production apps
Application bursting
Workload spikes
For which of the following purposes does/did your
organization use cloud infrastructure services?
Where to start? Backup and archive will be
the first workloads to move to the cloud
N=327,
multiple responses accepted
Source: ESG, 2015
21. Vielen Dank für Ihre Aufmerksamkeit
Peter Hanke
Country Manager Austria
Peter.Hanke@netapp.com
Mobile: +43 664 88661856
Notes de l'éditeur
Are you looking at optimization or innovation?
+ Own IT has classical goals that are linked to specific KPIs.
+ However, more and more, specialty departments (product management, R&D, marketing) requesting new dynamic services from their IT departments. As only some requests can be met, specialty departments are forced to circumvent their IT by purchasing pay as you go services by themselves. Also called Shadow IT!
+ This has enormous impact on IT quality, data control and purchasing.
The answer is: Hybrid Cloud or Multi Cloud
Hybrid cloud environments will dominate for the foreseeable future.
Their biggest question is how can they exist in two worlds of on-premises and off-premises without a painful redesign? How do they integrate the world of traditional, legacy IT and everything they own on-premises in the data center with the new world of the public cloud and everything they want to rent or lease off-premises. How do they take advantage of the wide choice of cloud environments that have a seemingly limitless pool of compute, network, and storage resources?
The combination of these two worlds will become the new normal for enterprise IT. There are characteristics that prove value in each. On-premises will continue to offer a level of predictability that you will continue to need in your organization. But the public cloud will provide a new level of variability of resources that you can use when you need it.
Organizations want the ability to move applications and workloads to the optimal environment, whether on-premises or off-premises, as their needs change and as new options gain traction.
Therefore, as customers move down this path of hybrid cloud and operating in these two worlds, NetApp plays a significant role. Hybrid cloud is what results when these two worlds merge.
How does today‘s situation look like?
What are the challenges of operating three different deployment modes?
1. Data gravity: Migration of data too time-consuming and expensive
2. Format incompatibility between on and off-premises as well as from hyperscaler to hyperscaler
3. Data protection and security
How can I efficiently combine and connect these deployment modes and data silos?
Left box - EU Model Clauses, Binding Corporate Rules or Consents
The Safe Harbor treaty, which was held invalid by the ECJ, will be replaced by a new treaty „Privacy Shield“.
Currently, companies can use US providers for the processing of personal data e.g., if they execute the EU model clauses – this has been confirmed by the Article 29 Working Party.
Two of these sets of contractual clauses apply to transfers from data controllers in the EU/EEA to controllers in third countries: see Decision 2001/497/EC (Set I) and Decision 2004/915/EC (the so-called “business clauses” — Set II). The third set applies to transfers from data controllers in the EU/EEA to processors in third countries (Decision 2002/16/EC). Consequently, companies will either have the possibility to choose between two sets of standard contractual clauses (for transfers from EU/EEA controllers to non-EU/EEA controllers — Set I and Set II) or only have the opportunity to use one set (for transfers from EU/EEA controllers to non-EU/EEA processors). As explained in question 4 below, this does not, however, prevent companies relying on different contracts approved at national level by data protection authorities.
Consequence: From a pure legal perspective personal data can still be processed by US providers.
The main risk is that companies have old contracts in place within which they have relied on Safe Harbor; data protection authorities have started to investigate and already imposed fines where companies are still relying on the invalid treaty – in addition to the monetary risk, reputation can suffer significantly.
Binding corporate rules: Binding corporate rules may be described as an international code of practice followed by a multinational corporation for transfers of personal data between the companies belonging to the same multinational corporation. Any multinational corporation wishing to transfer personal data between its own companies on an international basis can consider using binding corporate rules, which must be approved by the national data protection authority pursuant to its own national legal procedures.
http://ec.europa.eu/justice/policies/privacy/docs/international_transfers_faq/international_transfers_faq.pdf
Right box - Data Control: Data/IP Protection, Vendor Lock-in
Critics say that due to the Patriot Act none of the legal means available protect personal data of EU data subjects sufficiently (it is still possible that such data will be disclosed to US courts or authority (such as secret services)).
If a company uses US hyperscalers (such as AWS), there is an additional risk of vendor lock-ins. Due to data gravity the consequence would be a strong dependency on the provider and would have enormous impact on price increases and data mobility
With the emergence of more value-added (data) services (e.g., media streaming, data analytics and trading (Netflix or Spotify vs. AWS), it can also be a material risk to put critical business data into the public cloud service (risk of unauthorized disclosure).
The solution: NPS
Web Infrastructure example
NetApp has many departmental intranet sites that were built using different content management systems (CMSs), such as Jive, WordPress, Drupal, Joomla, Oracle Content Management platform, and custom Java/HTML/CSS frameworks. Over time, these portals have grown in size and reach. A lack of common standards caused CMS deployments to sprawl across the company, making it difficult to control and manage the data.
Solution Approach
To gain control and stop the sprawl, NetApp IT introduced a cloud-based hosting platform that brings all of the intranet portals into a standard design framework. This platform is built on NPS and AWS Cloud. It uses the open-source WordPress CMS.
Each portal is assigned a midsize AWS EC2 compute with LAMP (Linux, Apache, MySQL, and PHP) stack and a NetApp branded WordPress IT blueprint. The portal contents are stored on a NetApp FAS6290 using the NPS deployment model. The FAS is accessed by the EC2 nodes using NFS-mounted volumes over a 1Gbps direct connect link.
Cloud ONTAP
Cloud ONTAP for AWS is a software-only storage appliance that runs the clustered Data ONTAP storage operating system in the cloud managing Amazon Elastic Block Store (EBS) volumes with the NetApp clustered Data.
Enterprise-class features: Multiprotocol support (NFS, CIFS, and iSCSI), Data protection (NetApp Snapshot copies, SnapMirror technology, and SnapVault technology), Storage efficiency (thin provisioning, data deduplication, and data compression), Data-at-rest encryption using encryption keys that are stored on key managers under your control.
DSS Virtualized servers example
Case for Security: Providing Secure, Global Access to Corporate Data
NetApp users in offices around the globe need secure access to corporate business data for quarterly reporting and analytics. The centralized data warehouse had become inadequate. Report demand is bursty in nature, which often resulted in access bottlenecks.
To provide globally distributed users secure access to corporate data, NetApp IT deploys Cloud ONTAP in multiple AWS regions. SnapMirror replicates data from the centralized data warehouse to the various Cloud ONTAP endpoints. Users access reports locally in their region using a business analytics web portal. When demand for reports is fulfilled, the Cloud ONTAP instances are torn down.
NetApp follows a strict data governance policy, and hosting business data on public cloud storage met requirements due to Cloud ONTAP encryption capabilities. IT controls and manages the encryption keys for the data residing on cloud storage. The data is secured at rest and permanently inaccessible when the cloud resources are torn down.
HR and recruitment provider Randstad has chosen NetApp to revamp its data storage infrastructure and ensure the reliability of its core business applications.
Randstad’s legacy storage platform had neared capacity and risked undermining operations, with core business applications occasionally freezing, hindering front-line operations and business continuity.
Randstad chose to re-vamp its data infrastructure through Storm Technologies with the NetApp All-Flash FAS8040 and NetApp Cloud ONTAP for AWS solutions. The FAS8040 supports highly variable I/O workloads and read cycles, typical of a virtualized desktop environment.
Cloud ONTAP for AWS allows Randstad to manage its on-premises data in the cloud. It can meet varied capacity requirements and performance demands, enabling the company to develop a disaster recovery platform for AWS cloud.
“The technology provided by NetApp was clearly top drawer. The all-flash array delivered exceptionally high performance. Moving to the cloud is a big step, even though we were well placed to do this, but NetApp Cloud ONTAP with AWS makes this simple and the level of control it enables has made it easy to replicate our disaster recovery environment in the cloud.”Source: http://www.channelbiz.co.uk/2016/02/23/randstad-moves-into-the-aws-cloud-with-netapp/
To make it more convenient and to follow the idea of pay-as-you-go consumption: NPSaaS
Purpose of this slide: Describe how AltaVault is used to backup and restore any storage array.
Key points: An AltaVault appliance can back up any storage array to any cloud. AltaVault supports a wide variety of backup software. After the data has been backed up into the Data Fabric, it can be restored to any storage array within the fabric.
Script:
The Data Fabric is a great way to meet backup requirements.
An AltaVault appliance can back up any storage array to any cloud. It supports a wide variety of backup software. After the data has been backed up into the Data Fabric, it can be restored to any storage array within the fabric.
NetApp’s vision of a Data Fabric provides organizations with the choice and control to ensure that their data is stored in the right place, at the right cost and is kept compliant:
Component Solutions:
NetApp Private Storage – store data next to the cloud, rather than in it. Total data control
Cloud ONTAP – common format allowing data to be quickly moved back to the data centre if needed
AltaVault – encrypted data into multiple clouds. Ability to move data between clouds to remain compliant
StorageGRID Webscale – rules engine, multi-national environment. Data can be tracked
OpenStack – NetApp’s close integration with OpenStack helps you build your own private cloud environment