Watch full webinar here: https://bit.ly/3mPLIlo
A shift to the cloud is a common element of any current data strategy. However, a successful transition to the cloud is not easy and can take years. It comes with security challenges, changes in downstream and upstream applications, and new ways to operate and deploy software. An abstraction layer that decouples data access from storage and processing can be a key element to enable a smooth journey to the cloud.
Attend this webinar to learn more about:
- How to use Data Virtualization to gradually change data systems without impacting business operations
- How Denodo integrates with the larger cloud ecosystems to enable security
- How simple it is to create and manage a Denodo cloud deployment
Call Girls Bommasandra Just Call 👗 7737669865 👗 Top Class Call Girl Service B...
A Successful Journey to the Cloud with Data Virtualization
1. DATA VIRTUALIZATION PACKED LUNCH
WEBINAR SERIES
Sessions Covering Key Data Integration Challenges
Solved with Data Virtualization
2. A Successful Journey to the Cloud with Data
Virtualization
Pablo Alvarez-Yanez
Director of Product Management, Denodo
3. Agenda
1. A Transition to a Cloud-first Data Strategy
2. The Role of Data Virtualization in the Journey to the Cloud
3. Demo: Denodo in the Cloud
4. Customer Stories
5. Q&A
4. 4
Motivations for the Transition to Cloud
New Capabilitis and Reduced Cost
Lower cost of operations (automation)
No up-front HW investement
Flexbility to upgrade capacity
Less dependency on desktop software
New challenges
Security
Network latency
Existing challenges in data management still
apply!
5. 5
Approaches to Cloud Transition
1. Rehost
1. Move data as is to the same system, but hosted in a cloud provider
2. For example, from on-prem Oracle to Oracle RDS in AWS
2. Replatform
1. Move data as is to a new cloud-native system
2. For example, from an on-prem Teradata EDW to Snowflake
3. Refactor
1. Move to a new cloud0native system, but taking the opportunity to also change schemas,
data structure, ingestion and reporting tools, etc.
2. Changes not just in the systems, but in the data strategy itself
7. 7
The Data Strategy as part of the Transition to the Cloud
• A complete change in architecture is a complex process,
that modifies multiple elements in the data ecosystem
• However, it guarantees that the data strategy evolves and
follows new trends
• It’s not just a change in RDBMS vendor
• It addresses existing challenges and limitations of the existing
data strategy
• A change of this caliber implies longer projects with
intermediate stages
• It can last years
• It involves intermediate (or permanent) hybrid states, where
cloud and on-prem systems coexists
8. 8
What Pieces are Involved in a Strategy Change?
• Adoption of cloud-based SW solutions
• AWS, Azure, Google offer cloud alternatives for most common software
• Some companies have developed specialized cloud-based solutions, like Snowflake, Databricks, Looker
• Traditional on-prem software has been adapted to the needs and requirements of cloud deployments, like
Tableau and Denodo
• Adoption of new approaches to data management (e.g. Data lakes, streaming) that adapt to new
trends and requirements (predictive analytics, machine learning, etc.)
• Migrate to SaaS options for packaged applications
• For example, migrating from an on-prem CRM to Salesforce.com, marketing tolos, etc.
• Extended use of web APIs for application-to-application communication
• New authentication and authorization systems based on Identify Provides (SAML, OAuth, OpenID,
etc.)
• And many more
9. 9
Stages in Cloud Adoption
On-prem
Transition
to Cloud
Hybrid
100%
Cloud
Multi-
Cloud
10. 10
What if We Migrate all Data to a Single Cloud System?
Since we are re-architecting the transition to the cloud, couldn't we simply consolidate all data
in a single system, like a data lake? Is that possible? Is that realistic?
• Loss of capabilities: data lake capabilities may differ from those of original
sources
• E.g. quick access by ID in operational RDBMS
• Huge up-front investment: creating ingestion pipelines for all company
datasets into the lake is costly
• Questionable ROI as a lot of that data may never be used
• Large recurrent maintenance costs: those pipelines need to be constantly
modified as data structures change in the sources
• Loss of real time flavor: SaaS adoption means data can be checked in real
time via API calls. Sometimes it’s useful to replicate some key information,
but is that the case for live metrics?
• Risk of inconsistencies:
• Data needs to be frequently synchronized to avoid stale datasets
• Raw data needs to be curated in iterations, creating additional replicas
• Complexity to navigate the lake can lead to “data swamps”
11. The Role of Data Virtualization in the
Journey to the Cloud
14. 14
The Value of a Data Delivery layer
• For Business Users
• Simplicity: users don’t need to navigate the complexity of the architecture. Where is data (on-
prem, cloud, multi-cloud)? How to Access it? Which location has priority?
• Agility: all data is securely delivered from a single (virtual) system
• Accessibility: data is accessible in a variety of formats (SQL, REST, OData, GraphQL) and in a
web-based Data Catalog, regardless of original format and location
• For IT
• Abstraction: decouples storage and processing engines from the delivery of data
• Flexibility: allows IT to change technologies and move data without service interruptions
• Security: centralized governance and security controls for all data assets
15. 15
As a Global Strategy
For the first time, a technology allows you to
define and implement a data delivery strategy
• Independent from the sources where you
store and process your data
• Independent from the consuming
applications
• Independent from the location of the
deployment
• Can enforce security and access policies
• Provides strong governance management
16. 16
Use Case: Virtualizing to Accelerate Data Integration
DV becomes the common Access layer for both on-
rem and cloud systems:
• Access to all data from a single system
• Data can be accessed straight from the original
systems, without the need for an additional copy
• Data can be easily replicated and cached if
necessary
• Simplifies the combination of data, regardless of
original format and location
• Enables the definition of semantic models,
independant from original formats and structures
• Adds advanced security settings to all data
• Documentación y estadísticas de uso incluidas en el
Denodo Data Catalog
17. 17
Use Case: Virtualizing to simplify migrations
• Migrations of key systems are complex
• Normally involve multiple phases
• Can last months or years
• Data Virtualization, thanks to the
decoupling and abstraction capabilities,
simplifies the process:
• Shields the consumers from changes in
the backend
• Allows IT to move data from one system
to the other without changes in the
consuming applications
18. 18
Use Case: Virtualizing to Acceleart and Reduce Cost
• Data Sources charges based on usage or data volumes. For example:
• Snowflake charges by “compute credits”
• Athena by bytes scanned
• Smaller summaries mean less data processed and less CPU time
• Summarized queries are not just faster, but also cheaper
• Until now, these technologies were only available in some reporting tools
(BO, Microstrategy) and on-prem EDWs (Oracle, Teradata)
• No cloud-based RDBMS includes these capabilities
• Denodo supports aggregate-aware query acceleration regardless of consuming
tools and source capabilities
• For more details:
• https://www.denodo.com/en/webinar/accelerate-your-queries-data-
virtualization
SALES
10 billion rows
Sales summary
1 million rows
19. 19
Capabilities: Is Denodo ready for the Cloud?
Denodo 8 has been re-designed as a cloud-native platform!
• Native Deployment Options
• Automated, web-based management of clusters, instances,
autoscaling, load balancing, etc.
• Servers run on customer account, for security and latency reasons
• Available for AWS, coming for Azure
• Tight integration with cloud ecosystems:
• Snowflake, Redshift, BigQuery, Synapse, Athena, and many others
• SSO and popular IdPs like AWS, Azure, Okta, Ping, Duo, etc.
• Web-based client for development, monitoring, management,
etc.
• Available in cloud marketplaces for AWS, Azure and GCP with
attractive play-as-you-go options