Cloud ecosystems have the power to transform a business by delivering quick insights at a low cost. But when you must connect legacy systems like mainframe and IBM i to the cloud, your project can become expensive, time-consuming, and reliant on highly specialized skillsets. So much for low cost and efficiency!
During this session gain a deeper understanding of that value that data from mainframe and IBM i systems can bring to the Microsoft Azure ecosystem.
Transform Legacy Data Stores with Microsoft Azure and Precisely
1. Transform Legacy Data Stores:
Microsoft and Precisely
Precisely Connect and Microsoft Azure
Partnership Overview
Ashwin Ramachandran | Senior Product Manager
2. Housekeeping
Webinar Audio
• Today’s webcast audio is streamed through your computer
speakers
• If you need technical assistance with the web interface or audio,
please reach out to us using the Q&A box
Questions Welcome
• Submit your questions at any time during the presentation using
the Q&A box
Recording and slides
• This webinar is being recorded. You will receive an email following
the webinar with a link to the recording and slides
3. Challenges
of legacy
data in cloud
ecosystems
Complexity Governance
Bottlenecks Scaling
Precisely: Transform Legacy Data Stores: Microsoft and Precisely
4. Data complexity, all your data to the cloud
How fast can you
unlock legacy data for
use in cloud
environments?
Are the solutions you
are using today slowing
down or improving
processing?
Will open source
frameworks help
address the complexities
of legacy data?
Do you have legacy
data expertise in-house
or will you have to seek
external resources to
access data?
Precisely: Transform Legacy Data Stores: Microsoft and Precisely
5. Lorem ipsum dolor ete sit
amet, consectetur
Precisely: Transform Legacy Data Stores: Microsoft and Precisely
Data governance must cover entire data lifecycle
Consider the full
data lifecycle
Have a metadata
strategy in place
Clear
understanding of
data movement
Data replication and
data lineage are
critical foundations
6. Precisely: Transform Legacy Data Stores: Microsoft and Precisely
Data bottlenecks, prevent a single point of failure
Native connectivity to target cloud environment
Enable a design once, deploy anywhere approach
Have no impact on performance
Guarantee data delivery
✔
✔
✔
✔
YES NO
7. Data scaling for real-
time business needs
• Consider real-time data delivery may
break current data pipelines
• Select solutions that have reliable
transfer of information
• Keep in mind streaming platforms such
as Kafka on Azure could enable real-
time delivery
• Assess how your overall cloud strategy
supports real-time data delivery
Precisely: Transform Legacy Data Stores: Microsoft and Precisely
9. Connect unlocks data
for Microsoft Azure
Expertise in legacy data
No installation of software on mainframe
Use existing metadata
Quickly understand and access legacy data
Precisely: Transform Legacy Data Stores: Microsoft and Precisely
10. Precisely: Transform Legacy Data Stores: Microsoft and Precisely
Precisely Connect and Microsoft Azure ecosystem
Data Lake
Analytics
Azure Analysis
Azure Databricks
(Python, Scala, Spark SQL,
Sparkfl, Spark MI, SparklyR)
Azure Data
Lake Storage
Azure Synapse
Analytics
Mainframe
Ingest Store Prep and Train
PolyBase
Model and Serve
Business/custom
apps(structured)
1 2
4
5
Logs, files, and media
(unstructured)
3
SQL
Notes de l'éditeur
Data complexity: Legacy data is not readily compatible with cloud ecosystems
Data governance: Copybooks no longer match data, metadata unreadable, record descriptors and data copies lost
Data bottlenecks: Loading large data sets to the cloud can drag down performance and increase time to insight
Data scaling: Data is delivered on a delay causing loss in efficiency and inability to deliver at speed of business
Legacy data can provide a treasure-trove of information that can transform your business when used within cloud ecosystems
Consider the speed in which you need you are able to unlock legacy data for use in your cloud environments
Understand how the solutions you are using today handle variable length records – are they slowing down processing?
Determine if open-source frameworks will work for your business – mainframe data is not readily compatible with open system data formats in the cloud
Do you have legacy data expertise in-house or will you have to seek external resources to access data?
Moving to the cloud means you must guarantee that high data quality exists throughout the complete lifecycle of the data
Change data capture and data lineage foundational elements for a data governance strategy for legacy data
Change Data Capture - what changed the in data
Data Lineage – where that came from
Understand how when moving data to the cloud, most tools will unpack, expand, and convert data to readable formats
Determine how you can use existing metadata including copybooks to meet tightening SLAs
A single point of failure and throughput forces performance and reliability of your data that is being connected to the cloud
Determine if there is native connectivity to the cloud environment you want to move mainframe data to
Calculate: how much time and effort will it take to load database resources into cloud environments
How will mainframe data ingestion effect performance overall?
Consider where your single point of failure exists
Tracking and detection of data changes needs to be presented to as close to real-time as possible
Consider that faster data delivery may break current data pipeline structures
Look for solutions that insulate your organization against the underlying complexities of your technology stack
Select solutions that guarantee data delivery and have reliable transfer of information
Look at how streaming platforms such as Kafka on Azure could be leveraged to feed real-time delivery
Assess how your overall cloud strategy can support real-time data delivery
Precisely Connect offers:
50 years+ of Mainframe expertise
No installation of software on the mainframe needed to restructure mainframe and IBM i data into readable formats for use with Azure services
VSAM connections via: FTP, FTPS, Connect:Direct
Db2/z connections via: ODBC or JDBC
Leverage existing metadata including copybooks to meet tightening SLAs
Convert packed decimal, zoned decimals to readable formats
On ingestion, handle COBOL high/low values
Quickly and easily convert EBCDIC to Unicode
Handle REDEFINEs in COBOL copybooks
Ingest OCCURS DEPENDING ON variable length arrays