Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a modern data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. They all may sound great in theory, but I'll dig into the concerns you need to be aware of before taking the plunge. I’ll also include use cases so you can see what approach will work best for your big data needs. And I'll discuss Microsoft version of the data mesh.
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Gouvernance des données : le potentiel DATA au service de votre business !
La gouvernance des données devient aujourd’hui une priorité stratégique pour beaucoup d’entreprises… Mais quels sont concrètement les facteurs-clés de succès, les impacts de sa mise en œuvre et la création de valeur espérée ?
Les entreprises ont pour la plupart largement initié leur mue digitale. Cette transformation majeure fut, comme toutes les grandes mutations, dans un premier temps sous-estimée, faute de valorisation concrète des bénéfices. Cette situation semble se reproduire aujourd’hui avec la gouvernance data.
Pourtant, à l’heure de l’industrialisation des projets Big Data et du stockage des données tous azimuts, il devient fondamental de s’interroger sur la véritable création de valeur générée par la collecte, l’usage et la gouvernance de toutes ces données.
Alors au-delà de cette promesse, qu’entend-on par gouvernance des données ? Quelles sont les données à collecter ? Les outils à utiliser ? Les bénéfices concrets ? Quels sont les freins à sa mise en œuvre ? Quelles sont les dimensions prioritaires ? Quelle méthodologie adopter ? Quelles missions pour le Chief Data Officer ?
Wonder what this data mesh stuff is all about? What are the principles of data mesh? Can you or should you consider data mesh as the approach for your analytics platform? And most important - how can Snowflake help?
Given in Montreal on 14-Dec-2021
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a modern data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. They all may sound great in theory, but I'll dig into the concerns you need to be aware of before taking the plunge. I’ll also include use cases so you can see what approach will work best for your big data needs. And I'll discuss Microsoft version of the data mesh.
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Gouvernance des données : le potentiel DATA au service de votre business !
La gouvernance des données devient aujourd’hui une priorité stratégique pour beaucoup d’entreprises… Mais quels sont concrètement les facteurs-clés de succès, les impacts de sa mise en œuvre et la création de valeur espérée ?
Les entreprises ont pour la plupart largement initié leur mue digitale. Cette transformation majeure fut, comme toutes les grandes mutations, dans un premier temps sous-estimée, faute de valorisation concrète des bénéfices. Cette situation semble se reproduire aujourd’hui avec la gouvernance data.
Pourtant, à l’heure de l’industrialisation des projets Big Data et du stockage des données tous azimuts, il devient fondamental de s’interroger sur la véritable création de valeur générée par la collecte, l’usage et la gouvernance de toutes ces données.
Alors au-delà de cette promesse, qu’entend-on par gouvernance des données ? Quelles sont les données à collecter ? Les outils à utiliser ? Les bénéfices concrets ? Quels sont les freins à sa mise en œuvre ? Quelles sont les dimensions prioritaires ? Quelle méthodologie adopter ? Quelles missions pour le Chief Data Officer ?
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Coupon promo à 9.99€ : https://www.udemy.com/rgpd-les-fondamentaux/?couponCode=SLIDERGPD2018
D’une portée et d’une application plus étendues que l’actuelle loi sur la protection des données, le GDPR de l’UE étend les droits des individus en matière de données et exige des organisations qu’elles élaborent des politiques et des procédures claires pour protéger les données personnelles et qu’elles adoptent des mesures techniques et organisationnelles appropriées. Les organisations Européennes ont jusqu’en mai 2018 pour se conformer à la nouvelle loi, sous peine d’amendes pouvant aller jusqu’à 4 % du chiffre d’affaires annuel ou 20 millions d’euros – le montant le plus élevé étant retenu.
Dispensée par un praticien expérimenté dans le domaine de la protection des données, cette formation s’appuie sur notre vaste expérience pratique acquise en matière de conformité aux lois sur la protection des données et aux normes de sécurité de l’information, telles que la norme ISO 27001.
Contenu de la formation complète :
_3h30 de vidéos
_Des exercices et des discussions ouvertes
Coupon : https://www.udemy.com/rgpd-les-fondamentaux/?couponCode=SLIDERGPD2018
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
Enabling a Data Mesh Architecture with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3rwWhyv
The Data Mesh architectural design was first proposed in 2019 by Zhamak Dehghani, principal technology consultant at Thoughtworks, a technology company that is closely associated with the development of distributed agile methodology. A data mesh is a distributed, de-centralized data infrastructure in which multiple autonomous domains manage and expose their own data, called “data products,” to the rest of the organization.
Organizations leverage data mesh architecture when they experience shortcomings in highly centralized architectures, such as the lack domain-specific expertise in data teams, the inflexibility of centralized data repositories in meeting the specific needs of different departments within large organizations, and the slow nature of centralized data infrastructures in provisioning data and responding to changes.
In this session, Pablo Alvarez, Global Director of Product Management at Denodo, explains how data virtualization is your best bet for implementing an effective data mesh architecture.
You will learn:
- How data mesh architecture not only enables better performance and agility, but also self-service data access
- The requirements for “data products” in the data mesh world, and how data virtualization supports them
- How data virtualization enables domains in a data mesh to be truly autonomous
- Why a data lake is not automatically a data mesh
- How to implement a simple, functional data mesh architecture using data virtualization
Uncover how your business can save money and find new revenue streams.
Driving profitability is a top priority for companies globally, especially in uncertain economic times. It's imperative that companies reimagine growth strategies and improve process efficiencies to help cut costs and drive revenue – but how?
By leveraging data-driven strategies layered with artificial intelligence, companies can achieve untapped potential and help their businesses save money and drive profitability.
In this webinar, you'll learn:
- How your company can leverage data and AI to reduce spending and costs
- Ways you can monetize data and AI and uncover new growth strategies
- How different companies have implemented these strategies to achieve cost optimization benefits
Improving Data Literacy Around Data ArchitectureDATAVERSITY
Data Literacy is an increasing concern, as organizations look to become more data-driven. As the rise of the citizen data scientist and self-service data analytics becomes increasingly common, the need for business users to understand core Data Management fundamentals is more important than ever. At the same time, technical roles need a strong foundation in Data Architecture principles and best practices. Join this webinar to understand the key components of Data Literacy, and practical ways to implement a Data Literacy program in your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Slides: Taking an Active Approach to Data GovernanceDATAVERSITY
A Look at How Riot Games Implemented Non-Invasive Data Governance
Riot Games created and runs “League of Legends,” the world’s most-played PC game and most viewed eSport — and is now transforming to become a multi-title publisher. To keep pace with this transformation and support a growing player base of millions, Riot Games is taking a page from Bob Seiner’s book, “Non-Invasive Data Governance: The Path of Least Resistance and Greatest Success” and leveraging the Alation Data Catalog to help guide accurate, well-governed analysis.
Bob Seiner will join Riot Games’ Chris Kudelka, Technical Product Manager, and Michael Leslie, Senior Data Governance Architect, and Alation’s John Wills, VP of Professional Service, for an inside look at Data Governance at one of the world’s leading gaming companies.
Join this webinar to learn:
• How Riot Games is implementing Non-Invasive Data Governance
• How this new approach to Data Governance helps to drive the business
• How the Alation Data Catalog helps Riot Games create the foundation for guiding accurate, well-governed data use
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
Every business today wants to leverage data to drive strategic initiatives with machine learning, data science and analytics — but runs into challenges from siloed teams, proprietary technologies and unreliable data.
That’s why enterprises are turning to the lakehouse because it offers a single platform to unify all your data, analytics and AI workloads.
Join our How to Build a Lakehouse technical training, where we’ll explore how to use Apache SparkTM, Delta Lake, and other open source technologies to build a better lakehouse. This virtual session will include concepts, architectures and demos.
Here’s what you’ll learn in this 2-hour session:
How Delta Lake combines the best of data warehouses and data lakes for improved data reliability, performance and security
How to use Apache Spark and Delta Lake to perform ETL processing, manage late-arriving data, and repair corrupted data directly on your lakehouse
Building a Logical Data Fabric using Data Virtualization (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3FF1ubd
In the recent Building the Unified Data Warehouse and Data Lake report by leading industry analysts TDWI, we have discovered 64% of organizations stated the objective for a unified Data Warehouse and Data Lakes is to get more business value and 84% of organizations polled felt that a unified approach to Data Warehouses and Data Lakes was either extremely or moderately important.
In this session, you will learn how your organization can apply a logical data fabric and the associated technologies of machine learning, artificial intelligence, and data virtualization can reduce time to value. Hence, increasing the overall business value of your data assets.
KEY TAKEAWAYS:
- How a Logical Data Fabric is the right approach to assist organizations to unify their data.
- The advanced features of a Logical Data Fabric that assist with the democratization of data, providing an agile and governed approach to business analytics and data science.
- How a Logical Data Fabric with Data Virtualization enhances your legacy data integration landscape to simplify data access and encourage self-service.
Data-Ed Slides: Data-Centric Strategy & Roadmap - Supercharging Your BusinessDATAVERSITY
In many organizations and functional areas, data has pulled even with money in terms of what makes the proverbial world go ‘round. As businesses struggle to cope with the 21st century’s newfound data flood, it is more important than ever before to prioritize data as an asset that directly supports business imperatives. However, while organizations across most industries make some attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality), the results of these efforts frequently fall far below expectations. At the root of many of these failures is poor organizational data management—which fortunately is a remediable problem.
This webinar will cover three lessons, each illustrated with examples, that will help you establish realistic goals and benchmarks for data management processes and communicate their value to both internal and external decision makers:
- How organizational thinking must change to include value-added data management practices
- The importance of walking before you run with data-focused initiatives
- Prioritizing specification and data governance over “silver bullet” analytical tools
Club Urba-EA - Vers un SI "data centric" Club Urba-EA
"Extrait du rapport du projet 2016 du Club Urba-EA "Vers un SI data centric"
Les facteurs de transformation qui mènent vers des métiers et des SI dans lesquelles les données sont "au centre" sont multiples et varient d'une organisation à l'autre. A partir d'une quinzaine de retours d'expérience, cinq facteurs sont analysés et des idées clés dégagées...."
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Coupon promo à 9.99€ : https://www.udemy.com/rgpd-les-fondamentaux/?couponCode=SLIDERGPD2018
D’une portée et d’une application plus étendues que l’actuelle loi sur la protection des données, le GDPR de l’UE étend les droits des individus en matière de données et exige des organisations qu’elles élaborent des politiques et des procédures claires pour protéger les données personnelles et qu’elles adoptent des mesures techniques et organisationnelles appropriées. Les organisations Européennes ont jusqu’en mai 2018 pour se conformer à la nouvelle loi, sous peine d’amendes pouvant aller jusqu’à 4 % du chiffre d’affaires annuel ou 20 millions d’euros – le montant le plus élevé étant retenu.
Dispensée par un praticien expérimenté dans le domaine de la protection des données, cette formation s’appuie sur notre vaste expérience pratique acquise en matière de conformité aux lois sur la protection des données et aux normes de sécurité de l’information, telles que la norme ISO 27001.
Contenu de la formation complète :
_3h30 de vidéos
_Des exercices et des discussions ouvertes
Coupon : https://www.udemy.com/rgpd-les-fondamentaux/?couponCode=SLIDERGPD2018
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
Enabling a Data Mesh Architecture with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3rwWhyv
The Data Mesh architectural design was first proposed in 2019 by Zhamak Dehghani, principal technology consultant at Thoughtworks, a technology company that is closely associated with the development of distributed agile methodology. A data mesh is a distributed, de-centralized data infrastructure in which multiple autonomous domains manage and expose their own data, called “data products,” to the rest of the organization.
Organizations leverage data mesh architecture when they experience shortcomings in highly centralized architectures, such as the lack domain-specific expertise in data teams, the inflexibility of centralized data repositories in meeting the specific needs of different departments within large organizations, and the slow nature of centralized data infrastructures in provisioning data and responding to changes.
In this session, Pablo Alvarez, Global Director of Product Management at Denodo, explains how data virtualization is your best bet for implementing an effective data mesh architecture.
You will learn:
- How data mesh architecture not only enables better performance and agility, but also self-service data access
- The requirements for “data products” in the data mesh world, and how data virtualization supports them
- How data virtualization enables domains in a data mesh to be truly autonomous
- Why a data lake is not automatically a data mesh
- How to implement a simple, functional data mesh architecture using data virtualization
Uncover how your business can save money and find new revenue streams.
Driving profitability is a top priority for companies globally, especially in uncertain economic times. It's imperative that companies reimagine growth strategies and improve process efficiencies to help cut costs and drive revenue – but how?
By leveraging data-driven strategies layered with artificial intelligence, companies can achieve untapped potential and help their businesses save money and drive profitability.
In this webinar, you'll learn:
- How your company can leverage data and AI to reduce spending and costs
- Ways you can monetize data and AI and uncover new growth strategies
- How different companies have implemented these strategies to achieve cost optimization benefits
Improving Data Literacy Around Data ArchitectureDATAVERSITY
Data Literacy is an increasing concern, as organizations look to become more data-driven. As the rise of the citizen data scientist and self-service data analytics becomes increasingly common, the need for business users to understand core Data Management fundamentals is more important than ever. At the same time, technical roles need a strong foundation in Data Architecture principles and best practices. Join this webinar to understand the key components of Data Literacy, and practical ways to implement a Data Literacy program in your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Slides: Taking an Active Approach to Data GovernanceDATAVERSITY
A Look at How Riot Games Implemented Non-Invasive Data Governance
Riot Games created and runs “League of Legends,” the world’s most-played PC game and most viewed eSport — and is now transforming to become a multi-title publisher. To keep pace with this transformation and support a growing player base of millions, Riot Games is taking a page from Bob Seiner’s book, “Non-Invasive Data Governance: The Path of Least Resistance and Greatest Success” and leveraging the Alation Data Catalog to help guide accurate, well-governed analysis.
Bob Seiner will join Riot Games’ Chris Kudelka, Technical Product Manager, and Michael Leslie, Senior Data Governance Architect, and Alation’s John Wills, VP of Professional Service, for an inside look at Data Governance at one of the world’s leading gaming companies.
Join this webinar to learn:
• How Riot Games is implementing Non-Invasive Data Governance
• How this new approach to Data Governance helps to drive the business
• How the Alation Data Catalog helps Riot Games create the foundation for guiding accurate, well-governed data use
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
Every business today wants to leverage data to drive strategic initiatives with machine learning, data science and analytics — but runs into challenges from siloed teams, proprietary technologies and unreliable data.
That’s why enterprises are turning to the lakehouse because it offers a single platform to unify all your data, analytics and AI workloads.
Join our How to Build a Lakehouse technical training, where we’ll explore how to use Apache SparkTM, Delta Lake, and other open source technologies to build a better lakehouse. This virtual session will include concepts, architectures and demos.
Here’s what you’ll learn in this 2-hour session:
How Delta Lake combines the best of data warehouses and data lakes for improved data reliability, performance and security
How to use Apache Spark and Delta Lake to perform ETL processing, manage late-arriving data, and repair corrupted data directly on your lakehouse
Building a Logical Data Fabric using Data Virtualization (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3FF1ubd
In the recent Building the Unified Data Warehouse and Data Lake report by leading industry analysts TDWI, we have discovered 64% of organizations stated the objective for a unified Data Warehouse and Data Lakes is to get more business value and 84% of organizations polled felt that a unified approach to Data Warehouses and Data Lakes was either extremely or moderately important.
In this session, you will learn how your organization can apply a logical data fabric and the associated technologies of machine learning, artificial intelligence, and data virtualization can reduce time to value. Hence, increasing the overall business value of your data assets.
KEY TAKEAWAYS:
- How a Logical Data Fabric is the right approach to assist organizations to unify their data.
- The advanced features of a Logical Data Fabric that assist with the democratization of data, providing an agile and governed approach to business analytics and data science.
- How a Logical Data Fabric with Data Virtualization enhances your legacy data integration landscape to simplify data access and encourage self-service.
Data-Ed Slides: Data-Centric Strategy & Roadmap - Supercharging Your BusinessDATAVERSITY
In many organizations and functional areas, data has pulled even with money in terms of what makes the proverbial world go ‘round. As businesses struggle to cope with the 21st century’s newfound data flood, it is more important than ever before to prioritize data as an asset that directly supports business imperatives. However, while organizations across most industries make some attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality), the results of these efforts frequently fall far below expectations. At the root of many of these failures is poor organizational data management—which fortunately is a remediable problem.
This webinar will cover three lessons, each illustrated with examples, that will help you establish realistic goals and benchmarks for data management processes and communicate their value to both internal and external decision makers:
- How organizational thinking must change to include value-added data management practices
- The importance of walking before you run with data-focused initiatives
- Prioritizing specification and data governance over “silver bullet” analytical tools
Club Urba-EA - Vers un SI "data centric" Club Urba-EA
"Extrait du rapport du projet 2016 du Club Urba-EA "Vers un SI data centric"
Les facteurs de transformation qui mènent vers des métiers et des SI dans lesquelles les données sont "au centre" sont multiples et varient d'une organisation à l'autre. A partir d'une quinzaine de retours d'expérience, cinq facteurs sont analysés et des idées clés dégagées...."
Data-Centric and Message-Centric System ArchitectureRick Warren
Presentation from April, 2010 summarizing the principles of data-centric design and how they apply to DDS technology. Message-centric design is presented by way of contrast.
Jacobs has used Endeavour (AVEVA NET) for more than 12 years for delivery of project data. The use has been primarily driven by customer or contract requirements for data handover, but over time both Jacobs’ project teams and customers have recognized the value of having trustworthy and complete data at the completion of a project, and is giving a focused effort to execute data-centric projects moving forward. To support this, Jacobs is implementing AVEVA Engineering to drive a data-centric collaboration between disciplines to enable greater work efficiencies. This game-changing approach using Endeavour and AVEVA Engineering will provide data alignment across the full project spectrum of EPC delivery.
Presented by: Marc-Henri Cerar—Jacobs
Discover how AVEVA can transform your business today
www.aveva.com
Data-Centric Infrastructure for Agile DevelopmentDATAVERSITY
Most data centers are filled with rigid data servers that are tightly linked to specific applications, leading to data duplication, lengthy development cycles, and unnecessary costs. Learn how you can use an Enterprise NoSQL database platform to help create a flexible, agile data fabric that will allow you to iterate your application development, optimize your data, and reduce costs.
When your enterprise infrastructure is data-centric instead of application-centric, you make it easy for anyone to pull crucial data without spending unnecessary time and money on plumbing...freeing resources for building better applications. Learn how other companies have built –and benefited from– a data-centric infrastructure for agile development.
Ingest and manage all your data, documents, and semantic triples in a flexible, schema-agnostic platform – without sacrificing the ACID transactions, granular security, database management tools and other features you’ve come to expect in a mature database platform
Quickly build complex, interactive search applications
Deliver robust, real-time search and alerting within your applications
Use – and optimize – modern infrastructure including Hadoop and cloud to attain operational agility
Simplify implementation of data governance requirements around security, privacy, provenance, retention, continuity, and compliance – while reducing risk, cost, and time
Le référentiel employé pour tirer toute la valeur de vos données RHJean-Michel Franco
Les informations concernant vos collaborateurs sont-elles fiables et cohérentes au travers de tous les systèmes d'information qui nécessitent de les exploiter ? Sont-elles sécurisées et aisément accessibles par les équipes opérationnelles ?
Dans cette présentation, Talend et Edis Jems Group illustrent comment :
- Fédérer les informations RH à travers les organisation et ses systèmes d’information ;
- Améliorer les processus RH grâce à ces informations, pour digitaliser les services RH ou mieux gérer la performance individuelle et collective, la relation avec les IRP, ou encore la gestion prévisionnelle des emplois et compétences
- Enrichir les données RH grâce au Big Data pour mieux connaitre vos propres talents ou en découvrir de nouveaux en les connectant à des sources externes telles que Linkedin.
La qualité des données à l’ère du Big DataPrecisely
Face à l’essor du Big Data et à la multiplication des sources de données, les entreprises ont de plus en plus de données à leur disposition.
Malheureusement, 40% des données d’entreprises sont toujours inexactes, incomplètes ou indisponibles.
Parce que l’impact d’une donnée mal qualifiée sur la non-performance est majeur, la gestion de sa qualité doit donc être perçue comme un réel levier de performance.
Une meilleure vue unique client est un facteur de succès pour accélérer l'innovation des entreprises grâce à des données fiables, tout en obligeant à respecter les exigences réglementaires pour réduire le risque opérationnel et accroître l'efficacité.
Joignez-vous à notre webinaire pour découvrir :
- Comment répondre aux besoins court terme d’amélioration d’accès, d’intégration et de gestion de la qualité des données
- Une réelle gouvernance des données avec la mise en œuvre d’une gestion agile des données de référence par domaine métier
- Comment obtenir une réelle vue unique client
- Cas clients
Collaboration et gestion des données et contenus dans la transformation digit...KnowledgeConsult
Collaboration et gestion des données et contenus dans la transformation digitale
Rencontres internationales de la Conduite du Changement24 septembre 2014
Banques et Assurances : optez pour des données fiables, précises et cohérentes !Precisely
Aujourd’hui, les Banques et Assurances doivent faire face aux nouveaux défis technologiques et aux enjeux de la digitalisation.
Elles évoluent dans des environnements contraints et très réglementés, où chaque action est soumise à une multitude de consignes et de contrôles de conformité.
Face à un environnement de plus en plus compétitif, la rétention client change la donne pour les services financiers .
Aussi avoir des données fiables, précises, cohérentes et contextuelles devient un impératif business et Precisely, leader de la Data Integrity vous aide à l’accomplir.
Points clés du webinar :
- Amélioration de l’expérience client: la personnalisation de l’offre est aujourd’hui un élément décisionnel capital pour le client.
- Criminalité financière et conformité : découvrez comment surmonter les difficultés liées aux données et renforcer vos opérations de lutte contre la fraude.
- Geo-enrichissement des données commerciales avec Precisely ID : un fichier d'adresses avec des informations "géo-enrichies" fournit un contexte et une valeur permettant de prendre de meilleures décisions.
Webinar Smile et Talend : Faites communiquer vos applications en temps réelSmile I.T is open
L'ESB est la clé pour interconnecter vos applications et leur permettre l’échange d’information en temps réel. Talend et Smile vous invitent à ce webinar afin de vous faire découvrir la solution Talend Platform for Data Services appliquée à deux des plus importantes entreprises de la grande distribution.
Au programme :
Un expert Talend vous présentera la seule plateforme unifiée open source associant des outils d'ETL et d'ESB
Smile vous fera découvrir deux cas d’utilisations réels :
- Un exemple concret d'implémentation de cette solution pour gérer les flux pour la synchronisation du référentiel produit / de l'ERP central groupe / des ERP magasins chez un leader de la distribution d'articles de jardinage.
- Talend au cœur du SI grâce à la mise en place d'une véritable architecture orientée service e-business chez un leader de la grande distribution.
Découvrez comment créer une solution complète de gouvernance des donnéesPrecisely
Les systèmes de gouvernance des données fournissent un cadre pour les politiques, les processus, les règles, les rôles et les responsabilités qui vous aident à gérer vos données. Une excellente stratégie de gouvernance des données, basée sur des outils comme le Collibra Data Governance Center (DGC), est essentielle pour tirer le meilleur parti de vos données.
Trillium Discovery s'intègre de façon transparente à Collibra pour créer une solution complète de gouvernance des données qui fournit des indicateurs sur la santé de vos données pour vous aider à mieux piloter votre organisation, tout en répondant aux contraintes règlementaires.
Participez à ce webinaire et découvrez comment vous pouvez tirer parti de cette intégration dans votre organisation pour construire, appliquer et exécuter facilement des règles de gestion basées sur les politiques de gouvernance des données au sein de Collibra.
Talend, Leading Open Source DataIntegration plateform. Cedric CarboneCedric CARBONE
Slides corporate de la société Talend (Oct08) et ses 4 plateformes Open Source :
-Talend Open Studio
-Talend Integration Suite
-Talend Open Profiler
-Talend Data Quality
Plus d'info à http://www.talend.com
A la découverte de Talend Data Preparation, l’outil de préparation des données en libre-service pour tous
Talend Data Preparation donne les moyens à tous les décideurs dans l’entreprise de rapidement préparer et mettre en forme les données afin de pouvoir consacrer davantage de temps à les exploiter à des fins d’analyse et de partage.
IBM Information Management - Pas de décision de qualité sans informations de ...Nicolas Desachy
Sur une planète toujours plus intelligente, instrumentée et interconnectée, la masse d\'information explose. Il n\'y a pas de prise de décision de qualité sans une information fiable, pertinente, à la bonne personne au bon moment. Lors des Tendances Logicielles New Intelligence, Dan Benouaisch, IBM, a développé les concepts et présenté l\'offre IBM InfoSpere qui répond à ces impératifs.
Vos données produits sont-elles fiables et sont-elles fiables et cohérentes à travers tous vos systèmes en temps réel ? Les services concernés ont-ils accès aux informations les plus récentes et les plus complètes sur vos produits ?
Cette présentation montre comment :
- tirer parti du Big Data pour connecter vos produits grâce à l’Internet des Objets ;
- optimiser les processus de maintenance, réparation et révision (MRO) ;
- harmoniser l'information au travers de votre supply chain ;
- gérer vos stratégies de commerce multicanal ;
- échanger avec vos partenaires et les autorités de réglementation.
Enrichissement et nettoyage de données - Foule FactoryFoule factory
Foule Factory édite une solution permettant l'enrichissement et le nettoyage de vos données !
Idéale pour effacer vos doublons et enrichir vos données clients
[French] Une Vision à 360° de vos clients grâce au Master Data Management et ...Jean-Michel Franco
Vos données clients sont-elles fiables et cohérentes au travers de tous les systèmes d'information qui nécessitent de les exploiter ? Sont-elles sécurisées et facilement accessibles par les équipes opérationnelles ?
Cette présentation s'intéresse à la mise en œuvre d'un système de Master Data Management pour:
• obtenir une vision unique de vos clients à travers toute votre infrastructure ;
• transformer ces informations en données utilisables (qui peuvent être utilisées par les équipes marketing, commerciales, de service ou de fidélisation client) ;
• récupérer des données depuis de nouvelles sources en alliant les technologies de MDM et de Big Data.
Adoption de Hadoop : des Possibilités Illimitées - Hortonworks and TalendHortonworks
Hadoop has become unavoidable. Companies of all sizes are at different stages of their thoughts on Big Data. Whether you're just starting to explore the platform or you already have several existing clusters, everyone faces the same challenge - to develop its internal expertise.
Specialists of Big Data, Talend, and Hortonworks, watch this webinar to discover how to unify all your data in Hadoop, without specific skills Big Data.
Data matters to all of us, and business expectations are raising, everywhere in the company.
But it has not always lived up to its promises.
The conditions are in place to generalize its use and adoption, by answering these three questions:
- Organization: centralize or decentralize data management?
- Architecture: how to establish flexible and sustainable foundations?
- Governance: how to manage and encourage use and collaboration?
La data nous concerne tous, et les attentes sont considérables, partout dans l’entreprise. Mais elle n’a pas toujours tenu ses promesses.
Les conditions sont réunies pour généraliser ses usages et son adoption, en répondant à ces trois questions :
- Organisation : centraliser ou décentraliser la gestion des données ?
- Architecture : comment établir des fondations souples et pérennes ?
- Gouvernance : comment encadrer et susciter les usages et collaborations ?
Reveal the Intelligence in your Data with Talend Data FabricJean-Michel Franco
Discover the Winter'20 release of Talend Data Fabric.
Find out about the newly released product, Talend Data Inventory, and the powerful new capabilities and AI that accelerate and modernize data engineering. Find out how to:
- Ensure trusted data at first sight with Data Inventory
- Increase efficiency and productivity with Pipeline Designer
- Automate more integration tasks with AI and APIs
Découvrir la version WInter 20 de Talend Data Fabric.
Elle inclue un nouveau produit, Data Inventory, ainsi que de nouvelles et puissantes fonctionnalités de qualité des données intelligente et d'IA explicable, capables d’accélérer et de moderniser l’ingénierie de données.
Elle permet de :
- garantir des données fiables instantanément avec Data Inventory
- augmenter considérablement l’efficacité et la productivité avec Pipeline Designer
- automatiser davantage de tâches d’intégration avec l’IA et les API
3 Steps to Turning CCPA & Data Privacy into Personalized Customer ExperiencesJean-Michel Franco
Your company’s success lies in your capacity to keep your customers’ trust while offering them a personalized experience. With the right Data Privacy framework and technology for your data governance project you will maintain compliance and prosper.
CCPA isn’t the first privacy regulation to impact virtually every organization that does business in the United States – it’s simply the one starting in 2020. As these regulations continue to expand and change, what if there was a way to turn compliance into your advantage? Attend this session and learn how a strong, carefully considered data governance program can help you stay ahead of new regulations like CCPA, and also enhance customer experiences with trusted data.
Learn how a 3-step approach can help you:
Ensure regulatory compliance at scale
Deliver advanced analytics with trusted data
Enable customer personalization for more accurate business insights targeted offers, and behavioral knowledge
The reliability of data, and your company’s reputation for protecting it, have become essential to doing business in the data age. Modern data governance works at the speed of business, the scale of data, and still has a human touch so you can say “yes” and deliver trusted data.
In these presentations
, Stewart Bond, Research Director of IDC’s Data Integration and Integrity Software Service, and Talend will highlight this modern approach to data governance.
Watch now to learn how to:
Put trust and data literacy at the core of your digital transformation
Tackle the growing complexity of data management
Identify the value and ROI levers that drive success
Leverage Data Intelligence Software from discovery to enablement
To view this On Demand Webinar, please fill out the form. A Flash-based player will then open. Controls for pause/play, rewind, and sound are available at the bottom of the player.
Are you tired of saying “no” when it comes to data? IDC and Talend share insights into how you can deliver data governance with a “yes”.
The reliability of data, and your company’s reputation for protecting it, have become essential to doing business in the data age. Modern data governance works at the speed of business, the scale of data, and still has a human touch so you can say “yes” and deliver trusted data.
Les données sont partout. Fournir des données fiables à toutes les personnes qui en ont besoin est un véritable défi. Heureusement, des technologies émergeantes sont là pour vous aider. Grâce à des sémantiques intelligentes, la gestion des metadonnées, l'auto-profiling, la recherche par facettes, et l'archivage collaboratif des données, il est désormais possible d'avoir une approche de type Wikipedia pour vos données. Talend peut vous aider à operationaliser plus de données, plus rapidement et à accroître l'utilisation de ces données par tous grâce à un Data Catalog d'entreprise.
Data is everywhere, and delivering trustable data to anyone who needs it has become a challenge. But innovative technologies come to the rescue: through smart semantics, metadata management, auto-profiling, faceted search and collaborative data curation there is a way to establish a Wikipedia like approach for your data. Find out how Talend will help you to operationalize more data faster and increase data usage for everyone with an Enterprise Data Catalog
Delivering Analytics at Scale with a Governed Data LakeJean-Michel Franco
Data privacy is on everyone's mind right now. Regulations such as GDPR, as well as public sentiment, mean that governance and compliance are must-have capabilities for data lakes. Learn how to curate meaningful data from your data lake, accelerate governance and compliance, and enable your organization with searchable, trusted datasets.
Enacting the data subjects access rights for gdpr with data services and data...Jean-Michel Franco
GDPR is more than another regulation to be handled by your back office. As stated by the European Commission, “The primary objective of this new set of rules is to give citizens back control over of their personal data.” And surveys show that European citizens are eager to apply for those new fundamental rights, such as access to information, data portability, and the right to be forgotten. Will you be ready to deliver, or will you be forced to tell your customers that unfortunately, you are not yet ready to respect their rights?
Enacting the GDPR’s Data Subject Access Rights (DSAR) requires practical actions. There’s a mandate for an integrated data governance strategy to establish your data inventory, operationalize controls, foster accountability across teams and ensure compliance, and finally unleash personal data to your customers, employees, visitors, and prospects. Only a strong data governance program on top of a modern, collaborative data hub ensures that you have the policies, standards, and controls in place to enforce compliance.
This presentations outlines the practical steps to deploy governed data services that:
Know your customers and employees with a data inventory
Track and trace data using audit trails and data lineage
Manage and propagate opt-in consent across customer-facing applications
Reconcile and protect your sensitive data in a data hub with automated controls, data stewardship, and data masking
Respect the rights for your data subjects with collaborative data management and portals
The race is on for GDPR compliance, and now it is time to get hands-on with personal data. Survey highlight that the toughest operational challenges for compliance are related to Data management
This presentation shares use cases and concrete experiences on how companies are :
- Applying Metadata and Master Data Management to track, reconcile, control and trace personal data
- Establishing compliant consent mechanisms
- Enacting a privacy control center for the data subject access rights, data portability and rights to be forgotten.
Business can't wait to turn data into insights, which means they often can't wait for IT. But that increases the risk of bad data and inaccurate results. Learn how IT can engage the business to accelerate data integration, build perfect, trusted, and compliant data; and increase data usage and time-to-insight.
Delivering analytics at scale with a governed data lakeJean-Michel Franco
Data privacy is on everyone's mind right now. Regulations such as GDPR, as well as public sentiment, mean that governance and compliance are must-have capabilities for data lakes. Learn how to curate meaningful data from your data lake, accelerate governance and compliance, and enable your organization with searchable, trusted datasets.
pour accompagner les talents, gérer les compétences et assurer la conformité des données pour GDPR
Vos collaborateurs sont au cœur de la réussite de votre entreprise, mais disposez-vous d’informations précises et fiables les concernant ? Sont-elles sécurisées et en conformité avec les réglementations telles que GDPR, tout en étant facilement accessibles pour les prises de décision et activités opérationnelles ?
Lors de ce webinar à la demande, les équipes RH Orange et les consultants d’Orange Consulting partageront leur retour d’expérience dans la mise en œuvre de la vue 360° employés au sein du groupe, la méthode utilisée, les difficultés rencontrées et les résultats obtenus.
Participez à ce webinar à la demande d'une heure pour apprendre comment :
fédérer et réconcilier les 18 sources d'informations RH différents issus de plusieurs systèmes d'information ;
mieux connaitre les salariés pour répondre aux enjeux RH et business des managers de disposer d’une cartographie dynamique en temps réel des données salariés ;
mettre en place des tableaux de bord d’ indicateurs pertinents pour accompagner la réflexion stratégique et les plans d’actions RH par une vision synthétique, actualisé et multicritères des données salariés ;
anticiper la mise en application de la nouvelle règlementation européenne sur les données privées (GDPR).
As the deadline for GDPR approaches, it is time to get practical about protecting personal data.
We break down the steps for turning a data lake into a data hub with appropriate data management and governance activities: from capturing and reconciling personal data to providing for consent management, data anonymyzation, and the rights of the data subject.
A smart approach to GDPR compliance lays a foundation for personalized and profitable customer and employee relations.
Join us, as experts from MAPR and Talend show you how to:
Diagnose the maturity of your GDPR compliance
Set up milestones and priorities to reach compliance
Create a foundation to manage personal data through a data lake
Master compliance operations - from data inventory to data transfers to individual rights management
Tout d’abord, j’aimerais prendre quelques instants pour vous parler de Talend et vous présenter l’historique de notre entreprise. (Il y a beaucoup d’informations sur cette diapo et tous les points ne nécessitent pas d’être développés, en fonction du temps disponible).Faits importantsL’entreprise a été fondée en 2006. Nous comptons aujourd’hui environ 400 employés. Nous sommes installés dans 7 pays et possédons deux sièges – un à Los Altos, en Californie et l’autre près de Paris (Suresnes), en France, où l’entreprise a été fondée.Notre modèle commercial est appelé “open core”. Il est unique et un peu différent des modèles traditionnels que vous connaissez via les vendeurs propriétaires proposant des licences perpétuelles. Nous fournissons des licences sous souscription et nous proposons des formations et des services d’expertise et de support. Nous en parlerons plus en détail durant cette présentation. Le modèle open core permet à Talend de fournir de l’intégration à tous les niveaux, vous permettant de consommer ce dont vous avez besoin aujourd’hui et d’augmenter votre utilisation des produits au fur et à mesure que vos besoins évoluent.Vous pouvez constater sur le graphique que Talend a connu une forte croissance depuis sa création et continue de croître à un taux impressionnant (Le TCAM représente le taux de croissance annuel moyen, c’est-à-dire la moyenne de la croissance par an). Modèle de déploiement basé sur la croissanceComment cette croissance a-t-elle été conduite ? Notre modèle commercial est unique et se base sur la philosophie open source dans le domaine des logiciels, sur laquelle l’entreprise à été fondée. Ce modèle permet une connaissance répandue de la marque. Vous pouvez constater que notre produit a été téléchargé plus de 1,6 million de fois, ce qui montre que Talend est non seulement connu d’un large public, mais également que de nombreuses entreprises tirent parti de l’utilisation des produits Talend. Nous en avons la preuve au sein de notre communauté dynamique. Plus de 100 000 utilisateurs enregistrés utilisent activement nos produits et partagent leur expérience. Ils partagent leur apprentissage des produits et comment ils peuvent en faire profiter le reste de la communauté – c’est un arrangement mutuel de partage d’informations, d’expériences et de bonnes pratiques.Vous vous demandez peut-être, “Comment une entreprise open source gagne-t-elle de l’argent ?” Nous réussissons à monétiser la popularité des produits en commercialisant certaines fonctionnalités et en facturant ces produits améliorés via le modèle de souscription que j’ai mentionné. Nous fournissons des fonctionnalités de niveau entreprise complétant la technologie open source au cœur de nos produits.Dans cette diapositive, vous pouvez voir que nous avons plus de 1 800 souscripteurs actifs (sociétés). (Si votre public a vu nos diapositives disant que nous avons 4 000 clients, vous pouvez leur dire que, depuis notre création, nous avons eu 4 000 clients payants bénéficiant des produits Talend). Notre modèle de souscription signifie que nous devons prouver notre valeur chaque année à nos clients. Vous pouvez constater que 86% des clients renouvellent leur licence, ce qui est un taux indiquant un fort taux de fidélité de la part des clients de Talend. C’est la preuve que nos produits fournissent bien la valeur promise, d’une manière peu coûteuse et efficace. (Si l’on vous dit que ce chiffre est trop bas, vous pouvez expliquer que certains projets se terminent et que les licences ne sont plus requises, par exemple pour les projets de migration).SolutionsLa suite de la présentation contient plus de détails concernant nos solutions, mais, en quelques mots, nous fournissons des solutions répondant aux défis de tous les projets d’intégration – de données, d’applications ou orientée processus – mais nous faisons également de la qualité et de la gouvernance de données, puisque ces domaines concernent également les défis d’intégration.Nous sommes reconnus en tant qu’entreprise leader et innovante, notamment dans le marché de l’intégration, mais également dans les domaines que je viens de mentionner. Gartner et Forrester nous positionnent en tant que leader visionnaire dans ces catégories de produits.Si on vous interroge à propos de nos investisseurs, voici quelques informations supplémentaires : Nous avons levé 102 millions de dollars depuis notre création, ce qui nous place parmi les 10 premiers éditeurs privés de solutions Big Data et d’intégration.- Silver Lake Sumeru : notre investisseur le plus récent, Silver Lake Sumeru nous a rejoint lors de l’acquisition de Sopera en 2010. Ils sont le plus grand capital-risqueur de la SiliconValley dans le domaine de la technologie.Balderton Capital : basé à Londres, un des premiers investisseurs de MySQL. Le partenaire chez Balderton qui a conduit cet investissement est Bernard Liautaud, l’ancien fondateur de Business Objects. Il fait maintenant partie de notre Conseil d’Administration. Idinvest Partners: anciennement AGF PrivateEquity, notre premier investisseur, qui a cru en nous dès le début. Bpifrance: notre dernier investisseur en date, banque publique d’investissement, ils investissent dans des sociétés françaises et Talend est vue comme l’entreprise la plus prometteuse de l’industrie du logiciel en France, particulièrement grâce à son succès international et sa croissance rapide. Iris Capital investit généralement aux côté de Bpifrance.
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
POINT OF SLIDE: proved a standard understanding of what MDM is.MDM supports a business case but what exactly does it DO? Functionally Are these products the same across these two catalogs?Has Paul in accounting approved my request for a new account structure?Is this data valid?Why are these two reports so radically different?How can I better serve my customers?How many customers do we have?
We have seen strong customer momentum in last couple of years across all geographies, domains and industries.British Sugar: British Sugar, a leading supplier of sugar to the UK market has signed a 5 year agreement to implement Talend MDM internally across the business and within their retail arm, Silver Spoon.In a highly competitive market, a company goal for British Sugar is to maintain the best price for sugar and deliver on time to its customers. Large fines are levied by supermarkets for products delivered with any form of data inaccuracy, for example – labeling, product content and incorrect ingredients. Customers will also levy large fines for late delivery if bulk sugar to manufacturing line slides. Cost of delivering to an outdated address can be large. The business decided to upgrade their ERP system and alongside this, create a governance process at departmental level for accurate data validation, cleansing and manipulation.Supplier, Item, CustomerKuoni:Kuinois Global Travel Services and Information Broker between Travel Agencies & providers of Holiday & Tourism ServicesFormerly known as GTA TravelMDM for Customer, Supplier, Product, Location, & Calendar DomainsTo enable more intelligent/effective brokerage of services between their Customers & Suppliers,Ensuring more Revenue is routed directly through Kuoni GTS at the point of booking.i.e. Rather than me going to Egypt and Paying for a Scuba Diving trip with a local company after I arrive (Kuoni Loses this revenue). With effective MDM they can offer these services directly to the Traveller at the point of Booking the Holiday and ensure the revenue is routed through them.VeoliaVeolia An international brand operating in France. Veolia has 9 billion in revenue, 85,600 employees in 33 countries, 819,000 industrial and commercial customersThey service nearly 80 million people.Its line of businesses are Utilities: collection and related services, management of waste disposal facilities, urban cleanliness, sanitationBusiness services: collection, logistics, consolidation and transit, maintenance Industrial Sorting, recycling and treatment of hazardous and non-hazardousMaterial recovery, energy and agricultural Current business environment requires the ability to more, faster with fewer resources, Create products faster, Better serve the customer,Improve reporting and metrics,Optimize the supply chain, Identify new opportunities, Increase revenues, Mergers and acquisitions,Align with legal requirements What do they master: Assets, Products, Now – Employees
Key CapabilitiesIntegrates big data with big data platforms and appliances without codingImproves big data quality and information accuracySimplifies big data project governance and administration
Talend has answered all these market considerations with the introduction of Talend MDM.Talend MDM is rapid to implement. We are the first solution to build integration, quality, all the master data management components such as hierarchies and modeling and workflow all ON a single platform. There is one simple install for all the features across all categories. Everything is installed and connected in under ten minutes. Other vendors require development to cobble the features together.This approach also allows the master data to be IN CONTROL of its own destiny. Because we have integration, quality and workflow unified with MDM. The master data has direct access to each of these functions. More about this a little later.Talend MDM is a flexible general MDM solution. We do not ship with a rigid domain model like other products. The Active Data model allows you to master any data and on your terms. We provide intuitive graphical tools to create the data model and a unique set of processes and triggers to help you implement corresponding business processes. We also provide an easy to use method for importing a data model and corresponding business processes into the solution to get you started quickly.Our unique XML based approach also allows us to provide a dynamic web based interface for collaborating, authoring and searching on hub data. As the data model changes so do corresponding services and any web forms.Talend is dedicated to the open source community. We are not a proprietary black box solution. We provide code so that you can modify and extend as necessary.Download software TODAY. get started today. show real business value NOW.Join the community. Gain value from thousands
Thank you again for the opportunity to speak with you about Talend and how we believe that we can help your organization manage all of your integration needs.I’ll take questions at this time.