The Information System Complexity is a natural trend.
Data Governance is a solution. Un other way is to act on reference composents (MDM, Data Wells), giving them capability of "perfect" integration.
Chinsurah Escorts ☎️8617697112 Starting From 5K to 15K High Profile Escorts ...
Is increasing entropy of information systems a fatality
1. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
1
IS INCREASING ENTROPY OF INFORMATION SYSTEMS
FATALITY?
Rene MANDEL
www.value-architecture.com
Version 1
13/04/2015
2. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
2
SUMMARY
Table of contents
1 FOREWORD................................................................................................................................................................3
1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY.......................................................................3
1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE ........................................................................................3
2 INTEGRATION OF INFORMATION SYSTEMS..................................................................................................5
2.1 ALGEBRA OF IS COMPONENTS....................................................................................................................5
2.2 FATE OF THE COMPLEXITY .........................................................................................................................6
2.2.1 Drift towards complexity...............................................................................................................................6
2.2.2 Existence of reference data............................................................................................................................6
2.2.3 Imperfect integration of reference components ..........................................................................................6
2.3 MECHANICS OF COMPLEXITY.....................................................................................................................9
2.3.1 Divergence according to the rate of imperfection .......................................................................................9
2.3.2 Real reasons of particularities.......................................................................................................................9
3 ACT ON THE HEART OF INTEGRATION .........................................................................................................10
3.1 MAKE THE MOST PERFECT COMPONENTS MARKET........................................................................10
3.1.1 ROI for the SI with an algebra perfect ......................................................................................................10
3.1.2 Act on the internal market ..........................................................................................................................10
3.2 GENERIC INTEGRATION CAPABILITIES ("JANUS" PRINCIPLE) FOR REFERENCE
COMPONENTS........................................................................................................................................................11
3.2.1 Generic data integration functions.............................................................................................................12
3.2.3 Case of Master Data.....................................................................................................................................14
3.2.4 Manage subsidiarity.....................................................................................................................................15
3.3 PROFESSIONALIZING INFORMATION SYSTEM ACTORS..................................................................15
4.1 RESISTANCE TO PYRAMIDAL GOVERNANCE AND HEAVY METHODS ........................................16
4.2 HAVE DECISIONS NATURALLY IN THE CORRECT SENSE ................................................................16
4.3 SYNCHRONIZE DATA ON SIMPLICITY ....................................................................................................17
4.4 CHANGE OF PARADIGM...............................................................................................................................17
3. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
3
1FOREWORD
1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY
Information systems are result of a reality increasingly complex: multiplication of sensors, extension of
application areas, stacking of technologies, application sedimentation,...
Intuitively one can describe this drift over time, as an increase of "entropy" of systems (distinct from
other entropies known, as a human factor is taken into account). Enterprise Architecture aims to impose
a model for order. This is a rational and authoritarian approach, it assumes that the model is respected
and applied in duration. In practice, rational planning, certainly desirable, is often impractical and
expensive. A more modest approach relies on gradual order, at projects pace, at lower cost. But clearly
in absence of central and collective effort, the disorder increases "naturally". How much energy is
needed to return to order? By imposing or facilitating agility? Moreover how to define the IS entropy?
Anyway, is increase of IS entropy a fatality?
The same causes that cause fatal dead ends and embolism in a vicious cycle operation, could act instead
by virtuous circles and gradually reduce complexity. This simplification would be natural, unplanned
and undirected. One of the ways for this natural transformation of IS heritage, is to introduce some key
components, entropy killers: the socio-technical system then transform itself, a localized manner. The
ordering effort would be minimal and relayed naturally by all IS stakeholders. However this way,
opened, as described below, by technological advances, is ignored by the dominant ideology.
The increase in IS entropy is not inevitable, it results firstly from a natural drift, and on the other hand
the utopia of an unrealistic control ignoring the dynamics of micro-decisions of IS construction, and
ignoring technological opportunities. We need to change the EA paradigm, and consider alternatives to
purely methodological approaches.
1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE
The activity of software development has become, in the course of time, major in the economy:
• Software components are present everywhere, in all types of physical media, and by virtual way
in the "cloud".
• All organizations have internal teams, or subcontracts, to assemble, integrate components and
incorporate IS more and more crucial to their activity, for analysis, strategy, customer
engagement...
• The macroeconomic weight of these activities is paramount: number of jobs, global
redistribution of skills, induced financial flows,...
Software engineering should rely on a science of the software to the extent of these issues.
It is clear that this science of the software has not been hitherto commensurate with the issues, and that
in practice information systems of large organizations face challenges of complexity. IS indeed are
rigid, over-complexes, expensive, fragile, stovepipe "in silos", and ultimately difficult to control
4. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
4
The efforts of governance and Enterprise Architecture methodologies, seeks to alleviate these practical
and organizational difficulties. This authoritarian and rational approach has limits, in particular by its
heaviness and its pyramidal organizational implementation, even if it seeks to reform by integrating the
"agile" approaches.
Indeed the IS’s extension is an explosive phase, which no longer allows to rely exclusively on ancient
recipes based on an obsolete technological past and old
tools.
The challenge is both scientific and technical:
• Scientific because there is no enough recoil to explain, and to model the behavior of
stakeholders in projects that result in nesting, duplications, dysfunction observed.
• Technique, because the technological world is rapidly evolving, and new technological bases
allow to expect a paradigm shift so that the mastery of complexity is consubstantial to software
development. This major fact is an unknown opportunity that should shake up the
methodological approaches. Still need that the subject should be considered with rigor, in
relation to the challenges.
5. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
5
2INTEGRATION OF INFORMATION SYSTEMS
In the activity of software production, for several decades, the limitations are known. In particular the
perverse effects of poorly controlled integration led to develop methods for Enterprise Architecture
(Zachman, Togaf,...) and governance. One of the key questions is the integration of components to
constitute, by assembling, ISs.
2.1 ALGEBRA OF IS COMPONENTS
IS are formed, whatever they may be, from various components (software, technical).
We can have a global vision of an IS as a whole, which evolves by:
• Assembly and disassembly of components
• and the life cycles of the components themselves.
This cycle of an IS, which is not the "run time" cycle, but the software and technical components
evolution, is complex.
Each component has its own life cycle, with a phase of creation, maintenance episodes, reshape,
necrosis (phasing out),...
A strong feature of IS is the interdependence of the components. Indeed multiple interactions are
necessary, appropriate, or even involuntary.
We can say that, between two components, there is a "function" of interaction that allows to consider
they constitute a new component. This new component can itself interact with others. This function is
"transitive", and by successive assemblies sets of larger size are created (associativity).
Thus, from the set theory point of view, components obey specific laws.
It could be the analog with molecular physics, and the behavior of atoms between them (see
http://www.value-architecture.com/2015/03/les-limites-de-la-complexite-des-si.html)...
Appropriate interactions are named “integration”.
Components integration comply with a typical "algebra" (you can represent it by an algebraic
formulation).
Thus, could be installed as hypothesis an algebra of the components 'perfect' respecting a law of
composition associative ((x * y) * z = x * (y * z) and commutative: x * y * z = x * z * y).
But in practice, the particularities of integration, as the dispersion of projects between teams and at the
time, suggest that this assumption is not realistic in the State of contemporary software development art.
6. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
6
2.2 FATE OF THE COMPLEXITY
2.2.1 Drift towards complexity
Over time, new interactions appear, by "transitivity" of integration: step by step involuntary and non-
articulated dependencies thus emerge. Indeed, integration is controlled by proximity, but the spread of
knowledge is still limited, and on a human basis. Thus, beyond circles of proximity, indirect and
unintended dependencies are created.
Imperfections will increase over time:
• knowledge is gradually diluted
• new needs appear to respond with new integrations
• even new component emerge creating gaps or inconsistencies.
This phenomenon is classic and well known in its different forms: spaghetti effect, affiliation of systems
with systemic risk (domino effect), rigidity of the heritage making evolutions (need of heavy redesigns)
expensive or impossible, technology stack,...
For example the spaghetti effect is due to the combination of the particularities of the transmitters and
receivers: If there are n emissions of a set of information under n variants, and receptions of this
ensemble under p other variants, are built over the projects n X p types of exchanges, while passing
through a focal point would require n + p. More generally n + p integrations are multifaceted n X p.
Thus, SI are 'over-complex', having be extended as "coral reef" by opportunistic additions and
integrations focusing on proximity and short term. They are over-complex because a simpler
composition could produce the same result, or even better, if one takes into account the improvement in
flexibility.
2.2.2 Existence of reference data.
In all IS, is the need to share "reference data". This state of fact, that suffers no exception, is known
since a long date. The key role of these data for control systems disorder, well beyond the strict sphere
of the IS, is also accepted by all organizations.
The organizational response is to implement a governance of these data, minimalist governance, or on
the contrary maximalist, but which is often source of internal tension. In fact a balance of powers is to
be found to preserve the essential consistency, in the center, and non-less essential autonomy, in
periphery.
As result of this search for balance, there is excess, either in one direction or another, which generate
dysfunctions: inconsistencies, redundancies, stiffness, confusion, double entries, workarounds. And in
many cases the reference data, for example about clients (notion of 'golden data') are not in quality.
2.2.3 Imperfect integration of reference components
Still in the set vision, should focus on a particular class of components: these reference components (of
type 'master data' or 'data wells').
A reference component plays a particular role within a perimeter (if all or a subset): it is likely to be
'integrated' with all the other components of that subset, and this integration is perfectly transitive: a
7. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
7
component incorporating such reference inherits this option and can in turn serve identically as
reference.
Somehow the low of composition is associative: (x * y) * r = x * (y * r) and the legacy of the reference
properties might be noted Hr(x*y*r) = Hr(y*r) = Hr (r).
In reality, the integration of these components is always «imperfect» and gives rise to aberrations. How
to explain the imperfections of this "algebra"?
2.2.3.1 Usurpers components
Components, which are not classified as reference components, can, for various reasons (lack of
knowledge, desire to control, will to modify the model, add particularities...) be substituted as reference
within the subset.
Spoofing can be:
• Conceptual: modification of the model, introduction of a variant without real contribution,
denial of the genericity, nesting of a supplement,
• Syntax: introduction of a local dialect, making specific local exchanges and require a translation
to be included within the scope of election of the reference component,
• On the modes of latency: introduction of kinematics of special exchange (batches, pace of
update, messages, invocation of services...).
2.2.3.2 Opportunistic integration
Without that deliberate creation of usurpers components, integration links can grow in disarray, by
proximity, without respecting the quality of the integration proposed by the reference components.
For instance, if a reference component does not offer the expected exchange, the temporality required
mode,... an usurper component may seem illusory legitimate, and take a sustainable position, even if the
justifying gap disappears.
So a random grid of components is created, and makes the system more complex.
8. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
8
Perver
Vraie
source
Usurpeur
Sain
Vraie
source
Développement d’un maillage sur-
complexe
Leading gradually to:
2.2.3.3 Integration gaps
It is also common, given the difficulties to integrate components, that it avoids, relying on "manual"
solutions: entered in doubles, instructions, reports of data to view,...
These solutions, if the volumes are low, are little safe in the medium term, and must be framed by
controls that also pose the question of reference data.
9. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
3
1FOREWORD
1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY
Information systems are result of a reality increasingly complex: multiplication of sensors, extension of
application areas, stacking of technologies, application sedimentation,...
Intuitively one can describe this drift over time, as an increase of "entropy" of systems (distinct from
other entropies known, as a human factor is taken into account). Enterprise Architecture aims to impose
a model for order. This is a rational and authoritarian approach, it assumes that the model is respected
and applied in duration. In practice, rational planning, certainly desirable, is often impractical and
expensive. A more modest approach relies on gradual order, at projects pace, at lower cost. But clearly
in absence of central and collective effort, the disorder increases "naturally". How much energy is
needed to return to order? By imposing or facilitating agility? Moreover how to define the IS entropy?
Anyway, is increase of IS entropy a fatality?
The same causes that cause fatal dead ends and embolism in a vicious cycle operation, could act instead
by virtuous circles and gradually reduce complexity. This simplification would be natural, unplanned
and undirected. One of the ways for this natural transformation of IS heritage, is to introduce some key
components, entropy killers: the socio-technical system then transform itself, a localized manner. The
ordering effort would be minimal and relayed naturally by all IS stakeholders. However this way,
opened, as described below, by technological advances, is ignored by the dominant ideology.
The increase in IS entropy is not inevitable, it results firstly from a natural drift, and on the other hand
the utopia of an unrealistic control ignoring the dynamics of micro-decisions of IS construction, and
ignoring technological opportunities. We need to change the EA paradigm, and consider alternatives to
purely methodological approaches.
1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE
The activity of software development has become, in the course of time, major in the economy:
• Software components are present everywhere, in all types of physical media, and by virtual way
in the "cloud".
• All organizations have internal teams, or subcontracts, to assemble, integrate components and
incorporate IS more and more crucial to their activity, for analysis, strategy, customer
engagement...
• The macroeconomic weight of these activities is paramount: number of jobs, global
redistribution of skills, induced financial flows,...
Software engineering should rely on a science of the software to the extent of these issues.
It is clear that this science of the software has not been hitherto commensurate with the issues, and that
in practice information systems of large organizations face challenges of complexity. IS indeed are
rigid, over-complexes, expensive, fragile, stovepipe "in silos", and ultimately difficult to control
10. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
3
1FOREWORD
1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY
Information systems are result of a reality increasingly complex: multiplication of sensors, extension of
application areas, stacking of technologies, application sedimentation,...
Intuitively one can describe this drift over time, as an increase of "entropy" of systems (distinct from
other entropies known, as a human factor is taken into account). Enterprise Architecture aims to impose
a model for order. This is a rational and authoritarian approach, it assumes that the model is respected
and applied in duration. In practice, rational planning, certainly desirable, is often impractical and
expensive. A more modest approach relies on gradual order, at projects pace, at lower cost. But clearly
in absence of central and collective effort, the disorder increases "naturally". How much energy is
needed to return to order? By imposing or facilitating agility? Moreover how to define the IS entropy?
Anyway, is increase of IS entropy a fatality?
The same causes that cause fatal dead ends and embolism in a vicious cycle operation, could act instead
by virtuous circles and gradually reduce complexity. This simplification would be natural, unplanned
and undirected. One of the ways for this natural transformation of IS heritage, is to introduce some key
components, entropy killers: the socio-technical system then transform itself, a localized manner. The
ordering effort would be minimal and relayed naturally by all IS stakeholders. However this way,
opened, as described below, by technological advances, is ignored by the dominant ideology.
The increase in IS entropy is not inevitable, it results firstly from a natural drift, and on the other hand
the utopia of an unrealistic control ignoring the dynamics of micro-decisions of IS construction, and
ignoring technological opportunities. We need to change the EA paradigm, and consider alternatives to
purely methodological approaches.
1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE
The activity of software development has become, in the course of time, major in the economy:
• Software components are present everywhere, in all types of physical media, and by virtual way
in the "cloud".
• All organizations have internal teams, or subcontracts, to assemble, integrate components and
incorporate IS more and more crucial to their activity, for analysis, strategy, customer
engagement...
• The macroeconomic weight of these activities is paramount: number of jobs, global
redistribution of skills, induced financial flows,...
Software engineering should rely on a science of the software to the extent of these issues.
It is clear that this science of the software has not been hitherto commensurate with the issues, and that
in practice information systems of large organizations face challenges of complexity. IS indeed are
rigid, over-complexes, expensive, fragile, stovepipe "in silos", and ultimately difficult to control
11. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
11
This transparency of the internal market of components, applied to the components of reference, would
avoid 'spoofing', which places in the heritage usurpers components, with perverse effects at short term
and mostly in the medium term.
Also implies that the components of reference are available on the market "in advance of phase",
compared to the other components development cycles.
3.2 GENERIC INTEGRATION CAPABILITIES ("JANUS" PRINCIPLE) FOR REFERENCE
COMPONENTS
Another key to optimize the integrations around the reference components is to give them 'generic'
integration capabilities, enabling them to be assembled in all contexts, 'perfectly' and without extra
cost.
For this purpose the qualities are:
• Anticipation of future developments, to ensure projects a 'service' adapted to new technologies and
future requirements: new exchange method, more aligned with current opportunities for projects,
(ESB, low latency, Cloud,...) and with the expectations imposed on IS (all connected, mobility,
traceability, transparency, all sources, agility,...).
• Offer interfaces for a ' non-intrusive ' insertion within the existing heritage: accept all exchange
methods and integration, for coexistence with existing heritage without additional costs, and in a
short time.
This can be described as a "Janus" capacity, like the Roman God with double face, one turned to the past,
the other towards the future.
This capacity is achieved through the development of data services using a library of data integration
(several solutions exist on the market), and the design of a generic model adapted to the type of
reference data.
12. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
12
An interoperability architecture can for example be incorporated to apply to the worlds of distribution of
flows (batch, messages) and of orchestration of services (web services, Soa,...), and make them
converge.
Puits de
données
Web
CRM Legacy …
Web
ERP
MDMInteroperability System
Orchestration
Distribution
of Flow
Data Wells
3.2.1 Generic data integration functions
The library of data integration breaks down into several functions. It must also be enriched by a layer of
services and pilotage. These functions are described below (can also refer to the publications of Gartner
on the subject).
3.2.1.1 The "transport".
This is the level of the tubing that allows to carry lots of data or messages.
There are batchs of data file transfers on the one hand, on the other hand posts, for example through an
ESB.
3.2.1.2 "Connectivity".
These components enable to connect to a variety of databases of various publishers or the open source
world. Allow others to interact in various formats: fixed fields, marked (XML, JSON, EDI), specific to
ERP,...
3.2.1.3 'Conversion ' functions.
Thanks connectivity components, it is possible to convert various formats to each other. In addition, by
a few clicks certain libraries allow mapping of data to recompose new exchanges or flow from data
selected in the stream or existing trade.
These components can be activated in a batch context, for example in a classical ETL, as in a 'message'
or real time context.
3.2.1.4 "Storage" function.
This 'layer' is of variable use according to usage. The 'storage' function can use several techniques:
Classic DBMS, or new family "NoSql" solutions. It is necessary to offer all historical views of data.
13. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
13
3.2.1.5 «Synchronizing» functions, data quality
Consolidation of different flows and exchange shows various data quality problems: duplicate,
differences, inconsistency of identification from the repositories, temporal inconsistencies, not quality
of identifiers, addresses,... functions must encapsulate the diagnosis and correction functions. They are
generally using solutions running with Master Data Management (MDM).
3.2.1.6 "Data services" functions.
Data services are aimed to expose and disseminate data. Functional approaches must expose them by
grouped, batchs, latencies expected by internal or external structures, and applications that wish to use
them. This is to develop and adapt to the domain.
3.2.1.7 The "cockpit".
This level pilot and oversees the distribution of the flow and orchestration of services.
3.2.2 Generic modelling of “data wells”.
The wells were first designed to substitute for N + P interactions in the N X P interactions that emerge
naturally, according to the following scheme:
N X P flowsN X P Format Conversions
Becomes:
Master
Data
CRM Source
Extern
Source
360 Vision
Business
Intelligence
Partner
PuitsWell Format
Conversions
14. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
14
For a discussion of the principle see: http://www.value-architecture.com/2014/03/a-la-decouverte-des-
puits-de-donnees.html
The key to the design is in the identification of the generic "grain" that allows to trace all the evolutions
of objects. Data, at this finest “grain” level, is placed in the well. Defining events and different dates is
essential, to ensure the stability of the model.
(see also: http://fr.slideshare.net/RenMANDEL/principe-du-puits ) ).
In principle the wells must not be intrusive, or require process redesign. It is out of question to
dispossess applications of their processes, in general very nested. However wells provide traceability, in
particular for the monitoring of data quality (cf. «tri-dated» generic model).
The “tri-dated" generic model of a well
There is a simple way to identify potential wells: objectify the present cycles in the ecosystem (see:
http://www.value-architecture.com/2013/02/les-azimuts-des-chaines-de-valeur.html), because each
cycle is at the origin of events, motivating a well dedicated to him. Well, key to coherence, is aimed at
tracing of one such cycle.
3.2.3 Case of Master Data.
Note that master data, which structure the IS, are rather positioned between several cycles, and ensure
consistency inter-cycles.
The market of MDM solutions provides flexible and operational solutions.
Using the library of integration will also allow an insert non-intrusive and scalable, unlike some much
centered governance for MDM projects.
The principle is to organize a focal point to disseminate data that currently play the role of Master Data.
15. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
15
This does not exclude an evolution gradually controlled, in particular to ensure the quality of data, and
synchronizing sources. The impact on the supply process is then to examine, and involves a
reengineering.
Some data and control functions can be common between wells and Master Data repositories.
3.2.4 Manage subsidiarity
You would think that there is only one configuration in the architecture of reference data. In reality, the
perimeters of these systems can be variable, and current solutions can anticipate this flexibility.
As mentioned above, should on this occasion model subsidiarity and propagate model between different
repositories and wells. A unique and integrated model is unlikely to allow a balance between:
• Consistency on objects, on common concepts,
• Essential autonomy.
Subsidiarity is a sensitive issue, "cursors" of subsidiarity are not always obvious. It is therefore useful to
provide this setting to create a flexibility and resist to possible organizational earthquakes.
3.3 PROFESSIONALIZING INFORMATION SYSTEM ACTORS
Around this issue of IS simplification, with the emergence presented above, opportunities should be, if
actors come out of current crafts.
The question, overall, is to defer a portion of the energy placed in opportunistic investments, or in major
methodological engagement, to:
• Invest, at lower cost, in "Janussiens" components, avoiding the pitfall of MDM conventional
approaches too greedy in governance and «big bang»
• Promote Internal "commercialization", with offers of high-level data services,
• Take into account needs of the projects, at the decentralized level.
Wells 360 Vision
Focal Point
Conversion Library
16. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
16
4AN ALTERNATIVE TO GOVERNANCE
4.1 RESISTANCE TO PYRAMIDAL GOVERNANCE AND HEAVY METHODS
We have seen that the response to the inconsistencies, derivatives, and other confusions occurring
naturally was to introduce still more governance of data and, more particularly, of reference data.
This governance responds effectively to a major control issue.
However it runs into several resistors:
• It is an organization to set up, often with dedicated resources,
• Often there are internal opposition, with different objectives and different trades, and a pyramid
development in the Organization,
• Transitional arrangements to implement the system (MDM or other), in a rational and exclusive
approach are long and sown with pitfalls.
• The conceptual foundations of these systems are fragile, and don’t solve issues of subsidiarity,
dating, archiving, of traceability,...
Historical Enterprise Architecture methodologies are also heavy (impressive documentary volumetry,
consulting...) and costly (personal "certified", and specialty procedures) to implement, and find their
justification in large projects, which combine costs and risks.
4.2 HAVE DECISIONS NATURALLY IN THE CORRECT SENSE
We have seen that the availability on the market of routines to "data integration" allows to create,
around these wells and repositories, a belt of conversion for access in all temporalities and latencies, and
according to all types of protocols...
These components, organized into 'data services', manage the complexity of the "dialects" and
opportunistic variabilities.
The challenge is to make these services transparent, easily-accessible, operational, demonstrable by a
Proof Of Concept, so that stakeholders in the projects make good decisions without any form of trial.
The technological opportunity offer at a low cost solution to the enormous issues mentioned here.
The motivation must come from actors projects, responding to their wishes:
• Of cost control,
• Availability in the short term,
• Agility in the medium term,
17. IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL
4
The efforts of governance and Enterprise Architecture methodologies, seeks to alleviate these practical
and organizational difficulties. This authoritarian and rational approach has limits, in particular by its
heaviness and its pyramidal organizational implementation, even if it seeks to reform by integrating the
"agile" approaches.
Indeed the IS’s extension is an explosive phase, which no longer allows to rely exclusively on ancient
recipes based on an obsolete technological past and old
tools.
The challenge is both scientific and technical:
• Scientific because there is no enough recoil to explain, and to model the behavior of
stakeholders in projects that result in nesting, duplications, dysfunction observed.
• Technique, because the technological world is rapidly evolving, and new technological bases
allow to expect a paradigm shift so that the mastery of complexity is consubstantial to software
development. This major fact is an unknown opportunity that should shake up the
methodological approaches. Still need that the subject should be considered with rigor, in
relation to the challenges.