Using Grammatical Signals Suitable to Patterns of Idea Development
On Analyzing Self-Driving Networks: A Systems Thinking Approach
1. On Analyzing Self-Driving Networks:
A Systems Thinking Approach
Junaid Qadir
Associate Professor,
Information Technology
University (ITU),
Pakistan
2.
3. Session Outline
The Problems: Motivating Systems Thinking
• Complex Adaptive Systems
• Tame Problems vs. Wicked Problems.
• Problems Faced Due to Conventional Non-System Thinking
The Solution: What Is Systems
Thinking?• Defining Characteristics of Systems Thinking
• System Thinking Concepts of Leverage and Archetype
• System Thinking Tools
1
2
Systems Thinking For The Internet
and Future AI-Driven SDNs
3
• The Paradoxes of Internet Design
• Ethical and Policy Challenges in SDNs.
4. Motivating Systems Thinking
1
You think that because you understand “one” that you must
therefore understand “two” because one and one make two.
BUT you forget that you must also understand “and.”
5. So, what is a system?
A system must consist of three kinds of things:
elements, interconnections, and
a function or purpose.”
A system is a set of things—people, cells, molecules, or
whatever—interconnected in such a way that they produce
their own pattern of behavior over time.
“The behavior of a system cannot be known just by knowing the
elements of which the system is made.”
― Donella H. Meadows, Thinking in Systems: A Primer
6. Complex Systems
Tightly Coupled
“Everything influences everything else” “You can’t just do one thing”
Most important systems of interest are complex
systems, and have the following properties:
Dynamic
Change occurs at many time scales
Policy Resistant
Many obvious solutions to problems fail or
actually worsen the situation.
Counterintuitive
Cause and effect are distant in time and space
Exhibit Tradeoffs
Long term behavior is often different from short term behavior
7. Complex systems vs. Complex adaptive
systems
weather prediction
In complex systems (e.g., weather),
the system does not react to
predictions about it.
stock market prediction
In complex adaptive systems (e.g.,
markets), the system reacts to
predictions about it.
Social systems are typically nonlinear multi-loop feedback
complex adaptive systems, which makes their behavior counterintuitive
8. Tame Problems vs. Wicked Problems
Stubborn problems that refuse to go away and often become
worse despite efforts and best intentions
Examples:
Racism, Poverty, Terrorism,
Water Management.
Why are these problems so stubborn?
9. Problems Faced Due to
Conventional (Non-System)
Thinking
1. Incorrect Mental Models;
2. Reductionism;
3. Policy Resistance;
4. Problem Solving Through Symptom Treatment
5. Unintended Consequences;
6. Optimizing the Parts at the Cost of The Whole;
7. Intervening and Causing Problems
10. Mental models
A mental model is how we perceive reality.
It comes through our backgrounds, values,
expectations, and experiences.
Six blind men and the elephant
1. Linear Instead of Nonlinear;
2. Causal Links Rather Than Loops;
3. Immediate Feedback Rather Than
Delayed Feedback;
4. Reductionism vs. Holism
Incorrect assumptions
11. Reductionism
A common way of doing analysis is to break down the
system into parts and then analyze the parts.
But this approach suffers from a
problem, which may be called the
Humpty Dumpty Problem or the
Broken Mirror Problem.
Once you’ve broken
down the mirror to
pieces, you cannot
see the whole again.
12. Policy Resistance
The Cobra Effect
(policy resistance and
unintended consequences)
During the British Raj, a bounty system was
devised to counter the rise of venomous cobras.
The system worked really well; a lot of
cobras were killed.
Except that, entrepreneurs figured out they
could make money by farming cobras and killing
more of them.
After the government scrapped the system,
there were more cobras than before.
measures taken to improve a situation can directly make it worse
due to policy resistance arising from people adapting
Or the perils of giving lollipops
13. Treatment
Leverage points are places
within a complex system where a
small shift in one thing can
produce big changes in
everything.Interventions that address the systemic
structure are more high-leverage than those
that only address symptoms
“There’s always an easy solution to every problem that is neat, and
plausible,
but wrong.”—H. L. Mencken
15. Mercedes
+
A star studded team can
fail spectacularly if the
parts don’t fit well—or are
not focused on the overall
purpose.
Optimizing the parts rather than the whole
16. Intervening and causing problems
“The chief source of problems is solutions”—Sevareid
In medicine, there’s a special name for problems created by doctor’s interventions.
The term Iatrogenesis refers to the “wounds brought forth by the healer.”
“When you are confronted by any complex social system, …
you cannot just step in and set about fixing with much hope of
helping. This realization is one of the sore
discouragements of our century …
You cannot meddle with one part of a complex system from the
outside without the almost certain risk of setting off
disastrous events that you hadn’t counted on in other,
remote parts. If you want to fix something you are first obliged
to understand … the whole system …
Intervening is a way of causing trouble.”
17. 1) Today’s problems come from yesterday’s solutions.
2) The harder you push, the harder the system pushes back.
3) Behavior grows better before it grows worse.
4) The easy way out usually leads back in.
5) The cure can be worse than the disease.
6) Faster is slower.
7) Cause and effect are not closely related in time and space.
8) Small changes can produce big results – but the areas of
highest leverage are often the least obvious.
9) You can have your cake and eat it too – but not at once.
10) Dividing an elephant in half does not produce two small
elephants.
11) There is no blame.
Peter Senge’s laws of systems
18. What Is Systems Thinking?
2
“Systems Thinking is the art and science of linking structure to
performance, and performance to structure—often for purposes of
changing structure (relationships) so as to improve performance”—
Richmond.
19. Linear thinking vs. Systems thinking
If we enhance tourism in a place, it will attract
temporary immigrants which causes the
population to increase creating its own
problems (social issues, pollution, more
resources needed, more crime, etc.).
20. Feedback view of the world
“Everything we do as individuals, as an industry, or as a society is done
in the context of an information-feedback system.”—MIT’s Jay Forrester
The non-feedback view of the world (Open-loop thinking)
Closed-loop thinking
21. System-as-a-cause thinking
Exogenous point of view
Sam is always mean to Pam.
It’s all his fault.
If he would be nicer, Pam’s life
be better.
Endogenous point of view
Maybe there is something
Pam is doing, which is
causing Sam to be mean…
Pam's mean
behavior
Sam's hurt
feelings
Sam's mean
behavior
Pam's hurt
feelings
(R)
Sam's mean
behavior
Pam's hurt
feelings
“Systems are perfectly designed to achieve the results
they are currently achieving.” —Edward Deming
Exogenous: originating externally; Endogenous: originating internally
22. There’s no blame! System-as-a-cause
thinking
“Stop looking for who’s to blame; instead you’ll start
asking, “What’s the system?”The concept of feedback
opens up the idea that a system can cause its own
behavior.” ―Donella H. Meadows
The endogenous way is more
empathetic and proactive
A famous quote in management:
“If you are not aware of how you are part of the problem, you can’t
be part of the solution.”—Bill Torbert
23. Taking delayed feedback into account
“Because of feedback delays within complex systems, by the time a problem
becomes apparent it may be unnecessarily difficult to solve.” ― Donella H.
Meadows,Thinking in Systems: A Primer
The Parable of the
Boiled Frog
We have the intuition that cause and effect are located in close in time
and space.This is not always true in complex systems.
25. Concept of leverage
“Small changes can produce big results – but the areas of
highest leverage are often the least obvious.”—Peter Senge
Jay Forrester
Pioneering research by Forrester showed that in
many social systems, the high-leverage interventions
were non-obvious and even if determined, liable to be
pushed in the long direction
For example: to fix urbanization problems, we need to
adjust low-cost public housing
(we need less rather than more low-cost public housing)
26. System Thinking Tools
Qualitative tools
Causal loop diagrams
a framework for seeing interrelationships
rather than things; can help in identifying
reinforcing (R) and balancing (B) processes.
Quantitative tools
Stock and flow diagrams
Unlike causal loops, stock and flow
diagrams provide information about
rates of change and accumulations.
System dynamics is grounded in control theory and the modern
theory of nonlinear dynamics and offers many other rigorous tools
27. An arrow indicates a causal relationship or
change in the state of two variables.
Causal links
28. Only two kinds of feedback loops
Reinforcing Loop (R)
Positive feedback
The R-loop represents growing or declining actions
Balancing Loop (B)
Negative feedback (counteracting)
The B-loop seeks stability or return to control, or aims
for a specific target
30. Balancing Loop (B)
Negative feedback (counteracting)
The B-loop seeks stability or return to control, or
a specific target
31. Thinking of stocks and flows
If the birth rates decline all over the world, will we
continue to have increase in population growth?
We can use the bath tub analogy to understand
why despite global reduction in birth rates, the
world population continues to increase.
We can understand this using accumulations (or stocks) and rates (or flows)
World
Population
(Stock)Birth Rate
(Flow)
Death Rate
(Flow)
32. The bathtub analogy was popularized by Sterman to clear the
common misconception that CO2 in the atmosphere will drop simply
by reducing our emissions.
Sustainable
Development
Movement
34. In a “Fixes That Fail” situation, a problem symptom
cries out for resolution. A solution is quickly
implemented that alleviates the symptom (B1), but
the unintended consequences of the “fix”
exacerbate the problem (R2). Over time, the
problem symptom returns to its previous level or
becomes worse.
Fixes That Backfire
A quick solution with unexpected long-term consequences
In a “Shifting the Burden,” a problem is “solved” by
applying a symptomatic solution (B1), which diverts attention
away from more fundamental solutions (R3). In an “Addiction”
structure, a “Shifting the Burden” degrades into an addictive
pattern in which the side-effect gets so entrenched that it
overwhelms the original problem symptom.
Shifting the Burden
Systems unconsciously favor short-term, addictive solutions
1
2
35. In a “Success to the Successful” archetype, if one
person or group (A) is given more resources, it has a higher
likelihood of succeeding than B (assuming they are equally
capable). The initial success justifies devoting more
resources to A, and B’s success diminishes, further
justifying more resource allocations to A (R2).
Success to the Successful
Things get better for “winners” and worse for “losers”
In a “Limits of Success” scenario,
continued efforts initially lead to improved
performance. Over time, however, the system
encounters a limit which causes the
performance to slow down or even decline
(B2), even as efforts continue to rise.
Limits to Growth
Improvement accelerates and then suddenly stalls
3
4
Virtuous
cycle
Vicious
cycle
36. In a “Tragedy of the Commons” structure, each person
pursues actions which are individually beneficial (R1 and
R2). If the amount of activity grows too large for the system
to support, however, the “commons” becomes
experiences diminishing benefits (B5 and B6).
Tragedy of the Commons
Shared unmanaged resource collapses due to overconsumption
In the “Escalation” archetype, one party (A) takes actions
that are perceived by the other as a threat. The other party
(B) responds in a similar manner, increasing the threat to A
and resulting in more threatening actions by A. The
reinforcing loop is traced out by following the figure-8
outline produced by the two balancing loops.
Escalation
Different parties take actions to counter a perceived threat
5
6
37. Systems Thinking For The
Internet and Future AI-
Driven Self-Driving
Networks
3
• The Paradoxes of Internet Design
• Ethical and Policy Challenges in SDNs.
41. The paradoxes of Internet architecture
Three fundamental problems with the
Internet today.
1. Spam.
2. Privacy and Security
3. Quality of Service
“Technology is both a burden and a blessing;
not either-or, but this-and-that”—Neil
Postman.
Keshav points out that paradoxically the
Internet’s architectural elements most
responsible for its success are also
responsible for its most vexing
problems.
42. System archetypes in networking
Shifting the Burden
Systems unconsciously
favor short-term, addictive
solutions
IPv4 NAT; cross-layer design;
Tussles in cyberspace
Tragedy of the Commons
Shared unmanaged
resource collapses due to
overconsumption
Spectrum commons;
Success to the
Successful
Things get better for
“winners” and worse for
“losers”
Network neutrality
Walled gardens
Escalation
Different parties take
actions to counter a
perceived threat
Adblocking & counterblocking
Fixes That Backfire
A quick solution with
unexpected long-term
consequences
IPv4 NAT; Bufferbloat
Limits to Growth
Improvement
accelerates and
then suddenly stalls
IPv4
43. Ethical and policy challenges for SDNs
THERE ARE ETHICAL
CHOICES IN EVERY SINGLE
ALGORITHM WE BUILD
“The question of agency—i.e., “who will take the ethical
decision?”— also looms large for self-driving networks.
In the movie Minority Report, the cop tackles and handcuffs individuals who have
committed no crime (yet), proclaiming stuff like:
“By mandate of the District of Columbia Precrime Division, I’m placing you
under arrest for the future murder of Sarah Marks and Donald Dubin.”
The arrested person confronts Cruise and asks:
“You ever get any false positives?”
44. AI can help strengthen stereotypes
Harvard Professor
Latanya Sweeney
45. Security challenges
“No problem stays solved in a dynamic environment.” —
Russell Ackoff
Since algorithms are trained using historical datasets, self-driving networks are
always vulnerable to future evolved adversarial attack.
46. Concluding remarks
junaid.qadir@itu.edu.pk
Systems thinking helps us make sense of
interdependency in complex system and the holistic
behavior of a system by understanding the feedback
loops at play.
With the rise of interest in self-driving networks,
which will become part of the larger Internet, there is
a need to rigorously look at how these technologies
will affect—positively as well as negatively—all the
stakeholders.
47. Credits/ Acknowledgments
Figures from various sources:
Cover pages of various IEEE and ACM research papers
Slides made available by G. Richardson
Slides made available by Professor Ockie Bosch and Dr Nam Nguyen
Wikipedia/ Online Pictures of Forrester, Deming
Various book titles photo from Amazon.com
Snapshots from Youtube Videos and Online Websites.
Stock photos from various places; the respective owners are the rightful owners of the content.
Books
Business Dynamics: Systems Thinking and Modeling for a Complex World (Sterman),
Systems thinking made simple: New hope for solving wicked problems (Cabrera & Cabrera)
These resources have been used in these lecture slides for educational purpose under the fair use doctrine.
The ownership of these resources, if copyrighted, is retained by their respective copyright owners.
Various presentations at slideshare, including:
https://www.slideshare.net/Think2Impact/
Notes de l'éditeur
Social systems are not linear but belong to the class of systems called multi-loop nonlinear feedback systems.
Level two chaotic systems (Harari)
“History cannot be explained deterministically and it cannot be predicted because it is chaotic. So many forces are at work and their interactions are so complex that extremely small variations in the strength of the forces and the way they interact produce huge differences in outcomes. Not only that, but history is what is called a ‘level two’ chaotic system. Chaotic systems come in two shapes. Level one chaos is chaos that does not react to predictions about it. The weather, for example, is a level one chaotic system. Though it is influenced by myriad factors, we can build computer models that take more and more of them into consideration, and produce better and better weather forecasts. Level two chaos is chaos that reacts to predictions about it, and therefore can never be predicted accurately. Markets, for example, are a level two chaotic system.
A complex adaptive system, as distinguished from a complex system in general, is one that can understand itself and change based on that understanding. Complex adaptive systems are social systems. The difference is best illustrated by thinking about weather prediction contrasted to stock market prediction. The weather will not change based on an important forecaster’s opinion, but the stock market might. Complex adaptive systems are thus fundamentally not predictable.
Us vs. Them Mentality; Vested Interests
The systems thinking worldview dispels the “us versus them” mentality by expanding the boundary of our thinking. Within the framework of systems thinking, “us” and “them” are part of the same system and thus responsible for both the problems and their solutions.
“From a very early age, we are taught to break apart problems, to fragment the world. This apparently makes complex tasks and subjects more manageable, but we pay a hidden, enormous price. But, as physicist David Bohm says, the task is futile—similar to trying to reassemble the fragments of a broken mirror to see a true reflection. Thus, after a while we give up trying to see the whole altogether.”
Unfortunately, our problem-solving instinct also creates a number of followup problems and networking systems (including future self-driving networks) are not immune to this tendency [40]. Systems thinking can help us anticipate and avoid the negative consequences of well-intentioned solutions. This can be done both prospectively by anticipating unintended consequences during strategic planning or retrospectively by understanding more deeply the non-obvious causes of existing chronic complex social problems.
As we move our perspective from the event level to the structural level, we have a better understanding of what is really going on in a system.
We also increase our ability to influence and change the system’s behaviour, i.e., we can make adjustments to the structure which are consistent with the behaviour we would like to produce.
As we move our perspective from the event level to the structural level, we have a better understanding of what is really going on in a system.
We also increase our ability to influence and change the system’s behaviour, i.e., we can make adjustments to the structure which are consistent with the behaviour we would like to produce.
As pointed out by H. L. Mencken,
2) The harder you push, the harder the system pushes back.
This is, in systems thinking parlance, “compensating feedback”. This happens in conversation, government programs, business and personal wellness efforts. In conversations we often try to argue our point by disagreeing with the other person. Our “push” helps them strengthen their position. We make them think and fight harder. Social programs are rampant with examples of community improvement initiatives that resulted in worse conditions. Government aims to improve the living conditions for a group in one part of town result is more people moving into the area, placing an unsupportable burden on the systems in place.
4) The easy way out usually leads back in.
Kaplan called this the law of the instrument saying ”Give a small boy a hammer, and he will find that everything he encounters needs pounding”. Maslow reframed this saying “If all you have is a hammer, everything looks like a nail”.
“In a modern version of an ancient Sufi story, a passerby encounters a drunk on his hands and knees under a street lamp. He offers to help and finds out that the drunk is looking for his house keys. After several minutes, he asks, "Where did you drop them?" The drunk replies that he dropped them outside his front door. "Then why look for them here?" asks the passerby. "Because," says the drunk, "there is no light by my doorway.” Excerpt From: Peter M Senge. “The Fifth Discipline: The Art and Practice of the Learning Organization: First Edition.” iBooks.
“Pushing harder and harder on familiar solutions, while fundamental problems persist or worsen, is a reliable indicator of nonsystemic thinking—what we often call the "what we need here is a bigger hammer" syndrome.”
5) The cure can be worse than the disease.
This is also called “shifting the burden” and is easy to confuse with the push/push back law. It’s slightly different though and can occur at the same time. The “cure” in this case is an intervention that is enabling and becomes addictive. As dependence on the intervention increases the system’s ability to cure itself lessens. This is about the difference between giving a man a fish and teaching him how to fish. If an intervention is needed then we have to make sure the intervention doesn’t weaken the entire system causing more and more dependence. In some ways public education shifted the burden of teaching children from parents to teachers. Engaging stakeholders in defining problems and finding solutions keeps the burden where it belongs, shared across the entire system and not just on one part of it.
“ The phenomenon of short-term improvements leading to long-term dependency is so common, it has its own name among systems thinkers—it's called "Shifting the Burden to the Intervenor." The intervenor may be federal assistance to cities, food relief agencies, or welfare programs. All "help" a host system, only to leave the system fundamentally weaker than before and more in need of further help.”
7) Cause and effect are not closely related in time and space.
How many times do you push an elevator button? How often have you over compensated for the amount of cold water in the shower? We tend to believe that when we do something there should be an effect that we can see within a set amount of time. All of our testing, funding and business practices reflect this belief. The challenge is that sometimes there is a clear and present relationship between cause and effect. Just not all the time. When you actively inform, engage and include your community you provide them with an opportunity to see the real space between cause and effect.
8) Small changes can produce big results – but the areas of highest leverage are often the least obvious.
Butterfly wings and hurricanes. This is the law of leverage. Small, focused actions at the right place in the system can produce the biggest and best changes. The challenge is that the “right place” is not obvious and can seem counterintuitive. Donella Meadows describes 12 places of leverage in a system and close to the top of the list is the goals of the system. Some say that the education system isn’t working. I think it works quite well given the original goal of producing factory workers. Right now the goal is changing, in part because of changes in technology that allow education system stakeholders to collaborate and cooperate and influence the goals of the system. See #6 and 7 above. A great example of small, counterintuitive actions are the use of insects to control insects. Wasps are introduced in many a greenhouse as a way to control other insects that feed on the greenhouse crop. Brilliant, effective and not the most intuitive solution. The key to being able to use leverage in a system is knowing the structure of the system. In education the structure is massive and very few people know it well enough to intuit where the leverage points are.
11) There is no blame.
In a complex adaptive system there is no separate “other”. Everything and everyone is connected and together we co-create the whole system. Sometimes we have difficulty with this. We reflex to blame, we deflect, and deny. Its hard to take full responsibility for something that seems to be outside of our control without trying to control everything. It can feel like two competing ideas and for many that feeling is uncomfortable. Senge suggests:
The cure lies with the relationships with the very people we typically blame for the problems we are trying to solve.
“Systems thinking shows us that there is no outside; that you and the cause of your problems are part of a single system. The cure lies in your relationship with your "enemy.”
“The feedback perspective suggests that everyone shares responsibility for problems generated by a system. That doesn't necessarily imply that everyone involved can exert equal leverage in changing the system.”
Feedback is so commonplace that MIT’s Jay Forrester, the pioneer of the System Dynamics discipline, said,
systems thinking worldview is that problems are internally generated—we often create our own “worst nightmares.”
Most intuitively obvious policy interventions in complex social systems are low-leverage (i.e., they do not produce significant long-run change and will likely also create other problems) and only a few policy interventions are high-leverage (i.e., capable of producing substantial long-run change).
Previous research in system dynam- ics has shown that most intuitively obvious policy interventions in complex social systems are low-leverage (i.e., they do not pro- duce signi cant long-run change and will likely also create other problems) and only a few policy interventions are high-leverage (i.e., capable of producing substantial long-run change). System dynamics research has consistently highlighted the counterintu- itive nature of complex social systems in that the high-leverage points are not where most people expect, and if even these points are identi ed, they are prone to be altered in the wrong direction by people [11].
In addition, the process reinforces the stock and flow way of thinking by emphasizing the difference between information and material flows, and the importance of unit consistency throughout a diagram.
Breaking a “Fixes that Fail” cycle usually requires acknowledging that the fix is merely alleviating a symptom, and making a commitment to solve the real problem now.
A two-pronged attack of applying the fix and planning out the solution will help ensure that you don’t get caught in a perpetual cycle of solving yesterdays “solutions.”
“People “shift the burden” of their problem to other solutions—well-intentioned, easy fixes which seem extremely efficient.” ― Peter M. Senge, The Fifth Discipline: The Art and Practice of the Learning Organization: First edition
Fixes that Backfire
This system archetype is associated with the concept of unintended consequences. Fixes that back re are characterized by the use of a quick x to reduce a problem symptom that works in the short run but at the cost of long-term consequences (which people often fail to see due to long system delays). This is a common pitfall in networks with some networking examples including (1) increasing queue buffers to decrease packet loss but instead causing bufferbloat [14], and (2) introducing additional links to an existing system only to see overall performance drop (Braess’ paradox) [24].
Shifting the Burden
This archetype is associated with the concept of unintended dependence. This arises from dependence on a quick x, which is resorted to when the more fundamental solution is too expensive or too di cult to be implemented. This archetype di ers from fixes that back re” since the fundamental solution may not be apparent or applicable in the latter. The best example is seen in network capacity planning, where operators would rather over-dimension the network than implement more complex long-term solutions.
Shifting the burden
• Problem symptoms are usually easier to recognize than the other elements of the structure.
• If the side-effect has become the problem, you may be dealing with an “Addiction” structure.
• Whether a solution is “symptomatic” or “fundamental” often depends on one‘s perspective. Explore the problem from a dif- fering perspective in order to come to a more comprehensive understanding of what the fundamental solution may be.
Limits to Growth
This archetype describes the concept of unanticipated constraints, based on the insight that no physical system can sustain growth inde nitely. Any engine of growth, however successful, will in- evitably be constrained by internal and external bottlenecks and constraints, e.g., Meadows [32] showed that we cannot sustainably support perpetual growth in a nite world. This is a long-standing worry in communications networks. For example, researchers are now exploring how a permanent energy crisis scenarios may fun- damentally limit our ability to maintain the current-day Internet architecture and what our response should be in such an eventuality [41] [39].
Effective solutions for “Tragedy of the Commons” scenario never lie at the individual level.
Ask questions such as:
“What are the incentives for individuals to persist in their actions?”
“Can the long-term collective loss be made more real and immediate to the individual actors?”
Find ways to reconcile short-term cumulative consequences. A governing body that is chartered with the sustain- ability of the resources limit can help.
Of all the system structures that affect our lives and world, escalation takes the prize for its negative impact. In situations such as price wars and arms races—prime examples of the escalation dynamic—two competing entities operate, each with the goal of staying ahead of the other. As shown in “Escalation Template,” the basic causal loop structure of escalation has two balancing loops, as each party takes action to achieve its goal and move a bit ahead of its opponent.
It is important to note that reinforcing loops do not have to be made up of all, or even any, “s” links. This diagram has two “o” links and six “s” links, so it is a reinforcing loop.
We can use the systems thinking concept of system-as-a-cause to explain how the perennial Internet nuisances (such as spam and lack of privacy, security and QoS) are not isolated problems but, as noted by Keshav [25] follow endogenously as the byproducts of the Internet’s design preferences. This work points out that para- doxically the Internet’s architectural elements most responsible for its success are also responsible for its most vexing problems. It is clear that if we want to x these ancillary problems, this cannot be achieved super cially without changing the systemic causes.
Thus, there is “not a single happy family of people” on the Inter- net with aligned goals [7]. Apart from tussles and conflicts, Internet protocols and applications also often face dilemmas in which the goals of the subsystem and the overall system conflict. One of the major insights of systems thinking is that the best way to optimize a system is not to independently optimize each subsystem but to optimize the relationships among the parts (which often is the bottleneck).
“Addiction is finding a quick and dirty solution to the symptom of the problem, which prevents or distracts one from the harder and longer-term task of solving the real problem.” ― Donella H. Meadows, Thinking in Systems: A Primer
Ben Shneiderman (University of Maryland) argues against autonomous systems.
His point is that it is essential to keep a human in the loop. If not you run the risk of abdicating ethical responsibility for system design.