Many of us have follow well established practices for the development side of an integration project with tools like BizTalk, but even though we have been doing integration for many years a lot of projects still struggle with the process of working out what we need to do which puts a big burden on your development team to deliver a project with poor information about the requirements to be delivered. Often analysis before getting into development can be non existent or take a long time and still not capture the right information.
How do we do an effective job to get just the right amount of information to make life easy for a developer to deliver a good solution which is fit for purpose?
In this session Mike will share some ideas on this part of a project and the idea is to encourage some community activity to help people in this area.
08448380779 Call Girls In Friends Colony Women Seeking Men
The Analysis Part of Integration Projects
1. Connected Systems Consulting Ltd
Sponsored & Brought to you by
The Analysis Part of Integration Projects
Michael Stephenson
https://twitter.com/michael_stephen
https://www.linkedin.com/in/michaelstephensonuk1
2. The Analysis Part of Integration Projects
Connected Systems Consulting Ltd
3. Connected Systems Consulting Ltd
Who am I?
Michael Stephenson
– UK-Based Freelance Consultant specializing in:
• BizTalk
• Windows Azure
• Integration
– Microsoft Integration MVP for 6+ years
– Pluralsight Author
– Azure Insider/Advisor
– One of organizers of UK Connected Systems User Group
– Founder of BizTalk Maturity Assessment – www.biztalkmaturity.com
– Worked on approx. 30 projects that have leveraged Azure
– First project went live about 4 years ago
– Contact Info:
• Blog: http://microsoftintegration.guru
• Twitter: @michael_Stephen
• Linked In: http://www.linkedin.com/in/michaelstephensonuk1
• Email: michael_stephensonuk@yahoo.co.uk
5. Question
“How often do you feel thrown in at the deep end when starting an integration
development project because no one seems to have a clear understanding of
what is required”?
+1 in the chat window if you think this is common
7. Common Scenario 1
“I need a feed of customer data from system A to system B”
Err…. Ok so what
does this mean?
8. Common Scenario 2
Contoso
Fabrikam
Acme
CRM System
ERP
System
Finance
System
Marketing
System
Products
System
Products
System
Orders
System
Orders
System
Orders
System
Customers
System
Its this interface here, can we just do this?
Those arrows
are pretty but
we don’t know
anything about
what it means
9. Common Scenario 3
“I need an additional data attribute added to an
Existing integration process”
This may be
straightforward and
we could just get on
with it… but lets get a
little info to be sure
10. Common Scenario 4
“We need to change the existing new customer process
So it also updates the new CRM system”
I hope we have all
of the data we
might need
12. What systems
are involved?
What does the
data look like?
How much
data do we
need to
process?
What do things
happen?
What is the
process flow?
Are there any
alternative or
exception
scenarios?
How do I talk
to each
application?
Are there any
SLA’s?
Is there any
business logic?
What are the
data
transformation
rules?
Will we need
any new
infrastructure?
Are there any
performance
requirements?
Are there any
security
requirements?
Is there any
documentation
?
When do
things
happen?
13. “There’s just so much unknown that affects how we design
and build the system”
14. How Important is Analysis?
Integration is easy when…
• The solution is very like something you have done before
• You are reusing stuff you have build in the past
• The applications involved are well understood
• Only a small number of people are involved
• The data is not complicated
• The process logic is simple
• You have a good test environment setup
• Everyone understands the process to be built
• You have good testing
• Your dependencies are well understood
Integration is hard when…
• You need new infrastructure
• You have lots of dependencies or they are poorly understood
• Lots of external vendors or people are involved
• You need to use patterns you have not implemented before
• The applications involved are not well understood
• The applications involved are unreliable
• You have a poor test environment setup
• You follow good ALM practices
• You don’t have enough information
• Your requirements are not well understood or clear
• You have a poor or incorrect solution design
• Your making it up as you go along
15. Getting it wrong
Not enough analysis
• Developer gets things wrong
• Architect makes bad assumptions
• Poor documentation
• Testing problems
• Delays to redo and fix things
• Developer ends up chasing around to do analysis that wasn’t
done
• Takes longer to deliver value
• Unhappy team
• Unhappy customer
Too much analysis
• Wasted time
• Too much documentation
• Lots of the information doesn’t add value
• Takes longer to deliver
• Costs too much
• Over complicates things
• Difficult to deal with change
• Time to delivering value is high
• Unhappy customer
16. Amount of Analysis
Amount of Analysis
High
HighLow
Sweet spot = Just enough 0
20
40
60
80
100
120
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Chart Title
Cost of Project Risk
We are usually in one of these places
18. Methodologies – In Theory
Waterfall Agile
Analysis
Design
DevelopAnalysis Design Develop
19. Methodologies – In Practice
Waterfall Agile
Analysis &
Development
Re-doing
Analysis &
Develop that
we missed or
got wrong
Analysis Design Develop
Analysis Design Develop
Time Spent
Value Delivered
What
happened to
architecture?
20. Agile v’s Waterfall
Agile
• Just give it to the scrum team
• We are doing agile so things are assumed to be simpler
• The team will figure it out
• Often teams don’t do analysis sprints
• Often This should be a sprint 0 type activity
• You usually have more stakeholders than just the product
owner
• Encourages POC of risk areas which is good
Waterfall
• Lots of analysis and design before any real world
validation
• Analyst/Architect not close enough to delivery to handle
change
• Analysis sometimes too detailed and overkill in terms of
presentation format
• Analysis that doesn’t add value
“In the real world a methodology often doesn’t solve the problem
as they aren’t getting the right amount of the right information”
22. “Gathering the information we need to allow us to design, build and test the solution the customer
needs, and being able to communicate that information to the entire team”
Analysis is….
23. “Just enough analysis to do an effective job”
“Analyst and Architect working together”
25. Definition of Ready
Do we feel ready?
• Do we feel confident about what
we need to do
• Is the project broken down into
stories or features?
• Do we need or have the right
analysis artefacts
Measuring Readiness
• INVEST
• I – Independent (of others)
• N – Negotiable
• V – Valuable
• E – Estimatable
• S – Small
• T - Testable
26. Analysis Iterations
• Iteration 1 – High Level
• Iteration 2 - More Detail
• Iteration 3 – Reduce Risk
Iteration 1
Iteration 2
Iteration n
28. Step 1 – The User Story
Structure
In order to (why is it important)
As a ………. (who wants it)
I want …… (what do you need)
Example
In order to collect money from customers
As a sales director
I want the CRM system to process credit card
payments on a monthly basis
“If you cant explain the value statement
then why do we want to do it”?
29. Step 2 – The Context Diagram
Old CRM New CRM
Orders
Website
Lookup Customer
Customer Transfer
Orders
System
Process Orders
31. Step 4 – High Level Process Logic
• Could you easily explain the process logic to
someone
32. Step 5 – High Level Data Items
• What entities?
• Any relationships?
• Any key data attributes?
• Where does data come from?
33. Step 6 – Applications Involved
• What are the supported interfaces?
• What protocols are supported?
• Where are the applications?
• Who manages them?
• Who are the key contacts?
34. Step 7 – Non-Functional Requirements
• Is there a general list of NFR’s
• Are there any specific NFR’s raised by the business
• Are their any IT NFR’s
• How should the system perform?
35. Step 8 – Stakeholder Map
Keep Satisfied
• CRM Development
Team
• Infrastructure Team
Actively Engage
• Product Owner
• HR Department
• Support Team
Monitor
• Warehouse Dept
Keep Informed
• ERP Development
Team
• Finance Dept
• Make sure we identify stakeholders and where
they fit into the map so we can make sure we talk
to the right people at the right time
• Make sure owners of error conditions are
identified not just owners of processes
36. Step 9 – Which type of Integration?
• API & Services
• ESB / Messaging
• Batch Based
• Simple Orchestration
• Complex Orchestration
• Mixed
37. Step 10 – Candidate Architecture
• Is it something we already do?
• What patterns do we need?
• How might we build it?
• How confident are we about the design?
38. Step 11 – Create Integration Catalogue
• Agree a friendly Interface Name
• Identify Source and Destination Systems
• Identify Candidate Integration Type
• Batch
• API
• ESB/Messaging
• Complex Orchestration
• Identify Types of Data involved
• Identify grouping of interfaces
• Identify related Use Cases
• Relate non functional requirements
• Relate candidate architecture
39. Checklist
What have we done
• Created an Integration Catalogue for project which
pulls together:
• User Stories
• High Level Context
• Identified Use Cases
• Draw Process Logic
• High Level Data Model
• Applications Involved
• Stakeholders
• Non-Functional Req’s
• Candidate architecture
How do we feel
• Are we confident about process
• Are we confident about data
• Are we confident about candidate architecture
• Does it feel simple or complex
• Do we feel we have a good understanding
42. “We have decided we need more information & detail before proceeding, what do we get next?”
What next?
43. Where do we look – Key Areas
1 Processing Logic • What processing logic is
required
• What data transformation is
required
• Are there any patterns
2 Application
Connectivity
• How might we connect to the
application
• Is the application interface well
defined and easy to work with
3 The Data • Does the data make sense
• Does the data map between
systems well
1
2
2
2
2
3
3
3
44. Step 1 – Flush out features
• Find alternative scenarios
• Specification by example
• Find exceptions
• Confirm owner of exception scenarios
45. Step 2 – Define Data
• Get sample data
• Data Model
• Get data specifications if existing
• XSD/XML
• WSDL/Swagger
• JSON
• Flat File Structure
• HL7
• EDI
46. Step 3 – Define Data Mappings
• Encoding formats
• Reference data mappings (Look up values)
• Mr AA
• Mrs AB
• Miss AC
• Number formatting
• Leading zeros
• Decimal places
• Text
• Length
• Alignment
• Padding
• Date formatting
Important this can often be an area
Which can throw up more integration
use cases or maintenance challenges
47. Step 4 – Application Integration Capabilities
• Do we have connectors?
• Are the connectors well understood?
• Are there challenges doing the connection?
• How will security work?
• Are there any throttling requirements?
48. Step 5 – Non Functional’s
• Revisit the NFR’s and validate with stakeholders
49. Step 6 – Patterns
• Are there any obvious integration patterns
involved
• Patterns can help create a common understanding
50. Step 7 – Design Decisions
• Are there any key design decisions to make
• Who will be involved
• How will the decision be made
• How do we communicate to the decision to everyone
• How do we ensure the decision is followed?
51. Step 8 – Update Candidate Architecture
• Is our candidate still good?
• Are there any new things
52. Checklist
What have we done
• More detailed process definition using
Specification by Example
• Defined data in more detail
• Define mappings
• Reviewed NFR’s
• Identify Patterns
• Update Architecture
How do we feel
• Are we confident about process
• Are we confident about data
• Are we confident about candidate architecture
• Do we understand the complexity
55. “How do we reduce the risk around the areas we have concerns with”
What next?
56. Step 1 – Logical Process POC
• What to do
• Implement light weight stubs to prove the applications will
work
• What does it achieve
• Allows focus on “will the applications work”
• Mitigates risk that the process logic will not work
57. Step 2 – Application Integration POC
• What to do
• Test we can connect to the application
• Create BDD style Specflow tests of the Application interface
• What does this achieve
• Test the application behaves the way we expect it to
• Tests the data looks like what we expect it to
• Tests assumptions about error conditions
58. Step 3 – Update Candidate Architecture
• Is our candidate still good?
• Are there any new things
59. Checklist
What have we done
• POC the key risk areas
• Updated architecture candidate
How do we feel
• Do we have any concerns?
• Have we mitigated risks?
61. Analysis Iterations
• Just enough analysis to set us up to be successful
• Use the iterations that are needed
• Make sure information is clear and can be
understood by anyone in the team
Iteration 1
Iteration 2
Iteration n
62. Takeaway
• Define your organisations definition of ready?
• What analysis artefacts do you usually need?
• Share it?
• Who is up for some community guidance around this?
• Technet wiki or something?
Notes de l'éditeur
Purple means they are related to the analysis and design phase of a project
Sometimes we use a methodology which proposes ideas around how things should be done which are misinterpreted or misused and it results in not doing an effective job.
Regardless of methodology we need a certain minimum level of understanding of what we need to do otherwise we will be guessing which is never a good thing.
Sometimes too much analysis can be detrimental though
We need to appreciate that all stakeholders in the project have a different perspective and will see things in a different way.
As an architect we need to be able to imagine ourselves in each of these positions to understand the needs of each different