IT Cost Transparency is not about what does IT cost? It’s about why does IT cost what it does and how can they use their money and other resources to create more value.
12. Plan CapEx/OpEx Run/Grow/Transform Fixed/Variable Location/Regions Month/Quarter/Year
Cost Centers Accounts
General Ledger Line Item Detail
B R E A K O U T S
Standard IT Cost Categories
Application 1 Application 2
Application 3 Application 4
Application & Integration Delivery Services
Hosting Services Professional Services
Client Computing
Communication & Collaboration
Connectivity End-user Support
Records Management Business-Driven Initiatives Infrastructure Initiatives
Process Initiatives Compliance Initiatives
Your Applications Technology Services End User Services Project Services
COST TRANSPARENCY: APPLICATIONS AND SERVICES
Design & Develop Buy, Build & Deliver Market & Sell Service Customer Shared Services
BUSINESS CAPABILITIES
Business Unit 1 Business Unit 2 Business Unit 3 Business Unit 4
COST TRANSPARENCY: BUSINESS UNITS
= Peer Benchmarks Available
Hardware Software Internal Labor External Labor Facilities/Overhead Outside Service
FOUNDATIONApptio TBM Unified Model™(ATUM™)
IT TOWERS & SUB-TOWERS
COST POOLS
Data
Center
Compute
IT
Operations
Voice
Network
Delivery
Services
Security &
Compliance
User
Services
Data
Network
Storage Application
IT
Management
COST TRANSPARENCY: FOUNDATION
Wintel Servers
Linux Servers
Unix Servers
Locations
Mainframe
Tier 1
Tier 2
Tier 3
Tier 4
Office LAN
DC LAN
WAN
Internet
MAN
Voice premise
Wide Area Voice
Call Center
Project Mgnt
Client Mgmt
PCs
Mobile Device
Service Desk
Field Service
App Dev
App Support
QA (Testing)
Middleware
Software Lic.
IT Mgmt & Admin
IT Finance
IT Vendor Mgmt
Ops Center
Service Mgmt
Security
Compliance
Disaster Recov.
Architecture
14. Challenges of Creating Cost Transparency
TRIGGERS
Month end close
Forecasts
Project financial
QBRs
Ad hoc questions
MANIPULATE
DATA
Clean and manipulate
Create joins and mapping
Find data problems
110010
011001
010110
3
BUILD
REPORTS
Create pivots
Build graphs
Find data problems
5
MAINTAIN
COST MODEL
Build vLookups
Debug Formulas
Find data problems
4
VERSION
CONTROL!
ORG
CHANGE!
Update
Own Cost
Spreadsheets
DATA
SOURCE
CHANGE!
VALIDATE/CLARIFY6
Timing and categorizations
What cost went where?
Why is that in MY costs?
FINANCE
High:
$3.8m
Low:
$180k
DATA
OWNERS
High:
$225k
Low:
$18k
DEPARTMENT
OWNERS
High:
$108k
Low:
$11k
IT
EXECUTIVES
High:
$130k
Low:
$11k
PROVIDE DATA
Extract
Validate
2
110010
011001
Explain
Fill gaps
PUBLISH
REPORTS
Trick Excel into making
waterfall charts
Annotate and distribute
7
REQUEST
DATA
1 Prompt IT and
corporate for data
extracts
ANSWER QUESTIONS9
Answer requires more
granularity or different slices
Answer requires more dats
(may become a “study”)
ASK QUESTIONS8
Why did costs. KPIs change?
What does this include?
I need to see this slices by…
What it…?
?
The cost of staff
time spent on
manual IT cost
analysis shows wide
range of breadth,
scope across
Global 2000 IT orgs
Most companies take it for granted that they know what it costs to run their business. They track the cost of running key disciplines like R&D, manufacturing sales and marketing. They track the cost of providing their key products and services. They track who their customers are and how much they are buying. And they make decisions every year, every quarter, every month, and every day based on this shared understanding. If they didn’t have these essential facts, how long would it take to make a decision on whether to invest in a new product or discontinue it, close down a factory or build a new one?Whether or not you think of your IT organization as a business in its own right, the same principles apply. What we’ve seen in working with hundreds of IT leadership teams like yours is that there are essential views they all need to drive the right decisions and make them quickly.
We’ve found there are key moments of truth where better cost information makes an impact for IT organizations, and these are some of the patterns we see. One of our goals of this meeting is to better understand where you could use
Big changes beyond IT’s control are putting transparency and accelerated decision-making on the front burner for more and more companiesFor example,(New competition): Cloud providers are marketing directly to the business, and what the business sees isn’t usually a fair apples to apples comparison to what its IT organization provides(New technologies): Deploying or developing for new technologies like mobile, big data and hybrid clouds can have huge impacts on the business of IT, from understanding the true total cost of applications for app rationalization, to business adoption of shared services and private clouds(New expectations): The business is now an active participant in the technology marketplace. They bring their own devices and expect IT to support them. And now that the business is getting price transparency for even infrastructure services in the cloud, they expect transparency from their own IT as well. What does all this mean for the business of IT? IT is on an fast path in transforming from technology provider to service provider. With a supply chain view we can more clearly see the decisions that affect the cost, utilization and quality of its services from end to end -- what we buy from whom at what price, how we build and run services, and how we manage business demand.Which of these trends are you seeing in your organization? What are the impacts you are seeing?
ATUM is based on management categories and best practice allocation methods distilled from the TBM Council and hundreds of Apptio customers across industries, sizes and geographies. Because ATUM is aligned with categories and costing methods used in Apptio Infrastructure Benchmarking, you can use Apptio to automate benchmarking with apples-to-apples comparisons to your peers.
A TBM Advisor is DEDICATED to helping customers understand and ADOPT Technology Business Management within their organization and identify new ways to gain VALUE from their Apptio subscription.We have broad knowledge and experience in IT Finance as well as IT Operations.... So by listening to our customers first, we are able to align our product to your process and assist you with building a TBM Adoption roadmap that is right for your business.Why am I talking about DQM? What experiences do I have?Data quality is a broad topic and means different things to different people… but I think everyone agrees that it can have quite a significant impact on an organizations ability to run an efficient businessAn example I can recall early on in my career when I was a mid-level manager for a large IT Outsourcing companyExample of Telecom Services DQ issues early onThe significance of data quality and data integrity is foundational to having a productive relationship in business. You can’t talk strategic until your house is in order…In todays presentation you will learn 3 main things:Data Quality Governance and how it relates to TBMYou will learn the 5 data quality dimensions that can be measured within the Apptio platformYou will learn the process and approach of handling Data Quality within Apptio
One of the biggest challenges I’ve heard from people wanting to start the journey of TBM say that they don’t want to start TBM because of poor data quality.Well, first, let me quote the famous Chinese philosopher, Lao Tsu – “A journey of a thousand miles begins with a single step”… That first step begins with data… and remember, data is foundational to driving better business decisions.To add, Apptio has taken the learnings based on over 140+ customers and developed IP around the concept of Master Data Sets… which is basically our definition of the most commonly used fields within each dataset that will help generate actions and decisions that generate measureable value… So we can tell you what data that you need so your scope around data quality is targeted only to the fields that we recommend based on our best practices.Another big challenge I hear is that people don’t have enough confidence in the level of DQ to defend the allocations within the Apptio model.And to that I say Embrace failure, take risk, and let me quote the Irish Author Samuel Beckett – “Ever tried?, Ever failed? No Matter, try again, fail again, Fail better”This quote reminds me of my friends at DIRECTV where thy have embraced a 12 point program that can be summed up as “Fast to Fail” where they just want people to recognize failure faster so they can learn from their mistakes and pivot to move forward again without blame or blowback from the organization.
Data quality can be measured in several different dimensions. When choosing these dimensions for Apptio, we had 3 basic requirements: 1) Does the dimension apply to TBM 2) Can we quantitatively measure it in Apptio? 3) Will our customers derive value in the output to help them make better decisions?So we start withMaintainability which is a measure of how easy it is to extract the data from the source system as well as the degree to how much the data is manipulated before it gets loaded into Apptio...Uniqueness is a measure of unique rows… the inverse of uniqueness is duplication… so we are checking for duplicate records in the dataCompleteness is a measure of the fields that have a value, so we are checking and reporting on blank valuesValidity is a measure of the values within the dataset compared to a set of allowed values. So this dimension requires a set of “Master” lookup values to compare againstFrequncy Distribution is the number of occurrences of unique values within a column. A low number of occurrences in the column will translate to “noise” in reports and is an important factor to consider when trying to discover value insights with the data
The ecosystem of Data Quality spans many facets. It starts by setting up the right governance with proper sponsorship and ownership. The data advisor helps accelerate what data you need so you can define what data sources you want to go afterData Governance, Ownership == Sponsorship and support systemData Advisor, Data Sources, and Data Stewards == Implementation (doing the work)Change Management, ADM, Monitoring == Continuous Improvement
The Ecosystem of TBM… Data Governance Vision and Strategy… as well as the Ownership (TBM Office).
Data Advisor defines what data that is required in order to build our out of the box templates.You have the ability to review all the required attributes by component and determine whether or not it is important for your organization. This greatly reduces the scope the Data Quality initiative will encompass and increase the chances of success.
Once you have defined the data sources and the associated attributes that map to the data advisor, the next step is to understand how each attribute is updated in terms of change management processes. For example, Dell found an error in their processes that the Owner of the application was never getting recorded… so it was impossible to tie applications to business owners in terms of costing. When they realized that the Service Desk wasn’t asking the right questions and not requiring that field in their service management system, they modified their process to ensure that the App Owner was a required field. Fixing the DQ issue at the source is always the best, long term option if it is as simple as adjusting a process like this one.
Automation / ADM
An example of Adaptive Data Management in Apptio is how we use “Data Mapper”. On the left hand side is a list of attributes coming from our Master Data Tables. On the right hand side are attributes coming from the customers data. This aliasing technique is just one of the many ways you can apply ADM to quickly translate your data into Apptio’s solution.
Data quality initiatives require building off of the correct foundation layers… starting with strategy and governance… understanding the data you need, choosing the right source system to use and the processes behind them. Next, applying the dq dimensions to that data so you can measure progress over time… as well as leverage Adaptive Data Management techniques for quick time to value so you don’t always have to wait and fix data in the source systems… The result, is that you can have a DQ number that you can apply a heat map towards based on cost * DQ %... To get an idea for how confident the numbers are that you are looking at.
Walk them thru this Workflow
Here is the top level dashboard that shows our matrix of data quality dimensions by data set. This matrix allows you to quantify the state of data quality within Apptio for a given month. You can then drill into any one of these KPI’s to analyze and understand the score and take action to improve it…. So, we want to go through these dimensions in chronological order (or left to right) so we’ll drill into the Server dataset under the Maintainability dimension.
So Remember, Maintainability is the degree to how complex it is to extract the data from the source system and how much manipulation takes place prior to loading it into ApptioFor Maintainability, the user is required to answer 3 simple questions that tie back to the definitionWhat is the source system? Database, Excel, OtherRate from 1 – 5 how long it takes to extract data out of the systemRate from 1 – 5 how much data manipulation takes place prior to loading it into ApptioThe answers can be versioned by month as data load processes improve and the trend charts light up to show historical performance… hopefully for the better
Uniqueness is a measure of the number of unique rows… The middle of the page describes how to apply some of Apptio’s Adaptive Data management tools to manage duplicate records without having to fix the source systems… The table below is an exception based report showing the rows that appear to be duplicate
Completeness is a measure of the fields that have a value, so the table at the top shows the number of blanks per column… you can see that the columns in the top table align to the columns in the bottom table…
… so if I wanted to slice into CPU Cores and find all the records that are associated with the number in the top table, click on the slicer and the report will refresh with those specific records… Also, you can export the filtered results out to an Excel and give it to the data steward for remediation
Measure whether the values in the data match to a set of “Approved” values… But in order to generate the approved values, it requires some initial configuration,lets click into the Settings button …
On the left hand side is a simple step by step guide that walks you through how to setup this dimension. The first table on the right is an exception based table that shows any value that hasn’t been assigned “Valid” or “Invalid” during the configuration process. This is important because it will catch any NEW values caught in the source data to be Approved for this dimension. The far table is a list of all values and their approval status… it’s a rollup of all the tabs that we will go to next.
Each tab has a “Validity” column that is required to be filled out as “Valid” or “Invalid”. Blank values are new values coming from the source that have not been assigned… (They default to Invalid for the purposes of the score). After assigning values, don’t forget to save the table before moving to the next tab.
Back to the Validity page, you’ll notice that it has a similar look and feel as completeness, where the invalid values are listed in the top table, and then use the bottom table for discovery and helping with the remediation process. Use the slicers to drill into the records that pertain to the appropriate column… then check those values against the “Expected” values on the right hand side…
Frequency Distribution is the number of occurrences of unique values within a column. So here, you can review the frequency of column values for each column… The right hand of the screen describes how the KPI is measured as well as goes through step by step instructions of how to use and act on frequency distribution issues… Remember, the idea here is to reduce the number of values in a given column to ensure the best reporting experience possible...
So the first thing to do with this dimension is to set the threshold value, which is basically your tolerance for “noise”. What I mean my that is setting the threshold will calculate the number of rows that the KPI will calculate against… So as shown here, if you have 6874 records and you set your threshold value to 2%, then 2% of 6874 = 137 rows…
Then you take the 137 and insert it into the slicer to filter down the dataset to review all the values for inspection… Use the slicers to navigate column by column for this dimension…