1. 486
Trends & special topics
By Leigh Ulpen, Senior Manager, McGrathNicol
Howself-serviceanalytics
changesrisk
• Self-service analytics
and data discovery
tools are empowering
business users to
perform complex
analysis on much
larger volumes of data
without reliance on a
traditional BI team.
• When access rights
are in place personal
information cannot be
shared outside of the
intended audiences.
• In the ideal world,
the data being used
for analysis is as
close to the source
as possible.
The rise of self-service
analytics tools is changing
the business reporting and
analytics landscape for
many organisations. As with
any change to a business,
there are new and old risks
that need to be addressed
prior to implementation,
and as part of your
continual monitoring and
improvement processes.
This article will discuss some of the
capabilities of these tools and give you
some useful information to consider for
your own implementation. In summary:
• Self-service analytics can provide
analysts with the ability to connect
to many different data sources,
to perform rapid analytics and
data discovery.
• Monitoring user access and
understanding data privacy
should be a high priority for your
governance frameworks.
• Self-service analytic tools can
provide mechanisms to control
the spread of data through the
organisation, to help prevent data
sprawl, misuse and reliance on
manually produced reports.
• Your data governance framework
should identify the owners of the data.
• You should know exactly where
your analytics data is sourced from.
The closer to the source data your
analysis is the better, and in lieu
of directly accessing source data,
the data owners should map the
organisation’s data flows so they are
at least understood.
These points and more are discussed
throughout the article in more detail.
The changing analytics
environment
Business Intelligence (BI) has changed
over the past few years, and we no
longer think about data only for the
purpose of creating a single report,
but also the potential the data holds
for analysis and insights about your
business. This change in thinking
is partially driven by the greater
accessibility of tools and techniques to
perform more complicated analysis and
the impact they have on business goes
far beyond the analyst using the tools.
Self-service analytics and data
discovery tools are empowering
business users to perform rapid
analytics without the need of the
traditional BI teams. With tools like
‘Tableau’, ‘Qlik’ and ‘Microsoft Power
BI’, business users can perform
analysis and build reports much
faster than before when they were
often at the mercy of the BI and
Information Technology (IT)
workloads and schedules.
Some experienced analysts would
already be capable of producing
complex analysis without these tools,
but they were often limited to Excel
and, if they were more technical, VBA
and SQL. And they are often in short
supply. This limitation meant that many
users were limited to pivot tables, excel
2. 487Governance Directions September 2016
charts, and a relatively small volume
of data. Anyone needing to analyse
more than one million rows of data (or
65,000 if you haven’t upgraded Excel
in a long time) meant working with the
data professionals in the BI teams who
often have schedules and priorities to
compete with before they can develop
your reports.
With self-service analytic tools the
barriers of entry to perform complex
analysis on much larger volumes
of data is lower, and given the right
permissions a user can connect
directly to databases (for example,
sales, finance, HR etc) and enrich the
data with other sources (for example,
reports from third parties, online
weather data etc.) to build insightful
analysis to be shared with the rest
of the business. With greater data
connectivity, and ease of analysis,
there are more users being able to
touch more data, which exposes the
business to new variations of the
traditional risks.
There are many questions that should
be addressed prior to establishing an
analytics environment such as:
• Do the data sources being analysed
contain sensitive information about
customers or employees?
• Who is the audience and what will
the analysis be used for?
• Who is responsible for data quality?
While this isn’t a comprehensive list,
they are some of the most important
questions for the following reasons.
Does the data contain
sensitive information?
Chapter 11 of the Australian Privacy
Principles (APP) guidelines requires
entities to take active measures
to ensure the security of personal
information it holds, and to take
reasonable steps to protect the
information from misuse, interference
and loss, as well as unauthorised
access, modification or disclosure. This
is important to consider when deciding
what data sources can be used for
analysis, and where your employee
and customer personal information
is stored in the organisation. If basic
access rights are not in place then
you may inadvertently allow analysts
to access and analyse personal
information, or include it inside a data
source that is subsequently published.
Some self-serve analytics tools allow
users to create interactive dashboards
enabling the audience to create their
own analysis with the data, such as
filtering or adding in additional chart
dimensions. This is achieved by the
software ‘packaging up’ the source
data with the dashboard. If users are
not aware of this they could easily
include additional data fields with their
analysis that are subsequently shared
across the organisation. Imagine
a scenario where an analyst uses
employee information without removing
salaries or tax file numbers from the
data, and the issues this may cause.
Ensuring your data governance
frameworks restrict the access to
personal information and that analysis
is not shared outside of the intended
audiences is particularly important
when handling sensitive information.
Who is the audience? And what is
analysis’ purpose?
With the new generation of analytics
tools it is possible to create analysis
workbooks that link directly to the
source data. This means the audience
for the analysis can view the findings
with the most up-to-date information
without the analysts needing to update
the data periodically. They also allow
the audience to interact with the data
and ask their own questions.
In an Excel environment it is easy to
make a copy of the data for your own
purposes, build your analysis and share
it once more, but this often leads to
data sprawl where data and reports
are duplicated across the business,
and can become relied upon by users
who were never intended to be part of
the audience. In this instance you no
longer have visibility of where your data
is and who is using it. This can lead
to misunderstanding of an analysis,
or misuse of data that finds its way
into the inboxes of senior executives.
At worst, this scenario can lead to
significant reports being created with
data that may not be fit for purpose.
Most self-service analytics tools have
the ability to share analysis through
an internal web site, or using a reader
application similar to Adobe for PDF
files. This helps contain the analysis
to the intended use and audience via
access control, and also provides users
with the freedom to explore the data
and perform their own analysis in a
controlled way. The analyst can choose
what data is shared with the audience.
Including steps to audit the access
and ageing of analysis will help keep
the data controlled and relevant. Many
self-serve analytics platforms allow
for user access reporting helping to
identify which dashboards, reports
and sources of data are being viewed
and shared the most, and which ones
are out of date. These access controls
allow you to control the audience
and for them to use tools to perform
further analysis in a controlled manner,
without the need to duplicate data and
attribute to data sprawl.
...datagovernanceframeworksshouldcontain
requirementsthatanysignificantreportsor
analysisbecreatedfromdatathathasaknown
anddocumentedpathfromsourcesystem
toreport.
3. 488
Who is responsible for
data quality?
The quality of data being analysed is
pivotal to creating meaningful analysis.
Without confidence in the quality of the
source data there cannot be confidence
in the outputs. As the old programming
saying goes ‘Garbage in, Garbage out.’
Deciding who will be responsible for the
quality of data is critical to the success
of a self-service analytics environment.
This is where the data professionals
have a large role to play.
The data sources made available
for analysis must flow from trusted
sources. In the ideal world, the data
being used for analysis is as close to
the source as possible, however this
is not always feasible and in those
instances the entire flow of data from
origination to analysis warehouse has
to be known and regularly monitored
and assessed for quality. Your data
governance frameworks should contain
requirements that any significant
reports or analysis be created
from data that has a known and
documented path from source system
to report.
Knowing exactly where your core
business data comes from, and having
processes in place to regularly review
and assess the quality of that data
not only provides confidence to the
analysts, but to the audience as well.
Ensuring analysts use the known data
sources, will build confidence that
inputs are correct, and having built-
in reconciliation processes provided
confidence in the outputs.
With this in mind it is the data
professionals who should own the
processes of mapping the data in the
organisation from source systems
to analysis, and by doing so either
through creation of an analysis data
warehouse, or through an approval
process for each data source, you
can be confident that analysis is
being performed on the right data,
and that the results of the analysis
can reconciled back to the trusted
source of data and ultimately provide
confidence to act on the findings.
The risks identified above will no doubt
be familiar to many organisations, so it
is important to continually revisit them
as you add new tools and technologies
to your environment. With the advances
in analysis software and the technical
abilities of the users, ensuring your
frameworks are appropriate for the
goals of the organisation is an ongoing
task, and striking the right balance
between good governance and being
too restrictive is a challenging task.
Leigh Ulpen can be contacted on
(03) 9038 3134 or by email at
lulpen@mcgrathnicol.com.
GovernanceInstituteEthicsIndex
surveyfindingsreleased
Australianpublicseebigbusiness,
banksandpoliticiansas‘unethical’.
That’s according to the findings of Governance Institute of
Australia’s inaugural Ethics Index.
The survey is the first of its kind in Australia. These findings
are vitally important for us as governance professionals. Never
before has the case been stronger for governance strategies
and solutions that enable a company to not only conduct itself
ethically, but to also evolve, grow and succeed.
Find out more about Governance Institute Ethics
Index at governanceinstitute.com.au/EthicsIndex
Trends & special topics