SlideShare une entreprise Scribd logo
1  sur  8
Télécharger pour lire hors ligne
Understanding Quality of
Experience for Mobile Data
Mobile network operators (MNOs) have to adapt to a rapidly changing
market.The explosion of applications accessible through mobile networks
and the advent of over-the-top (OTT) services are clearly putting pres-
sure on traditional service providers to differentiate themselves, keep the
customer base loyal, and most importantly, recover revenues currently
flowing toward OTT providers (or to other MNO competitors). Under-
standing Quality of Experience will play a critical role in this transition.
| empirix.com
MNOs NEEDTO DIFFERENTIATE SERVICES
In the past, MNOs were just focused on the bandwidth made available to mobile users;
think back to the evolution of network technologies for mobile data services. They went
from 2G to 4G, passing through 2.5G and 3G. In that case, the only issue was increas-
ing bandwidth, with no real focus on the services running on it (the only advertised
service was generic Internet access). They based this action on the assumption that
more bandwidth equaled more customers or subscribers.
Later, OTT providers appeared, relegating the MSP to the role of “dumb pipes” provid-
er, creating an environment where MSPs have invested significant amounts of money
to increase network capacity, only to have their customers access services provided
(and billed) by other entities!
So, how can MNOs differentiate themselves with customers? One of the best ways
to do this is by regaining ownership of the services, emphasizing what they can control
on the network—that is, the quality of the offering.
But, in order to do that, MNOs need full visibility into who is accessing the network,
through which devices, from where, and for what purpose. Most important of all,
they need to know the quality of the user experience.
DEFINING QUALITY IN AN EVOLVING NETWORK ENVIRONMENT
When telecom networks were mainly used to deliver fixed-line voice services, the
customer experience was tightly linked to physical network quality because there was
a 1:1 match between the network infrastructure and the provided service. Therefore, a
valid quality concept was “if the network is good, the service is good, so my customers
are happy.”
However, the advent of high-speed mobile networks completely invalidated such an
approach because the customers are now accessing services while mobile and, most
importantly, the service chain (meaning all software/hardware entities involved in ser-
vice delivery) is much more complex so the 1:1 match is no longer applicable.
Additionally, MNOs need to focus on not only the quality of voice services, but, even
more importantly, the quality of mobile data services. Given that multitasking is a
norm, providers must understand how customers perceive the quality of service when
accessing both voice and mobile data services simultaneously. Keep in mind that this
could include video as well.
QoS ISN’T ENOUGH
Standards bodies and industry organizations, like ITU and ETSI, have defined how to
measure Quality of Service (QoS), meaning “a network’s ability to provide a service
with an assured service level.” QoS was defined by means of a set of technical metrics
(a.k.a. key performance indicators, KPI) such as packet loss, delay, throughput, answer-
to-seizure ratio, and call setup time. It should be noted that QoS was a valid approach in
fixed line environments to derive the experience perceived by the customers due to the
previously mentioned concept of “if the network is good, the service is good.”
To better understand the quality of voice delivery from a customer’s point of view
(their perceived experience), mean opinion score (MOS), which provided a subjective
measurement (metric) of the quality of a call. This test and the algorithms used have
evolved over the years to accommodate the transition from circuit-switch services to
packet-switched networks carrying voice.
Empirix created
a model that
will not only
leverage
information
from existing
methods, but
also enhance
them to ensure
they cover
mobile data
services.
| empirix.com
With the introduction of data services over mobile networks, QoS has proven to be a valid way to check the health of the
physical network and the links between nodes, but lacks the ability to represent the subjective perception of quality from
the customer point of view. While MOS was effective for voice, there is a gap as it pertains to data and video services.
THE REQUIREMENT FOR COMPREHENSIVE QUALITY OF EXPERIENCE MONITORING
MNOs currently lack a standard methodology for measuring the subjective and reporting on the experience of its custom-
ers across the services they deliver, this includes voice, data and video. This is where Quality of Experience, or QoE, can
play a significant role in helping MNOs provide a normalized approach leveraging easy to understand terms for all the
services being delivered. Various definitions have been provided for QoE, including:
uu The overall acceptability of an application or a service, as perceived subjectively by the end-user
uu How a user perceives the usability of a service when in use, and how satisfied the user is with the service
uu The degree of delight of the user of a service, influenced by content, network, device, application, user expectations
and goals, and context of use
Regardless of the academic definitions, the general consensus is that QoE should be defined by both services and user
experience leveraging subjective and objective measurements. Indicators like “excellent,” “good,” and “bad” (referring to
services) as well as “satisfied,” “tolerating,” and “frustrated” (referring to the users) that everyone can understand with-
out any deep technical know-how represent a completely different approach than QoS, which has been based on metrics
tightly related to the telecom technology.
To summarize, QoS evaluates the network while QoE evaluates the perception of a service from user standpoint. And
while QoE for well-established services like voice, IPTV and fixed internet access is defined (respectively, through MOS,
VMOS and a set of standard indicators related to IP traffic like throughput, packet loss, jitter, and delay), very little is avail-
able for scoring the multimedia data services accessed in mobility based on the HTTP protocol, including dynamic adaptive
streaming over HTTP (DASH).
In addition, the variety and complexity of the services (i.e., number of network entities involved, interworking technologies,
dynamicity and variability of IP networks conditions) require new approaches that can be categorized as:
uu Subjective assessment of QoE: by means of users survey, collection of user complaints, periodic active testing.
uu Objective assessment of QoE: adoption of computational models to collect measurable parameters from live traffic,
coupled with correlation rules to derive the overall QoE in as close to real-time as possible.
CHALLENGES
Approaching QoE measurement through subjective methods is relatively easy: there are many solutions (active testing)
on the market that can perform a set of tests injecting traffic (called reference traffic) into the network and measure QoE
from real mobile devices and/or dedicated hardware deployed across the network. However, such an approach has many
limitations:
uu Customers seldom agree to have software provided by the MNO running on their devices, for both privacy and per-
ceived performance reasons.
uu Testing traffic can consume the data plans of end-users: Why should they consume their monthly traffic to make tests
for MNO?
uu The type of traffic that can be tested is limited to the scenarios defined by the tool.
uu Visibility of QoE is limited to the times and locations when and where the tests are performed; therefore, it is possible
that everything may look good while executing the test, while other customers are experiencing poor quality because
they are doing something different or are in different locations.
So while it is possible to provide an evaluation of QoE through active testing, it is limited because it was obtained through
a sampling approach and therefore by rule it does not cover 100% of the traffic, of the customers, of the devices and of
the locations.
| empirix.com
THE NEED FOR SMARTER SCORING OF QoE
To define the “best” computational model to objectively assess QoE, MNOs need
to take into account many parameters including:
uu Network dependent factors: Examples are the application response time or the
time a web page takes to load.
uu User expectations: “If I pay more, I expect more.” Note that this requires MNOs
to contextualize the analysis depending on the type of contract subscribed by the
end-users; therefore the subscribe profile should be an input to the algorithm.
uu Context of use: The type of user that can be categorized as private, business,
M2M, etc.
uu Application-specific factors: If users are simply browsing the Internet, their QoE
must be scored differently than
when looking at video.
To better understand the complexity in scoring QoE in the case of mobile data services,
let’s examine a typical data session for Web browsing along with each of the compo-
nents that can affect the perceived QoE.
Before accessing any website from a smartphone, a user must be recognized as an
authorized user of a mobile network (by means of the attach procedure, normally issued
when we turn on our device) and also be connected to an APN (access point name)
which serves as the gateway between a 2G, 3G or 4G mobile network and the public
Internet, normally provided by the MNO.
Once connected to a mobile network, a data session from a smartphone normally
starts by entering a Web address, into the browser (or clicking on a link from an
existing webpage).
The device must translate the web address into an IP address; this is the task of the
DNS resolution procedures through which the browser sends a query to the MNO
and waits for the answer. Assuming that the address is successfully translated into
a valid IP address, the next step is to open a connection to the destination remote
server, which that can be located anywhere on the Internet. This phase is called TCP
handshake and during it, the end-user is not seeing any data, except for a small prompt
that normally looks like “…contacting site...”
Once the connection with the remote server is established, the device sends the
request for the data to display in the form of an HTTP query. If this is the first visit
to the website, the request from the device is for the homepage. The remote server
sends back a list of all the components (called “page objects”) that will be showed
in the homepage; for each of them, the device normally opens a new TCP connection,
sends the request for the data and waits for the answer back. Note that the procedure
is reiterated for every component of the page until the entire page is retrieved; it is
very common for a webpage to be composed of parts coming from different servers.
The device normally starts to show the components as soon as they are received
from the remote server; depending on the speed on the connection, the loading of
the various components can be perceived or not. Sometime the page appears suddenly
and fully completed, while other times the various parts appear in sequence.
Finally, the device closes all the TCP connections and waits for the next command.
QoS evaluates
the network
while QoE
evaluates the
perception of
a service from
user standpoint
| empirix.com
The example above (Figure 1) clearly shows the complexity involved in trying to generate a single normalized metric of the
experience as perceived by the customer. Each step contributes to the overall scoring in a different way; furthermore, the
scoring must take in account both objective parameters that depend on the network and the content provider (e.g., the
time to download the page is a function of the connection’s speed and of the response time of the remote server), but
also subjective parameters (how long will a user wait for the page before deciding to give up because the network is slow).
If the device does not receive an answer to its DNS resolution query, the browser can only show a message like
“…cannot open…” or display an incomplete view of the page and this is more difficult to score objectively!
Up to now, international recommendations have helped to measure the network parameters (e.g., QoS – ITU-T G.1010,
ITU-T Y.1540, etc.). The question now is: how to correlate existing QoS solutions and subjective customer experience
into a holistic QoE?
Most of the service assurance tools currently deployed at MNOs have been designed to monitor the network, not the
customers’ QoE; a better, smarter approach is now required.
AN INNOVATIVE EMPIRIX APPROACHTO QoE
Recognizing that modern MNO environments require a different approach to QoE scoring, Empirix created a model
that will not only leverage information from existing methods, but also enhance them to ensure they cover mobile
data services.
To start, Empirix defined a set of metrics to evaluate each of the phases composing a mobile data session, grouping
them in three basic categories taken by the existing standards already available for the definition of the QoS:
uu Service Accessibility: Evaluate the capacity of a customer in accessing a service.
uu Service Retainability: Evaluate the capacity of a customer to retain the service, once obtained, without interruption
due to the network.
uu Service Performance: Evaluate the performance of a service.
Note that each category includes a set of metrics that cover multiple types of procedures: for example, service accessibil-
ity considers the authentication, attach and PDP context procedures. It is, however, possible that not all such information
will be available, depending on the level of coverage of the probe system within the network (e.g., the attach procedures
for 3G require visibility on Iu-PS interface).
FIGURE 1. PROCEDURESTO ACCESS A WEBPAGETHROUGH A 3G NETWORK
Node
B
SGSN Internet
Remote Web
Server
UE
MNO
Responsibility
Demarcation
lub
3GDT (User Plane)
DNS Resolution
To/From DNS Server
End-To-End Transaction Starts (TCP Handshake)
Data Transfer: GET (First Home Page and then all other objects composing the page)
TCP ACK (or ACK “piggy-backed” in 200 OK)
200 OK + Data
TCP ACK from UE
End-to-End Transaction Stops (TCP)
Attach and PDP Context Create
lu-PS GTP-C
RNC GGSN
Probe
| empirix.com
The innovative approach of Empirix is in the next step, when each QoS category is assigned, for each individual customer,
to a subscriber satisfaction zone (SSZ) correlating the respective KPI. The SSZ is defined by:
uu Satisfied: Response times are fast enough to satisfy the user, who is then able to concentrate fully on the work
at hand with minimal negative impact on his/her thought process.
uu Tolerating: Response times are longer than those of the satisfied zone, so the user notices how long it takes to
interact with the system. However, they are still able to use the application.
uu Frustrated: The user becomes unhappy. The casual user is likely to abandon the current course of action while a
production user will definitely cancel the task.
Such categorization is inspired by the APDEX Alliance, a consortium of vendors that standardizes a way to measure any
type of experience from the end-user perspective, but Empirix introduced a way to correlate the QoS metrics to the
customer’s QoE. Basically, within a specific time range (configurable), the users are categorized as satisfied, tolerating or
frustrated by evaluating the respective set of QoS metrics.
For example, a customer can have a satisfied score for Service Accessibility, but a frustrated score for service performance:
the various components affecting the QoE are all taken into account and weighed. The SSZ scoring also records all of the
events that occur and have been evaluated, because they will be required in the final formulas.
Finally, a formula correlates the SSZ results and generates a single normalized QoE Index between 0 (= unacceptable) and
1 (= excellent), as shown in Figure 3 below:
Note that the QoE Index will provide an overall view of the experience, but can be grouped by a number of different dimen-
sions including a single customer, a group of customers, cells, applications or mobile devices. This will allow, for example,
an MNO to easily understand if poor QoE is affecting only a specific set of devices and not a specific area.
1.00
0.94
0.85
0.70
0.50
0.00
Excellent
Good
Fair
Poor
Unacceptable
Empirix QoE index
FIGURE 3. QoE INDEX
FIGURE 2. CORRELATING QoSTO QoE
Individual Subscriber
Experience
Overall Experience
Quality of Service
Network Performance
| empirix.com
The key differentiators of the Empirix approach can be summarized as follows and
pictured in the figure below:
uu Generate a single “QoE Index,” instead of thousands of uncorrelated metrics.
uu Correlate all the procedures that can affect access to the service, not only the
performance, regardless the technologies (2G, 3G and 4G).
uu Flexibility to customize the formulas to assign different weights to how the indi-
vidual QoS categories contributes to the subscriber satisfaction zone assignment
(satisfied, tolerating, frustrated).
uu Flexibility to introduce application-specific KPI for the evaluation of service perfor-
mance: examples include MOS for audio calls (i.e., VoLTE), video stalling in case of
YouTube, response time in the case of generic web access.
uu Options to include the customer profile into the subscriber satisfaction zone’s
categorization.
uu Aggregate the QoE Index by configurable dimensions like individual or groups of
customers, cells, applications and mobile devices to provide valuable information to
multiple MNOs’ service assurance departments.
uu Provide an automated tool to indicate the possible root-cause-analysis of a low QoE
(automatic expert analysis).
THE VALUE OF QoE
MNOs have no shortage of service assurance solutions that generate hundreds or
thousands of KPI/KQI covering specific network domains. However, they lack an easy
way to obtain a holistic view of the experience provided to their subscribers because
such solutions are totally uncorrelated and designed only to score the network per-
formance. In the past, service quality management (SQM) solutions have been an
attempt to correlate disparate information to provide a single view of the service, but
they are not suitable in mobile scenarios because they are heavily dependent on the
paradigm of “if the network is good, the service is good, so my customers are happy”
and are also very difficult to configure. The Empirix solution has been designed specifi-
cally to continuously monitor the QoE for MNO data services. It accomplishes this by
analyzing customers’ session information generated by a passive probe system; it is
totally independent of the network infrastructure technology deployed. It also provides
a framework that can be easily extended to other types of services (e.g., voice) and
network technology (e.g., fixed).
FIGURE 4. OBTAIN A SINGLE VIEW OFTHE OVERALL EXPERIENCE BY SPECIFIC DIMENSIONS
By Subscribers
By Applications
By Devices
By Location
QoE
Index
| empirix.com
The Empirix QoE solution can provide a number of benefits to multiple departments (see Figure 5) in an MNO including:
uu Drive strategic company decisions: Understanding QoE enables an MNO to understand how customers are using the
network and the quality of experience they are receiving, providing invaluable information to help define strategy.
uu Provide complementary information for business and customer intelligence: Although many applications are already
in place to collect information from billing systems about how customers use the network, they lack the Quality of
Experience information to properly evaluate the probability of churn and/or driving targeted marketing campaigns.
uu Assist in Network Planning: Focus investments to improve QoE the areas where the high value accounts are and
enhance the performance of the applications they use.
uu Enhance Network Supervision and Operations: Monitoring the network from the customers’ standpoint
(i.e., looking to their experience and not just to the infrastructure’s health) allows MNOs to prioritize the network
operation resources, focusing them on the issues most impacting the customer base.
SA:WP:UTQOEFMD:0216
FIGURE 5. QoE PROVIDES HIGH BUSINESS VALUETO ALL MNO DEPARTMENTS

Contenu connexe

Dernier

Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Dernier (20)

From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 

En vedette

How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
ThinkNow
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
Kurio // The Social Media Age(ncy)
 

En vedette (20)

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPT
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage Engineerings
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
 
Skeleton Culture Code
Skeleton Culture CodeSkeleton Culture Code
Skeleton Culture Code
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search Intent
 
How to have difficult conversations
How to have difficult conversations How to have difficult conversations
How to have difficult conversations
 
Introduction to Data Science
Introduction to Data ScienceIntroduction to Data Science
Introduction to Data Science
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best Practices
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project management
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
 

Understanding Quality of Experience for Mobile Data

  • 1. Understanding Quality of Experience for Mobile Data Mobile network operators (MNOs) have to adapt to a rapidly changing market.The explosion of applications accessible through mobile networks and the advent of over-the-top (OTT) services are clearly putting pres- sure on traditional service providers to differentiate themselves, keep the customer base loyal, and most importantly, recover revenues currently flowing toward OTT providers (or to other MNO competitors). Under- standing Quality of Experience will play a critical role in this transition.
  • 2. | empirix.com MNOs NEEDTO DIFFERENTIATE SERVICES In the past, MNOs were just focused on the bandwidth made available to mobile users; think back to the evolution of network technologies for mobile data services. They went from 2G to 4G, passing through 2.5G and 3G. In that case, the only issue was increas- ing bandwidth, with no real focus on the services running on it (the only advertised service was generic Internet access). They based this action on the assumption that more bandwidth equaled more customers or subscribers. Later, OTT providers appeared, relegating the MSP to the role of “dumb pipes” provid- er, creating an environment where MSPs have invested significant amounts of money to increase network capacity, only to have their customers access services provided (and billed) by other entities! So, how can MNOs differentiate themselves with customers? One of the best ways to do this is by regaining ownership of the services, emphasizing what they can control on the network—that is, the quality of the offering. But, in order to do that, MNOs need full visibility into who is accessing the network, through which devices, from where, and for what purpose. Most important of all, they need to know the quality of the user experience. DEFINING QUALITY IN AN EVOLVING NETWORK ENVIRONMENT When telecom networks were mainly used to deliver fixed-line voice services, the customer experience was tightly linked to physical network quality because there was a 1:1 match between the network infrastructure and the provided service. Therefore, a valid quality concept was “if the network is good, the service is good, so my customers are happy.” However, the advent of high-speed mobile networks completely invalidated such an approach because the customers are now accessing services while mobile and, most importantly, the service chain (meaning all software/hardware entities involved in ser- vice delivery) is much more complex so the 1:1 match is no longer applicable. Additionally, MNOs need to focus on not only the quality of voice services, but, even more importantly, the quality of mobile data services. Given that multitasking is a norm, providers must understand how customers perceive the quality of service when accessing both voice and mobile data services simultaneously. Keep in mind that this could include video as well. QoS ISN’T ENOUGH Standards bodies and industry organizations, like ITU and ETSI, have defined how to measure Quality of Service (QoS), meaning “a network’s ability to provide a service with an assured service level.” QoS was defined by means of a set of technical metrics (a.k.a. key performance indicators, KPI) such as packet loss, delay, throughput, answer- to-seizure ratio, and call setup time. It should be noted that QoS was a valid approach in fixed line environments to derive the experience perceived by the customers due to the previously mentioned concept of “if the network is good, the service is good.” To better understand the quality of voice delivery from a customer’s point of view (their perceived experience), mean opinion score (MOS), which provided a subjective measurement (metric) of the quality of a call. This test and the algorithms used have evolved over the years to accommodate the transition from circuit-switch services to packet-switched networks carrying voice. Empirix created a model that will not only leverage information from existing methods, but also enhance them to ensure they cover mobile data services.
  • 3. | empirix.com With the introduction of data services over mobile networks, QoS has proven to be a valid way to check the health of the physical network and the links between nodes, but lacks the ability to represent the subjective perception of quality from the customer point of view. While MOS was effective for voice, there is a gap as it pertains to data and video services. THE REQUIREMENT FOR COMPREHENSIVE QUALITY OF EXPERIENCE MONITORING MNOs currently lack a standard methodology for measuring the subjective and reporting on the experience of its custom- ers across the services they deliver, this includes voice, data and video. This is where Quality of Experience, or QoE, can play a significant role in helping MNOs provide a normalized approach leveraging easy to understand terms for all the services being delivered. Various definitions have been provided for QoE, including: uu The overall acceptability of an application or a service, as perceived subjectively by the end-user uu How a user perceives the usability of a service when in use, and how satisfied the user is with the service uu The degree of delight of the user of a service, influenced by content, network, device, application, user expectations and goals, and context of use Regardless of the academic definitions, the general consensus is that QoE should be defined by both services and user experience leveraging subjective and objective measurements. Indicators like “excellent,” “good,” and “bad” (referring to services) as well as “satisfied,” “tolerating,” and “frustrated” (referring to the users) that everyone can understand with- out any deep technical know-how represent a completely different approach than QoS, which has been based on metrics tightly related to the telecom technology. To summarize, QoS evaluates the network while QoE evaluates the perception of a service from user standpoint. And while QoE for well-established services like voice, IPTV and fixed internet access is defined (respectively, through MOS, VMOS and a set of standard indicators related to IP traffic like throughput, packet loss, jitter, and delay), very little is avail- able for scoring the multimedia data services accessed in mobility based on the HTTP protocol, including dynamic adaptive streaming over HTTP (DASH). In addition, the variety and complexity of the services (i.e., number of network entities involved, interworking technologies, dynamicity and variability of IP networks conditions) require new approaches that can be categorized as: uu Subjective assessment of QoE: by means of users survey, collection of user complaints, periodic active testing. uu Objective assessment of QoE: adoption of computational models to collect measurable parameters from live traffic, coupled with correlation rules to derive the overall QoE in as close to real-time as possible. CHALLENGES Approaching QoE measurement through subjective methods is relatively easy: there are many solutions (active testing) on the market that can perform a set of tests injecting traffic (called reference traffic) into the network and measure QoE from real mobile devices and/or dedicated hardware deployed across the network. However, such an approach has many limitations: uu Customers seldom agree to have software provided by the MNO running on their devices, for both privacy and per- ceived performance reasons. uu Testing traffic can consume the data plans of end-users: Why should they consume their monthly traffic to make tests for MNO? uu The type of traffic that can be tested is limited to the scenarios defined by the tool. uu Visibility of QoE is limited to the times and locations when and where the tests are performed; therefore, it is possible that everything may look good while executing the test, while other customers are experiencing poor quality because they are doing something different or are in different locations. So while it is possible to provide an evaluation of QoE through active testing, it is limited because it was obtained through a sampling approach and therefore by rule it does not cover 100% of the traffic, of the customers, of the devices and of the locations.
  • 4. | empirix.com THE NEED FOR SMARTER SCORING OF QoE To define the “best” computational model to objectively assess QoE, MNOs need to take into account many parameters including: uu Network dependent factors: Examples are the application response time or the time a web page takes to load. uu User expectations: “If I pay more, I expect more.” Note that this requires MNOs to contextualize the analysis depending on the type of contract subscribed by the end-users; therefore the subscribe profile should be an input to the algorithm. uu Context of use: The type of user that can be categorized as private, business, M2M, etc. uu Application-specific factors: If users are simply browsing the Internet, their QoE must be scored differently than when looking at video. To better understand the complexity in scoring QoE in the case of mobile data services, let’s examine a typical data session for Web browsing along with each of the compo- nents that can affect the perceived QoE. Before accessing any website from a smartphone, a user must be recognized as an authorized user of a mobile network (by means of the attach procedure, normally issued when we turn on our device) and also be connected to an APN (access point name) which serves as the gateway between a 2G, 3G or 4G mobile network and the public Internet, normally provided by the MNO. Once connected to a mobile network, a data session from a smartphone normally starts by entering a Web address, into the browser (or clicking on a link from an existing webpage). The device must translate the web address into an IP address; this is the task of the DNS resolution procedures through which the browser sends a query to the MNO and waits for the answer. Assuming that the address is successfully translated into a valid IP address, the next step is to open a connection to the destination remote server, which that can be located anywhere on the Internet. This phase is called TCP handshake and during it, the end-user is not seeing any data, except for a small prompt that normally looks like “…contacting site...” Once the connection with the remote server is established, the device sends the request for the data to display in the form of an HTTP query. If this is the first visit to the website, the request from the device is for the homepage. The remote server sends back a list of all the components (called “page objects”) that will be showed in the homepage; for each of them, the device normally opens a new TCP connection, sends the request for the data and waits for the answer back. Note that the procedure is reiterated for every component of the page until the entire page is retrieved; it is very common for a webpage to be composed of parts coming from different servers. The device normally starts to show the components as soon as they are received from the remote server; depending on the speed on the connection, the loading of the various components can be perceived or not. Sometime the page appears suddenly and fully completed, while other times the various parts appear in sequence. Finally, the device closes all the TCP connections and waits for the next command. QoS evaluates the network while QoE evaluates the perception of a service from user standpoint
  • 5. | empirix.com The example above (Figure 1) clearly shows the complexity involved in trying to generate a single normalized metric of the experience as perceived by the customer. Each step contributes to the overall scoring in a different way; furthermore, the scoring must take in account both objective parameters that depend on the network and the content provider (e.g., the time to download the page is a function of the connection’s speed and of the response time of the remote server), but also subjective parameters (how long will a user wait for the page before deciding to give up because the network is slow). If the device does not receive an answer to its DNS resolution query, the browser can only show a message like “…cannot open…” or display an incomplete view of the page and this is more difficult to score objectively! Up to now, international recommendations have helped to measure the network parameters (e.g., QoS – ITU-T G.1010, ITU-T Y.1540, etc.). The question now is: how to correlate existing QoS solutions and subjective customer experience into a holistic QoE? Most of the service assurance tools currently deployed at MNOs have been designed to monitor the network, not the customers’ QoE; a better, smarter approach is now required. AN INNOVATIVE EMPIRIX APPROACHTO QoE Recognizing that modern MNO environments require a different approach to QoE scoring, Empirix created a model that will not only leverage information from existing methods, but also enhance them to ensure they cover mobile data services. To start, Empirix defined a set of metrics to evaluate each of the phases composing a mobile data session, grouping them in three basic categories taken by the existing standards already available for the definition of the QoS: uu Service Accessibility: Evaluate the capacity of a customer in accessing a service. uu Service Retainability: Evaluate the capacity of a customer to retain the service, once obtained, without interruption due to the network. uu Service Performance: Evaluate the performance of a service. Note that each category includes a set of metrics that cover multiple types of procedures: for example, service accessibil- ity considers the authentication, attach and PDP context procedures. It is, however, possible that not all such information will be available, depending on the level of coverage of the probe system within the network (e.g., the attach procedures for 3G require visibility on Iu-PS interface). FIGURE 1. PROCEDURESTO ACCESS A WEBPAGETHROUGH A 3G NETWORK Node B SGSN Internet Remote Web Server UE MNO Responsibility Demarcation lub 3GDT (User Plane) DNS Resolution To/From DNS Server End-To-End Transaction Starts (TCP Handshake) Data Transfer: GET (First Home Page and then all other objects composing the page) TCP ACK (or ACK “piggy-backed” in 200 OK) 200 OK + Data TCP ACK from UE End-to-End Transaction Stops (TCP) Attach and PDP Context Create lu-PS GTP-C RNC GGSN Probe
  • 6. | empirix.com The innovative approach of Empirix is in the next step, when each QoS category is assigned, for each individual customer, to a subscriber satisfaction zone (SSZ) correlating the respective KPI. The SSZ is defined by: uu Satisfied: Response times are fast enough to satisfy the user, who is then able to concentrate fully on the work at hand with minimal negative impact on his/her thought process. uu Tolerating: Response times are longer than those of the satisfied zone, so the user notices how long it takes to interact with the system. However, they are still able to use the application. uu Frustrated: The user becomes unhappy. The casual user is likely to abandon the current course of action while a production user will definitely cancel the task. Such categorization is inspired by the APDEX Alliance, a consortium of vendors that standardizes a way to measure any type of experience from the end-user perspective, but Empirix introduced a way to correlate the QoS metrics to the customer’s QoE. Basically, within a specific time range (configurable), the users are categorized as satisfied, tolerating or frustrated by evaluating the respective set of QoS metrics. For example, a customer can have a satisfied score for Service Accessibility, but a frustrated score for service performance: the various components affecting the QoE are all taken into account and weighed. The SSZ scoring also records all of the events that occur and have been evaluated, because they will be required in the final formulas. Finally, a formula correlates the SSZ results and generates a single normalized QoE Index between 0 (= unacceptable) and 1 (= excellent), as shown in Figure 3 below: Note that the QoE Index will provide an overall view of the experience, but can be grouped by a number of different dimen- sions including a single customer, a group of customers, cells, applications or mobile devices. This will allow, for example, an MNO to easily understand if poor QoE is affecting only a specific set of devices and not a specific area. 1.00 0.94 0.85 0.70 0.50 0.00 Excellent Good Fair Poor Unacceptable Empirix QoE index FIGURE 3. QoE INDEX FIGURE 2. CORRELATING QoSTO QoE Individual Subscriber Experience Overall Experience Quality of Service Network Performance
  • 7. | empirix.com The key differentiators of the Empirix approach can be summarized as follows and pictured in the figure below: uu Generate a single “QoE Index,” instead of thousands of uncorrelated metrics. uu Correlate all the procedures that can affect access to the service, not only the performance, regardless the technologies (2G, 3G and 4G). uu Flexibility to customize the formulas to assign different weights to how the indi- vidual QoS categories contributes to the subscriber satisfaction zone assignment (satisfied, tolerating, frustrated). uu Flexibility to introduce application-specific KPI for the evaluation of service perfor- mance: examples include MOS for audio calls (i.e., VoLTE), video stalling in case of YouTube, response time in the case of generic web access. uu Options to include the customer profile into the subscriber satisfaction zone’s categorization. uu Aggregate the QoE Index by configurable dimensions like individual or groups of customers, cells, applications and mobile devices to provide valuable information to multiple MNOs’ service assurance departments. uu Provide an automated tool to indicate the possible root-cause-analysis of a low QoE (automatic expert analysis). THE VALUE OF QoE MNOs have no shortage of service assurance solutions that generate hundreds or thousands of KPI/KQI covering specific network domains. However, they lack an easy way to obtain a holistic view of the experience provided to their subscribers because such solutions are totally uncorrelated and designed only to score the network per- formance. In the past, service quality management (SQM) solutions have been an attempt to correlate disparate information to provide a single view of the service, but they are not suitable in mobile scenarios because they are heavily dependent on the paradigm of “if the network is good, the service is good, so my customers are happy” and are also very difficult to configure. The Empirix solution has been designed specifi- cally to continuously monitor the QoE for MNO data services. It accomplishes this by analyzing customers’ session information generated by a passive probe system; it is totally independent of the network infrastructure technology deployed. It also provides a framework that can be easily extended to other types of services (e.g., voice) and network technology (e.g., fixed). FIGURE 4. OBTAIN A SINGLE VIEW OFTHE OVERALL EXPERIENCE BY SPECIFIC DIMENSIONS By Subscribers By Applications By Devices By Location QoE Index
  • 8. | empirix.com The Empirix QoE solution can provide a number of benefits to multiple departments (see Figure 5) in an MNO including: uu Drive strategic company decisions: Understanding QoE enables an MNO to understand how customers are using the network and the quality of experience they are receiving, providing invaluable information to help define strategy. uu Provide complementary information for business and customer intelligence: Although many applications are already in place to collect information from billing systems about how customers use the network, they lack the Quality of Experience information to properly evaluate the probability of churn and/or driving targeted marketing campaigns. uu Assist in Network Planning: Focus investments to improve QoE the areas where the high value accounts are and enhance the performance of the applications they use. uu Enhance Network Supervision and Operations: Monitoring the network from the customers’ standpoint (i.e., looking to their experience and not just to the infrastructure’s health) allows MNOs to prioritize the network operation resources, focusing them on the issues most impacting the customer base. SA:WP:UTQOEFMD:0216 FIGURE 5. QoE PROVIDES HIGH BUSINESS VALUETO ALL MNO DEPARTMENTS