Banking

Exceed your Customer's Expectations First Time, and Every Time

Customer Profile

A major UK based multinational banking & financial services company, a universal bank with operations in retail, wholesale and investment banking, as well as wealth management, mortgage lending and credit cards. It has operations in over 50 countries and territories and has around 48 million customers.

No. of Agent:~4000
Volume:~350,000 calls/month

Business Challenge

A 60 Billion dollar credit card and loan provider company who was the first to introduce a credit card in the UK was seeking to improve their customer experience for their contact center in US. The major challenge was to know if the parameters used to measure the agent performance through the quality assurance (QA) form were really capable of sensing their customer's overall satisfaction levels on the brand and quality of service.

For Example:
Supervisor:"My team member has an outstanding quality score of 98% on his/her call with the card holder / the customer"
Card Holder / Customer in the survey-"I am not satisfied with the clarity of speech and the agent did not have authority to resolve my issue. All in all, the quality of service was poor".

In the above example, the customer rated the agent low on questions (within Customer Satisfaction - CSAT survey form) related to overall satisfaction with quality of service, authority and clarity of speech. Whereas the corresponding call quality score does not say so, thereby indicating that the call quality assurance form is not quite aligned to CSAT survey form

Our Approach

The project was guided by the following set of broadly stated research objectives:

  • Identification of the factors which drive the overall customer experience through the HQ's home grown tool called 'Key Driver Analysis-KDA'
  • Check whether the attributes used to measure call quality are aligned to gage the customer satisfaction as well
  • Analyze a sample of calls to modify the attributes of existing quality assurance (QA) form

Our Solution

Let's study the key phases to understand how we arrived at the solution.

Key Driver Analysis

Other than the question on the overall customer satisfaction, in the existing CSAT survey form, there were also some other questions to gage the satisfaction with the brand, quality of service and agent performance. Every question can be rated by the customer on the scale 1-5 (1 being low satisfaction and 5 being highly satisfied).

Key Driver analysis was done to identify the questions in the current CSAT survey form which correlate to the overall CSAT. This phase was supported with the following approach

our approch

During the analysis it was found that the CSAT survey question "Overall Experience with Cards" is significantly driven by the question "Quality of Service on Phone" (considered as Level 1 factor) which is further driven by the another question "Quality of Service by Agent" (considered as Level 2 factor) as shown below:

our approch

In order to control the overall satisfaction, the "Quality of Service by Agent" was further analyzed. This question was divided into seven sub-questions (e.g. "Did the agent take ownership?", "Did the agent take answered clearly?" etc.). These sub-questions were considered as the Level 3 factors. The inter-relationship of these sub-questions was analyzed with the help of statistical tools to check the interdependence (Factor Analysis) and found that these 7 sub-questions (Level 3 Factors) can be packetized into 3 categories based on direction of their dependence relationship with "Quality of Service by Agent" (Level 2 Factor) as shown below:

our approch

Recommendations were provided on each of the three categories identified out of this phase.

Current QA (Quality Assurance) form alignment study

To see whether the attributes in the existing QA form are able to gage the true customer experience, the analysis was done between:

  • Overall QA Score & Overall CSAT Score
  • Overall QA Score & Drivers of CSAT (Customer Experience)
  • Section level QA score & Drivers of CSAT (Customer Experience) aligned

With the help of statistical tools like correlation and regression it was found that the call QA form was not aligned with the Overall Customer Experience which means that the quality score generated through the QA form does not represent the true customer satisfaction level.

Attributes in the QA form were needed to be redefined from customer's perspective. In order to gauge the true customer experience the client was recommended to measure the "Customer Experience" and "Compliance" separately.

Call Analytics and QA Form Update

By studying the sample (statistically calculated) of calls where the customer ratings were available, the meaning of each sub-question in the CSAT survey question was clearly defined followed by the conversion of these definitions to new attributes in the QA form.

Through this exercise, 13 new QA attributes were formed and logically divided into 7 categories based on each of the 7 sub-questions for "Quality of Service by Agent" in CSAT survey form.

Once all the 13 QA attributed are marked as Yes/No/NA, this new QA form was able to convert the QA results into the customer satisfaction ratings, called the Surrogate CSAT Ratings for each of drivers of Customer Experience with the accuracy of over 85%.

Results

With the successful testing of the new QA form, it was then applied this to a larger sample and found that it was able to improve the overall customer experience by 2-4% in just a quarter as shown below.

Overall Customer Experience (Actual)

our approch

The new QA form impressed the stakeholders by the consistently delivering an accuracy of over 85% in measuring the CSAT (customer satisfaction).

Accuracy to measure CSAT (Through new QA Form for 3 Months)

our approch

Download Case Study

Get in Touch with us

Still have Questions? Contact Us using the Form below

Send us an Email

Enter Code -