What our Customers say:

"This will change my ability to do my job. You are my hero! :)"


"Kristin, you have done a very good job for us."


"We consider you a friend of the company and recommend your services!"


"Thank you for an enjoyable and informative class!"

Read what others are saying about KR Consulting!

7 Steps to Creating a Quality Monitoring Program

Kristin Robertson, KR Consulting
August, 2007

A support center’s quality monitoring (or call monitoring) program is an essential element in providing excellent service to customers. There’s a direct correlation between how analysts treat customers to how satisfied and loyal customers are to the company. Quality interactions help you retain your customers and grow the business. More and more companies find that they can no longer sell solely on the features and benefits of their products, because those products have become commodities. Instead, they must compete on the quality and types of service they offer. The spotlight is on service, so quality must be on the forefront of every support manager’s mind.

There are many benefits to establishing a quality monitoring program for your phone-based analysts including a means to ensure consistent and professional service, to enforce customer service skills learned in training, to demonstrate to analysts how important customer service is, and to approximate and measure what your customers think of your service. The goals of a quality program are to ensure consistent, high quality service, recognize analysts who are doing a great job, and identify opportunities for analyst training. Results of the quality monitoring evaluations are often included in analyst scorecards with other, more tangible metrics such as number of calls handled, average handle time, and first contact resolution rate.

Creating a quality monitoring program takes a bit of time and thought, but is well worth the effort. Here’s a seven-step process to create a quality program that reflects your unique support center environment.

Step 1: Define/Redefine Critical Success Factors

The very first step in this process is to identify a task force, composed of both support analysts and support managers, to work through this process. Together, the team can identify goals, define intended outcomes, and assign tasks to individuals to achieve results.

The task force needs to define what constitutes success for this project. In doing that, some important questions to answer are:

  • Why are we doing this?
  • Who is going to be involved in this, and to what degree?
  • What outcomes do we want to see?
  • When do we want to monitor for quality, and how often?
  • How will we measure our success?

Clarifying objectives, expected results, and metrics at this point influences all subsequent work and ensures understanding across the task force.

Step 2: Observe Key Analysts

Every support center has its star analysts. During this step, ask both management and analysts to identify a small group of analysts who are exemplary customer service providers, using available criteria. The task force then observes these analysts as they handle customers, and analyzes the interactions to identify structural elements of the call as well as best practices in handling the customer’s requests. Members of the task force will listen to calls, observe how the star analysts use their support tools, and how they structure their work. You can then map out the common elements of your group’s interactions with customers. Compiling the information from this research will direct the next step in the process.

Step 3: Create an Interaction Structure and Elements

Next, the task force outlines the structure of a successful interaction by defining the elements identified while observing the star analysts. All calls will, at minimum, have three phases: a greeting, the problem-solving or request-handling step, and a close. Within that structure, define the behaviors, or elements, that create successful interactions with customers. There are two types of elements to keep in mind: procedural steps, such as gathering the correct information from the customer; and customer service behaviors, such as using a standard greeting, showing empathy, using the customer’s name, employing an appropriate tone of voice, etc. The procedural steps will be unique to your situation, but the customer service skills are universal. Capture the star performers’ best practices in both areas by creating a flow diagram of a typical interaction. If you have a wide variety of call types, you may need to define several kinds of interactions.

Step 4: Create a Call Evaluation Form, Legend, and Analyst Scorecard

Using the structure defined in the previous step, the task force creates an evaluation form for quality monitoring sessions. The form lists all the elements, in question form, that an analyst should follow during the interaction to solve the problem and satisfy the customer. Some elements you might want to include on the evaluation form are:

  • Did the analyst use the department’s standard greeting?
  • Did the analyst properly identify the problem?
  • Did the analyst use active listening skills with the customer?
  • Did the analyst use effective questioning skills to diagnose the problem?
  • Did the analyst headline any dead air space during the call?
  • Did the analyst use proper on-hold procedures?
  • Did the analyst make use of available resources to solve the issue?
  • Did the analyst set expectations for follow-up after the call?
  • Did the analyst use the department’s call closure guidelines?

A legend accompanies the evaluation form, and defines in detail the behaviors expected of the analysts in each step of the call structure. The legend can include suggested wordings for each element of the evaluation. These phrases should be ones that are proven, through the experience of your star performers, to work well with your customers.

Don’t forget to include call evaluation scores on a monthly summary or scorecard of each individual’s performance. The scorecard generally includes both quality metrics and performance metrics such as number of calls handled, average handle time, and first contact resolution rate.

Although many support centers grade calls using a 1-5 score system (similar to a customer satisfaction survey), a yes-no grading system is undeniably simpler. Says Tracy Coleman, Senior Manager at CompuCom System’s Enterprise Help Desk, “We use a yes-no grading system. The analysts either did what you expected or they didn’t.”

Step 5: Create Interaction Monitoring Process

In this step, gather the answers to the questions identified in Step 1, and create the process flow for your quality monitoring program. The task force documents exactly who will be doing what, and what tools and techniques will be used. Be sure to include the answers to questions such as “How many times will each analyst be monitored each month?”, and “How will we record both the call and the evaluations?”

Recording calls is an important step in the quality monitoring process. It’s important for the supervisor to be able to listen several times to a call, but most significantly, it’s beneficial for the analyst to listen to their own calls in a calm environment. I once managed an analyst who had a very loud phone voice. Although I mentioned to her that she needn’t speak so forcefully, it wasn’t until she heard her voice in a recorded call that she fully realized how the volume could be unpleasant to a customer.

Many call centers don’t have the budget to invest in expensive call recording technologies. For smaller centers, expensive monitoring equipment isn’t necessary. Fortunately, it’s easy to manually record calls by connecting a common tape recorder to a supervisor’s telephone set. You can buy the tape recorder and connecting hardware for under $100 at your local electronics store. Be sure to get a tape recorder with a meter on it so you can record the location of each conversation you record. Using the supervisor’s functionality on the ACD or phone system, the supervisor dials into an analyst’s extension and records the calls. Roberta Moss, of Medtronic Powered Surgical Solutions, keeps a different tape for each analyst in her center, and records several calls a week in order to have a representative number of monitors at the end of the month. “I mark the tape recorder’s meter number at the top of each call evaluation form,” says Roberta. “That way, I know exactly where to fast-forward on the tape to listen again to the call.”

Then create a document that outlines the quality monitoring process. It helps to outline the tasks in a weekly, monthly, and yearly format. If you agreed, for example, that each analyst should have two coachings per month, your quality process outline might look like this:
Weekly Tasks:

  • Record and evaluate calls for half of the analysts on the team.
  • Prepare coaching for each call evaluated.
  • Coach analysts.
  • Record evaluations in a spreadsheet (or database).

Monthly Tasks:

  • Run summary reports.
  • Incorporate quality scores into analyst and department scorecards.

Yearly Tasks:

  • Run yearly summary reports.
  • Incorporate quality scores into annual performance evaluations.

Step 6: Train and Pilot

Any new process is going to have wrinkles that need ironing! Conducting a pilot program, in which you roll out the new program to a smaller group of analysts, provides an opportunity to test new processes. Training is an important part of this step, as the analysts need to adopt new habits and behaviors. Training should emphasize the “What’s in it for me?” aspect, from the analyst’s point of view, of the program. Recognizing a job well done is typically a good motivator for the training program. After a few weeks of the pilot program, the task force should get together to assess what went well and what didn’t, and tweak the process before implementation.

Step 7: Implement

Training will be repeated for all the analysts during the Implementation phase. You’re now ready to rollout the program to the entire support center, and start reaping the benefits of your hard work!

A quality monitoring process should be reviewed and improved periodically via feedback loops within the support center. That’s why the process of creating a new quality program is a continuous cycle. Coleman cites a new program at CompuCom that demonstrates how one company continuously improves its quality monitoring process. CompuCom started a new program called “Put a Face on the Caller” to ensure that basic customer service skills are used in their support center. This back-to-the-basics program allows any supervisor or manager in CompuCom’s large center to listen to and critique the customer service skills of any analyst, regardless of that supervisor’s technical or process knowledge of the applications supported. This evaluation form is devoid of process questions that are account-specific and focuses only on the interaction skills of the analyst. “This new layer in our quality monitoring program has focused our analysts on putting a smile on the customer’s face through superior service,” reports Coleman.

Quality is an elusive goal, but with a solid quality monitoring program in place, your support center can indeed put a smile on the caller’s face.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>