End-to-End Testing to the Agent Desktop

Now that you’ve completed infrastructure, SBC and IVR testing, let’s take a look at the last segments of the end-to-end test best practice: CTI Data and Routing testing, and Agent Desktop Integration.

CTI Data and Routing Testing

CTI Data and Routing testing measures the capabilities and accuracy of the link between your IVR system and the database managing information feeding into your CTI system. Some key validation metrics to observe include:

  • Intelligent call routing
  • Default routing
  • Data delivery
  • Timeliness of data arrival

White Paper: The Future of IVR Customer Service Assurance

Use Case Scenario

Here’s an example of an interesting CTI Data and Routing testing-related scenario we came across here at Empirix.

A larger retailer had agents complaining that occasionally they were not getting customer screen pops. To help identify the failure points, the company opted to do performance testing of its “home grown” IVR and its integration into its IP PBX.

Read: The Right Ingredients to Build an Impactful Testing ROI Case

Using a virtual agent simulator (VAS) to capture the screen pop data, it was determined that at 330 concurrent calls, the VAS was no longer getting data. Upon further investigation into the customer’s IVR and IP PBX, it was determined that the VXML servers that were servicing the IVR were not passing the data to IP PBX.

Once the customer added an additional VXML server, the test was re-executed. At that point, the team was able to validate that the system worked as expected, with up to 600 concurrent calls.

White Paper: 10 Tips for Developing a Powerful End-to-End Contact Center Testing Plan

Agent Desktop Integration Testing

Agent Desktop Integration testing isolates the CTI-to-CRM integration. Some key validation points include:

  • Screen pop performance
  • CTI call control functionality
  • Verification of screen pop content
  • Time to screen pop

Use Case Scenario

Here is a real-life example of why Agent Desktop Integration testing makes sense.

A retailer provider was experiencing issues with their agents’ CRM and CTI toolbar integration. During testing, screen pop performance was measured and with 250 concurrent calls response time was less than two seconds. However, when the load was increased to 310 concurrent calls, the response time for the screen pop increased to 38 seconds. In addition, while the agent waited for the screen pop, the entire custom CRM application froze, making it impossible for the agent to assist the customer in any way, even if they had simply asked for the account number.

Learn more: Test Automation for the Retail Industry

After testing the agent desktop integration, the cause was discovered to be a poorly written database set of queries that logged the call and produced the screen pop.

Do you test new technology before adding it to your contact center? Want to know if others do too? Check out the Infographic and survey results in Customer Service: Who Cares?

For more information about end-to-end testing, read the paper Ten Tips for Developing a Powerful End-to-End Contact Center Testing Plan.

For further information about understanding the testing landscape, read the Gartner research reports covering management tools for the lifecycle of Unified Communications (UC) technologies.

Taming CX Disruption with Automated, Collaborative Testing

More on test automation