Integrated Network
Solutions Group (INS Group)

Integrated Network Solutions Group (INS Group)Integrated Network Solutions Group (INS Group)Integrated Network Solutions Group (INS Group)
  • Home
  • SERVICES
    • Wireless Networking
    • Wired Networking
    • Industrial Networking
    • Data Center/Cloud
    • Proect Management
    • Analysis Validation Test
  • Industries
    • Manufacturing
    • Health Care
    • Warehousing
    • Engineering & Office
    • Other
  • About Us
    • Contact Us
    • Officers Portfolio
    • Customers
    • Partners
  • More
    • Home
    • SERVICES
      • Wireless Networking
      • Wired Networking
      • Industrial Networking
      • Data Center/Cloud
      • Proect Management
      • Analysis Validation Test
    • Industries
      • Manufacturing
      • Health Care
      • Warehousing
      • Engineering & Office
      • Other
    • About Us
      • Contact Us
      • Officers Portfolio
      • Customers
      • Partners

Integrated Network
Solutions Group (INS Group)

Integrated Network Solutions Group (INS Group)Integrated Network Solutions Group (INS Group)Integrated Network Solutions Group (INS Group)
  • Home
  • SERVICES
    • Wireless Networking
    • Wired Networking
    • Industrial Networking
    • Data Center/Cloud
    • Proect Management
    • Analysis Validation Test
  • Industries
    • Manufacturing
    • Health Care
    • Warehousing
    • Engineering & Office
    • Other
  • About Us
    • Contact Us
    • Officers Portfolio
    • Customers
    • Partners

VoIP QOS Validation & PoC Testing

Additional Information

  

The primary purpose of this testing is to validate that the QOS configurations that have been developed for IPT systems will in all office environments. The basic methodology used in these configurations is to statically set the QOS value at the port that a phone is attached. The configurations use default QOS settings with no adjustment of parameters like bandwidth and queue depth. This static methodology is being used in lieu of AutoQOS (Cisco Proprietary) to offer a vendor neutral solution. 

VoIP testing over the Customer AT&T infranet will be done between the Ren Cen to Customer Staging Center and the Pontiac CenterPoint campus to Customer Staging Center. The IPT/VoIP testing environment will be setup at the Customer Network Engineering Center (Staging Center). Combinations of standard IPT, video and video soft phones will be used for this testing. Guest VLANs segmented through “vrf” routing domains will be tested to verify that phones (including video soft phones) attached to the guest VLANs will operate properly. Firewall modules will also be enabled with vrf traffic routed to them. This is not an exact simulation of the final guest solution that would then be routed from the local Customer firewall to an off site vendor location to then VPN back into the Customer environment.

Another method of setting QOS would be to use AutoQOS and let the switch automatically select the QOS settings based on network conditions. This is the method to be used at Customer manufacturing facilities. Comparative formal testing of the two methodologies was out of scope for this project. 


Test Objective and Scope

Ensure consistent voice quality for IP Telephony under maximum network load and/or resource  

constrained conditions. Document quality improvements with QoS policy vs. A non-QoS environment.  

evaluate the impact of prioritized VoIP traffic on the performance of existing application traffic.

The testing being done is progressive and builds upon previous tests. The primary parameter to be tested is QOS so the different scenarios are done in pairs with QOS off and then on. Of equal importance is all phone types work seamlessly throughout the different scenarios and tests. A baseline test is done first to determine the state of the network before voice testing is started. Then Standard IPT phones are introduced to the environment without PC’s attached or simulated user traffic generated. The tests then progress by attaching PC’s and servers to the phones (the PC’s and servers have Chariot traffic generation end points (EP’s) installed) and generating traffic on the network through the phone PC ports and other EP’s placed throughout the network. The testing then moves to conference calls, video phones, soft phones, testing on a guest VLAN and finally testing over the Customer AT&T infranet. The sequence of these tests is important in order to minimize test time. There will be standard IP phones, IP conference phones, video phones and video soft phones used for testing. The test scenarios will use mixes of these phones so that each phone type will be tested with each other phone type.

Chariot/IXIA will be used to generate network traffic to load uplinks that IPT traffic will be traversing. Ideally these tests will generate enough traffic loads to adversely affect the IPT traffic; essentially trying to find an operating threshold. IPT phone calls will be made during these tests and the quality of the calls will be assessed manually, by Chariot and the Infinistream Multiport sniffer. The amount of traffic, throughput, and response time of the data traffic generated by Chariot in these simulated network environments will be recorded and evaluated as well as the Infinistream sniffer traces.

All devices involved in this testing will be time synchronized (if they are NTP/SNTP capable) with the Staging Center core NTP server. The server that the Chariot console will run on the Cisco call manager that the phones register with and the Infinistream Multiport sniffer are NTP capable. This is extremely important since the sniffer traces will be extracted from the Infinistream at a later date that needs to be time synchronized with Chariot and the Cisco IPT phones.


1) Create a Project Time-Line

2) Create an Architecture Diagram

3) Develop a Test Plan/Scenarios

4) Setup IPT Equipment in Staging Center Lab

5) Develop QOS Electronics Configurations

6) Manage Project

7) Develop VoIP Equipment BOM

8) Setup Equipment in Staging Center Lab

9) IPT QOS Testing

10) Remote VoIP to VoIP Testing

11) Execute Several Test Scenarios with a FWSM Enabled

12) Project Write-up


Test Equipment List


  1. Two Cisco 6513 Cores 
    1. Two Supervisor 720/3Bs
    2. Two FWSM Modules
    3. Two 6724’s
    4. Two 6748’s
  2. 3750’s
    1. Two WS-C3750G-48PS
    2. Three WS-C3750-48TS
    3. Two WS-C3750-48P
    4. Two WS-C3750G-12S
  3. Call Managers
    1. Four MCS7845H2-K9-CMA2
  4. 3845 PSTN Gateway
    1. Two CISCO3845-V/K9
  5. Cisco IPT Phones
    1. Standard Phones

                      i. Two CP-7971G-GE

                      ii. Two CP-7941G

  1. Video Phones

                      i. Two CP-7985-NTSE

  1. Soft Phones with Web Cams

                      i. Two UPC-CAMERAS-24 Web Cameras

                      ii. Two Cisco IP Communicator software bundles


See the IPT/VoIP Architecture diagram in section five and the Customer Staging Center Lab architecture diagram in section six for further details.


Testing Tools, Test Scenario Format and Test Results Format


Following is a list of test tools and what they were used for:

  1. IXIA Chariot
    1. Chariot is essentially and application layer test tool that can generate different       types of data streams and files. The test scenarios in this document will use scripts that generate TCP/IP connection based traffic and also connectionless based       multicast/UDP traffic.
    2. The primary purpose of Chariot is to create network load. In many scenarios the network is incrementally loaded in an attempt to find failure thresholds. 
    3. Chariot test results will be saved in HTML format and there links found in the test       scenario results section.
  2. Sniffers
    1. Sniffer traces using an NGC Infinistream multiport sniffer will be taken at points of high data volume and where VoIP traffic passes. As the tests are running the sniffer will be left in expert mode to observe error/warnings (i.e. retransmits, TTL       expirations, broadcast/multicast storms, etc.) and unexpected traffic (i.e. multicasts traversing an uplink that has no group members attached). The Infinistream has 7.5 terabytes of storage and will capture and store data continuously throughout the entire three week test period. 
    2. Using the Chariot and phone date and time stamps individual traces for each test scenario can be drilled into and extracted from the Infinistreams continuous long       term capture database.  These individual traces will then be saved in the format Illustrated below.


Following is the basic format of each test scenario:

  1. Test Scenario; this is a description/definition of what the test will do and in general      how to set it up.
  2. Test Objective; this is an overview of what the test is for.
  3. Test Equipment; a list of the type and quantity of equipment being used.
  4. Test Setup; is primarily made up of configuration and patch panel allocation files.
  5. Test Procedure; how to start the tests, test durations, test cycles, what to save/observe, etc.
  6. Test Results; this will either have questions to respond to and/or instruction to save      Chariot, Cisco Switch Status, Sniffer Traces, etc. 
    1. The file naming conventions are as follows

                        i. Chariot file format: “IPTQOS_ScenX_TestX_Date_Time” the date and time are the start  

                          date and time of the Chariot test.

                        ii. Sniffer traces: “IPTQOS_ScenX_TestX_Date_Time”, the date and time are the same

                           as the Chariot test.

                        iii. Cisco Log Files: “IPTQOS_ScenX_TestX_Date_Time”, the date and time are the same 

                            as the Chariot test.


General observations & conclusions of tests performed 


Overall the configuration validation testing went very well with some minor issues and observations that are mentioned below. The configurations used for this testing worked as designed and will be recommended for inclusion the office VoIP architecture.

MLS QOS parameters (buffer size, queue depth and bandwidth) should be monitored on a periodic basis and when new applications; video, multicast based, and audio to name a few are added to the environment. The configurations as written today use default QOS parameters which worked well in the Staging Center lab testing. The default settings leave a comfortable margin for future growth.

The MOS scores recorded in scenarios two through seven were determined by making actual phone calls, listening to the quality of the call and applying an appropriate MOS score. MOS scores on the remaining scenarios were recorded from Chariot and Cisco Phone statistics.


Observations/Issues by test scenario:

  1. Test scenario 1
    1. No issues 
  2. Test scenario 2      QOS off
    1. No issues 
  3. Test scenario 3      QOS on
    1. No issues
  4. Test scenario 4      QOS off
    1. No issues
  5. Test scenario 5      QOS on
    1. No issues
  6. Test scenario 6      QOS off
    1. No issues
  7. Test scenario 7      QOS on
    1. Throughput of Chariot EP’s in the voice VLAN ran extremely slow as can be seen from the previous Chariot EP 9 & 10 throughput data and graph for scenario 7. With AutoQOS enabled the bandwidth is adjusted automatically based on demand and the throughput is normal. For just voice calls the default queues and buffers are more than adequate with rates averaging 2.4 Mbits as can be seen in the Chariot reports for test scenario 7, tests 1-3.
  8. Test scenario 8      QOS off
    1. The DSCP value coming from the video soft phone is set to zero. This is because the video soft phone is setup in the data VLAN. The IPT traffic could be differentiated using ACL’s that key off of IPT port numbers, triggering the DSCP value to be set to 46/EF.
  9. Test Scenario 9      QOS on
    1. No issues
  10. Test Scenario 10      QOS off
    1. The MIN MOS LQK  value on phone 2 was 3.1 which is low the Avg MOS LQK was 4.43 and the MAX Jitter was 271. This would indicate that the call may have had a small amount of noise. QOS is disabled on this test.
    2. A minor issue that was confusing at first, was seeing DSCP values in sniffer trace       packets from a switch with the video phone attached set to a value of 34/AF. This could be seen on a sniffer trace when the video phone source was also on that switch. When looking at a packet on a switch where the receiving phone is attached, the DSCP value from the video phone is 46 as it should be. The reason for this is that the QOS policies have not yet been applied when the monitor port captures a packet whether the monitor port is looking at a VLAN or the port that the phone is attached. This was only an issue on sniffer traces from scenario 8 thru 15, which were captured on ports monitoring the voice VLAN not an uplink port.
  11. Test Scenario 11      QOS on - The MOS scores are fairly low as can be seen below.
    1. MOS LQK  3.2678
    2. Avg MOS LQK 3.2727
    3. Min MOS LQK 3.1591
    4. Max MOS LQK 3.5100
  12. Test Scenario 12      QOS off
    1. NOTE: Video soft phones did not work originally in this test environment an ACL       needed to be added to the FireWall to enable the guest VLAN access to the call manager. In order to expedite the testing and stay on schedule the formal testing in the next two sections was done with only standard phones. Video soft phones were tested informally and worked within the Staging Center lab.
  13. Test Scenario 13      QOS on
    1. No Issues
  14. Test Scenario 14      QOS off
    1. No Issues 
  15. Test Scenario 15      QOS on: The Min MOS score is fairly      low and the Max Jitter is high as can be seen in the table below.
    1. Min MOS LQK  2.8227
    2. Max Jitter 496


Other Observations & Issues:

  1. Data throughput is about 25% lower when QOS is enabled. This can be seen in most of the test scenario pairs for example test scenario 8 test 3 (QOS off) and 9 test 3 (QOS on) the throughput is 1,708 Mbps and 1,304 Mbps respectively.
  2. Remote testing with video soft phones attached to the guest VLAN over the Customer AT&T infranet was not tested.
  3. Video phones worked very well in over the WAN testing, standard phones were used to      gather statistics.
  4. Stateful IPT/VoIP test tools would have been very useful for this testing. Their ability to simulate 100’s of actual calls that follow the protocols being used (SCCP & H323) would have saved time and coordinated/assimilated the data better and more      accurately. 
  5. When Laptop PC’s and servers running Chariot EP’s were attached to IPT phones (models 7941& 7971) and traffic was run between them on different model phones the      throughput was low and a lot of retransmissions occurred. This could be an issue between the phones or it could be that dissimilar test equipment was used for Chariot      EP’s. The Chariot tests where changed so that EP’s were communicating with EP’s attached to the same model phone. This resulted in more typical throughput and retransmission rates. This would require further testing with the same type of Chariot EP’s before any conclusions could be established.

Copyright © 2020 Integrated Network Solutions Group ( - All Rights Reserved.

Powered by GoDaddy Website Builder