The primary purpose of this testing is to validate that the QOS configurations that have been developed for IPT systems will in all office environments. The basic methodology used in these configurations is to statically set the QOS value at the port that a phone is attached. The configurations use default QOS settings with no adjustment of parameters like bandwidth and queue depth. This static methodology is being used in lieu of AutoQOS (Cisco Proprietary) to offer a vendor neutral solution.
VoIP testing over the Customer AT&T infranet will be done between the Ren Cen to Customer Staging Center and the Pontiac CenterPoint campus to Customer Staging Center. The IPT/VoIP testing environment will be setup at the Customer Network Engineering Center (Staging Center). Combinations of standard IPT, video and video soft phones will be used for this testing. Guest VLANs segmented through “vrf” routing domains will be tested to verify that phones (including video soft phones) attached to the guest VLANs will operate properly. Firewall modules will also be enabled with vrf traffic routed to them. This is not an exact simulation of the final guest solution that would then be routed from the local Customer firewall to an off site vendor location to then VPN back into the Customer environment.
Another method of setting QOS would be to use AutoQOS and let the switch automatically select the QOS settings based on network conditions. This is the method to be used at Customer manufacturing facilities. Comparative formal testing of the two methodologies was out of scope for this project.
Test Objective and Scope
Ensure consistent voice quality for IP Telephony under maximum network load and/or resource
constrained conditions. Document quality improvements with QoS policy vs. A non-QoS environment.
evaluate the impact of prioritized VoIP traffic on the performance of existing application traffic.
The testing being done is progressive and builds upon previous tests. The primary parameter to be tested is QOS so the different scenarios are done in pairs with QOS off and then on. Of equal importance is all phone types work seamlessly throughout the different scenarios and tests. A baseline test is done first to determine the state of the network before voice testing is started. Then Standard IPT phones are introduced to the environment without PC’s attached or simulated user traffic generated. The tests then progress by attaching PC’s and servers to the phones (the PC’s and servers have Chariot traffic generation end points (EP’s) installed) and generating traffic on the network through the phone PC ports and other EP’s placed throughout the network. The testing then moves to conference calls, video phones, soft phones, testing on a guest VLAN and finally testing over the Customer AT&T infranet. The sequence of these tests is important in order to minimize test time. There will be standard IP phones, IP conference phones, video phones and video soft phones used for testing. The test scenarios will use mixes of these phones so that each phone type will be tested with each other phone type.
Chariot/IXIA will be used to generate network traffic to load uplinks that IPT traffic will be traversing. Ideally these tests will generate enough traffic loads to adversely affect the IPT traffic; essentially trying to find an operating threshold. IPT phone calls will be made during these tests and the quality of the calls will be assessed manually, by Chariot and the Infinistream Multiport sniffer. The amount of traffic, throughput, and response time of the data traffic generated by Chariot in these simulated network environments will be recorded and evaluated as well as the Infinistream sniffer traces.
All devices involved in this testing will be time synchronized (if they are NTP/SNTP capable) with the Staging Center core NTP server. The server that the Chariot console will run on the Cisco call manager that the phones register with and the Infinistream Multiport sniffer are NTP capable. This is extremely important since the sniffer traces will be extracted from the Infinistream at a later date that needs to be time synchronized with Chariot and the Cisco IPT phones.
1) Create a Project Time-Line
2) Create an Architecture Diagram
3) Develop a Test Plan/Scenarios
4) Setup IPT Equipment in Staging Center Lab
5) Develop QOS Electronics Configurations
6) Manage Project
7) Develop VoIP Equipment BOM
8) Setup Equipment in Staging Center Lab
9) IPT QOS Testing
10) Remote VoIP to VoIP Testing
11) Execute Several Test Scenarios with a FWSM Enabled
12) Project Write-up
Test Equipment List
i. Two CP-7971G-GE
ii. Two CP-7941G
i. Two CP-7985-NTSE
i. Two UPC-CAMERAS-24 Web Cameras
ii. Two Cisco IP Communicator software bundles
See the IPT/VoIP Architecture diagram in section five and the Customer Staging Center Lab architecture diagram in section six for further details.
Following is a list of test tools and what they were used for:
Following is the basic format of each test scenario:
i. Chariot file format: “IPTQOS_ScenX_TestX_Date_Time” the date and time are the start
date and time of the Chariot test.
ii. Sniffer traces: “IPTQOS_ScenX_TestX_Date_Time”, the date and time are the same
as the Chariot test.
iii. Cisco Log Files: “IPTQOS_ScenX_TestX_Date_Time”, the date and time are the same
as the Chariot test.
Overall the configuration validation testing went very well with some minor issues and observations that are mentioned below. The configurations used for this testing worked as designed and will be recommended for inclusion the office VoIP architecture.
MLS QOS parameters (buffer size, queue depth and bandwidth) should be monitored on a periodic basis and when new applications; video, multicast based, and audio to name a few are added to the environment. The configurations as written today use default QOS parameters which worked well in the Staging Center lab testing. The default settings leave a comfortable margin for future growth.
The MOS scores recorded in scenarios two through seven were determined by making actual phone calls, listening to the quality of the call and applying an appropriate MOS score. MOS scores on the remaining scenarios were recorded from Chariot and Cisco Phone statistics.
Observations/Issues by test scenario:
Other Observations & Issues:
Copyright © 2020 Integrated Network Solutions Group ( - All Rights Reserved.
Powered by GoDaddy Website Builder