idnits 2.17.1 draft-ietf-bmwg-atm-method-00.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** Looks like you're using RFC 2026 boilerplate. This must be updated to follow RFC 3978/3979, as updated by RFC 4748. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- ** The document seems to lack a 1id_guidelines paragraph about Internet-Drafts being working documents. ** The document is more than 15 pages and seems to lack a Table of Contents. == No 'Intended status' indicated for this document; assuming Proposed Standard == It seems as if not all pages are separated by form feeds - found 0 form feeds but 43 pages Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The document seems to lack an IANA Considerations section. (See Section 2.2 of https://www.ietf.org/id-info/checklist for how to handle the case when there are no actions for IANA.) ** The document seems to lack separate sections for Informative/Normative References. All references will be assumed normative when checking for downward references. ** There are 657 instances of weird spacing in the document. Is it really formatted ragged-right, rather than justified? ** There are 945 instances of too long lines in the document, the longest one being 3 characters in excess of 72. ** The document seems to lack a both a reference to RFC 2119 and the recommended RFC 2119 boilerplate, even if it appears to use RFC 2119 keywords. RFC 2119 keyword, line 122: '... the MUST requirements for t...' RFC 2119 keyword, line 123: '...sfies all the MUST and all the ...' RFC 2119 keyword, line 125: '...atisfies all the MUST requirements but...' RFC 2119 keyword, line 126: '... SHOULD requirements for its pro...' RFC 2119 keyword, line 193: '... The SUT MUST be configured as d...' (327 more instances...) Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the RFC 3978 Section 5.4 Copyright Line does not match the current year == Line 11 has weird spacing: '... This docum...' == Line 12 has weird spacing: '...visions of S...' == Line 13 has weird spacing: '...cuments of th...' == Line 14 has weird spacing: '...tribute worki...' == Line 18 has weird spacing: '...eted by other...' == (652 more instances...) == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: It is useful to characterize the SUTs performance under a number of conditions. Some of these conditions are noted below. The reported results SHOULD include as many of these conditions as the test equipment is able to generate. The suite of tests SHOULD be run first without any modifying conditions, then repeated under each of the modifying conditions separately. To preserve the ability to compare the results of these tests, any IP PDUs that are required to generate the modifying conditions (excluding management queries) will be included in the same data stream as that of the normal test IP PDUs and in place of one of the test IP PDUs. They MUST not be supplied to the SUT on a separate network port. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'SHOULD not' in this paragraph: The 24 ATM addresses SHOULD not be any that are represented in the test data stream. The last filter SHOULD permit the forwarding of the test data stream. By "first" and "last" we mean to ensure that in the second case, 25 conditions must be checked before the data IP PDUs will match the conditions that permit the forwarding of the IP PDU. Of course, if the SUT reorders the filters or does not use a linear scan of the filter rules the effect of the sequence in which the filters are input is properly lost. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with the maximum number of VCCs supported on the SUT. For example, if the maximum number of VCCs supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with the maximum number of VCCs supported on the SUT. For example, if the maximum number of VCCs supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one of the specified traffic descriptors. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with the maximum number of VCCs supported on the SUT. For example, if the maximum number of VCCs supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one of the specified traffic descriptors. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with the maximum number of VCCs supported on the SUT. For example, if the maximum number of VCCs supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one of the specified traffic descriptors. == Using lowercase 'not' together with uppercase 'MUST', 'SHALL', 'SHOULD', or 'RECOMMENDED' is not an accepted usage according to RFC 2119. Please use uppercase 'NOT' together with RFC 2119 keywords (if that is what you mean). Found 'MUST not' in this paragraph: 2) Configure the SUT and test device with the maximum number of VCCs supported on the SUT. For example, if the maximum number of VCCs supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one of the specified traffic descriptors. -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- Couldn't find a document date in the document -- date freshness check skipped. Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) -- Looks like a reference, but probably isn't: '0' on line 1869 -- Looks like a reference, but probably isn't: '5' on line 1868 -- Looks like a reference, but probably isn't: '16' on line 1869 == Unused Reference: 'IETF-RFC-2544' is defined on line 1999, but no explicit reference was found in the text == Unused Reference: 'IETF-RFC-2225' is defined on line 2002, but no explicit reference was found in the text == Unused Reference: 'IETF-TERM-ATM' is defined on line 2005, but no explicit reference was found in the text == Unused Reference: 'AF-TEST-0022' is defined on line 2011, but no explicit reference was found in the text ** Downref: Normative reference to an Informational RFC: RFC 2544 -- Possible downref: Non-RFC (?) normative reference: ref. 'IETF-TERM-ATM' -- Possible downref: Non-RFC (?) normative reference: ref. 'AF-TEST-0022' Summary: 9 errors (**), 0 flaws (~~), 25 warnings (==), 7 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 1 Network Working Group J. H. Dunn 2 INTERNET-DRAFT C. E. Martin 3 Expires: April, 2000 ANC, Inc. 5 October, 1999 6 Methodology for ATM Benchmarking 7 9 Status of this Memo 11 This document is an Internet-Draft and is in full conformance with all 12 provisions of Section 10 of RFC2026. Internet-Drafts are working 13 documents of the Internet Engineering Task Force (IETF), its areas, and 14 its working groups. Note that other groups may also distribute working 15 documents as Internet-Drafts. 17 Internet-Drafts are draft documents valid for a maximum of six months 18 and may be updated, replaced, or obsoleted by other documents at any 19 time. It is inappropriate to use Internet-Drafts as reference material 20 or to cite them other than as "work in progress." 22 The list of current Internet-Drafts can be accessed at 23 http://www.ietf.org/ietf/1id-abstracts.txt 25 The list of Internet-Draft Shadow Directories can be accessed at 26 http://www.ietf.org/shadow.html. 28 Copyright Notice 30 Copyright (C) The Internet Society (1999). All Rights Reserved. 32 Abstract 34 This document discusses and defines a number of tests that may be used 35 to describe the performance characteristics of ATM based switching 36 devices. In addition to defining the tests this document also describes 37 specific formats for reporting the results of the tests. 39 Appendix A lists the tests and conditions that we believe should be 40 included for specific cases and gives additional information about 41 testing practices. 43 This memo is a product of the Benchmarking Methodology Working Group 44 (BMWG) of the Internet Engineering Task Force (IETF). 46 1. Introduction. 48 This document defines a specific set of tests that vendors can use to 49 measure and report the performance characteristics of ATM network 50 devices. The results of these tests will provide the user comparable 51 data from different vendors with which to evaluate these devices. The 52 methods defined in this memo are based on RFC 2544 "Benchmarking 53 Methodology for Network Interconnect Devices". 55 The document "Terminology for ATM Benchmarking" (draft-ietf-bmwg-atm- 56 term- 01.txt), defines many of the terms that are used in this document. 57 The terminology document should be consulted before attempting to make 58 use of this document. 60 The BMWG produces two major classes of documents: Benchmarking 61 Terminology documents and Benchmarking Methodology documents. The 62 Terminology documents present the benchmarks and other related terms. 63 The Methodology documents define the procedures required to collect the 64 benchmarks cited in the corresponding Terminology documents. 66 2. Background 68 2.1. Test Device Requirements 70 This document is based on the requirement that a test device is 71 available. The test device can either be off the shelf or can be easily 72 built with current technologies. The test device must have a 73 transmitting and receiving port for the interface type under test. The 74 test device must be configured to transmit test PDUs and to analyze 75 received PDUs. The test device should be able to transmit and analyze 76 received data at the same time. 78 2.2. Systems Under Test (SUTs) 80 There are a number of tests described in this document that do not apply 81 to each SUT. Vendors should perform all of the tests that can be 82 supported by a specific product type. It will take some time to perform 83 all of the recommended tests under all of the recommended conditions. 84 Appendix A lists some of the tests and conditions that we believe should 85 be included for specific cases. 87 2.3. Test Result Evaluation 89 Performing all of the tests in this document will result in a great deal 90 of data. The applicability of this data to the evaluation of a 91 particular SUT will depend on its expected use and the configuration of 92 the network in which it will be used. For example, the time required by 93 a switch to provide ILMI services will not be a pertinent measurement in 94 a network that does not use the ILMI protocol, such as an ATM WAN. 95 Evaluating data relevant to a particular network installation may 96 require considerable experience, which may not be readily available. 98 Finally, test selection and evaluation of test results must be done with 99 an understanding of generally accepted testing practices regarding 100 repeatability, variance and the statistical significance of a small 101 numbers of trials. 103 2.4. Requirements 105 In this document, the words that are used to define the significance of 106 each particular requirement are capitalized. These words are: 108 * "MUST" This word, or the words "REQUIRED" and "SHALL" mean that the 109 item is an absolute requirement of the specification. 111 * "SHOULD" This word or the adjective "RECOMMENDED" means that there may 112 exist valid reasons in particular circumstances to ignore this item, but 113 the full implications should be understood and the case carefully 114 weighed before choosing a different course. 116 * "MAY" This word or the adjective "OPTIONAL" means that this item is 117 truly optional. One vendor may choose to include the item because a 118 particular marketplace requires it or because it enhances the product, 119 for example; another vendor may omit the same item. 121 An implementation is not compliant if it fails to satisfy one or more of 122 the MUST requirements for the protocols it implements. An 123 implementation that satisfies all the MUST and all the SHOULD 124 requirements for its protocols is said to be "unconditionally 125 compliant"; one that satisfies all the MUST requirements but not all the 126 SHOULD requirements for its protocols is said to be "conditionally 127 compliant". 129 2.5. Test Configurations for SONET 131 The test device can be connected to the SUT in a variety of 132 configurations depending on the test point. The following 133 configurations will be used for the tests described in this document. 135 1) Uni-directional connection: The test devices transmit port (labeled 136 Tx) is connected to the SUT receive port (labeled Rx). The SUTs 137 transmit port is connected to the test device receive port (see Figure 138 1). In this configuration, the test device can verify that all 139 transmitted packets are acknowledged correctly. Note that this 140 configuration does not verify internal system functions, but verifies 141 one port on the SUT. 143 +-------------+ +-------------+ 144 | Tx|-------------->|Rx | 145 | Test Rx|<--------------|Tx SUT | 146 | Device | | | 147 +-------------+ +-------------+ 149 Figure 1 151 2) Bi-directional connection: The test devices first transmit port is 152 connected to the SUTs first receive port. The SUTs first transmit port 153 is connected to the test devices first receive port. The test devices 154 second transmit port is connected to the SUTs second receive port. The 155 SUTs second transmit port is connected to the test devices second 156 receive port (see Figure 2). In this configuration, the test device can 157 determine if all of the transmitted packets were received and forwarded 158 correctly. Note that this configuration does verify internal system 159 functions, since it verifies two ports on the SUT. 161 +-------------+ +-------------+ 162 | Test Tx|-------------->|Rx | 163 | Device Rx|<--------------|Tx SUT | 164 | Tx Rx | | Tx Rx | 165 +-------------+ +-------------+ 166 | ^ | ^ 167 | | | | 168 | +------------------------+ | 169 | | 170 |---------------------------------| 172 Figure 2 174 2) Uni-directional passthrough connection: The test devices first 175 transmit port is connected to the SUT1 receive port. The SUT1 transmit 176 port is connected to the test devices first receive port. The test 177 devices second transmit port is connected to the SUT2 receive port. The 178 SUT2 transmit port is connected to the test devices second receive port 179 (see Figure 3). In this configuration, the test device can determine if 180 all of the packets transmitted by SUT1 were correctly acknowledged by 181 SUT2. Note that this configuration does not verify internal system 182 functions, but verifies one port on each SUT. 184 +-------------+ +-------------+ +-------------+ 185 | Tx|---------->|Rx Tx|---------->|Rx | 186 | SUT1 Rx|<----------|Tx Test Rx|<----------|Tx SUT2 | 187 | | | Device | | | 188 +-------------+ +-------------+ +-------------+ 189 Figure 3 191 2.5. SUT Configuration 193 The SUT MUST be configured as described in the SUT users guide. 194 Specifically, it is expected that all of the supported protocols will be 195 configured and enabled (See Appendix A). It is expected that all of the 196 tests will be run without changing the configuration or setup of the SUT 197 in any way other than that required to do the specific test. For 198 example, it is not acceptable to disable all but one transport protocol 199 when testing the throughput of that protocol. If PNNI or BISUP is used 200 to initiate switched virtual connections (SVCs), the SUT configuration 201 SHOULD include the normally recommended routing update intervals and 202 keep alive frequency. The specific version of the software and the 203 exact SUT configuration, including what functions are disabled and used 204 during the tests MUST be included as part of the report of the results. 206 2.6. Frame formats 208 The formats of the test IP PDUs to use for TCP/IP over ATM are shown in 209 Appendix C: Test Frame Formats. Note that these IP PDUs are in 210 accordance with RFC 2225. These exact IP PDU formats SHOULD be used in 211 the tests described in this document for this protocol/media 212 combination. These IP PDUs will be used as a template for testing other 213 protocol/media combinations. The specific formats that are used to 214 define the test IP PDUs for a particular test series MUST be included in 215 the report of the results. 217 2.7. Frame sizes 219 All of the described tests SHOULD be performed using a number of IP PDU 220 sizes. Specifically, the sizes SHOULD include the maximum and minimum 221 legitimate sizes for the protocol under test on the media under test and 222 enough sizes in between to be able to get a full characterization of the 223 SUT performance. Except where noted, at least five IP PDU sizes SHOULD 224 be tested for each test condition. 226 Theoretically the minimum size UDP Echo request IP PDU would consist of 227 an IP header (minimum length 20 octets), a UDP header (8 octets), AAL5 228 trailer (8 octets) and an LLC/SNAP code point header(8 octets); 229 therefore, the minimum size PDU will fit into one ATM cell. The 230 theoretical maximum IP PDU size is determined by the size of the length 231 field in the IP header. In almost all cases the actual maximum and 232 minimum sizes are determined by the limitations of the media. In the 233 case of ATM, the maximum IP PDU size SHOULD be the ATM MTU size, which 234 is 9180 octets. 236 In theory it would be ideal to distribute the IP PDU sizes in a way that 237 would evenly distribute the theoretical IP PDU rates. These 238 recommendations incorporate this theory but specify IP PDU sizes, which 239 are easy to understand and remember. In addition, many of the same IP 240 PDU sizes are specified on each of the media types to allow for easy 241 performance comparisons. 243 Note: The inclusion of an unrealistically small IP PDU size on some of 244 the media types (i.e. with little or no space for data) is to help 245 characterize the per-IP PDU processing overhead of the SUT. 247 The IP PDU sizes that will be used are: 249 44, 64, 128, 256, 1024, 1518, 2048, 4472, 9180 251 The minimum size IP PDU for UDP on ATM is 44 octets, the minimum size of 252 44 is recommended to allow direct comparison to token ring performance. 253 The IP PDU size of 4472 is recommended instead of the theoretical FDDI 254 maximum size of 4500 octets in order to permit the same type of 255 comparison. An IP (i.e. not UDP) IP PDU may be used in addition if a 256 higher data rate is desired, in which case the minimum IP PDU size is 28 257 octets. 259 2.8. Verifying received IP PDUs 261 The test equipment SHOULD discard any IP PDUs received during a test run 262 that are not actual forwarded test IP PDUs. For example, keep-alive and 263 routing update IP PDUs SHOULD NOT be included in the count of received 264 IP PDUs. In any case, the test equipment SHOULD verify the length of 265 the received IP PDUs and check that they match the expected length. 267 Preferably, the test equipment SHOULD include sequence numbers in the 268 transmitted IP PDUs and check for these numbers on the received IP PDUs. 269 If this is done, the reported results SHOULD include. in addition to the 270 number of IP PDUs dropped, the number of IP PDUs that were received out 271 of order, the number of duplicate IP PDUs received and the number of 272 gaps in the received IP PDU numbering sequence. This functionality is 273 required for some of the described tests. 275 2.9. Modifiers 277 It is useful to characterize the SUTs performance under a number of 278 conditions. Some of these conditions are noted below. The reported 279 results SHOULD include as many of these conditions as the test equipment 280 is able to generate. The suite of tests SHOULD be run first without any 281 modifying conditions, then repeated under each of the modifying 282 conditions separately. To preserve the ability to compare the results 283 of these tests, any IP PDUs that are required to generate the modifying 284 conditions (excluding management queries) will be included in the same 285 data stream as that of the normal test IP PDUs and in place of one of 286 the test IP PDUs. They MUST not be supplied to the SUT on a separate 287 network port. 289 2.9.1. Management IP PDUs 291 Most ATM data networks now make use of ILMI, signaling and OAM. In many 292 environments, there can be a number of management stations sending 293 queries to the same SUT at the same time. 295 Management queries MUST be made in accordance with the applicable 296 specification, e.g., ILMI sysUpTime getNext requests will be made in 297 accordance with ILMI 4.0. The response to the query MUST be verified by 298 the test equipment. Note that, for each management protocol in use, this 299 requires that the test equipment implement the associated protocol state 300 machine. One example of the specific query IP PDU that should be used 301 is shown in Appendix B. 303 2.9.2. Routing update IP PDUs 305 The processing of PNNI updates could have a significant impact on the 306 ability of a switch to forward cells and complete calls. If PNNI is 307 configured on the SUT, one routing update MUST be transmitted before the 308 first test IP PDU is transmitted during the trial. The test SHOULD 309 verify that the SUT has properly processed the routing update. 311 PNNI routing update IP PDUs SHOULD be sent at the rate specified in 312 Appendix B. Appendix C defines two routing update PDUs for the TCP/IP 313 over ATM example. The routing updates are designed to change the 314 routing on a number of networks that are not involved in the forwarding 315 of the test data. The first IP PDU sets the routing table state to "A", 316 the second one changes the state to "B". The IP PDUs MUST be alternated 317 during the trial. The test SHOULD verify that the SUT has properly 318 processed the routing update. 320 2.10. Filters 322 Filters are added to switches to selectively inhibit the forwarding of 323 cells that would normally be forwarded. This is usually done to 324 implement security controls on the data that is accepted between one 325 area and another. Different products have different capabilities to 326 implement filters. 328 The SUT SHOULD be first configured to add one filter condition and the 329 tests performed. This filter SHOULD permit the forwarding of the test 330 data stream. This filter SHOULD be of the form: 332 To be determined. 334 The SUT SHOULD be then reconfigured to implement a total of 25 filters. 335 The first 24 of these filters SHOULD be of the form: 337 To be determined. 339 The 24 ATM addresses SHOULD not be any that are represented in the test 340 data stream. The last filter SHOULD permit the forwarding of the test 341 data stream. By "first" and "last" we mean to ensure that in the second 342 case, 25 conditions must be checked before the data IP PDUs will match 343 the conditions that permit the forwarding of the IP PDU. Of course, if 344 the SUT reorders the filters or does not use a linear scan of the filter 345 rules the effect of the sequence in which the filters are input is 346 properly lost. 348 The exact filters configuration command lines used SHOULD be included 349 with the report of the results. 351 2.10.1. Filter Addresses 353 Two sets of filter addresses are required, one for the single filter 354 case and one for the 25 filter case. 356 The single filter case should permit traffic from ATM address [Switch 357 Network Prefix] 00 00 00 00 00 01 00 to ATM address [Switch Network 358 Prefix] 00 00 00 00 00 02 00 and deny all other traffic. Note that the 359 13 octet Switch Network Prefix MUST be configured before this test can 360 be run. 362 The 25 filter case should follow the following sequence. 364 deny [Switch Network Prefix] 00 00 00 00 00 01 00 365 to [Switch Network Prefix] 00 00 00 00 00 03 00 366 deny [Switch Network Prefix] 00 00 00 00 00 01 00 367 to [Switch Network Prefix] 00 00 00 00 00 04 00 368 deny [Switch Network Prefix] 00 00 00 00 00 01 00 369 to [Switch Network Prefix] 00 00 00 00 00 05 00 370 ... 371 deny [Switch Network Prefix] 00 00 00 00 00 01 00 372 to [Switch Network Prefix] 00 00 00 00 00 0C 00 373 deny [Switch Network Prefix] 00 00 00 00 00 01 00 374 to [Switch Network Prefix] 00 00 00 00 00 0D 00 375 allow [Switch Network Prefix] 00 00 00 00 00 01 00 376 to [Switch Network Prefix] 00 00 00 00 00 02 00 377 deny [Switch Network Prefix] 00 00 00 00 00 01 00 378 to [Switch Network Prefix] 00 00 00 00 00 0E 00 379 deny [Switch Network Prefix] 00 00 00 00 00 01 00 380 to [Switch Network Prefix] 00 00 00 00 00 0F 00 381 ... 383 deny [Switch Network Prefix] 00 00 00 00 00 01 00 384 to [Switch Network Prefix] 00 00 00 00 00 18 00 385 deny all else 387 All previous filter conditions should be cleared from the switch before 388 this sequence is entered. The sequence is selected to test to see if 389 the switch sorts the filter conditions or accepts them in the order that 390 they were entered. Both of these procedures will result in a greater 391 impact on performance than will some form of hash coding. 393 2.11. Protocol addresses 395 It is easier to implement these tests using a single logical stream of 396 data, with one source ATM address and one destination ATM address, and 397 for some conditions like the filters described above, a practical 398 requirement. Networks in the real world are not limited to single 399 streams of data. The test suite SHOULD be first run with a single ATM 400 source and destination address pair. The tests SHOULD then be repeated 401 with using a random destination address. In the case of testing single 402 switches, the addresses SHOULD be random and uniformly distributed over 403 a range of 256 seven octet user parts. In the case of testing multiple 404 interconnected switches, the addresses SHOULD be random and uniformly 405 distributed over the 256 network prefixes, each of which should support 406 256 seven octet user parts. The specific address ranges to use for ATM 407 are shown in Appendix B. IP to ATM address mapping MUST be accomplished 408 as described in RFC 2225. 410 2.12. Route Set Up 412 It is not reasonable that all of the routing information necessary to 413 forward the test stream, especially in the multiple address case, will 414 be manually set up. If PNNI and/or ILMI are running, at the start of 415 each trial a routing update MUST be sent to the SUT. This routing update 416 MUST include all of the ATM addresses that will be required for the 417 trial. This routing update will have to be repeated at the interval 418 required by PNNI or ILMI. An example of the format and repetition 419 interval of the update IP PDUs is given in Appendix B. 421 2.13. Bidirectional traffic 423 Bidirectional performance tests SHOULD be run with the same data rate 424 being offered from each direction. The sum of the data rates should not 425 exceed the theoretical limit for the media. 427 2.14. Single stream path 429 The full suite of tests SHOULD be run with the appropriate modifiers for 430 a single receive and transmit port on the SUT. If the internal design of 431 the SUT has multiple distinct pathways, for example, multiple interface 432 cards each with multiple network ports, then all possible permutations 433 of pathways SHOULD be tested separately. If multiple interconnected 434 switches are tested, the test MUST specify routes, which allow only one 435 path between source and destination ATM addresses. 437 2.15. Multi-port 439 Many switch products provide several network ports on the same interface 440 module. Each port on an interface module SHOULD be stimulated in an 441 identical manner. Specifically, half of the ports on each module SHOULD 442 be receive ports and half SHOULD be transmit ports. For example if a 443 SUT has two interface module each of which has four ports, two ports on 444 each interface module be receive ports and two will be transmit ports. 445 Each receive port MUST be offered the same data rate. The addresses in 446 the input data streams SHOULD be set so that an IP PDU will be directed 447 to each of the transmit ports in sequence. That is, all transmit ports 448 will receive an identical distribution of IP PDUs from a particular 449 receive port. 451 Consider the following 6 port SUT: 453 -------------- 454 ---------| Rx A Tx X|-------- 455 ---------| Rx B Tx Y|-------- 456 ---------| Rx C Tx Z|-------- 457 -------------- 459 The addressing of the data streams for each of the inputs SHOULD be: 461 stream sent to Rx A: 462 IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z 463 stream sent to Rx B: 464 IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z 465 stream sent to Rx C 466 IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z 468 Note: Each stream contains the same sequence of IP destination 469 addresses; therefore, each transmit port will receive 3 IP PDUs 470 simultaneously. This procedure ensures that the SUT will have to process 471 multiple IP PDUs addressed to the same transmit port simultaneously. 473 The same configuration MAY be used to perform a bi-directional multi- 474 stream test. In this case all of the ports are considered both receive 475 and transmit ports. Each data stream MUST consist of IP PDUs whose 476 addresses correspond to the ATM addresses all of the other ports. 478 2.16. Multiple protocols 480 This document does not address the issue of testing the effects of a 481 mixed protocol environment other than to suggest that if such tests are 482 wanted then PDUs SHOULD be distributed between all of the test 483 protocols. The distribution MAY approximate the conditions on the 484 network in which the SUT would be used. 486 2.17. Multiple IP PDU sizes 488 This document does not address the issue of testing the effects of a 489 mixed IP PDU size environment other than to suggest that, if such tests 490 are required, then IP PDU size SHOULD be evenly distributed among all of 491 the PDU sizes listed in this document. The distribution MAY approximate 492 the conditions on the network in which the SUT would be used. 494 2.18. Testing beyond a single SUT 496 In the performance testing of a single SUT, the paradigm can be 497 described as applying some input to a SUT and monitoring the output. The 498 results of which can be used to form a basis of characterization of that 499 device under those test conditions. 501 This model is useful when the test input and output are homogenous 502 (e.g., 64-byte IP, AAL5 PDUs into the SUT; 64 byte IP, AAL5 PDUs out). 504 By extending the single SUT test model, reasonable benchmarks regarding 505 multiple SUTs or heterogeneous environments may be collected. In this 506 extension, the single SUT is replaced by a system of interconnected 507 network SUTs. This test methodology would support the benchmarking of a 508 variety of device/media/service/protocol combinations. For example, a 509 configuration for a LAN-to-WAN-to-LAN test might be: 511 (1) ATM UNI -> SUT 1 -> BISUP -> SUT 2 -> ATM UNI 513 Or an extended LAN configuration might be: 515 (2) ATM UNI -> SUT 1 -> PNNI Network -> SUT 2 -> ATM UNI 517 In both examples 1 and 2, end-to-end benchmarks of each system could be 518 empirically ascertained. Other behavior may be characterized through the 519 use of intermediate devices. In example 2, the configuration may be used 520 to give an indication of the effect of PNNI routing on IP throughput. 522 Because multiple SUTs are treated as a single system, there are 523 limitations to this methodology. For instance, this methodology may 524 yield an aggregate benchmark for a tested system. That benchmark alone, 525 however, may not necessarily reflect asymmetries in behavior between the 526 SUTs, latencies introduced by other apparatus (e.g., CSUs/DSUs, 527 switches), etc. 529 Further, care must be used when comparing benchmarks of different 530 systems by ensuring that the SUTs' features and configuration of the 531 tested systems have the appropriate common denominators to allow 532 comparison. 534 2.19. Maximum IP PDU rate 536 The maximum IP PDU rates that should be used when testing LAN 537 connections SHOULD be the listed theoretical maximum rate for the IP PDU 538 size on the media. 540 The maximum IP PDU rate that should be used when testing WAN connections 541 SHOULD be greater than the listed theoretical maximum rate for the IP 542 PDU size on that speed connection. The higher rate for WAN tests is to 543 compensate for the fact that some vendors employ various forms of header 544 compression. 546 A list of maximum IP PDU rates for LAN connections is included in 547 Appendix B. 549 2.20. Bursty traffic 551 It is convenient to measure the SUT performance under steady state load; 552 however, this is an unrealistic way to gauge the functioning of a SUT. 553 Actual network traffic normally consists of bursts of IP PDUs. 555 Some of the tests described below SHOULD be performed with both constant 556 bit rate and bursty traffic. The IP PDUs within a burst are transmitted 557 with the minimum legitimate inter-IP PDU gap. 559 The objective of the test is to determine the minimum interval between 560 bursts that the SUT can process with no IP PDU loss. Tests SHOULD be 561 run with burst sizes of 10% of Maximum Burst Size (MBS), 20% of MBS, 50% 562 of MBS and 100% MBS. Note that the number of IP PDUs in each burst will 563 depend on the PDU size. 565 2.21. Trial description 567 A particular test consists of multiple trials. Each trial returns one 568 piece of information, for example the loss rate at a particular input IP 569 PDU rate. Each trial consists of five of phases: 571 a) If the SUT is a switch supporting PNNI, send the routing update to 572 the SUT receive port and wait two seconds to be sure that the routing 573 has settled. 575 b) Send an ATM ARP PDU to determine the ATM address corresponding to the 576 destination IP address. The formats of the ATM ARP PDU that should be 577 used are shown in the Test Frame Formats document and MUST be in 578 accordance with RFC 2225. 580 c) Stimulate SUT with traffic load. 582 d) Wait for two seconds for any residual IP PDUs to be received. 584 e) Wait for at least five seconds for the SUT to restabilize. 586 2.22. Trial duration 588 The objective of the tests defined in this document is to accurately 589 characterize the behavior of a particular piece of network equipment 590 under varying traffic loads. The choice of test duration must be a 591 compromise between this objective and keeping the duration of the 592 benchmarking test suite within reasonable bounds. The SUT SHOULD be 593 stimulated for at least 60 seconds. If this time period results in a 594 high variance in the test results, the SUT SHOULD be stimulated for at 595 least 300 seconds. 597 2.23. Address resolution 599 The SUT MUST be able to respond to address resolution requests sent by 600 another SUT, an ATM ARP server or the test equipment in accordance with 601 RFC 2225. 603 2.24. Synchronized Payload Bit Pattern. 605 Some measurements assume that both the transmitter and receiver payload 606 information is synchronized. Synchronization MUST be achieved by 607 supplying a known bit pattern to both the transmitter and receiver. 608 This bit pattern MUST be one of the following: PRBS-15, PRBS-23, 0xFF00, 609 or 0xAA55. 611 2.25. Burst Traffic Descriptors. 613 Some measurements require busty traffic patterns. These patterns MUST 614 conform to one of the following traffic descriptors: 616 1) PCR=100% allotted line rate, SCR=50% allotted line rate, and MBS=8192 618 2) PCR=100% allotted line rate, SCR=50% allotted line rate, and MBS=4096 620 3) PCR=90% allotted line rate, SCR=50% allotted line rate, and MBS=8192 622 4) PCR=90% allotted line rate, SCR=50% allotted line rate, and MBS=4096 623 5) PCR=90% allotted line rate, SCR=45% allotted line rate, and MBS=8192 625 6) PCR=90% allotted line rate, SCR=45% allotted line rate, and MBS=4096 627 7) PCR=80% allotted line rate, SCR=40% allotted line rate, and MBS=65536 629 8) PCR=80% allotted line rate, SCR=40% allotted line rate, and MBS=32768 631 The allotted line rate refers to the total available line rate divided 632 by the number of VCCs in use. 634 3. Performance Metrics 636 3.1. Physical Layer- SONET 638 3.1.1. Pointer Movements 640 3.1.1.1. Pointer Movement Propagation. 642 Objective: To determine that the SUT does not propagate pointer movements 643 as defined in "Terminology for ATM Benchmarking". 645 Procedure: 647 1) Set up the SUT and test device using the uni-directional 648 configuration. 650 2) Send a specific number of IP PDUs at a specific rate through the SUT. 651 Since this test is not a throughput test, the rate should not be greater 652 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 653 The IP PDUs MUST be encapsulated in AAL5. 655 3) Count the IP PDUs that are transmitted by the SUT to verify 656 connectivity and load. If the count on the test device is the same on 657 the SUT, continue the test, else lower the test device traffic rate 658 until the counts are the same. 660 4) Inject one forward payload pointer movement. Verify that the SUT 661 does not change the pointer. 663 5) Inject one forward payload pointer movement every 1 second. Verify 664 that the SUT does not change the pointer. 666 6) Discontinue the payload pointer movement. 668 7) Inject five forward payload pointer movements every 1 second. Verify 669 that the SUT does not change the pointer. 671 8) Discontinue the payload pointer movement. 673 9) Inject one backward payload pointer movement. Verify that the SUT 674 does not change the pointer. 676 10) Inject one backward payload pointer movement every 1 second. Verify 677 that the SUT does not change the pointer. 679 11) Discontinue the payload pointer movement. 681 12) Inject five backward payload pointer movements every 1 second. 682 Verify that the SUT does not change the pointer. 684 13) Discontinue the payload pointer movement. 686 Reporting Format: 688 The results of the pointer movement propagation test SHOULD be reported 689 in a form of a table. The rows SHOULD be labeled single pointer 690 movement, one pointer movement per second, and five pointer movements 691 per second. The columns SHOULD be labeled pointer movement and loss of 692 pointer. The elements of the table SHOULD be either True or False, 693 indicating whether the particular condition was observed for each test. 695 The table MUST also indicate the IP PDU size in octets and traffic rate 696 in IP PDUs per second as generated by the test device. 698 3.1.1.2. Cell Loss due to Pointer Movement. 700 Objective: To determine if the SUT will drop cells due to pointer movements 701 as defined in "Terminology for ATM Benchmarking". 703 Procedure: 705 1) Set up the SUT and test device using the uni-directional 706 configuration. 708 2) Send a specific number of cells at a specific rate through the SUT. 709 Since this test is not a throughput test, the rate should not be greater 710 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 711 The IP PDUs MUST be encapsulated in AAL5. 713 3) Count the cells that are transmitted by the SUT to verify 714 connectivity and load. If the count on the test device is the same on 715 the SUT, continue the test; else lower the test device traffic rate 716 until the counts are the same. 718 4) Inject one forward payload pointer movement. Verify that the SUT 719 does not drop any cells. 721 5) Inject one forward payload pointer movement every 1 second. Verify 722 that the SUT does not drop any cells. 724 6) Discontinue the payload pointer movement. 726 7) Inject five forward payload pointer movements every 1 second. Verify 727 that the SUT does not drop any cells. 729 8) Discontinue the payload pointer movement. 731 9) Inject one backward payload pointer movement. Verify that the SUT 732 does not drop any cells. 734 10) Inject one backward payload pointer movement every 1 second. Verify 735 that the SUT does not drop any cells. 737 11) Discontinue the payload pointer movement. 739 12) Inject five backward payload pointer movements every 1 second. 740 Verify that the SUT does not drop any cells. 742 13) Discontinue the payload pointer movement. 744 Reporting Format: 746 The results of the cell loss due to pointer movement test SHOULD be 747 reported in a form of a table. The rows SHOULD be labeled single pointer 748 movement, one pointer movement per second, and five pointer movements 749 per second. The columns SHOULD be labeled cell loss and number of cells 750 lost. The elements of column 1 SHOULD be either True or False, 751 indicating whether the particular condition was observed for each test. 752 The elements of column 2 SHOULD be non-negative integers. 754 The table MUST also indicate the traffic rate in IP PDUs per second as 755 generated by the test device. 757 3.1.1.3. IP Packet Loss due to Pointer Movement. 759 Objective: To determine if the SUT will drop IP packets due to pointer 760 movements as defined in "Terminology for ATM Benchmarking". 762 Procedure: 764 1) Set up the SUT and test device using the uni-directional 765 configuration. 767 2) Send a specific number of IP packets at a specific rate through the 768 SUT. Since this test is not a throughput test, the rate should not be 769 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 770 AAL5. 772 3) Count the IP packets that are transmitted by the SUT to verify 773 connectivity and load. If the count on the test device is the same on 774 the SUT, continue the test; else lower the test device traffic rate 775 until the counts are the same. 777 4) Inject one forward payload pointer movement. Verify that the SUT 778 does not drop any packets. 780 5) Inject one forward payload pointer movement every 1 second. Verify 781 that the SUT does not drop any packets. 783 6) Discontinue the payload pointer movement. 785 7) Inject five forward payload pointer movements every 1 second. Verify 786 that the SUT does not drop any packets. 788 8) Discontinue the payload pointer movement. 790 9) Inject one backward payload pointer movement. Verify that the SUT 791 does not drop any packets. 793 10) Inject one backward payload pointer movement every 1 second. Verify 794 that the SUT does not drop any packets. 796 11) Discontinue the payload pointer movement. 798 12) Inject five backward payload pointer movements every 1 second. 799 Verify that the SUT does not drop any packets. 801 13) Discontinue the payload pointer movement. 803 Reporting Format: 805 The results of the IP packet loss due to pointer movement test SHOULD be 806 reported in a form of a table. The rows SHOULD be labeled single pointer 807 movement, one pointer movement per second, and five pointer movements 808 per second. The columns SHOULD be labeled packet loss and number of 809 packets lost. The elements of column 1 SHOULD be either True or False, 810 indicating whether the particular condition was observed for each test. 811 The elements of column 2 SHOULD be non-negative integers. 813 The table MUST also indicate the packet size in octets and traffic rate 814 in packets per second as generated by the test device. 816 3.1.2. Transport Overhead (TOH) Error Count 818 3.1.2.1. TOH Error Propagation. 820 Objective: To determine that the SUT does not propagate TOH errors as 821 defined in "Terminology for ATM Benchmarking". 823 Procedure: 825 1) Set up the SUT and test device using the uni-directional 826 configuration. 828 2) Send a specific number of IP PDUs at a specific rate through the SUT. 829 Since this test is not a throughput test, the rate should not be greater 830 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 831 The IP PDUs MUST be encapsulated in AAL5. 833 3) Count the IP PDUs that are transmitted by the SUT to verify 834 connectivity and load. If the count on the test device is the same on 835 the SUT, continue the test, else lower the test device traffic rate 836 until the counts are the same. 838 4) Inject one error in the first bit of the A1 and A2 Frameword. Verify 839 that the SUT does not propagate the error. 841 5) Inject one error in the first bit of the A1 and A2 Frameword every 1 842 second. Verify that the SUT does not propagate the error. 6) 843 Discontinue the Frameword error. 845 7) Inject one error in the first bit of the A1 and A2 Frameword for 4 846 consecutive IP PDUs in every 6 IP PDUs. Verify that the SUT indicates 847 Loss of Frame. 849 8) Discontinue the Frameword error. 851 Reporting Format: 853 The results of the TOH error propagation test SHOULD be reported in a 854 form of a table. The rows SHOULD be labeled single error, one error per 855 second, and four consecutive errors every 6 IP PDUs. The columns SHOULD 856 be labeled error propagated and loss of IP PDU. The elements of the 857 table SHOULD be either True or False, indicating whether the particular 858 condition was observed for each test. 860 The table MUST also indicate the IP PDU size in octets and traffic rate 861 in IP PDUs per second as generated by the test device. 863 3.1.2.2. Cell Loss due to TOH Error. 865 Objective: To determine if the SUT will drop cells due TOH Errors as 866 defined in "Terminology for ATM Benchmarking". 868 Procedure: 870 1) Set up the SUT and test device using the uni-directional 871 configuration. 873 2) Send a specific number of cells at a specific rate through the SUT. 874 Since this test is not a throughput test, the rate should not be greater 875 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 876 The IP PDUs MUST be encapsulated in AAL5. 878 3) Count the cells that are transmitted by the SUT to verify 879 connectivity and load. If the count on the test device is the same on 880 the SUT, continue the test; else lower the test device traffic rate 881 until the counts are the same. 883 4) Inject one error in the first bit of the A1 and A2 Frameword. Verify 884 that the SUT does not drop any cells. 886 5) Inject one error in the first bit of the A1 and A2 Frameword every 1 887 second. Verify that the SUT does not drop any cells. 889 6) Discontinue the Frameword error. 891 7) Inject one error in the first bit of the A1 and A2 Frameword for 4 892 consecutive IP PDUs in every 6 IP PDUs. Verify that the SUT does drop 893 cells. 895 8) Discontinue the Frameword error. 897 Reporting Format: 899 The results of the Cell Loss due to TOH errors test SHOULD be reported 900 in a form of a table. The rows SHOULD be labeled single error, one error 901 per second, and four consecutive errors every 6 IP PDUs. The columns 902 SHOULD be labeled cell loss and number of cells lost. The elements of 903 column 1 SHOULD be either True or False, indicating whether the 904 particular condition was observed for each test. The elements of column 905 2 SHOULD be non-negative integers. 907 The table MUST also indicate the traffic rate in IP PDUs per second as 908 generated by the test device. 910 3.1.2.3. IP Packet Loss due to TOH Error. 912 Objective: To determine if the SUT will drop IP packets due to TOH errors 913 as defined in "Terminology for ATM Benchmarking". 915 Procedure: 917 1) Set up the SUT and test device using the uni-directional 918 configuration. 920 2) Send a specific number of IP packets at a specific rate through the 921 SUT. Since this test is not a throughput test, the rate should not be 922 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 923 AAL5. 925 3) Count the IP packets that are transmitted by the SUT to verify 926 connectivity and load. If the count on the test device is the same on 927 the SUT, continue the test; else lower the test device traffic rate 928 until the counts are the same. 930 4) Inject one error in the first bit of the A1 and A2 Frameword. Verify 931 that the SUT does not drop any packets. 933 5) Inject one error in the first bit of the A1 and A2 Frameword every 1 934 second. Verify that the SUT does not drop any packets. 936 6) Discontinue the Frameword error. 938 7) Inject one error in the first bit of the A1 and A2 Frameword for 4 939 consecutive IP PDUs in every 6 IP PDUs. Verify that the SUT does drop 940 packets. 942 8) Discontinue the Frameword error. 944 Reporting Format: 946 The results of the IP packet loss due to TOH errors test SHOULD be 947 reported in a form of a table. The rows SHOULD be labeled single error, 948 one error per second, and four consecutive errors every 6 IP PDUs. The 949 columns SHOULD be labeled packet loss and number of packets lost. The 950 elements of column 1 SHOULD be either True or False, indicating whether 951 the particular condition was observed for each test. The elements of 952 column 2 SHOULD be non-negative integers. 954 The table MUST also indicate the packet size in octets and traffic rate 955 in packets per second as generated by the test device. 957 3.1.3. Path Overhead (POH) Error Count 959 3.1.3.1. POH Error Propagation. 961 Objective: To determine that the SUT does not propagate POH errors as 962 defined in "Terminology for ATM Benchmarking". 964 Procedure: 966 1) Set up the SUT and test device using the uni-directional 967 configuration. 969 2) Send a specific number of IP PDUs at a specific rate through the SUT. 970 Since this test is not a throughput test, the rate should not be greater 971 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 972 The IP PDUs MUST be encapsulated in AAL5. 974 3) Count the IP PDUs that are transmitted by the SUT to verify 975 connectivity and load. If the count on the test device is the same on 976 the SUT, continue the test, else lower the test device traffic rate 977 until the counts are the same. 979 4) Inject one error in the B3 (Path BIP8) byte. Verify that the SUT 980 does not propagate the error. 982 5) Inject one error in the B3 byte every 1 second. Verify that the SUT 983 does not propagate the error. 985 6) Discontinue the POH error. 987 Reporting Format: 989 The results of the POH error propagation test SHOULD be reported in a 990 form of a table. The rows SHOULD be labeled single error and one error 991 per second. The columns SHOULD be labeled error propagated and loss of 992 IP PDU. The elements of the table SHOULD be either True or False, 993 indicating whether the particular condition was observed for each test. 995 The table MUST also indicate the IP PDU size in octets and traffic rate 996 in IP PDUs per second as generated by the test device. 998 3.1.3.2. Cell Loss due to POH Error. 1000 Objective: To determine if the SUT will drop cells due POH Errors as 1001 defined in "Terminology for ATM Benchmarking". 1003 Procedure: 1005 1) Set up the SUT and test device using the uni-directional 1006 configuration. 1008 2) Send a specific number of cells at a specific rate through the SUT. 1009 Since this test is not a throughput test, the rate should not be greater 1010 than 90% of line rate. The cell payload SHOULD contain valid IP PDUs. 1011 The IP PDUs MUST be encapsulated in AAL5. 1013 3) Count the cells that are transmitted by the SUT to verify 1014 connectivity and load. If the count on the test device is the same on 1015 the SUT, continue the test; else lower the test device traffic rate 1016 until the counts are the same. 1018 4) Inject one error in the B3 (Path BIP8) byte. Verify that the SUT 1019 does not drop any cells. 1021 5) Inject one error in the B3 byte every 1 second. Verify that the SUT 1022 does not drop any cells. 1024 6) Discontinue the POH error. 1026 Reporting Format: 1028 The results of the Cell Loss due to POH errors test SHOULD be reported 1029 in a form of a table. The rows SHOULD be labeled single error and one 1030 error per second. The columns SHOULD be labeled cell loss and number of 1031 cells lost. The elements of column 1 SHOULD be either True or False, 1032 indicating whether the particular condition was observed for each test. 1033 The elements of column 2 SHOULD be non-negative integers. 1035 The table MUST also indicate the traffic rate in IP PDUs per second as 1036 generated by the test device. 1038 3.1.3.3. IP Packet Loss due to POH Error. 1040 Objective: To determine if the SUT will drop IP packets due to POH errors 1041 as defined in "Terminology for ATM Benchmarking". 1043 Procedure: 1045 1) Set up the SUT and test device using the uni-directional 1046 configuration. 1048 2) Send a specific number of IP packets at a specific rate through the 1049 SUT. Since this test is not a throughput test, the rate should not be 1050 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1051 AAL5. 1053 3) Count the IP packets that are transmitted by the SUT to verify 1054 connectivity and load. If the count on the test device is the same on 1055 the SUT, continue the test; else lower the test device traffic rate 1056 until the counts are the same. 1058 4) Inject one error in the B3 (Path BIP8) byte. Verify that the SUT 1059 does not drop any packets. 1061 5) Inject one error in the B3 byte every 1 second. Verify that the SUT 1062 does not drop any packets. 1064 6) Discontinue the POH error. 1066 Reporting Format: 1068 The results of the IP packet loss due to POH errors test SHOULD be 1069 reported in a form of a table. The rows SHOULD be labeled single error 1070 and one error per second. The columns SHOULD be labeled packet loss and 1071 number of packets lost. The elements of column 1 SHOULD be either True 1072 or False, indicating whether the particular condition was observed for 1073 each test. The elements of column 2 SHOULD be non-negative integers. 1075 The table MUST also indicate the packet size in octets and traffic rate 1076 in packets per second as generated by the test device. 1078 3.2. ATM Layer 1080 3.2.1. Two-Point Cell Delay Variation (CDV) 1082 3.2.1.1. Test Setup 1084 The cell delay measurements assume that both the transmitter and 1085 receiver timestamp information is synchronized. Synchronization SHOULD 1086 be achieved by supplying a common clock signal (minimum of 100 Mhz or 10 1087 nS resolution) to both the transmitter and receiver. The maximum 1088 timestamp values MUST be recorded to ensure synchronization in the case 1089 of counter rollover. 1091 3.2.1.2. Two-point CDV/One VCC 1093 Objective: To determine the SUT variation in cell transfer delay with one 1094 VCC as defined in "Terminology for ATM Benchmarking". 1096 Procedure: 1098 1) Set up the SUT and test device using the bi-directional 1099 configuration. 1101 2) Configure the SUT and test device with one VCC. The VCC SHOULD 1102 contain one VPI/VCI. The VPI/VCI MUST not be one of the reserved 1103 signaling channels (e.g. [0,5], [0,16]). 1105 3) Send a specific number of IP packets containing time stamps at a 1106 specific rate through the SUT via the defined test VCC. Since this test 1107 is not a throughput test, the rate should not be greater than 90% of 1108 line rate. The IP PDUs MUST be encapsulated in AAL5. 1110 4) Count the IP packets that are transmitted by the SUT to verify 1111 connectivity and load. If the count on the test device is the same on 1112 the SUT, continue the test; else lower the test device traffic rate 1113 until the counts are the same. 5) Record the packets timestamps at the 1114 transmitter and receiver ends of the test device. 1116 Reporting Format: 1118 The results of the Two-point CDV/One VCC test SHOULD be reported in a 1119 form of text, graph, and histogram. 1121 The text results SHOULD display the numerical values of the CDV. The 1122 values given SHOULD include: time period of test in s, test VPI/VCI 1123 value, total number of cells transmitted and received on the given 1124 VPI/VCI during the test in positive integers, maximum and minimum CDV 1125 during the test in us, and peak-to-peak CDV in us. 1127 The graph results SHOULD display the cell delay values. The x- 1128 coordinate SHOULD be the test run time in either seconds, minutes or 1129 days depending on the total length of the test. The x-coordinate time 1130 SHOULD be configurable. The y-coordinate SHOULD be the cell delay in 1131 us. The integration time per point MUST be indicated. 1133 The histogram results SHOULD display the peak-to-peak cell delay. The 1134 x- coordinate SHOULD be the cell delay in us with at least 256 bins. 1135 The y- coordinate SHOULD be the number of cells observed in each bin. 1137 The results MUST also indicate the packet size in octets and traffic 1138 rate in packets per second as generated by the test device. The VCC and 1139 VPI/VCI values MUST also be indicated. 1141 3.2.1.3. Two-point CDV/Twelve VCCs 1143 Objective: To determine the SUT variation in cell transfer delay with 1144 twelve VCCs as defined in "Terminology for ATM Benchmarking". 1146 Procedure: 1148 1) Set up the SUT and test device using the bi-directional 1149 configuration. 1151 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 1152 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels 1153 (e.g. [0,5], [0,16]). 1155 3) Send a specific number of IP packets containing time stamps at a 1156 specific rate through the SUT via the defined test VCCs. All of the 1157 VPI/VCI pairs will generate traffic at the same traffic rate. Since 1158 this test is not a throughput test, the rate should not be greater than 1159 90% of line rate. The IP PDUs MUST be encapsulated in AAL5. 1161 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1162 verify connectivity and load. If the count on the test device is the 1163 same on the SUT, continue the test; else lower the test device traffic 1164 rate until the counts are the same. 1166 5) Record the packets timestamps at the transmitter and receiver ends of 1167 the test device for all VCCs. 1169 Reporting Format: 1171 The results of the Two-point CDV/Twelve VCCs test SHOULD be reported in 1172 a form of text, graph, and histograms. 1174 The text results SHOULD display the numerical values of the CDV. The 1175 values given SHOULD include: time period of test in s, test VPI/VCI 1176 values, total number of cells transmitted and received on each VCC 1177 during the test in positive integers, maximum and minimum CDV on each 1178 VCC during the test in us, and peak-to-peak CDV on each VCC in us. 1180 The graph results SHOULD display the cell delay values. The x- 1181 coordinate SHOULD be the test run time in either seconds, minutes or 1182 days depending on the total length of the test. The x-coordinate time 1183 SHOULD be configurable. The y-coordinate SHOULD be the cell delay for 1184 each VCC in ms. There SHOULD be 12 curves on the graph, one curves 1185 indicated and labeled for each VCC. The integration time per point MUST 1186 be indicated. 1188 The histograms SHOULD display the peak-to-peak cell delay. There will be 1189 one histogram for each VCC. The x-coordinate SHOULD be the cell delay 1190 in us with at least 256 bins. The y-coordinate SHOULD be the number of 1191 cells observed in each bin. 1193 The results MUST also indicate the packet size in octets and traffic 1194 rate in packets per second as generated by the test device. The VCC and 1195 VPI/VCI values MUST also be indicated. 1197 3.2.1.4. Two-point CDV/Maximum VCCs 1199 Objective: To determine the SUT variation in cell transfer delay with the 1200 maximum number VCCs supported on the SUT as defined in "Terminology for ATM 1201 Benchmarking". 1203 Procedure: 1205 1) Set up the SUT and test device using the bi-directional 1206 configuration. 1208 2) Configure the SUT and test device with the maximum number of VCCs 1209 supported on the SUT. For example, if the maximum number of VCCs 1210 supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The 1211 VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], 1212 [0,16]). 1214 3) Send a specific number of IP packets containing time stamps at a 1215 specific rate through the SUT via the defined test VCCs. All of the 1216 VPI/VCI pairs will generate traffic at the same traffic rate. Since 1217 this test is not a throughput test, the rate should not be greater than 1218 90% of line rate. The IP PDUs MUST be encapsulated in AAL5. 1220 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1221 verify connectivity and load. If the count on the test device is the 1222 same on the SUT, continue the test; else lower the test device traffic 1223 rate until the counts are the same. 1225 5) Record the packets timestamps at the transmitter and receiver ends of 1226 the test device for all VCCs. 1228 Reporting Format: 1230 The results of the Two-point CDV/Maximum VCCs test SHOULD be reported in 1231 a form of text, graphs, and histograms. 1233 The text results SHOULD display the numerical values of the CDV. The 1234 values given SHOULD include: time period of test in s, test VPI/VCI 1235 values, total number of cells transmitted and received on each VCC 1236 during the test in positive integers, maximum and minimum CDV on each 1237 VCC during the test in us, and peak-to-peak CDV on each VCC in us. 1239 The graph results SHOULD display the cell delay values. There will be 1240 (Max number of VCCs/10) graphs, with 10 VCCs indicated on each graph. 1242 The x-coordinate SHOULD be the test run time in either seconds, minutes 1243 or days depending on the total length of the test. The x-coordinate time 1244 SHOULD be configurable. The y-coordinate SHOULD be the cell delay for 1245 each VCC in us. There SHOULD be no more than 10 curves on each graph, 1246 one curve indicated and labeled for each VCC. The integration time per 1247 point MUST be indicated. 1249 The histograms SHOULD display the peak-to-peak cell delay. There will be 1250 one histogram for each VCC. The x-coordinate SHOULD be the cell delay in 1251 us with at least 256 bins. The y-coordinate SHOULD be the number of 1252 cells observed in each bin. 1254 The results MUST also indicate the packet size in octets and traffic 1255 rate in packets per second as generated by the test device. The VCC and 1256 VPI/VCI values MUST also be indicated. 1258 3.2.2. Cell Error Ratio (CER) 1260 3.2.2.1. Test Setup 1262 The cell error ratio measurements assume that both the transmitter and 1263 receiver payload information is synchronized. Synchronization MUST be 1264 achieved by supplying a known bit pattern to both the transmitter and 1265 receiver. If this bit pattern is longer than the packet size, the 1266 receiver MUST synchronize with the transmitter before tests can be run. 1268 3.2.2.2. Steady Load/One VCC 1270 Objective: To determine the SUT ratio of errored cells on one VCC in a 1271 transmission in relation to the total cells sent as defined in "Terminology 1272 for ATM Benchmarking". 1274 Procedure: 1276 1) Set up the SUT and test device using the bi-directional 1277 configuration. 1279 2) Configure the SUT and test device with one VCC. The VCC SHOULD 1280 contain one VPI/VCI. The VPI/VCI MUST not be one of the reserved 1281 signaling channels (e.g. [0,5], [0,16]). 1283 3) Send a specific number of IP packets containing one of the specified 1284 bit patterns at a specific rate through the SUT via the defined test 1285 VCC. Since this test is not a throughput test, the rate should not be 1286 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1287 AAL5. 1289 4) Count the IP packets that are transmitted by the SUT to verify 1290 connectivity and load. If the count on the test device is the same on 1291 the SUT, continue the test; else lower the test device traffic rate 1292 until the counts are the same. 1294 5) Record the number of bit errors at the receiver end of the test 1295 device. 1297 Reporting Format: 1299 The results of the Steady Load/One VCC test SHOULD be reported in a form 1300 of text and graph. 1302 The text results SHOULD display the numerical values of the CER. The 1303 values given SHOULD include: time period of test in s, test VPI/VCI 1304 value, total number of cells transmitted and received on the given 1305 VPI/VCI during the test in positive integers, and the CER for the entire 1306 test. 1308 The graph results SHOULD display the cell error ratio values. The x- 1309 coordinate SHOULD be the test run time in either seconds, minutes or 1310 days depending on the total length of the test. The x-coordinate time 1311 SHOULD be configurable. The y-coordinate SHOULD be the CER. The 1312 integration time per point MUST be indicated. 1314 The results MUST also indicate the packet size in octets and traffic 1315 rate in packets per second as generated by the test device. The VCC and 1316 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1317 indicated. 1319 3.2.2.3. Steady Load/Twelve VCCs 1321 Objective: To determine the SUT ratio of errored cells on twelve VCCs in a 1322 transmission in relation to the total cells sent as defined in "Terminology 1323 for ATM Benchmarking". 1325 Procedure: 1327 1) Set up the SUT and test device using the bi-directional 1328 configuration. 1330 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 1331 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels 1332 (e.g. [0,5], [0,16]). 1334 3) Send a specific number of IP packets containing one of the specified 1335 bit patterns at a specific rate through the SUT via the defined test 1336 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1337 rate. Since this test is not a throughput test, the rate should not be 1338 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1339 AAL5. 1341 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1342 verify connectivity and load. If the count on the test device is the 1343 same on the SUT, continue the test; else lower the test device traffic 1344 rate until the counts are the same. 1346 5) Record the number of bit errors at the receiver end of the test 1347 device for all VCCs. 1349 Reporting Format: 1351 The results of the Steady Load/Twelve VCCs test SHOULD be reported in a 1352 form of text and graph. 1354 The text results SHOULD display the numerical values of the CER. The 1355 values given SHOULD include: time period of test in s, test VPI/VCI 1356 value, total number of cells transmitted and received on the given 1357 VPI/VCI during the test in positive integers, and the CER for the entire 1358 test. 1360 The graph results SHOULD display the cell error ratio values. The x- 1361 coordinate SHOULD be the test run time in either seconds, minutes or 1362 days depending on the total length of the test. The x-coordinate time 1363 SHOULD be configurable. The y-coordinate SHOULD be the CER for each 1364 VCC. There should be 12 curves on the graph, on curve indicated and 1365 labeled for each VCC. The integration time per point MUST be indicated. 1367 The results MUST also indicate the packet size in octets and traffic 1368 rate in packets per second as generated by the test device. The VCC and 1369 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1370 indicated. 1372 3.2.2.4. Steady Load/Maximum VCCs 1374 Objective: To determine the SUT ratio of errored cells with the maximum 1375 number VCCs supported on the SUT in a transmission in relation to the total 1376 cells sent as defined in "Terminology for ATM Benchmarking". 1378 Procedure: 1380 1) Set up the SUT and test device using the bi-directional 1381 configuration. 1383 2) Configure the SUT and test device with the maximum number of VCCs 1384 supported on the SUT. For example, if the maximum number of VCCs 1385 supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The 1386 VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], 1387 [0,16]). 1389 3) Send a specific number of IP packets containing one of the specified 1390 bit patterns at a specific rate through the SUT via the defined test 1391 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1392 rate. Since this test is not a throughput test, the rate should not be 1393 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1394 AAL5. 1396 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1397 verify connectivity and load. If the count on the test device is the 1398 same on the SUT, continue the test; else lower the test device traffic 1399 rate until the counts are the same. 1401 5) Record the number of bit errors at the receiver end of the test 1402 device for all VCCs. 1404 Reporting Format: 1406 The results of the Steady Load/Maximum VCCs test SHOULD be reported in a 1407 form of text and graph. 1409 The text results SHOULD display the numerical values of the CER. The 1410 values given SHOULD include: time period of test in s, test VPI/VCI 1411 value, total number of cells transmitted and received on the given 1412 VPI/VCI during the test in positive integers, and the CER for the entire 1413 test. 1415 The graph results SHOULD display the cell error ratio values. There will 1416 be (Max number of VCCs/10) graphs, with 10 VCCs indicated on each graph. 1417 The x-coordinate SHOULD be the test run time in either seconds, minutes 1418 or days depending on the total length of the test. The x-coordinate time 1419 SHOULD be configurable. The y-coordinate SHOULD be the CER for each 1420 VCC. There SHOULD be no more than 10 curves on each graph, one curve 1421 indicated and labeled for each VCC. The integration time per point MUST 1422 be indicated. 1424 The results MUST also indicate the packet size in octets and traffic 1425 rate in packets per second as generated by the test device. The VCC and 1426 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1427 indicated. 1429 3.2.2.5. Bursty Load/One VCC 1431 Objective: To determine the SUT ratio of errored cells on one VCC in a 1432 transmission in relation to the total cells sent as defined in "Terminology 1433 for ATM Benchmarking". 1435 Procedure: 1437 1) Set up the SUT and test device using the bi-directional 1438 configuration. 1440 2) Configure the SUT and test device with one VCC. The VCC SHOULD 1441 contain one VPI/VCI. The VPI/VCI MUST not be one of the reserved 1442 signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be 1443 configured using one of the specified traffic descriptors. 1445 3) Send a specific number of IP packets containing one of the specified 1446 bit patterns at a specific rate through the SUT via the defined test 1447 VCC. Since this test is not a throughput test, the rate should not be 1448 greater than 90% of line rate. The IP PDUs MUST be encapsulated in AAL5. 1450 4) Count the IP packets that are transmitted by the SUT to verify 1451 connectivity and load. If the count on the test device is the same on 1452 the SUT, continue the test; else lower the test device traffic rate 1453 until the counts are the same. 1455 5) Record the number of bit errors at the receiver end of the test 1456 device. 1458 Reporting Format: 1460 The results of the Bursty Load/One VCC test SHOULD be reported in a form 1461 of text and graph. 1463 The text results SHOULD display the numerical values of the CER. The 1464 values given SHOULD include: time period of test in s, test VPI/VCI 1465 value, total number of cells transmitted and received on the given 1466 VPI/VCI during the test in positive integers, and the CER for the entire 1467 test. 1469 The graph results SHOULD display the cell error ratio values. The x- 1470 coordinate SHOULD be the test run time in either seconds, minutes or 1471 days depending on the total length of the test. The x-coordinate time 1472 SHOULD be configurable. The y-coordinate SHOULD be the CER. The 1473 integration time per point MUST be indicated. 1475 The results MUST also indicate the packet size in octets and traffic 1476 rate in packets per second as generated by the test device. The VCC and 1477 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1478 indicated. The PCR, SCR, and MBS MUST be indicated. 1480 3.2.2.6. Bursty Load/Twelve VCCs 1482 Objective: To determine the SUT ratio of errored cells on twelve VCCs in a 1483 transmission in relation to the total cells sent as defined in "Terminology 1484 for ATM Benchmarking". 1486 Procedure: 1488 1) Set up the SUT and test device using the bi-directional 1489 configuration. 1491 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 1492 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels 1493 (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one 1494 of the specified traffic descriptors. 1496 3) Send a specific number of IP packets containing one of the specified 1497 bit patterns at a specific rate through the SUT via the defined test 1498 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1499 rate. Since this test is not a throughput test, the rate should not be 1500 greater than 90% of line rate. The PCR, SCR, and MBS must be indicated. 1501 The IP PDUs MUST be encapsulated in AAL5. 1503 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1504 verify connectivity and load. If the count on the test device is the 1505 same on the SUT, continue the test; else lower the test device traffic 1506 rate until the counts are the same. 1508 5) Record the number of bit errors at the receiver end of the test 1509 device for all VCCs. 1511 Reporting Format: 1513 The results of the Bursty Load/Twelve VCCs test SHOULD be reported in a 1514 form of text and graph. 1516 The text results SHOULD display the numerical values of the CER. The 1517 values given SHOULD include: time period of test in s, test VPI/VCI 1518 value, total number of cells transmitted and received on the given 1519 VPI/VCI during the test in positive integers, and the CER for the entire 1520 test. 1522 The graph results SHOULD display the cell error ratio values. The x- 1523 coordinate SHOULD be the test run time in either seconds, minutes or 1524 days depending on the total length of the test. The x-coordinate time 1525 SHOULD be configurable. The y-coordinate SHOULD be the CER for each 1526 VCC. There should be 12 curves on the graph, on curve indicated and 1527 labeled for each VCC. The integration time per point MUST be indicated. 1529 The results MUST also indicate the packet size in octets and traffic 1530 rate in packets per second as generated by the test device. The VCC and 1531 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1532 indicated. The PCR, SCR, and MBS MUST be indicated. 1534 3.2.2.7. Bursty Load/Maximum VCCs 1536 Objective: To determine the SUT ratio of errored cells with the maximum 1537 number VCCs supported on the SUT in a transmission in relation to the total 1538 cells sent as defined in "Terminology for ATM Benchmarking". 1540 Procedure: 1542 1) Set up the SUT and test device using the bi-directional 1543 configuration. 1545 2) Configure the SUT and test device with the maximum number of VCCs 1546 supported on the SUT. For example, if the maximum number of VCCs 1547 supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The 1548 VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], 1549 [0,16]). The PCR, SCR, and MBS must be configured using one of the 1550 specified traffic descriptors. 1552 3) Send a specific number of IP packets containing one of the specified 1553 bit patterns at a specific rate through the SUT via the defined test 1554 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1555 rate. Since this test is not a throughput test, the rate should not be 1556 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1557 AAL5. 1559 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1560 verify connectivity and load. If the count on the test device is the 1561 same on the SUT, continue the test; else lower the test device traffic 1562 rate until the counts are the same. 1564 5) Record the number of bit errors at the receiver end of the test 1565 device for all VCCs. 1567 Reporting Format: 1569 The results of the Bursty Load/Maximum VCCs test SHOULD be reported in a 1570 form of text and graph. 1572 The text results SHOULD display the numerical values of the CER. The 1573 values given SHOULD include: time period of test in s, test VPI/VCI 1574 value, total number of cells transmitted and received on the given 1575 VPI/VCI during the test in positive integers, and the CER for the entire 1576 test. 1578 The graph results SHOULD display the cell error ratio values. There will 1579 be (Max number of VCCs/10) graphs, with 10 VCCs indicated on each graph. 1580 The x-coordinate SHOULD be the test run time in either seconds, minutes 1581 or days depending on the total length of the test. The x-coordinate time 1582 SHOULD be configurable. The y-coordinate SHOULD be the CER for each 1583 VCC. There SHOULD be no more than 10 curves on each graph, one curve 1584 indicated and labeled for each VCC. The integration time per point MUST 1585 be indicated. 1587 The results MUST also indicate the packet size in octets and traffic 1588 rate in packets per second as generated by the test device. The VCC and 1589 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1590 indicated. The PCR, SCR, and MBS MUST be indicated. 1592 3.2.3. Cell Loss Ratio (CLR) 1594 3.2.3.1. Steady Load/One VCC 1596 Objective: To determine the SUT ratio of lost cells on one VCC in a 1597 transmission in relation to the total cells sent as defined in "Terminology 1598 for ATM Benchmarking". 1600 Procedure: 1602 1) Set up the SUT and test device using the bi-directional 1603 configuration. 1605 2) Configure the SUT and test device with one VCC. The VCC SHOULD 1606 contain one VPI/VCI. The VPI/VCI MUST not be one of the reserved 1607 signaling channels (e.g. [0,5], [0,16]). 1609 3) Send a specific number of IP packets at a specific rate through the 1610 SUT via the defined test VCC. Since this test is not a throughput test, 1611 the rate should not be greater than 90% of line rate. The IP PDUs MUST 1612 be encapsulated in AAL5. 1614 4) Count the IP packets that are transmitted by the SUT to verify 1615 connectivity and load. If the count on the test device is the same on 1616 the SUT, continue the test; else lower the test device traffic rate 1617 until the counts are the same. 1619 5) Record the number of cells transmitted and received on the test 1620 device. 1622 Reporting Format: 1624 The results of the Steady Load/One VCC test SHOULD be reported in a form 1625 of text and graph. 1627 The text results SHOULD display the numerical values of the CLR. The 1628 values given SHOULD include: time period of test in s, test VPI/VCI 1629 value, total number of cells transmitted and received on the given 1630 VPI/VCI during the test in positive integers, and the CLR for the entire 1631 test. 1633 The graph results SHOULD display the Cell Loss ratio values. The x- 1634 coordinate SHOULD be the test run time in either seconds, minutes or 1635 days depending on the total length of the test. The x-coordinate time 1636 SHOULD be configurable. The y-coordinate SHOULD be the CLR. The 1637 integration time per point MUST be indicated. 1639 The results MUST also indicate the packet size in octets and traffic 1640 rate in packets per second as generated by the test device. The VCC and 1641 VPI/VCI values MUST be indicated. 1643 3.2.3.2. Steady Load/Twelve VCCs 1645 Objective: To determine the SUT ratio of lost cells on twelve VCCs in a 1646 transmission in relation to the total cells sent as defined in "Terminology 1647 for ATM Benchmarking". 1649 Procedure: 1651 1) Set up the SUT and test device using the bi-directional 1652 configuration. 1654 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 1655 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels 1656 (e.g. [0,5], [0,16]). 1658 3) Send a specific number of IP packets at a specific rate through the 1659 SUT via the defined test VCCs. All of the VPI/VCI pairs will generate 1660 traffic at the same traffic rate. Since this test is not a throughput 1661 test, the rate should not be greater than 90% of line rate. The IP PDUs 1662 MUST be encapsulated in AAL5. 1664 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1665 verify connectivity and load. If the count on the test device is the 1666 same on the SUT, continue the test; else lower the test device traffic 1667 rate until the counts are the same. 1669 5) Record the number of cells transmitted and received per VCC on the 1670 test device. 1672 Reporting Format: 1674 The results of the Steady Load/Twelve VCCs test SHOULD be reported in a 1675 form of text and graph. 1677 The text results SHOULD display the numerical values of the CLR. The 1678 values given SHOULD include: time period of test in s, test VPI/VCI 1679 value, total number of cells transmitted and received on the given 1680 VPI/VCI during the test in positive integers, and the CLR for the entire 1681 test. 1683 The graph results SHOULD display the Cell Loss ratio values. The x- 1684 coordinate SHOULD be the test run time in either seconds, minutes or 1685 days depending on the total length of the test. The x-coordinate time 1686 SHOULD be configurable. The y-coordinate SHOULD be the CLR for each VCC. 1687 There should be 12 curves on the graph, on curve indicated and labeled 1688 for each VCC. The integration time per point MUST be indicated. 1690 The results MUST also indicate the packet size in octets and traffic 1691 rate in packets per second as generated by the test device. The VCC and 1692 VPI/VCI values MUST be indicated. 1694 3.2.3.3. Steady Load/Maximum VCCs 1696 Objective: To determine the SUT ratio of lost cells with the maximum number 1697 VCCs supported on the SUT in a transmission in relation to the total cells 1698 sent as defined in "Terminology for ATM Benchmarking". 1700 Procedure: 1702 1) Set up the SUT and test device using the bi-directional 1703 configuration. 1705 2) Configure the SUT and test device with the maximum number of VCCs 1706 supported on the SUT. For example, if the maximum number of VCCs 1707 supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The 1708 VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], 1709 [0,16]). 1711 3) Send a specific number of IP packets at a specific rate through the 1712 SUT via the defined test VCCs. All of the VPI/VCI pairs will generate 1713 traffic at the same traffic rate. Since this test is not a throughput 1714 test, the rate should not be greater than 90% of line rate. The IP PDUs 1715 MUST be encapsulated in AAL5. 1717 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1718 verify connectivity and load. If the count on the test device is the 1719 same on the SUT, continue the test; else lower the test device traffic 1720 rate until the counts are the same. 1722 5) Record the number of cells transmitted and received per VCC on the 1723 test device. 1725 Reporting Format: 1727 The results of the Steady Load/Maximum VCCs test SHOULD be reported in a 1728 form of text and graph. 1730 The text results SHOULD display the numerical values of the CLR. The 1731 values given SHOULD include: time period of test in s, test VPI/VCI 1732 value, total number of cells transmitted and received on the given 1733 VPI/VCI during the test in positive integers, and the CLR for the entire 1734 test. 1736 The graph results SHOULD display the Cell Loss ratio values. There will 1737 be (Max number of VCCs/10) graphs, with 10 VCCs indicated on each graph. 1738 The x-coordinate SHOULD be the test run time in either seconds, minutes 1739 or days depending on the total length of the test. The x-coordinate time 1740 SHOULD be configurable. The y-coordinate SHOULD be the CLR for each 1741 VCC. There SHOULD be no more than 10 curves on each graph, one curve 1742 indicated and labeled for each VCC. The integration time per point MUST 1743 be indicated. 1745 The results MUST also indicate the packet size in octets and traffic 1746 rate in packets per second as generated by the test device. The VCC and 1747 VPI/VCI values MUST be indicated 1749 3.2.3.4. Bursty Load/One VCC 1751 Objective: To determine the SUT ratio of lost cells on one VCC in a 1752 transmission in relation to the total cells sent as defined in "Terminology 1753 for ATM Benchmarking". 1755 Procedure: 1757 1) Set up the SUT and test device using the bi-directional 1758 configuration. 1760 2) Configure the SUT and test device with one VCC. The VCC SHOULD 1761 contain one VPI/VCI. The VPI/VCI MUST not be one of the reserved 1762 signaling channels (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be 1763 configured using one of the specified traffic descriptors. 1765 3) Send a specific number of IP packets containing one of the specified 1766 bit patterns at a specific rate through the SUT via the defined test 1767 VCC. Since this test is not a throughput test, the rate should not be 1768 greater than 90% of line rate. The IP PDUs MUST be encapsulated in AAL5. 1770 4) Count the IP packets that are transmitted by the SUT to verify 1771 connectivity and load. If the count on the test device is the same on 1772 the SUT, continue the test; else lower the test device traffic rate 1773 until the counts are the same. 1775 5) Record the number of bit errors at the receiver end of the test 1776 device. 1778 Reporting Format: 1780 The results of the Bursty Load/One VCC test SHOULD be reported in a form 1781 of text and graph. 1783 The text results SHOULD display the numerical values of the CLR. The 1784 values given SHOULD include: time period of test in s, test VPI/VCI 1785 value, total number of cells transmitted and received on the given 1786 VPI/VCI during the test in positive integers, and the CLR for the entire 1787 test. 1789 The graph results SHOULD display the Cell Loss ratio values. The x- 1790 coordinate SHOULD be the test run time in either seconds, minutes or 1791 days depending on the total length of the test. The x-coordinate time 1792 SHOULD be configurable. The y-coordinate SHOULD be the CLR. The 1793 integration time per point MUST be indicated. 1795 The results MUST also indicate the packet size in octets and traffic 1796 rate in packets per second as generated by the test device. The VCC and 1797 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1798 indicated. The PCR, SCR, and MBS MUST be indicated. 1800 3.2.3.5. Bursty Load/Twelve VCCs 1802 Objective: To determine the SUT ratio of lost cells on twelve VCCs in a 1803 transmission in relation to the total cells sent as defined in "Terminology 1804 for ATM Benchmarking". 1806 Procedure: 1808 1) Set up the SUT and test device using the bi-directional 1809 configuration. 1811 2) Configure the SUT and test device with twelve VCCs, using 1 VPI and 1812 12 VCIs. The VPI/VCIs MUST not be one of the reserved signaling channels 1813 (e.g. [0,5], [0,16]). The PCR, SCR, and MBS must be configured using one 1814 of the specified traffic descriptors. 1816 3) Send a specific number of IP packets containing one of the specified 1817 bit patterns at a specific rate through the SUT via the defined test 1818 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1819 rate. Since this test is not a throughput test, the rate should not be 1820 greater than 90% of line rate. The PCR, SCR, and MBS must be indicated. 1821 The IP PDUs MUST be encapsulated in AAL5. 1823 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1824 verify connectivity and load. If the count on the test device is the 1825 same on the SUT, continue the test; else lower the test device traffic 1826 rate until the counts are the same. 1828 5) Record the number of bit errors at the receiver end of the test 1829 device for all VCCs. 1831 Reporting Format: 1833 The results of the Bursty Load/Twelve VCCs test SHOULD be reported in a 1834 form of text and graph. 1836 The text results SHOULD display the numerical values of the CLR. The 1837 values given SHOULD include: time period of test in s, test VPI/VCI 1838 value, total number of cells transmitted and received on the given 1839 VPI/VCI during the test in positive integers, and the CLR for the entire 1840 test. 1842 The graph results SHOULD display the Cell Loss ratio values. The x- 1843 coordinate SHOULD be the test run time in either seconds, minutes or 1844 days depending on the total length of the test. The x-coordinate time 1845 SHOULD be configurable. The y-coordinate SHOULD be the CLR for each 1846 VCC. There should be 12 curves on the graph, on curve indicated and 1847 labeled for each VCC. The integration time per point MUST be indicated. 1849 The results MUST also indicate the packet size in octets and traffic 1850 rate in packets per second as generated by the test device. The VCC and 1851 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1852 indicated. The PCR, SCR, and MBS MUST be indicated. 1854 3.2.3.6. Bursty Load/Maximum VCCs 1856 Objective: To determine the SUT ratio of lost cells with the maximum number 1857 VCCs supported on the SUT in a transmission in relation to the total cells 1858 sent as defined in "Terminology for ATM Benchmarking". 1860 Procedure: 1862 1) Set up the SUT and test device using the bi-directional 1863 configuration. 1865 2) Configure the SUT and test device with the maximum number of VCCs 1866 supported on the SUT. For example, if the maximum number of VCCs 1867 supported on the SUT is 1024, define 256 VPIs with 4 VCIs per VPI. The 1868 VPI/VCIs MUST not be one of the reserved signaling channels (e.g. [0,5], 1869 [0,16]). The PCR, SCR, and MBS must be configured using one of the 1870 specified traffic descriptors. 1872 3) Send a specific number of IP packets containing one of the specified 1873 bit patterns at a specific rate through the SUT via the defined test 1874 VCCs. All of the VPI/VCI pairs will generate traffic at the same traffic 1875 rate. Since this test is not a throughput test, the rate should not be 1876 greater than 90% of line rate. The IP PDUs MUST be encapsulated in 1877 AAL5. 1879 4) Count the IP packets that are transmitted by the SUT on all VCCs to 1880 verify connectivity and load. If the count on the test device is the 1881 same on the SUT, continue the test; else lower the test device traffic 1882 rate until the counts are the same. 1884 5) Record the number of bit errors at the receiver end of the test 1885 device for all VCCs. 1887 Reporting Format: 1889 The results of the Bursty Load/Maximum VCCs test SHOULD be reported in a 1890 form of text and graph. 1892 The text results SHOULD display the numerical values of the CLR. The 1893 values given SHOULD include: time period of test in s, test VPI/VCI 1894 value, total number of cells transmitted and received on the given 1895 VPI/VCI during the test in positive integers, and the CLR for the entire 1896 test. 1898 The graph results SHOULD display the Cell Loss ratio values. There will 1899 be (Max number of VCCs/10) graphs, with 10 VCCs indicated on each graph. 1900 The x-coordinate SHOULD be the test run time in either seconds, minutes 1901 or days depending on the total length of the test. The x-coordinate time 1902 SHOULD be configurable. The y-coordinate SHOULD be the CLR for each 1903 VCC. There SHOULD be no more than 10 curves on each graph, one curve 1904 indicated and labeled for each VCC. The integration time per point MUST 1905 be indicated. 1907 The results MUST also indicate the packet size in octets and traffic 1908 rate in packets per second as generated by the test device. The VCC and 1909 VPI/VCI values MUST be indicated. The payload bit pattern MUST be 1910 indicated. The PCR, SCR, and MBS MUST be indicated. 1912 3.2.4. Cell Misinsertion Rate (CMR) 1914 To be done. 1916 3.2.5. Cell Rate Margin (CRM) 1918 To be done. 1920 3.2.6. CRC Error Ratio: 1922 To be done. 1924 3.2.7. Cell Transfer Delay (CTD) 1926 To be done. 1928 3.3. ATM Adaptation Layer (AAL) Type 5 (AAL5) 1930 To be done. 1932 3.3.1. AAL5 Reassembly Errors 1934 To be done. 1936 3.3.2. AAL5 Reassembly Time 1938 To be done. 1940 3.3.3. AAL5 CRC Error Ratio 1942 To be done. 1944 4. Security Considerations. 1946 As this document is solely for the purpose of providing methodology and 1947 describes neither a protocol nor an implementation, there are no 1948 security considerations associated with this document. 1950 5. Notices 1952 The IETF takes no position regarding the validity or scope of any 1953 intellectual property or other rights that might be claimed to pertain 1954 to the implementation or use of the technology described in this 1955 document or the extent to which any license under such rights might or 1956 might not be available; neither does it represent that it has made any 1957 effort to identify any such rights. Information on the IETFs procedures 1958 with respect to rights in standards-track and standards-related 1959 documentation can be found in BCP-11. Copies of claims of rights made 1960 available for publication and any assurances of licenses to be made 1961 available, or the result of an attempt made to obtain a general license 1962 or permission for the use of such proprietary rights by implementors or 1963 users of this specification can be obtained from the IETF Secretariat. 1965 The IETF invites any interested party to bring to its attention any 1966 copyrights, patents or patent applications, or other proprietary rights 1967 which may cover technology that may be required to practice this 1968 standard. Please address the information to the IETF Executive 1969 Director. 1971 6. Disclaimer 1973 Copyright (C) The Internet Society (1999). All Rights Reserved. 1975 This document and translations of it may be copied and furnished to 1976 others, and derivative works that comment on or otherwise explain it or 1977 assist in its implementation may be prepared, copied, published and 1978 distributed, in whole or in part, without restriction of any kind, 1979 provided that the above copyright notice and this paragraph are included 1980 on all such copies and derivative works. However, this document itself 1981 may not be modified in any way, such as by removing the copyright notice 1982 or references to the Internet Society or other Internet organizations, 1983 except as needed for the purpose of developing Internet standards in 1984 which case the procedures for copyrights defined in the Internet 1985 Standards process must be followed, or as required to translate it into 1986 languages other than English. 1988 The limited permissions granted above are perpetual and will not be 1989 revoked by the Internet Society or its successors or assigns. This 1990 document and the information contained herein is provided on an "AS IS" 1991 basis and THE INTERNET SOCIETY AND THE INTERNET ENGINEERING TASK FORCE 1992 DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED 1993 TO ANY WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE 1994 ANY RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A 1995 PARTICULAR PURPOSE. 1997 7. References 1999 [IETF-RFC-2544] IETF RFC 2544 "Benchmarking Methodology for Network 2000 Interconnect Devices", March, 1999. 2002 [IETF-RFC-2225] IETF RFC 2225 "Classical IP and ARP over ATM", April, 2003 1998. 2005 [IETF-TERM-ATM] IETF "Terminology for ATM Benchmarking" Internet Draft, 2006 September, 1999. 2008 [AF-ILMI4.0] ATM Forum Integrated Local Management Interface Version 2009 4.0, af-ilmi-0065.000, September 1996. 2011 [AF-TEST-0022] Introduction to ATM Forum Test Specifications, af-test- 2012 0022.00, December 1994. 2014 [AF-TM4.0] ATM Forum, Traffic Management Specification Version 4.0, af- 2015 tm- 0056.00, April 1996. 2017 [AF-UNI3.1] ATM Forum, User Network Interface Specification Version 3.1, 2018 September 1994. 2020 [AF-UNI4.0] ATM Forum, User Network Interface Specification Version 4.0, 2021 July 1996. 2023 8. Editor's Addresses 2025 Jeffrey Dunn 2026 Advanced Network Consultants, Inc. 2027 4214 Crest Place, Ellicott City, MD 21043 USA 2028 Phone: +1 (410) 750-1700, E-mail: Jeffrey.Dunn@worldnet.att.net 2030 Cynthia Martin 2031 Advanced Network Consultants, Inc. 2032 4214 Crest Place, Ellicott City, MD 21043 USA 2033 Phone: +1 (410) 750-1700, E-mail: Cynthia.E.Martin@worldnet.att.net