idnits 2.17.1 draft-poretsky-bmwg-sip-bench-meth-02.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** It looks like you're using RFC 3978 boilerplate. You should update this to the boilerplate described in the IETF Trust License Policy document (see https://trustee.ietf.org/license-info), which is required now. -- Found old boilerplate from RFC 3978, Section 5.1 on line 17. -- Found old boilerplate from RFC 3978, Section 5.5, updated by RFC 4748 on line 763. -- Found old boilerplate from RFC 3979, Section 5, paragraph 1 on line 774. -- Found old boilerplate from RFC 3979, Section 5, paragraph 2 on line 781. -- Found old boilerplate from RFC 3979, Section 5, paragraph 3 on line 787. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The abstract seems to contain references ([5]), which it shouldn't. Please replace those with straight textual mentions of the documents in question. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust Copyright Line does not match the current year == The document seems to lack the recommended RFC 2119 boilerplate, even if it appears to use RFC 2119 keywords. (The document does seem to have the reference to RFC 2119 which the ID-Checklist requires). -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- The document date (November 19, 2007) is 6002 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) == Unused Reference: '2' is defined on line 683, but no explicit reference was found in the text == Unused Reference: '4' is defined on line 689, but no explicit reference was found in the text == Unused Reference: '7' is defined on line 700, but no explicit reference was found in the text == Unused Reference: '8' is defined on line 703, but no explicit reference was found in the text == Unused Reference: '9' is defined on line 707, but no explicit reference was found in the text == Unused Reference: '10' is defined on line 711, but no explicit reference was found in the text == Unused Reference: '11' is defined on line 715, but no explicit reference was found in the text ** Downref: Normative reference to an Informational RFC: RFC 1242 (ref. '2') ** Downref: Normative reference to an Informational RFC: RFC 2544 (ref. '3') ** Downref: Normative reference to an Informational RFC: RFC 2285 (ref. '4') == Outdated reference: A later version (-05) exists of draft-poretsky-sip-bench-term-02 -- Possible downref: Normative reference to a draft: ref. '5' ** Downref: Normative reference to an Informational RFC: RFC 4083 (ref. '8') ** Downref: Normative reference to an Informational RFC: RFC 4475 (ref. '9') == Outdated reference: A later version (-08) exists of draft-malas-performance-metrics-01 -- Possible downref: Normative reference to a draft: ref. '10' == Outdated reference: A later version (-12) exists of draft-ietf-sip-mib-11 Summary: 7 errors (**), 0 flaws (~~), 13 warnings (==), 9 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 1 Benchmarking Methodology Working S. Poretsky 2 Group Reef Point Networks 3 Internet-Draft V. Gurbani 4 Expires: May 22, 2008 Bell Laboratories, Alcatel-Lucent 5 C. Davids 6 Illinois Institute of Technology 7 November 19, 2007 9 Methodology for Benchmarking SIP Networking Devices 10 draft-poretsky-bmwg-sip-bench-meth-02 12 Status of this Memo 14 By submitting this Internet-Draft, each author represents that any 15 applicable patent or other IPR claims of which he or she is aware 16 have been or will be disclosed, and any of which he or she becomes 17 aware will be disclosed, in accordance with Section 6 of BCP 79. 19 Internet-Drafts are working documents of the Internet Engineering 20 Task Force (IETF), its areas, and its working groups. Note that 21 other groups may also distribute working documents as Internet- 22 Drafts. 24 Internet-Drafts are draft documents valid for a maximum of six months 25 and may be updated, replaced, or obsoleted by other documents at any 26 time. It is inappropriate to use Internet-Drafts as reference 27 material or to cite them other than as "work in progress." 29 The list of current Internet-Drafts can be accessed at 30 http://www.ietf.org/ietf/1id-abstracts.txt. 32 The list of Internet-Draft Shadow Directories can be accessed at 33 http://www.ietf.org/shadow.html. 35 This Internet-Draft will expire on May 22, 2008. 37 Copyright Notice 39 Copyright (C) The IETF Trust (2007). 41 Abstract 43 This document describes the methodology for benchmarking Session 44 Initiation Protocol (SIP) performance as described in Terminology 45 document [5]. The methodology and terminology are to be used for 46 benchmarking signaling plane performance with varying signaling and 47 media load. Both scale and establishment rate are measured by 48 signaling plane performance. The SIP Devices to be benchmarked may 49 be a single device under test (DUT) or a system under test (SUT). 50 Benchmarks can be obtained and compared for different types of 51 devices such as SIP Proxy Server, SBC, P-CSCF, and Server paired with 52 a Firewall/NAT device. 54 Table of Contents 56 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 57 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4 58 3. Test Setup . . . . . . . . . . . . . . . . . . . . . . . . . . 5 59 3.1. Test Topologies . . . . . . . . . . . . . . . . . . . . . 5 60 3.2. Test Considerations . . . . . . . . . . . . . . . . . . . 8 61 3.2.1. Selection of SIP Transport Protocol . . . . . . . . . 8 62 3.2.2. Server . . . . . . . . . . . . . . . . . . . . . . . . 8 63 3.2.3. Associated Media . . . . . . . . . . . . . . . . . . . 8 64 3.2.4. Selection of Associated Media Protocol . . . . . . . . 8 65 3.2.5. Number of Associated Media Streams per SIP Session . . 8 66 3.2.6. Session Duration . . . . . . . . . . . . . . . . . . . 9 67 3.2.7. Attempted Sessions per Second . . . . . . . . . . . . 9 68 3.2.8. Stress Testing . . . . . . . . . . . . . . . . . . . . 9 69 3.3. Reporting Format . . . . . . . . . . . . . . . . . . . . . 9 70 3.3.1. Test setup Report . . . . . . . . . . . . . . . . . . 9 71 3.3.2. Device Benchmarks . . . . . . . . . . . . . . . . . . 9 72 4. Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 10 73 4.1. Maximum Session Attempt Rate . . . . . . . . . . . . . . . 10 74 4.2. Maximum Session Attempt Rate with Media . . . . . . . . . 10 75 4.3. Maximum Session Attempt Rate with Loop Detection 76 Enabled . . . . . . . . . . . . . . . . . . . . . . . . . 11 77 4.4. Maximum Session Attempt Rate with Forking . . . . . . . . 11 78 4.5. Maximum Session Attempt Rate with Forking and Loop 79 Detection . . . . . . . . . . . . . . . . . . . . . . . . 12 80 4.6. Maximum Session Attempt Rate with TLS Encrypted SIP . . . 12 81 4.7. Maximum Session Attempt Rate with IPsec Encrypted SIP . . 13 82 4.8. Maximum Session Attempt Rate with SIP Flooding . . . . . . 13 83 4.9. Maximum Registration Rate . . . . . . . . . . . . . . . . 14 84 4.10. Maximum IM Rate . . . . . . . . . . . . . . . . . . . . . 15 85 4.11. Maximum Presence Rate . . . . . . . . . . . . . . . . . . 15 86 4.12. Maximum Session Establishment Rate . . . . . . . . . . . . 15 87 4.13. Maximum Session Establishment Rate with media . . . . . . 16 88 5. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 17 89 6. Security Considerations . . . . . . . . . . . . . . . . . . . 17 90 7. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 17 91 8. References . . . . . . . . . . . . . . . . . . . . . . . . . . 17 92 8.1. Normative References . . . . . . . . . . . . . . . . . . . 17 93 8.2. Informational References . . . . . . . . . . . . . . . . . 18 94 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 18 95 Intellectual Property and Copyright Statements . . . . . . . . . . 20 97 1. Terminology 99 In this document, the key words "MUST", "MUST NOT", "REQUIRED", 100 "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT 101 RECOMMENDED", "MAY", and "OPTIONAL" are to be interpreted as 102 described in BCP 14, conforming to RFC 2119 [1] and indicate 103 requirement levels for compliant implementations. 105 Terms specific to SIP Performance benchmarking are defined in [5]. 107 RFC 2119 defines the use of these key words to help make the intent 108 of standards track documents as clear as possible. While this 109 document uses these keywords, this document is not a standards track 110 document. The term Throughput is defined in RFC 2544. 112 2. Introduction 114 This document describes the methodology for benchmarking Session 115 Initiation Protocol (SIP) performance as described in Terminology 116 document [5]. The methodology and terminology are to be used for 117 benchmarking signaling plane performance with varying signaling and 118 media load. Both scale and establishment rate are measured by 119 signaling plane performance. 121 The SIP Devices to be benchmarked may be a single device under test 122 (DUT) or a system under test (SUT). The DUT is a SIP Server, which 123 may be any RFC 3261 [6] conforming device. The SUT can be any device 124 or group of devices containing RFC 3261 conforming functionality 125 along with Firewall and/or NAT functionality. This enables 126 benchmarks to be obtained and compared for different types of devices 127 such as SIP Proxy Server, SBC, P-CSCF, Proxy Server paired with a 128 Firewall/NAT device, and P-CSCF paired with a Firewall/NAT device. 129 SIP Associated Media benchmarks can also be made when testing SUTs. 131 The test cases covered in this methodology document provide 132 benchmarks metrics of Registration Rate, SIP Session Setup Rate, 133 Session Capacity, IM Rate, and Presence Rate. These can be 134 benchmarked with or without associated Media. Some cases are also 135 included to cover Forking, Loop detecion, Encrypted SIP, and SIP 136 Flooding. The test topologies that can be used are described in the 137 Test Setup section. Topologies are provided for benchmarking of a 138 DUT or SUT. Benchmarking with Associated Media can be performed when 139 using a SUT. 141 SIP permits a wide range of configuration options that are also 142 explained in the Test Setup section. Benchmark metrics could 143 possibly be impacted by Associated Media. The selected values for 144 Session Duration and Media Streams per Session enable benchmark 145 metrics to be benchmarked without Associated Media. Session Setup 146 Rate could possibly be impacted by the selected value for Maximum 147 Sessions Attempted. The benchmark for Session Setup Rate is measured 148 with a fixed value for Maximum Sessions Attempted. 150 3. Test Setup 152 3.1. Test Topologies 154 Figures 1 through 5 below provide various topologies to perform the 155 SIP Performance Benchmarking. These figures show the Device Under 156 Test (DUT) to be a single server or a System Under Test (SUT). Test 157 Topology options to include benchmarking with Associated Media 158 require use of a SUT and are shown in Figures 4 and 5. 160 DUT 161 --------- --------- 162 | | | | 163 | | | | 164 | | SIP | | 165 |Server |<------------->| Tester| 166 | | | | 167 | | | | 168 | | | | 169 --------- --------- 171 Figure 1. Basic SIP Test Topology 173 Figure 1 175 SUT 176 ------------------------ 177 --------- --------- --------- 178 | | | | | | 179 | | | | | | 180 | | SIP |Fire- | SIP | | 181 | Server|<---------------------->| Tester| 182 | | |Wall | | | 183 | | | | | | 184 | | | | | | 185 --------- --------- --------- 187 Figure 2. SIP Test Topology with Firewall 189 Figure 2 191 SUT 192 ------------------------ 193 --------- --------- --------- 194 | | | | | | 195 | | | | | | 196 | | SIP | NAT | SIP | | 197 | Server|<---------------------->| Tester| 198 | | | | | | 199 | | | | | | 200 | | | | | | 201 --------- --------- --------- 203 Figure 3. SIP Test Topology with NAT Device 205 Figure 3 207 SUT 208 ------------------------ 209 --------- --------- --------- 210 | | | | | | 211 | | | | | | 212 | | SIP |Fire- | SIP | | 213 | Server|<---------------------->| Tester| 214 | | |Wall | | | 215 | | | | Media | | 216 | | ---| |---------| | 217 --------- | --------- --------- 218 | Media ^ 219 -------------------------| 221 Figure 4. SIP Test Topology with Media through Firewall 223 Figure 4 225 SUT 226 ------------------------ 227 --------- --------- --------- 228 | | | | | | 229 | | | | | | 230 | | SIP | NAT | SIP | | 231 | Server|<---------------------->| Tester| 232 | | | | | | 233 | | | | Media | | 234 | | ---| |---------| | 235 --------- | --------- --------- 236 | Media ^ 237 -------------------------| 239 Figure 5. SIP Test Topology with Media through NAT Device 241 Figure 5 243 3.2. Test Considerations 245 3.2.1. Selection of SIP Transport Protocol 246 Discussion: 247 Test cases may be performed with any transport protocol supported 248 by SIP. This includes, but is not limited to, SIP TCP, SIP UDP, 249 and TLS. The protocol used for the SIP transport protocol must be 250 reported with benchmarking results. 252 3.2.2. Server 253 Discussion: 254 The Server is a SIP-speaking device that complies with RFC 3261. 255 The purpose of this document is to benchmark SIP performance, not 256 conformance. Conformance to RFC 3261 [6] is assumed for all 257 tests. The Server may be the DUT or a component of a SUT that 258 includes Firewall and/or NAT functionality. The components of the 259 SUT may be a single physical device or separate devices. 261 3.2.3. Associated Media 262 Discussion: 263 Some tests may require associated media to be present for each SIP 264 session. The Server is not involved in the forwarding of media. 265 Associated Media can be benchmarked only with a SUT in which the 266 media traverses a Firewall, NAT, or Firewall NAT device.The test 267 topologies to be used when benchmarking SUT performance for 268 Associated Media are shown in Figures 4 and 5. 270 3.2.4. Selection of Associated Media Protocol 271 Discussion: 272 The test cases specified in this document provide SIP performance 273 independent of the protocol used for the media stream. Any media 274 protocol supported by SIP may be used. This includes, but is not 275 limited to, RTP, RTSP, and SRTP. The protocol used for Associated 276 Media must be reported with benchmarking results. 278 3.2.5. Number of Associated Media Streams per SIP Session 279 Discussion: 280 Benchmarking results may vary with the number of media streams per 281 SIP session. When benchmarking a SUT for voice, a single media 282 stream is used. When benchmarking a SUT for voice and video, two 283 media streams are used. The number of Associated Media Streams 284 must be reported with benchmarking results. 286 3.2.6. Session Duration 287 Discussion: 288 SUT performance benchmarks may vary with the duration of SIP 289 sessions. Session Duration must be reported with benchmarking 290 results. A Session Duration of zero seconds indicates 291 transmission of a BYE immediately following successful SIP 292 establishment indicate by receipt of a 200 OK. An infinite 293 Session Duration indicates that a BYE is never transmitted. 295 3.2.7. Attempted Sessions per Second 296 Discussion: 297 DUT and SUT performance benchmarks may vary with the the rate of 298 attempted sessions offered by the Tester. Attempted Sessions per 299 Second must be reported with benchmarking results. 301 3.2.8. Stress Testing 302 Discussion: 303 The purpose of this document is to benchmark SIP performance, not 304 system stability under stressful conditions such as a high rate of 305 Attempted Sessions per Second. 307 3.3. Reporting Format 309 3.3.1. Test setup Report 311 SIP Transport Protocol = _____________________________ 313 IS Duration = ____________________________________ 315 Maximum Sessions Attempted = _________________________ 317 Media Streams per Session = __________________________ 319 Media Protocol = _____________________________________ 321 3.3.2. Device Benchmarks 323 Failed Session Attempts = ___________________________ 325 Session Capacity = ___________________________________ 327 Maximum Session Establishment Rate = ________________ 328 Maximum Retransmits = _______________________________ 330 Mean Session Setup Delay = __________________________ 332 Mean Session Disconnect Delay = ______________________ 334 4. Test Cases 336 4.1. Maximum Session Attempt Rate 337 Objective: 338 To benchmark the maximum session attempt rate of the DUT/SUT with 339 zero failures. 340 Procedure: 341 1. Configure the DUT in the test topology shown in Figure 1 or 342 SUT as shown in Figures 2 or 3. 343 2. Configure Tester for SIP UDP with an Attempted Session Rate = 344 100 SPS, Session Duration = 0 sec, Maximum Sessions Attempted 345 = 100,000 and media streams per session=0. 346 3. Start Tester to initiate SIP Session establishment with the 347 DUT. 348 4. Measure Failed Session Attempts and Total Sessions Established 349 at the Tester. 350 5. If a Failed Session Attempt is recorded then reduce the 351 Attempted Session Rate configured on the Tester by 50%. 352 6. If no Failed Session Attempt is recorded then increase the 353 Attempted Session Rate configured on the Tester by 50%. 354 7. Repeat steps 3 through 6 until the Maximum Session 355 Establishment Rate is obtained. 356 Expected Results: 358 4.2. Maximum Session Attempt Rate with Media 359 Objective: 360 To benchmark the maximum session establishment rate of the SUT 361 with zero failures when Associated Media is included in the 362 benchmark test. 363 Procedure: 364 1. Configure the SUT in the test topology shown in Figure 4 or 5. 365 2. Configure Tester for SIP UDP with an Attempted Session Rate = 366 100 SPS, Session Duration = 30 sec, Maximum Sessions Attempted 367 = 100,000 and media streams per session = 1. The rate of 368 offered load for each media stream SHOULD be (eq 1) Offered 369 Load per Media Stream = Throughput / Maximum Sessions 370 Attempted, where Throughput is defined in [3]. 371 3. Start Tester to initiate SIP Session establishment with the 372 SUT and transmit media through the SUT to a destination other 373 than the server. 375 4. At the Tester measure Failed Session Attempts, Total Sessions 376 Established, and Packet Loss [3] of the media. 377 5. If a Failed Session Attempt or Packet Loss is recorded then 378 reduce the Attempted Session Rate configured on the Tester by 379 50%. 380 6. If no Failed Session Attempt or Packet Loss is recorded then 381 increase the Attempted Session Rate configured on the Tester 382 by 50%. 383 7. Repeat steps 3 through 6 until the Session Setup Rate is 384 obtained. 385 8. Repeat steps 1 through 7 for multimedia in which media streams 386 per session = 2. 387 Expected Results: 388 Maximum Session Establishment Rate results obtained with 389 Associated Media with any number of media streams per SIP session 390 will be identical to the Session Setup Rate results obtained 391 without media. 393 4.3. Maximum Session Attempt Rate with Loop Detection Enabled 394 Objective: 395 To benchmark the maximum session attempt rate of the DUT/SUT with 396 zero failures when the Loop Detection option is enabled. 397 Procedure: 398 1. Configure the DUT in the test topology shown in Figure 1 or 399 SUT as shown in Figures 2 or 3. 400 2. Configure Tester for SIP UDP with an Attempted Session Rate = 401 100 SPS, Session Duration = 0 sec, Maximum Sessions Attempted 402 = 100,000 and media streams per session=0. 403 3. Turn on the Loop Detection option in the DUT or SUT. 404 4. Start Tester to initiate SIP Session establishment with the 405 DUT. 406 5. Measure Failed Session Attempts and Total Sessions Established 407 at the Tester. 408 6. If a Failed Session Attempt is recorded then reduce the 409 Attempted Session Rate configured on the Tester by 50%. 410 7. If no Failed Session Attempt is recorded then increase the 411 Attempted Session Rate configured on the Tester by 50%. 412 8. Repeat steps 4 through 7 until the Maximum Session 413 Establishment Rate is obtained. 414 Expected Results: 416 4.4. Maximum Session Attempt Rate with Forking 417 Objective: 418 To benchmark the maximum session attempt rate of the DUT/SUT with 419 zero failures when the Forking option is enabled. 421 Procedure: 422 1. Configure the DUT in the test topology shown in Figure 1 or 423 SUT as shown in Figures 2 or 3. 424 2. Configure Tester for SIP UDP with an Attempted Session Rate = 425 100 SPS, Session Duration = 0 sec, Maximum Sessions Attempted 426 = 100,000 and media streams per session=0. 427 3. Turn on the Forking option in the DUT or SUT. 428 4. Start Tester to initiate SIP Session establishment with the 429 DUT. 430 5. Measure Failed Session Attempts and Total Sessions Established 431 at the Tester. 432 6. If a Failed Session Attempt is recorded then reduce the 433 Attempted Session Rate configured on the Tester by 50%. 434 7. If no Failed Session Attempt is recorded then increase the 435 Attempted Session Rate configured on the Tester by 50%. 436 8. Repeat steps 4 through 7 until the Maximum Session 437 Establishment Rate is obtained. 438 Expected Results: 440 4.5. Maximum Session Attempt Rate with Forking and Loop Detection 441 Objective: 442 To benchmark the maximum session attempt rate of the DUT/SUT with 443 zero failures when both forking and loop detection are enabled. 444 Procedure: 445 1. Configure the DUT in the test topology shown in Figure 1 or 446 SUT as shown in Figures 2 or 3. 447 2. Configure Tester for SIP UDP with an Attempted Session Rate = 448 100 SPS, Session Duration = 0 sec, Maximum Sessions Attempted 449 = 100,000 and media streams per session=0. 450 3. Start Tester to initiate SIP Session establishment with the 451 DUT. 452 4. Turn on both the forking and the loop detection options. 453 5. Measure Failed Session Attempts and Total Sessions Established 454 at the Tester. 455 6. If a Failed Session Attempt is recorded then reduce the 456 Attempted Session Rate configured on the Tester by 50%. 457 7. If no Failed Session Attempt is recorded then increase the 458 Attempted Session Rate configured on the Tester by 50%. 459 8. Repeat steps 4 through 7 until the Maximum Session 460 Establishment Rate is obtained. 461 Expected Results: 463 4.6. Maximum Session Attempt Rate with TLS Encrypted SIP 464 Objective: 466 To benchmark the maximum session attempt rate of the DUT/SUT with 467 zero failures. 468 Procedure: 469 1. Configure the DUT in the test topology shown in Figure 1 or 470 SUT as shown in Figures 2 or 3. 471 2. Configure Tester for SIP TCP, enable TLS, Attempted Session 472 Rate = 100 SPS, Session Duration = 0 sec, Maximum Sessions 473 Attempted = 100,000 and media streams per session=0. 474 3. Start Tester to initiate SIP Session establishment with the 475 DUT. 476 4. Measure Failed Session Attempts and Total Sessions Established 477 at the Tester. 478 5. If a Failed Session Attempt is recorded then reduce the 479 Attempted Session Rate configured on the Tester by 50%. 480 6. If no Failed Session Attempt is recorded then increase the 481 Attempted Session Rate configured on the Tester by 50%. 482 7. Repeat steps 3 through 6 until the Maximum Session 483 Establishment Rate is obtained. 484 Expected Results: 486 4.7. Maximum Session Attempt Rate with IPsec Encrypted SIP 487 Objective: 488 To benchmark the maximum session attempt rate of the DUT/SUT with 489 zero failures. 490 Procedure: 491 1. Configure the DUT in the test topology shown in Figure 1 or 492 SUT as shown in Figures 2 or 3. 493 2. Configure Tester for SIP TCP, enable IPSec, Attempted Session 494 Rate = 100 SPS, Session Duration = 0 sec, Maximum Sessions 495 Attempted = 100,000 and media streams per session=0. 496 3. Start Tester to initiate SIP Session establishment with the 497 DUT. 498 4. Measure Failed Session Attempts and Total Sessions Established 499 at the Tester. 500 5. If a Failed Session Attempt is recorded then reduce the 501 Attempted Session Rate configured on the Tester by 50%. 502 6. If no Failed Session Attempt is recorded then increase the 503 Attempted Session Rate configured on the Tester by 50%. 504 7. Repeat steps 3 through 6 until the Maximum Session 505 Establishment Rate is obtained. 506 Expected Results: 508 4.8. Maximum Session Attempt Rate with SIP Flooding 509 Objective: 511 To benchmark the maximum session attempt rate of the SUT with zero 512 failures when SIP Flooding is occurring. 513 Procedure: 514 1. Configure the DUT in the test topology shown in Figure 1 or 515 the SUT as shown in Figure 2. 516 2. Configure Tester for SIP UDP with an Attempted Session Rate = 517 100 SPS, Session Duration = 0 sec, Maximum Sessions Attempted 518 = 100,000, Associated Media Streams per session = 0, and SIP 519 INVITE Message Flood = 500 per second. 520 3. Start Tester to initiate SIP Session establishment with the 521 SUT and SIP Flood targetted at the Server. 522 4. At the Tester measure Failed Session Attempts, Total Sessions 523 Established, and Packet Loss [3] of the media. 524 5. If a Failed Session Attempt or Packet Loss is recorded then 525 reduce the Attempted Session Rate configured on the Tester by 526 50%. 527 6. If no Failed Session Attempt or Packet Loss is recorded then 528 increase the Attempted Session Rate configured on the Tester 529 by 50%. 530 7. Repeat steps 3 through 6 until the Session Setup Rate is 531 obtained. 532 8. Repeat steps 1 through 7 with SIP INVITE Message Flood = 1000 533 per second. 534 Expected Results: Session Setup Rate results obtained with SIP 535 Flooding may be degraded. 537 4.9. Maximum Registration Rate 538 Objective: 539 To benchmark the maximum registration rate of the SUT with zero 540 failures. 541 Procedure: 542 1. Configure the DUT in the test topology shown in Figure 1 or 543 SUT as shown in Figures 2 or 3. 544 2. Configure Tester for SIP UDP with an Attempted Registration 545 Rate = 100 SPS, Maximum Registrations Attempted = 100,000. 546 3. At the Tester measure Failed Registration Attempts, Total 547 Registrations and Packet Loss. 548 4. If a Failed Registration Attempt or Packet Loss is recorded 549 then reduce the Attempted Registration Rate configured on the 550 Tester by 50%. 551 5. If no Failed Registration or Packet Loss is recorded then 552 increase the Attempted Registration Rate configured on the 553 Tester by 50%. 554 6. Repeat steps 3 through 6 until the Session Setup Rate is 555 obtained. 557 Expected Results: 559 4.10. Maximum IM Rate 560 Objective: 561 To benchmark the maximum IM rate of the SUT with zero failures. 562 Procedure: 563 1. Configure the DUT in the test topology shown in Figure 1 or 564 SUT as shown in Figures 2 or 3. 565 2. Configure Tester for SIP UDP with an Attempted IM Rate = 100 566 SPS, Maximum IM Attempted = 100,000. 567 3. At the Tester measure Failed IM Attempts, Total IM and Packet 568 Loss. 569 4. If a Failed IM Attempt or Packet Loss is recorded then reduce 570 the Attempted IM Rate configured on the Tester by 50%. 571 5. If no Failed IM or Packet Loss is recorded then increase the 572 Attempted IM Rate configured on the Tester by 50%. 573 6. Repeat steps 3 through 6 until the Session Setup Rate is 574 obtained. 575 Expected Results: 577 4.11. Maximum Presence Rate 578 Objective: 579 To benchmark the Maximum Presence Rate of the SUT with zero 580 failures. 581 Procedure: 582 1. Configure the DUT in the test topology shown in Figure 1 or 583 SUT as shown in Figures 2 or 3. 584 2. Configure Tester for SIP UDP with an Attempted Presence Rate = 585 100 SPS, Maximum Registrations Attempted = 100,000. 586 3. At the Tester measure Failed Presence Attempts, Total Presence 587 Attempts and Packet Loss. 588 4. If a Failed Presence Attempt or Packet Loss is recorded then 589 reduce the Attempted Presence Rate configured on the Tester by 590 50%. 591 5. If no Failed Presence Attempt or Packet Loss is recorded then 592 increase the Attempted Registration Rate configured on the 593 Tester by 50%. 594 6. Repeat steps 3 through 6 until the Session Setup Rate is 595 obtained. 596 Expected Results: 598 4.12. Maximum Session Establishment Rate 599 Objective: 600 To benchmark the Session Capacity of the SUT with Associated 601 Media. 603 Procedure: 604 1. Configure the DUT in the test topology shown in Figure 1 or 605 SUT as shown in Figures 2 or 3. 606 2. Configure Tester for SIP UDP with an Attempted Session Rate = 607 Zero-Failure Session Setup Rate, Session Duration = 0 sec, 608 Maximum Sessions Attempted = 10,000 and media streams per 609 session = 0. 610 3. Start Tester to initiate SIP Session establishment with the 611 DUT. 612 4. Measure Failed Session Attempts, Total Sessions Established, 613 and Packet Loss [3] at the Tester. 614 5. If a Failed Session Attempt or Packet Loss is recorded then 615 reduce the Maximum Sessions Attempted configured on the Tester 616 by 5,000. 617 6. If no Failed Session Attempt or Packet Loss is recorded then 618 increase the Maximum Sessions Attempted configured on the 619 Tester by 10,000. 620 7. Repeat steps 3 through 6 until the Session Capacity is 621 obtained. 622 8. Repeat steps 1 through 7 for multimedia in which media streams 623 per session = 2. 624 Expected Results: 626 4.13. Maximum Session Establishment Rate with media 627 Objective: 628 To benchmark the Maximum Session Establishment Rate of the DUT/SUT 629 with associated media. 630 Procedure: 631 1. Configure the DUT in the test topology shown in Figure 1 or 632 SUT as shown in Figures 2 or 3. 633 2. Configure Tester for SIP UDP with a Session Attempt Rate = 100 634 SPS, Session Duration = 30 sec, Maximum Sessions Attempted = 635 100,000 and media streams per session = 1. The rate of 636 offered load for each media stream SHOULD be (eq 1) Offered 637 Load per Media Stream = Throughput / Maximum Sessions 638 Attempted, where Throughput is defined in [3]. 639 3. Start Tester to initiate SIP Session establishment with the 640 SUT and transmit media through the SUT to a destination other 641 than the server. 642 4. Measure Failed Session Attempts and Total Sessions Established 643 at the Tester. 644 5. If a Failed Session Attempt is recorded then reduce the 645 Maximum Sessions Attempted configured on the Tester by 5,000. 646 6. If no Failed Session Attempt is recorded then increase the 647 Maximum Sessions Attempted configured on the Tester by 10,000. 648 7. Repeat steps 3 through 6 until the Session Capacity is 649 obtained. 651 Expected Results: Session establishment rate results obtained with 652 Associated Media with any number of media streams per SIP session 653 will be identical to the Session Capacity results obtained without 654 media. 656 5. IANA Considerations 658 This document requires no IANA considerations. 660 6. Security Considerations 662 Documents of this type do not directly affect the security of 663 Internet or corporate networks as long as benchmarking is not 664 performed on devices or systems connected to production networks. 665 Security threats and how to counter these in SIP and the media layer 666 is discussed in RFC3261, RFC3550, and RFC3711 and various other 667 drafts. This document attempts to formalize a set of common 668 methodology for benchmarking performance of SIP devices in a lab 669 environment. 671 7. Acknowledgments 673 The authors would like to thank Keith Drage and Daryl Malas for their 674 contributions to this document. 676 8. References 678 8.1. Normative References 680 [1] Bradner, S., "Key words for use in RFCs to Indicate Requirement 681 Levels", RFC 2119, March 1997. 683 [2] Bradner, S., "Benchmarking Terminology for Network 684 Interconnection Devices", RFC 1242, July 1991. 686 [3] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 687 Network Interconnection Devices", RFC 2544, July 1999. 689 [4] Mandeville, R., "Benchmarking Terminology for LAN Switching 690 Devices", RFC 2285, February 1998. 692 [5] Poretsky, S., Gurbani, V., and C. Davids, "SIP Performance 693 Benchmarking Terminology", draft-poretsky-sip-bench-term-02 694 (work in progress), October 2006. 696 [6] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, A., 697 Peterson, J., Sparks, R., Handley, M., and E. Schooler, "SIP: 698 Session Initiation Protocol", RFC 3261, June 2002. 700 [7] Rosenberg, J., "A Presence Event Package for the Session 701 Initiation Protocol (SIP)", RFC 3856, August 2004. 703 [8] Garcia-Martin, M., "Input 3rd-Generation Partnership Project 704 (3GPP) Release 5 Requirements on the Session Initiation 705 Protocol (SIP)", RFC 4083, May 2005. 707 [9] Sparks, R., Hawrylyshen, A., Johnston, A., Rosenberg, J., and 708 H. Schulzrinne, "Session Initiation Protocol (SIP) Torture Test 709 Messages", RFC 4475, August 2006. 711 [10] Malas, D., "SIP Performance Metrics", 712 draft-malas-performance-metrics-01 (work in progress), 713 August 2006. 715 [11] Lingle, K., Mule, J., Maeng, J., and D. Walker, "Management 716 Information Base for the Session Initiation Protocol (SIP)", 717 draft-ietf-sip-mib-11 (work in progress), May 2006. 719 8.2. Informational References 721 Authors' Addresses 723 Scott Porersky 724 Reef Point Networks 725 8 New England and Executive Park 726 Burlington, MA 08103 727 USA 729 Phone: +1 508 439 9008 730 Email: sporetsky@reefpoint.com 732 Vijay K. Gurbani 733 Bell Laboratories, Alcatel-Lucent 734 2701 Lucent Lane 735 Rm 9F-546 736 Lisle, IL 60532 737 USA 739 Phone: +1 630 224 0216 740 Email: vkg@alcatel-lucent.com 741 Carol Davids 742 Illinois Institute of Technology 743 201 East Loop Road 744 Wheaton, IL 60187 745 USA 747 Email: davids@iit.edu 749 Full Copyright Statement 751 Copyright (C) The IETF Trust (2007). 753 This document is subject to the rights, licenses and restrictions 754 contained in BCP 78, and except as set forth therein, the authors 755 retain all their rights. 757 This document and the information contained herein are provided on an 758 "AS IS" basis and THE CONTRIBUTOR, THE ORGANIZATION HE/SHE REPRESENTS 759 OR IS SPONSORED BY (IF ANY), THE INTERNET SOCIETY, THE IETF TRUST AND 760 THE INTERNET ENGINEERING TASK FORCE DISCLAIM ALL WARRANTIES, EXPRESS 761 OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF 762 THE INFORMATION HEREIN WILL NOT INFRINGE ANY RIGHTS OR ANY IMPLIED 763 WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. 765 Intellectual Property 767 The IETF takes no position regarding the validity or scope of any 768 Intellectual Property Rights or other rights that might be claimed to 769 pertain to the implementation or use of the technology described in 770 this document or the extent to which any license under such rights 771 might or might not be available; nor does it represent that it has 772 made any independent effort to identify any such rights. Information 773 on the procedures with respect to rights in RFC documents can be 774 found in BCP 78 and BCP 79. 776 Copies of IPR disclosures made to the IETF Secretariat and any 777 assurances of licenses to be made available, or the result of an 778 attempt made to obtain a general license or permission for the use of 779 such proprietary rights by implementers or users of this 780 specification can be obtained from the IETF on-line IPR repository at 781 http://www.ietf.org/ipr. 783 The IETF invites any interested party to bring to its attention any 784 copyrights, patents or patent applications, or other proprietary 785 rights that may cover technology that may be required to implement 786 this standard. Please address the information to the IETF at 787 ietf-ipr@ietf.org. 789 Acknowledgment 791 Funding for the RFC Editor function is provided by the IETF 792 Administrative Support Activity (IASA).