idnits 2.17.1 draft-ietf-bmwg-sip-bench-meth-01.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** You're using the IETF Trust Provisions' Section 6.b License Notice from 12 Sep 2009 rather than the newer Notice from 28 Dec 2009. (See https://trustee.ietf.org/license-info/) Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year == The document seems to lack the recommended RFC 2119 boilerplate, even if it appears to use RFC 2119 keywords. (The document does seem to have the reference to RFC 2119 which the ID-Checklist requires). -- The document date (February 8, 2010) is 5191 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 2544 == Outdated reference: A later version (-12) exists of draft-ietf-bmwg-sip-bench-term-01 ** Downref: Normative reference to an Informational draft: draft-ietf-bmwg-sip-bench-term (ref. 'I-D.sip-bench-term') Summary: 3 errors (**), 0 flaws (~~), 4 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Benchmarking Methodology Working S. Poretsky 3 Group Allot Communications 4 Internet-Draft V. Gurbani 5 Expires: August 12, 2010 Bell Laboratories, Alcatel-Lucent 6 C. Davids 7 Illinois Institute of Technology 8 February 8, 2010 10 Methodology for Benchmarking SIP Networking Devices 11 draft-ietf-bmwg-sip-bench-meth-01 13 Abstract 15 This document describes the methodology for benchmarking Session 16 Initiation Protocol (SIP) performance as described in SIP 17 benchmarking terminology document. The methodology and terminology 18 are to be used for benchmarking signaling plane performance with 19 varying signaling and media load. Both scale and establishment rate 20 are measured by signaling plane performance. The SIP Devices to be 21 benchmarked may be a single device under test (DUT) or a system under 22 test (SUT). Benchmarks can be obtained and compared for different 23 types of devices such as SIP Proxy Server, SBC, and server paired 24 with a media relay or Firewall/NAT device. 26 Status of this Memo 28 This Internet-Draft is submitted to IETF in full conformance with the 29 provisions of BCP 78 and BCP 79. 31 Internet-Drafts are working documents of the Internet Engineering 32 Task Force (IETF), its areas, and its working groups. Note that 33 other groups may also distribute working documents as Internet- 34 Drafts. 36 Internet-Drafts are draft documents valid for a maximum of six months 37 and may be updated, replaced, or obsoleted by other documents at any 38 time. It is inappropriate to use Internet-Drafts as reference 39 material or to cite them other than as "work in progress." 41 The list of current Internet-Drafts can be accessed at 42 http://www.ietf.org/ietf/1id-abstracts.txt. 44 The list of Internet-Draft Shadow Directories can be accessed at 45 http://www.ietf.org/shadow.html. 47 This Internet-Draft will expire on August 12, 2010. 49 Copyright Notice 51 Copyright (c) 2010 IETF Trust and the persons identified as the 52 document authors. All rights reserved. 54 This document is subject to BCP 78 and the IETF Trust's Legal 55 Provisions Relating to IETF Documents 56 (http://trustee.ietf.org/license-info) in effect on the date of 57 publication of this document. Please review these documents 58 carefully, as they describe your rights and restrictions with respect 59 to this document. Code Components extracted from this document must 60 include Simplified BSD License text as described in Section 4.e of 61 the Trust Legal Provisions and are provided without warranty as 62 described in the BSD License. 64 Table of Contents 66 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 67 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4 68 3. Test Topologies . . . . . . . . . . . . . . . . . . . . . . . 5 69 4. Test Considerations . . . . . . . . . . . . . . . . . . . . . 7 70 4.1. Selection of SIP Transport Protocol . . . . . . . . . . . 7 71 4.2. Server . . . . . . . . . . . . . . . . . . . . . . . . . . 8 72 4.3. Associated Media . . . . . . . . . . . . . . . . . . . . . 8 73 4.4. Selection of Associated Media Protocol . . . . . . . . . . 8 74 4.5. Number of Associated Media Streams per SIP Session . . . . 8 75 4.6. Session Duration . . . . . . . . . . . . . . . . . . . . . 8 76 4.7. Attempted Sessions per Second . . . . . . . . . . . . . . 9 77 4.8. Stress Testing . . . . . . . . . . . . . . . . . . . . . . 9 78 5. Reporting Format . . . . . . . . . . . . . . . . . . . . . . . 9 79 5.1. Test Setup Report . . . . . . . . . . . . . . . . . . . . 9 80 5.2. Device Benchmarks for IS . . . . . . . . . . . . . . . . . 10 81 5.3. Device Benchmarks for NS . . . . . . . . . . . . . . . . . 10 82 6. Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 10 83 6.1. Session Establishment Rate . . . . . . . . . . . . . . . . 10 84 6.2. Session Establishment Rate with Media . . . . . . . . . . 10 85 6.3. Session Establishment Rate with Loop Detection Enabled . . 11 86 6.4. Session Establishment Rate with Forking . . . . . . . . . 12 87 6.5. Session Establishment Rate with Forking and Loop 88 Detection . . . . . . . . . . . . . . . . . . . . . . . . 12 89 6.6. Session Establishment Rate with TLS Encrypted SIP . . . . 13 90 6.7. Session Establishment Rate with IPsec Encrypted SIP . . . 14 91 6.8. Session Establishment Rate with SIP Flooding . . . . . . . 14 92 6.9. Maximum Registration Rate . . . . . . . . . . . . . . . . 15 93 6.10. Maximum Re-Registration Rate . . . . . . . . . . . . . . . 15 94 6.11. Maximum IM Rate . . . . . . . . . . . . . . . . . . . . . 16 95 6.12. Session Capacity without Media . . . . . . . . . . . . . . 16 96 6.13. Session Capacity with Media . . . . . . . . . . . . . . . 17 97 6.14. Session Capacity with Media and a Media Relay/NAT 98 and/or Firewall . . . . . . . . . . . . . . . . . . . . . 18 99 7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 18 100 8. Security Considerations . . . . . . . . . . . . . . . . . . . 18 101 9. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 19 102 10. References . . . . . . . . . . . . . . . . . . . . . . . . . . 19 103 10.1. Normative References . . . . . . . . . . . . . . . . . . . 19 104 10.2. Informative References . . . . . . . . . . . . . . . . . . 19 105 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19 107 1. Terminology 109 In this document, the key words "MUST", "MUST NOT", "REQUIRED", 110 "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT 111 RECOMMENDED", "MAY", and "OPTIONAL" are to be interpreted as 112 described in BCP 14, conforming to [RFC2119] and indicate requirement 113 levels for compliant implementations. 115 Terms specific to SIP [RFC3261] performance benchmarking are defined 116 in [I-D.sip-bench-term]. 118 RFC 2119 defines the use of these key words to help make the intent 119 of standards track documents as clear as possible. While this 120 document uses these keywords, this document is not a standards track 121 document. The term Throughput is defined in [RFC2544]. 123 2. Introduction 125 This document describes the methodology for benchmarking Session 126 Initiation Protocol (SIP) performance as described in Terminology 127 document [I-D.sip-bench-term]. The methodology and terminology are 128 to be used for benchmarking signaling plane performance with varying 129 signaling and media load. Both scale and establishment rate are 130 measured by signaling plane performance. 132 The SIP Devices to be benchmarked may be a single device under test 133 (DUT) or a system under test (SUT). The DUT is a SIP Server, which 134 may be any [RFC3261] conforming device. The SUT can be any device or 135 group of devices containing RFC 3261 conforming functionality along 136 with Firewall and/or NAT functionality. This enables benchmarks to 137 be obtained and compared for different types of devices such as SIP 138 Proxy Server, SBC, SIP proxy server paired with a media relay or 139 Firewall/NAT device. SIP Associated Media benchmarks can also be 140 made when testing SUTs. 142 The test cases covered in this methodology document provide 143 benchmarks metrics of Registration Rate, SIP Session Establishment 144 Rate, Session Capacity, IM Rate, and Presence Rate. These can be 145 benchmarked with or without associated Media. Some cases are also 146 included to cover Forking, Loop detecion, Encrypted SIP, and SIP 147 Flooding. The test topologies that can be used are described in the 148 Test Setup section. Topologies are provided for benchmarking of a 149 DUT or SUT. Benchmarking with Associated Media can be performed when 150 using a SUT. 152 SIP permits a wide range of configuration options that are also 153 explained in the Test Setup section. Benchmark metrics could 154 possibly be impacted by Associated Media. The selected values for 155 Session Duration and Media Streams Per Session enable benchmark 156 metrics to be benchmarked without Associated Media. Session Setup 157 Rate could possibly be impacted by the selected value for Maximum 158 Sessions Attempted. The benchmark for Session Establishment Rate is 159 measured with a fixed value for maximum Session Attempts. 161 3. Test Topologies 163 Figures 1 through 6 below provide various topologies to perform the 164 SIP Performance Benchmarking. These figures show the Device Under 165 Test (DUT) to be a single server or a System Under Test (SUT). Test 166 Topology options to include benchmarking with Associated Media 167 require use of a SUT and are shown in Figures 4 and 5. 169 DUT 170 --------- --------- 171 | | | | 172 | | | | 173 | | SIP | | 174 |Server |<------------->| Tester| 175 | | | | 176 | | | | 177 | | | | 178 --------- --------- 180 Figure 1: Basic SIP Test Topology 182 SUT 183 ------------------------ 184 --------- --------- --------- 185 | | | | | | 186 | | | | | | 187 | | SIP |Fire- | SIP | | 188 | Server|<---------------------->| Tester| 189 | | |Wall | | | 190 | | | | | | 191 | | | | | | 192 --------- --------- --------- 194 Figure 2: SIP Test Topology with Firewall 195 SUT 196 ------------------------ 197 --------- --------- --------- 198 | | | | | | 199 | | | | | | 200 | | SIP | NAT | SIP | | 201 | Server|<---------------------->| Tester| 202 | | | | | | 203 | | | | | | 204 | | | | | | 205 --------- --------- --------- 207 Figure 3: SIP Test Topology with NAT Device 209 SUT 210 ------------------------ 211 --------- --------- --------- 212 | | | | | | 213 | | | | | | 214 | | SIP |Fire- | SIP | | 215 | Server|<---------------------->| Tester| 216 | | |Wall | | | 217 | | | | Media | | 218 | | ---| |---------| | 219 --------- | --------- --------- 220 | Media ^ 221 -------------------------| 223 Figure 4: SIP Test Topology with Media through Firewall 224 SUT 225 ------------------------ 226 --------- --------- --------- 227 | | | | | | 228 | | | | | | 229 | | SIP | NAT | SIP | | 230 | Server|<---------------------->| Tester| 231 | | | | | | 232 | | | | Media | | 233 | | ---| |---------| | 234 --------- | --------- --------- 235 | Media ^ 236 -------------------------| 238 Figure 5: SIP Test Topology with Media through NAT Device 240 SUT 241 ------------------------ 242 --------- --------- --------- 243 | | | | | | 244 | | | | | | 245 | | SIP |Media | SIP | | 246 | Server|<---------------------->| Tester| 247 | | |Relay | | | 248 | | | | Media | | 249 | | ---| |---------| | 250 --------- | --------- --------- 251 | Media ^ 252 -------------------------| 254 Figure 6: SIP Test Topology with Media Through a Media Relay 256 4. Test Considerations 258 4.1. Selection of SIP Transport Protocol 260 Discussion: 261 Test cases may be performed with any transport protocol supported 262 by SIP. This includes, but is not limited to, SIP TCP, SIP UDP, 263 and TLS. The protocol used for the SIP transport protocol must be 264 reported with benchmarking results. 266 4.2. Server 268 Discussion: 269 The Server is a SIP-speaking device that complies with RFC 3261. 270 The purpose of this document is to benchmark SIP performance, not 271 conformance. Conformance to [RFC3261] is assumed for all tests. 272 The Server may be the DUT or a component of a SUT that includes 273 Firewall and/or NAT functionality. The components of the SUT may 274 be a single physical device or separate devices. 276 4.3. Associated Media 278 Discussion: 279 Some tests may require associated media to be present for each SIP 280 session. The Server is not involved in the forwarding of media. 281 Associated Media can be benchmarked only with a SUT in which the 282 media traverses a Media Relay, Firewall, NAT, or Firewall/NAT 283 device. The test topologies to be used when benchmarking SUT 284 performance for Associated Media are shown in Figures 4 and 5, in 285 which the SIP signaling is bidirectional and the Associated Media 286 is unidirectional. 288 4.4. Selection of Associated Media Protocol 290 Discussion: 291 The test cases specified in this document provide SIP performance 292 independent of the protocol used for the media stream. Any media 293 protocol supported by SIP may be used. This includes, but is not 294 limited to, RTP, RTSP, and SRTP. The protocol used for Associated 295 Media must be reported with benchmarking results. 297 4.5. Number of Associated Media Streams per SIP Session 299 Discussion: 300 Benchmarking results may vary with the number of media streams per 301 SIP session. When benchmarking a SUT for voice, a single media 302 stream is used. When benchmarking a SUT for voice and video, two 303 media streams are used. The number of Associated Media Streams 304 must be reported with benchmarking results. 306 4.6. Session Duration 307 Discussion: 308 SUT performance benchmarks may vary with the duration of SIP 309 sessions. Session Duration must be reported with benchmarking 310 results. A Session Duration of zero seconds indicates 311 transmission of a BYE immediately following successful SIP 312 establishment indicate by receipt of a 200 OK. An infinite 313 Session Duration indicates that a BYE is never transmitted. 315 4.7. Attempted Sessions per Second 317 Discussion: 318 DUT and SUT performance benchmarks may vary with the the rate of 319 attempted sessions offered by the Tester. Attempted Sessions per 320 Second must be reported with benchmarking results. 322 4.8. Stress Testing 324 Discussion: 325 The purpose of this document is to benchmark SIP performance, not 326 system stability under stressful conditions such as a high rate of 327 Attempted Sessions per Second. 329 5. Reporting Format 331 5.1. Test Setup Report 333 SIP Transport Protocol = ___________________________ 334 Session Attempt Rate = _____________________________ 335 IS Media Attempt Rate = ____________________________ 336 Total Sessions Attempted = _________________________ 337 Media Streams Per Session = _______________________ 338 Associated Media Protocol = _______________________ 339 Media Packet Size = _______________________________ 340 Media Offered Load = ______________________________ 341 Media Session Hold Time = _________________________ 342 Establishment Threshold Time = ____________________ 343 Loop Detecting Option = ___________________________ 344 Forking Option 345 Number of endpoints request sent to = ___________ 346 Type of forking = _______________________________ 348 Note: Total Sessions Attempted is used in the calculation of the 349 Session Establishment Performance ([I-D.sip-bench-term], Section 350 3.4.5). It is the number of session attempts ([I-D.sip-bench-term], 351 Section 3.1.6) that will be made over the duration of the test. 353 5.2. Device Benchmarks for IS 355 Registration Rate = _______________________________ 356 Re-registration Rate = ____________________________ 357 Session Capacity = _________________________________ 358 Session Overload Capacity = ________________________ 359 Session Establishment Rate = ______________________ 360 Session Establishment Performance = _______________ 361 Session Attempt Delay = ___________________________ 362 Session Disconnect Delay = ________________________ 364 5.3. Device Benchmarks for NS 366 IM Rate = _______________________________ 368 6. Test Cases 370 6.1. Session Establishment Rate 372 Objective: 373 To benchmark the Session Establishment Rate of the DUT/SUT with 374 zero failures. 375 Procedure: 376 1. Configure the DUT in the test topology shown in Figure 1 or 377 SUT as shown in Figures 2 or 3. 378 2. Configure Tester for SIP UDP with an Session Attempt Rate = 379 100 SPS, maximum Session Attempts = 100,000 and Media Streams 380 Per Session=0. 381 3. Start Tester to initiate SIP Session establishment with the 382 DUT. 383 4. Measure Session Attempt Failures and total Established 384 Sessions at the Tester. 385 5. If a Session Attempt Failure is recorded then reduce the 386 Session Attempt Rate configured on the Tester by 50%. 387 6. If no Session Attempt Failure is recorded then increase the 388 Session Attempt Rate configured on the Tester by 50%. 389 7. Repeat steps 3 through 6 until the Session Establishment Rate 390 is obtained and recorded. 391 Expected Results: This is the scenario to obtain the maximum Session 392 Establishment Rate of the DUT/SUT. 394 6.2. Session Establishment Rate with Media 395 Objective: 396 To benchmark the Session Establishment Rate of the SUT with zero 397 failures when Associated Media is included in the benchmark test. 398 Procedure: 399 1. Configure the SUT in the test topology shown in Figure 4, 5 or 400 6. 401 2. Configure Tester for SIP UDP with an Session Attempt Rate = 402 100 SPS, maximum Session Attempts = 100,000 and Media Streams 403 Per Session = 1. The rate of offered load for each media 404 stream SHOULD be (eq 1) Offered Load per Media Stream = 405 Throughput / maximum sessions attempted, where Throughput is 406 defined in [RFC2544]. 407 3. Start Tester to initiate SIP Session establishment with the 408 SUT and transmit media through the SUT to a destination other 409 than the server. 410 4. At the Tester measure Session Attempt Failures, total 411 Established Sessions, and Packet Loss [RFC2544] of the media. 412 5. If a Session Attempt Failure or Packet Loss is recorded then 413 reduce the Session Attempt Rate configured on the Tester by 414 50%. 415 6. If no Session Attempt Failure or Packet Loss is recorded then 416 increase the Session Attempt Rate configured on the Tester by 417 50%. 418 7. Repeat steps 3 through 6 until the Session Establishment Rate 419 is obtained and recorded. 420 8. Repeat steps 1 through 7 for multimedia in which Media Streams 421 Per Session = 2. 422 Expected Results: Session Establishment Rate results obtained with 423 Associated Media with any number of media streams per SIP session 424 are expected to be identical to the Session Establishment Rate 425 results obtained without media in the case where the server is 426 running on a platform separate from the platform on which the 427 Media Relay, NAT or Firewall is running. Session Establishment 428 Rate results obtained with Associated Media may be lower than 429 those obtained without media in the case where the server and the 430 NAT, Firewall or Media Relay are running on the same platform. 432 6.3. Session Establishment Rate with Loop Detection Enabled 434 Objective: 435 To benchmark the Session Establishment Rate of the DUT/SUT with 436 zero failures when the Loop Detection option is enabled. 437 Procedure: 438 1. Configure the DUT in the test topology shown in Figure 1 or 439 SUT as shown in Figures 2 or 3. 440 2. Configure Tester for SIP UDP with an Session Attempt Rate = 441 100 SPS, maximum Session Attempts = 100,000 and Media Streams 442 Per Session=0. 444 3. Turn on the Loop Detection option in the DUT or SUT. 445 4. Start Tester to initiate SIP Session establishment with the 446 DUT. 447 5. Measure Session Attempt Failures and total Established 448 Sessions at the Tester. 449 6. If a Session Attempt Failure is recorded then reduce the 450 Session Attempt Rate configured on the Tester by 50%. 451 7. If no Session Attempt Failure is recorded then increase the 452 Session Attempt Rate configured on the Tester by 50%. 453 8. Repeat steps 4 through 7 until the Session Establishment Rate 454 is obtained and recorded. 455 Expected Results: Session Establishment Rate results obtained with 456 Loop Detection may be lower than those obtained without Loop 457 Detection enabled. 459 6.4. Session Establishment Rate with Forking 461 Objective: 462 To benchmark the Session Establishment Rate of the DUT/SUT with 463 zero failures when the Forking Option is enabled. 464 Procedure: 465 1. Configure the DUT in the test topology shown in Figure 1 or 466 SUT as shown in Figures 2 or 3. 467 2. Configure Tester for SIP UDP with an Session Attempt Rate = 468 100 SPS, maximum Session Attempts = 100,000 and Media Streams 469 Per Session=0. 470 3. Set the number of endpoints that will receive the forked 471 invitation to a value of 2 or more (subsequent tests may 472 increase this value at the discretion of the tester.) 473 4. Start Tester to initiate SIP Session establishment with the 474 DUT. 475 5. Measure Session Attempt Failures and total Established 476 Sessions at the Tester. 477 6. If a Session Attempt Failure is recorded then reduce the 478 Session Attempt Rate configured on the Tester by 50%. 479 7. If no Session Attempt Failure is recorded then increase the 480 Session Attempt Rate configured on the Tester by 50%. 481 8. Repeat steps 4 through 7 until the Session Establishment Rate 482 is obtained and recorded. 483 Expected Results: Session Establishment Rate results obtained with 484 Forking may be lower than those obtained without Forking enabled. 486 6.5. Session Establishment Rate with Forking and Loop Detection 487 Objective: 488 To benchmark the Session Establishment Rate of the DUT/SUT with 489 zero failures when both the Forking and Loop Detection Options are 490 enabled. 491 Procedure: 492 1. Configure the DUT in the test topology shown in Figure 1 or 493 SUT as shown in Figures 2 or 3. 494 2. Configure Tester for SIP UDP with an Session Attempt Rate = 495 100 SPS, maximum Session Attempts = 100,000 and Media Streams 496 Per Session=0. 497 3. Start Tester to initiate SIP Session establishment with the 498 DUT. 499 4. Enable the Loop Detection Options on the DUT. 500 5. Set the number of endpoints that will receive the forked 501 invitation to a value of 2 or more (subsequent tests may 502 increase this value at the discretion of the tester.) 503 6. Measure Session Attempt Failures and total Established 504 Sessions at the Tester. 505 7. If a Session Attempt Failure is recorded then reduce the 506 Session Attempt Rate configured on the Tester by 50%. 507 8. If no Session Attempt Failure is recorded then increase the 508 Session Attempt Rate configured on the Tester by 50%. 509 9. Repeat steps 4 through 7 until the Session Establishment Rate 510 is obtained and recorded. 511 Expected Results: Session Establishment Rate results obtained with 512 Forking and Loop Detection may be lower than those obtained with 513 only Forking or Loop Detection enabled. 515 6.6. Session Establishment Rate with TLS Encrypted SIP 517 Objective: 518 To benchmark the Session Establishment Rate of the DUT/SUT with 519 zero failures when using TLS encrypted SIP. 520 Procedure: 521 1. Configure the DUT in the test topology shown in Figure 1 or 522 SUT as shown in Figures 2 or 3. 523 2. Configure Tester for SIP TCP, enable TLS, Session Attempt Rate 524 = 100 SPS, maximum Session Attempts = 100,000 and Media 525 Streams Per Session = 0. 526 3. Start Tester to initiate SIP Session establishment with the 527 DUT. 528 4. Measure Session Attempt Failures and total Established 529 Sessions at the Tester. 530 5. If a Session Attempt Failure is recorded then reduce the 531 Session Attempt Rate configured on the Tester by 50%. 532 6. If no Session Attempt Failure is recorded then increase the 533 Session Attempt Rate configured on the Tester by 50%. 535 7. Repeat steps 3 through 6 until the Session Establishment Rate 536 is obtained and recorded. 537 Expected Results: Session Establishment Rate results obtained with 538 TLS Encrypted SIP may be lower than those obtained with plaintext 539 SIP. 541 6.7. Session Establishment Rate with IPsec Encrypted SIP 543 Objective: 544 To benchmark the Session Establishment Rate of the DUT/SUT with 545 zero failures when using IPsec Encryoted SIP. 546 Procedure: 547 1. Configure the DUT in the test topology shown in Figure 1 or 548 SUT as shown in Figures 2 or 3. 549 2. Configure Tester for SIP TCP, enable IPSec, Session Attempt 550 Rate = 100 SPS, maximum Session Attempts = 100,000 and Media 551 Streams Per Session = 0. 552 3. Start Tester to initiate SIP Session establishment with the 553 DUT. 554 4. Measure Session Attempt Failures and total Established 555 Sessions at the Tester. 556 5. If a Session Attempt Failure is recorded then reduce the 557 Session Attempt Rate configured on the Tester by 50%. 558 6. If no Session Attempt Failure is recorded then increase the 559 Session Attempt Rate configured on the Tester by 50%. 560 7. Repeat steps 3 through 6 until the Session Establishment Rate 561 is obtained and recorded. 562 Expected Results: Session Establishment Rate results obtained with 563 IPSec Encrypted SIP may be lower than those obtained with 564 plaintext SIP. 566 6.8. Session Establishment Rate with SIP Flooding 568 Objective: 569 To benchmark the Session Establishment Rate of the SUT with zero 570 failures when SIP Flooding is occurring. 571 Procedure: 572 1. Configure the DUT in the test topology shown in Figure 1 or 573 the SUT as shown in Figure 2. 574 2. Configure Tester for SIP UDP with an Session Attempt Rate = 575 100 SPS, maximum Session Attempts = 100,000, Associated Media 576 Streams Per Session = 0, and SIP INVITE Message Flood = 500 577 per second. 578 3. Start Tester to initiate SIP Session establishment with the 579 SUT and SIP Flood targetted at the Server. 580 4. At the Tester measure Session Attempt Failures, total 581 Established Sessions, and Packet Loss [RFC2544] of the media. 583 5. If a Session Attempt Failure or Packet Loss is recorded then 584 reduce the Session Attempt Rate configured on the Tester by 585 50%. 586 6. If no Session Attempt Failure or Packet Loss is recorded then 587 increase the Session Attempt Rate configured on the Tester by 588 50%. 589 7. Repeat steps 3 through 6 until the Session Establishment Rate 590 is obtained and recorded. 591 8. Repeat steps 1 through 7 with SIP INVITE Message Flood = 1000 592 per second. 593 Expected Results: Session Establishment Rate results obtained with 594 SIP Flooding may be degraded. 596 6.9. Maximum Registration Rate 598 Objective: 599 To benchmark the maximum registration rate of the DUT/SUT with 600 zero failures. 601 Procedure: 602 1. Configure the DUT in the test topology shown in Figure 1 or 603 SUT as shown in Figures 2 or 3. 604 2. Configure Tester for SIP UDP with an attempted Registration 605 Rate = 100 SPS and maximum registrations attempted = 100,000. 606 3. Set the registration timeout value to at least 3600 seconds. 607 4. At the Tester measure failed registration attempts, total 608 registrations and packet loss. 609 5. If a Failed Registration Attempt or Packet Loss is recorded 610 then reduce the Attempted Registration Rate configured on the 611 Tester by 50%. 612 6. If no Failed Registration or Packet Loss is recorded then 613 increase the Attempted Registration Rate configured on the 614 Tester by 50%. 615 7. Repeat steps 5 and 6 until the all registrations have 616 succeeded. This number is obtained and recorded. 617 Expected Results: 619 6.10. Maximum Re-Registration Rate 621 Objective: 622 To benchmark the maximum re-registration rate of the DUT/SUT with 623 zero failures. 624 Procedure: 625 1. Configure the DUT in the test topology shown in Figure 1 or 626 SUT as shown in Figures 2 or 3. 627 2. Execute test detailed in Section 6.9 to register the endpoints 628 with the registrar. The rest of the steps below MUST be 629 performed at least 5 minutes after, but no more than 15 630 minutes after the test performed in Section 6.9. 632 3. Configure Tester for SIP UDP with an attempted Registration 633 Rate = 100 SPS and maximum registrations attempted = 100,000. 634 4. Configure Tester to re-register the same address-of-records 635 that were registered in Section 6.9. 636 5. At the Tester measure failed registration attempts, total 637 registrations and packet loss. 638 6. If a Failed Registration Attempt or Packet Loss is recorded 639 then reduce the Attempted Registration Rate configured on the 640 Tester by 50%. 641 7. If no Failed Registration or Packet Loss is recorded then 642 increase the Attempted Registration Rate configured on the 643 Tester by 50%. 644 8. Repeat steps 6 and 7 until the all re-registrations have 645 succeeded. This number is obtained and recorded. 646 Expected Results: The rate should be at least equal to but not more 647 than the result of Section 6.9. 649 6.11. Maximum IM Rate 651 Objective: 652 To benchmark the maximum IM rate of the SUT with zero failures. 653 Procedure: 654 1. Configure the DUT in the test topology shown in Figure 1 or 655 SUT as shown in Figures 2 or 3. 656 2. Configure Tester for SIP UDP with an Attempted IM Rate = 100 657 SPS, Maximum IM Attempted = 100,000. 658 3. At the Tester measure Failed IM Attempts, Total IM and Packet 659 Loss. 660 4. If a Failed IM Attempt or Packet Loss is recorded then reduce 661 the Attempted IM Rate configured on the Tester by 50%. 662 5. If no Failed IM or Packet Loss is recorded then increase the 663 Attempted IM Rate configured on the Tester by 50%. 664 6. Repeat steps 3 through 6 until the Maximum IM Rate is obtained 665 and recorded. 666 Expected Results: 668 6.12. Session Capacity without Media 670 Objective: 671 To benchmark the Session Capacity of the SUT without Associated 672 Media. 673 Procedure: 674 1. Configure the DUT in the test topology shown in Figure 1 or 675 SUT as shown in Figures 2 or 3. 676 2. Configure Tester for SIP UDP with an Session Attempt Rate = 677 Session Establishment Rate, maximum Session Attempts = 10,000 678 and Media Streams Per Session = 0. 680 3. Start Tester to initiate SIP Session establishment with the 681 DUT. 682 4. Measure Session Attempt Failures, total Established Sessions, 683 and Packet Loss [RFC2544] at the Tester. 684 5. If a Session Attempt Failure or Packet Loss is recorded then 685 reduce the maximum Session Attempts configured on the Tester 686 by 5,000. 687 6. If no Session Attempt Failure or Packet Loss is recorded then 688 increase the maximum Session Attempts configured on the Tester 689 by 10,000. 690 7. Repeat steps 3 through 6 until the Session Capacity is 691 obtained and recorded. 692 8. Repeat steps 1 through 7 for multimedia in which media streams 693 per session = 2. 694 Expected Results: This is the scenario to obtain the maximum Session 695 Capacity of the DUT/SUT. 697 6.13. Session Capacity with Media 699 Objective: 700 To benchmark the session capacity of the DUT/SUT with Associated 701 Media. 702 Procedure: 703 1. Configure the DUT in the test topology shown in Figure 1 or 704 SUT as shown in Figures 2 or 3. 705 2. Configure Tester for SIP UDP with a Session Attempt Rate = 100 706 SPS, Session Duration = 30 sec, maximum Session Attempts = 707 100,000 and Media Streams Per Session = 1. 708 NOTE: The total offered load to the DUT/SUT SHOULD be equal 709 to the Throughput of the DUT/SUT as defined in [RFC2544]. 710 The offered load to the DUT/SUT for each media stream 711 SHOULD be equal to 712 Throughput/Maximum Session Attemps. 713 3. Start Tester to initiate SIP Session establishment with the 714 SUT and transmit media through the SUT to a destination other 715 than the server. 716 4. Measure Session Attempt Failures and total Established 717 Sessions at the Tester. 718 5. If a Session Attempt Failure is recorded then reduce the 719 maximum Session Attempts configured on the Tester by 5,000. 720 6. If no Session Attempt Failure is recorded then increase the 721 maximum Session Attempts configured on the Tester by 10,000. 722 7. Repeat steps 3 through 6 until the Session Capacity is 723 obtained and recorded. 725 Expected Results: Session Capacity results obtained with Associated 726 Media with any number of media streams per SIP session will be 727 identical to the Session Capacity results obtained without media. 729 6.14. Session Capacity with Media and a Media Relay/NAT and/or Firewall 731 Objective: 732 To benchmark the Session Establishment Rate of the SUT with 733 Associated Media. 734 Procedure: 735 1. Configure the SUT in the test topology shown in Figure 4, 5 or 736 6. 737 2. Configure Tester for SIP UDP with a Session Attempt Rate = 100 738 SPS, Session Duration = 30 sec, maximum Session Attempts = 739 100,000 and Media Streams Per Session = 1. 740 NOTE: The offered load for each media stream SHOULD be as 741 in Equation 1. 742 3. Start Tester to initiate SIP Session establishment with the 743 SUT and transmit media through the SUT to a destination other 744 than the server. 745 4. Measure Session Attempt Failures and total Established 746 Sessions at the Tester. 747 5. If a Session Attempt Failure is recorded then reduce the 748 maximum Session Attempts configured on the Tester by 5,000. 749 6. If no Session Attempt Failure is recorded then increase the 750 maximum Session Attempts configured on the Tester by 10,000. 751 7. Repeat steps 3 through 6 until the Session Capacity is 752 obtained and recorded. 753 Expected Results: Session Capacity results obtained with Associated 754 Media with any number of media streams per SIP session may be 755 lower than the Session Capacity without Media result if the Media 756 Relay, NAT or Firewall is sharing a platform with the server. 758 7. IANA Considerations 760 This document does not requires any IANA considerations. 762 8. Security Considerations 764 Documents of this type do not directly affect the security of 765 Internet or corporate networks as long as benchmarking is not 766 performed on devices or systems connected to production networks. 767 Security threats and how to counter these in SIP and the media layer 768 is discussed in RFC3261, RFC3550, and RFC3711 and various other 769 drafts. This document attempts to formalize a set of common 770 methodology for benchmarking performance of SIP devices in a lab 771 environment. 773 9. Acknowledgments 775 The authors would like to thank Keith Drage and Daryl Malas for their 776 contributions to this document. 778 10. References 780 10.1. Normative References 782 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 783 Requirement Levels", BCP 14, RFC 2119, March 1997. 785 [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 786 Network Interconnect Devices", RFC 2544, March 1999. 788 [I-D.sip-bench-term] 789 Poretsky, S., Gurbani, V., and C. Davids, "SIP Performance 790 Benchmarking Terminology", 791 draft-ietf-bmwg-sip-bench-term-01 (work in progress), 792 January 2010. 794 10.2. Informative References 796 [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, 797 A., Peterson, J., Sparks, R., Handley, M., and E. 798 Schooler, "SIP: Session Initiation Protocol", RFC 3261, 799 June 2002. 801 Authors' Addresses 803 Scott Poretsky 804 Allot Communications 805 300 TradeCenter, Suite 4680 806 Woburn, MA 08101 807 USA 809 Phone: +1 508 309 2179 810 Email: sporetsky@allot.com 811 Vijay K. Gurbani 812 Bell Laboratories, Alcatel-Lucent 813 1960 Lucent Lane 814 Rm 9C-533 815 Naperville, IL 60566 816 USA 818 Phone: +1 630 224 0216 819 Email: vkg@bell-labs.com 821 Carol Davids 822 Illinois Institute of Technology 823 201 East Loop Road 824 Wheaton, IL 60187 825 USA 827 Email: davids@iit.edu