idnits 2.17.1 draft-ietf-bmwg-sip-bench-meth-02.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** You're using the IETF Trust Provisions' Section 6.b License Notice from 12 Sep 2009 rather than the newer Notice from 28 Dec 2009. (See https://trustee.ietf.org/license-info/) Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year == The document seems to lack the recommended RFC 2119 boilerplate, even if it appears to use RFC 2119 keywords. (The document does seem to have the reference to RFC 2119 which the ID-Checklist requires). -- The document date (July 12, 2010) is 5036 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 2544 == Outdated reference: A later version (-12) exists of draft-ietf-bmwg-sip-bench-term-02 ** Downref: Normative reference to an Informational draft: draft-ietf-bmwg-sip-bench-term (ref. 'I-D.sip-bench-term') Summary: 3 errors (**), 0 flaws (~~), 4 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Benchmarking Methodology Working S. Poretsky 3 Group Allot Communications 4 Internet-Draft V. Gurbani 5 Expires: January 13, 2011 Bell Laboratories, Alcatel-Lucent 6 C. Davids 7 Illinois Institute of Technology 8 July 12, 2010 10 Methodology for Benchmarking SIP Networking Devices 11 draft-ietf-bmwg-sip-bench-meth-02 13 Abstract 15 This document describes the methodology for benchmarking Session 16 Initiation Protocol (SIP) performance as described in SIP 17 benchmarking terminology document. The methodology and terminology 18 are to be used for benchmarking signaling plane performance with 19 varying signaling and media load. Both scale and establishment rate 20 are measured by signaling plane performance. The SIP Devices to be 21 benchmarked may be a single device under test (DUT) or a system under 22 test (SUT). Benchmarks can be obtained and compared for different 23 types of devices such as SIP Proxy Server, SBC, and server paired 24 with a media relay or Firewall/NAT device. 26 Status of this Memo 28 This Internet-Draft is submitted to IETF in full conformance with the 29 provisions of BCP 78 and BCP 79. 31 Internet-Drafts are working documents of the Internet Engineering 32 Task Force (IETF), its areas, and its working groups. Note that 33 other groups may also distribute working documents as Internet- 34 Drafts. 36 Internet-Drafts are draft documents valid for a maximum of six months 37 and may be updated, replaced, or obsoleted by other documents at any 38 time. It is inappropriate to use Internet-Drafts as reference 39 material or to cite them other than as "work in progress." 41 The list of current Internet-Drafts can be accessed at 42 http://www.ietf.org/ietf/1id-abstracts.txt. 44 The list of Internet-Draft Shadow Directories can be accessed at 45 http://www.ietf.org/shadow.html. 47 This Internet-Draft will expire on January 13, 2011. 49 Copyright Notice 51 Copyright (c) 2010 IETF Trust and the persons identified as the 52 document authors. All rights reserved. 54 This document is subject to BCP 78 and the IETF Trust's Legal 55 Provisions Relating to IETF Documents 56 (http://trustee.ietf.org/license-info) in effect on the date of 57 publication of this document. Please review these documents 58 carefully, as they describe your rights and restrictions with respect 59 to this document. Code Components extracted from this document must 60 include Simplified BSD License text as described in Section 4.e of 61 the Trust Legal Provisions and are provided without warranty as 62 described in the BSD License. 64 Table of Contents 66 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 67 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4 68 3. Test Topologies . . . . . . . . . . . . . . . . . . . . . . . 5 69 4. Test Considerations . . . . . . . . . . . . . . . . . . . . . 6 70 4.1. Selection of SIP Transport Protocol . . . . . . . . . . . 6 71 4.2. Signaling Server . . . . . . . . . . . . . . . . . . . . . 6 72 4.3. Associated Media . . . . . . . . . . . . . . . . . . . . . 7 73 4.4. Selection of Associated Media Protocol . . . . . . . . . . 7 74 4.5. Number of Associated Media Streams per SIP Session . . . . 7 75 4.6. Session Duration . . . . . . . . . . . . . . . . . . . . . 7 76 4.7. Attempted Sessions per Second . . . . . . . . . . . . . . 7 77 4.8. Stress Testing . . . . . . . . . . . . . . . . . . . . . . 8 78 5. Reporting Format . . . . . . . . . . . . . . . . . . . . . . . 8 79 5.1. Test Setup Report . . . . . . . . . . . . . . . . . . . . 8 80 5.2. Device Benchmarks for IS . . . . . . . . . . . . . . . . . 9 81 5.3. Device Benchmarks for NS . . . . . . . . . . . . . . . . . 9 82 6. Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 9 83 6.1. Baseline Session Establishment Rate . . . . . . . . . . . 9 84 6.2. Session Establishment Rate . . . . . . . . . . . . . . . . 10 85 6.3. Session Establishment Rate with Media . . . . . . . . . . 10 86 6.4. Session Establishment Rate with Loop Detection Enabled . . 11 87 6.5. Session Establishment Rate with Forking . . . . . . . . . 12 88 6.6. Session Establishment Rate with Forking and Loop 89 Detection . . . . . . . . . . . . . . . . . . . . . . . . 12 90 6.7. Session Establishment Rate with TLS Encrypted SIP . . . . 13 91 6.8. Session Establishment Rate with IPsec Encrypted SIP . . . 13 92 6.9. Session Establishment Rate with SIP Flooding . . . . . . . 14 93 6.10. Maximum Registration Rate . . . . . . . . . . . . . . . . 14 94 6.11. Maximum Re-Registration Rate . . . . . . . . . . . . . . . 15 95 6.12. Maximum IM Rate . . . . . . . . . . . . . . . . . . . . . 16 96 6.13. Session Capacity without Media . . . . . . . . . . . . . . 16 97 6.14. Session Capacity with Media . . . . . . . . . . . . . . . 17 98 6.15. Session Capacity with Media and a Media Relay/NAT 99 and/or Firewall . . . . . . . . . . . . . . . . . . . . . 17 100 7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 18 101 8. Security Considerations . . . . . . . . . . . . . . . . . . . 18 102 9. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18 103 10. References . . . . . . . . . . . . . . . . . . . . . . . . . . 18 104 10.1. Normative References . . . . . . . . . . . . . . . . . . . 18 105 10.2. Informative References . . . . . . . . . . . . . . . . . . 19 106 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19 108 1. Terminology 110 In this document, the key words "MUST", "MUST NOT", "REQUIRED", 111 "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT 112 RECOMMENDED", "MAY", and "OPTIONAL" are to be interpreted as 113 described in BCP 14, conforming to [RFC2119] and indicate requirement 114 levels for compliant implementations. 116 Terms specific to SIP [RFC3261] performance benchmarking are defined 117 in [I-D.sip-bench-term]. 119 RFC 2119 defines the use of these key words to help make the intent 120 of standards track documents as clear as possible. While this 121 document uses these keywords, this document is not a standards track 122 document. The term Throughput is defined in [RFC2544]. 124 2. Introduction 126 This document describes the methodology for benchmarking Session 127 Initiation Protocol (SIP) performance as described in Terminology 128 document [I-D.sip-bench-term]. The methodology and terminology are 129 to be used for benchmarking signaling plane performance with varying 130 signaling and media load. Both scale and establishment rate are 131 measured by signaling plane performance. 133 The SIP Devices to be benchmarked may be a single device under test 134 (DUT) or a system under test (SUT). The DUT is a SIP Server, which 135 may be any [RFC3261] conforming device. The SUT can be any device or 136 group of devices containing RFC 3261 conforming functionality along 137 with Firewall and/or NAT functionality. This enables benchmarks to 138 be obtained and compared for different types of devices such as SIP 139 Proxy Server, SBC, SIP proxy server paired with a media relay or 140 Firewall/NAT device. SIP Associated Media benchmarks can also be 141 made when testing SUTs. 143 The test cases covered in this methodology document provide 144 benchmarks metrics of Registration Rate, SIP Session Establishment 145 Rate, Session Capacity, IM Rate, and Presence Rate. These can be 146 benchmarked with or without associated Media. Some cases are also 147 included to cover Forking, Loop detecion, Encrypted SIP, and SIP 148 Flooding. The test topologies that can be used are described in the 149 Test Setup section. Topologies are provided for benchmarking of a 150 DUT or SUT. Benchmarking with Associated Media can be performed when 151 using a SUT. 153 SIP permits a wide range of configuration options that are also 154 explained in the Test Setup section. Benchmark metrics could 155 possibly be impacted by Associated Media. The selected values for 156 Session Duration and Media Streams Per Session enable benchmark 157 metrics to be benchmarked without Associated Media. Session Setup 158 Rate could possibly be impacted by the selected value for Maximum 159 Sessions Attempted. The benchmark for Session Establishment Rate is 160 measured with a fixed value for maximum Session Attempts. 162 3. Test Topologies 164 Figures 1 through 3 below provide various topologies to perform the 165 SIP Performance Benchmarking. These figures show the Device Under 166 Test (DUT) to be a single server or a System Under Test (SUT). Test 167 Topology options to include benchmarking with Associated Media 168 require use of a SUT and are shown in Figures 3. 170 DUT 171 --------- --------- 172 | | | | 173 |signal-| 174 |ing | | | 175 |server | SIP | | 176 | |<------------->| EA | 177 | | | | 178 | | | | 179 | | | | 180 --------- --------- 182 Figure 1: Basic SIP Test Topology 184 SUT 185 ------------------------ 186 --------- --------- --------- 187 | | |media- | | | 188 |signal-| | relay | | | 189 |ing | SIP | OR | SIP | | 190 |server |<---------------------->| EA | 191 | | |fire | | | 192 | | |wall OR| | | 193 | | | NAT | | | 194 --------- --------- --------- 196 Figure 2: SIP Test Topology with Firewall 197 SUT 198 ------------------------ 199 --------- --------- --------- 200 | | | media-| | | 201 |signal-| | relay| | | 202 |ing | SIP | OR | SIP | | 203 |server |<---------------------->| EA | 204 | | |fire | | | 205 | | |wall OR| Media | | 206 | | ---| NAT |---------| | 207 --------- | --------- --------- 208 | Media ^ 209 -------------------------| 211 Figure 3: SIP Test Topology with Media through Firewall 213 --------- 214 | | 215 |------------->| | 216 ^ | | 217 | SIP | | 218 |<-------------| EA | 219 | | 220 | | 221 | | 222 --------- 224 Figure 4: Baseline Test Topology 226 4. Test Considerations 228 4.1. Selection of SIP Transport Protocol 230 Test cases may be performed with any transport protocol supported 231 by SIP. This includes, but is not limited to, SIP TCP, SIP UDP, 232 and TLS. The protocol used for the SIP transport protocol must be 233 reported with benchmarking results. 235 4.2. Signaling Server 236 The Server is a SIP-speaking device that complies with RFC 3261. 237 The purpose of this document is to benchmark SIP performance, not 238 conformance. Conformance to [RFC3261] is assumed for all tests. 239 The Server may be the DUT or a component of a SUT that includes 240 Firewall and/or NAT functionality. The components of the SUT may 241 be a single physical device or separate devices. 243 4.3. Associated Media 245 Some tests may require associated media to be present for each SIP 246 session. The Server is not involved in the forwarding of media. 247 Associated Media can be benchmarked only with a SUT in which the 248 media traverses a Media Relay, Firewall, NAT, or Firewall/NAT 249 device. The test topologies to be used when benchmarking SUT 250 performance for Associated Media are shown in Figures 4 and 5, in 251 which the SIP signaling is bidirectional and the Associated Media 252 is unidirectional. 254 4.4. Selection of Associated Media Protocol 256 The test cases specified in this document provide SIP performance 257 independent of the protocol used for the media stream. Any media 258 protocol supported by SIP may be used. This includes, but is not 259 limited to, RTP, RTSP, and SRTP. The protocol used for Associated 260 Media MUST be reported with benchmarking results. 262 4.5. Number of Associated Media Streams per SIP Session 264 Benchmarking results may vary with the number of media streams per 265 SIP session. When benchmarking a SUT for voice, a single media 266 stream is used. When benchmarking a SUT for voice and video, two 267 media streams are used. The number of Associated Media Streams 268 MUST be reported with benchmarking results. 270 4.6. Session Duration 272 SUT performance benchmarks may vary with the duration of SIP 273 sessions. Session Duration MUST be reported with benchmarking 274 results. A Session Duration of zero seconds indicates 275 transmission of a BYE immediately following successful SIP 276 establishment indicate by receipt of a 200 OK. An infinite 277 Session Duration indicates that a BYE is never transmitted. 279 4.7. Attempted Sessions per Second 280 DUT and SUT performance benchmarks may vary with the the rate of 281 attempted sessions offered by the Tester. Attempted Sessions per 282 Second MUST be reported with benchmarking results. 284 4.8. Stress Testing 286 Discussion: 287 The purpose of this document is to benchmark SIP performance, not 288 system stability under stressful conditions such as a high rate of 289 Attempted Sessions per Second. 291 5. Reporting Format 293 5.1. Test Setup Report 295 SIP Transport Protocol = ___________________________ 296 (valid values: TCP|UDP|TLS|SCTP|specify-other) 297 Session Attempt Rate = _____________________________ 298 (session attempts/sec) 299 IS Media Attempt Rate = ____________________________ 300 (IS media attempts/sec) 301 Total Sessions Attempted = _________________________ 302 (total sessions to be created over duration of test) 303 Media Streams Per Session = _______________________ 304 (number of streams per session) 305 Associated Media Protocol = _______________________ 306 (RTP|RTSP|specify-other) 307 Media Packet Size = _______________________________ 308 (bytes) 309 Media Offered Load = ______________________________ 310 (packets per second) 311 Media Session Hold Time = _________________________ 312 (seconds) 313 Establishment Threshold time = ____________________ 314 (seconds) 315 Loop Detecting Option = ___________________________ 316 (on|off) 317 Forking Option 318 Number of endpoints request sent to = ___________ 319 (1, means forking is not enabled) 320 Type of forking = _______________________________ 321 (serial|parallel) 323 Note: Total Sessions Attempted is used in the calculation of the 324 Session Establishment Performance ([I-D.sip-bench-term], Section 325 3.4.5). It is the number of session attempts ([I-D.sip-bench-term], 326 Section 3.1.6) that will be made over the duration of the test. 328 5.2. Device Benchmarks for IS 330 Registration Rate = _______________________________ 331 (registrations per second) 332 Re-registration Rate = ____________________________ 333 (registrations per second) 334 Session Capacity = _________________________________ 335 (sessions) 336 Session Overload Capacity = ________________________ 337 (sessions) 338 Session Establishment Rate = ______________________ 339 (sessions per second) 340 Session Establishment Performance = _______________ 341 (total established sessions/total sessions attempted)(no units) 342 Session Attempt Delay = ___________________________ 343 (seconds) 345 5.3. Device Benchmarks for NS 347 IM Rate = _______________________________ (IM messages per second) 349 6. Test Cases 351 6.1. Baseline Session Establishment Rate 353 Objective: 354 To benchmark the Session Establishment Rate of the DUT/SUT with 355 zero failures. 356 Procedure: 357 1. Configure the test bed in the test topology shown in Figure 4. 358 2. Configure Tester with a Session Attempt Rate = 100 SPS, 359 maximum Session Attempts = 100,000 and Media Streams Per 360 Session=0. 361 3. Start Tester to initiate SIP Session establishment with the 362 DUT. 363 4. Measure Session Attempt Failures and total Established 364 Sessions at the Tester. 365 5. If a Session Attempt Failure is recorded then reduce the 366 Session Attempt Rate configured on the Tester by 50%. 367 6. If no Session Attempt Failure is recorded then increase the 368 Session Attempt Rate configured on the Tester by 50%. 370 7. Repeat steps 3 through 6 until the Session Establishment Rate 371 is obtained and recorded. 372 Expected Results: This is the scenario to obtain the maximum Session 373 Establishment Rate of the test bed when no DUT?SUT is present. 374 The results of this test might be used to normalize test results 375 performed on different test beds or simply to better understand 376 the impact of the DUT/SUT on the test bed in question. 378 6.2. Session Establishment Rate 380 Objective: 381 To benchmark the Session Establishment Rate of the DUT/SUT with 382 zero failures. 383 Procedure: 384 1. Configure the DUT in the test topology shown in Figure 1 or 385 SUT as shown in Figures 2 or 3. 386 2. Configure Tester with a Session Attempt Rate = 100 SPS, 387 maximum Session Attempts = 100,000 and Media Streams Per 388 Session=0. 389 3. Start Tester to initiate SIP Session establishment with the 390 DUT. 391 4. Measure Session Attempt Failures and total Established 392 Sessions at the Tester. 393 5. If a Session Attempt Failure is recorded then reduce the 394 Session Attempt Rate configured on the Tester by 50%. 395 6. If no Session Attempt Failure is recorded then increase the 396 Session Attempt Rate configured on the Tester by 50%. 397 7. Repeat steps 3 through 6 until the Session Establishment Rate 398 is obtained and recorded. 399 Expected Results: This is the scenario to obtain the maximum Session 400 Establishment Rate of the DUT/SUT. 402 6.3. Session Establishment Rate with Media 404 Objective: 405 To benchmark the Session Establishment Rate of the SUT with zero 406 failures when Associated Media is included in the benchmark test. 407 Procedure: 408 1. Configure the SUT in the test topology shown in Figure 2 or 3. 409 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 410 Session Attempts = 100,000 and Media Streams Per Session = 1. 411 The rate of offered load for each media stream SHOULD be (eq 412 1) Offered Load per Media Stream = Throughput / maximum 413 sessions attempted, where Throughput is defined in [RFC2544]. 414 3. Start Tester to initiate SIP Session establishment with the 415 SUT and transmit media through the SUT to a destination other 416 than the server. 418 4. At the Tester measure Session Attempt Failures, total 419 Established Sessions, and Packet Loss [RFC2544] of the media. 420 5. If a Session Attempt Failure or Packet Loss is recorded then 421 reduce the Session Attempt Rate configured on the Tester by 422 50%. 423 6. If no Session Attempt Failure or Packet Loss is recorded then 424 increase the Session Attempt Rate configured on the Tester by 425 50%. 426 7. Repeat steps 3 through 6 until the Session Establishment Rate 427 is obtained and recorded. 428 8. Repeat steps 1 through 7 for multimedia in which Media Streams 429 Per Session = 2. 430 Expected Results: Session Establishment Rate results obtained with 431 Associated Media with any number of media streams per SIP session 432 are expected to be identical to the Session Establishment Rate 433 results obtained without media in the case where the server is 434 running on a platform separate from the platform on which the 435 Media Relay, NAT or Firewall is running. Session Establishment 436 Rate results obtained with Associated Media may be lower than 437 those obtained without media in the case where the server and the 438 NAT, Firewall or Media Relay are running on the same platform. 440 6.4. Session Establishment Rate with Loop Detection Enabled 442 Objective: 443 To benchmark the Session Establishment Rate of the DUT/SUT with 444 zero failures when the Loop Detection option is enabled. 445 Procedure: 446 1. Configure the DUT in the test topology shown in Figure 1 or 447 SUT as shown in Figures 2 or 3. 448 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 449 Session Attempts = 100,000 and Media Streams Per Session=0. 450 3. Turn on the Loop Detection option in the DUT or SUT. 451 4. Start Tester to initiate SIP Session establishment with the 452 DUT. 453 5. Measure Session Attempt Failures and total Established 454 Sessions at the Tester. 455 6. If a Session Attempt Failure is recorded then reduce the 456 Session Attempt Rate configured on the Tester by 50%. 457 7. If no Session Attempt Failure is recorded then increase the 458 Session Attempt Rate configured on the Tester by 50%. 459 8. Repeat steps 4 through 7 until the Session Establishment Rate 460 is obtained and recorded. 461 Expected Results: Session Establishment Rate results obtained with 462 Loop Detection may be lower than those obtained without Loop 463 Detection enabled. 465 6.5. Session Establishment Rate with Forking 467 Objective: 468 To benchmark the Session Establishment Rate of the DUT/SUT with 469 zero failures when the Forking Option is enabled. 470 Procedure: 471 1. Configure the DUT in the test topology shown in Figure 1 or 472 SUT as shown in Figures 2 or 3. 473 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 474 Session Attempts = 100,000 and Media Streams Per Session=0. 475 3. Set the number of endpoints that will receive the forked 476 invitation to a value of 2 or more (subsequent tests may 477 increase this value at the discretion of the tester.) 478 4. Start Tester to initiate SIP Session establishment with the 479 DUT. 480 5. Measure Session Attempt Failures and total Established 481 Sessions at the Tester. 482 6. If a Session Attempt Failure is recorded then reduce the 483 Session Attempt Rate configured on the Tester by 50%. 484 7. If no Session Attempt Failure is recorded then increase the 485 Session Attempt Rate configured on the Tester by 50%. 486 8. Repeat steps 4 through 7 until the Session Establishment Rate 487 is obtained and recorded. 488 Expected Results: Session Establishment Rate results obtained with 489 Forking may be lower than those obtained without Forking enabled. 491 6.6. Session Establishment Rate with Forking and Loop Detection 493 Objective: 494 To benchmark the Session Establishment Rate of the DUT/SUT with 495 zero failures when both the Forking and Loop Detection Options are 496 enabled. 497 Procedure: 498 1. Configure the DUT in the test topology shown in Figure 1 or 499 SUT as shown in Figures 2 or 3. 500 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 501 Session Attempts = 100,000 and Media Streams Per Session=0. 502 3. Start Tester to initiate SIP Session establishment with the 503 DUT. 504 4. Enable the Loop Detection Options on the DUT. 505 5. Set the number of endpoints that will receive the forked 506 invitation to a value of 2 or more (subsequent tests may 507 increase this value at the discretion of the tester.) 508 6. Measure Session Attempt Failures and total Established 509 Sessions at the Tester. 510 7. If a Session Attempt Failure is recorded then reduce the 511 Session Attempt Rate configured on the Tester by 50%. 513 8. If no Session Attempt Failure is recorded then increase the 514 Session Attempt Rate configured on the Tester by 50%. 515 9. Repeat steps 4 through 7 until the Session Establishment Rate 516 is obtained and recorded. 517 Expected Results: Session Establishment Rate results obtained with 518 Forking and Loop Detection may be lower than those obtained with 519 only Forking or Loop Detection enabled. 521 6.7. Session Establishment Rate with TLS Encrypted SIP 523 Objective: 524 To benchmark the Session Establishment Rate of the DUT/SUT with 525 zero failures when using TLS encrypted SIP. 526 Procedure: 527 1. Configure the DUT in the test topology shown in Figure 1 or 528 SUT as shown in Figures 2 or 3. 529 2. Configure Tester for SIP TCP, enable TLS, Session Attempt Rate 530 = 100 SPS, maximum Session Attempts = 100,000 and Media 531 Streams Per Session = 0. 532 3. Start Tester to initiate SIP Session establishment with the 533 DUT. 534 4. Measure Session Attempt Failures and total Established 535 Sessions at the Tester. 536 5. If a Session Attempt Failure is recorded then reduce the 537 Session Attempt Rate configured on the Tester by 50%. 538 6. If no Session Attempt Failure is recorded then increase the 539 Session Attempt Rate configured on the Tester by 50%. 540 7. Repeat steps 3 through 6 until the Session Establishment Rate 541 is obtained and recorded. 542 Expected Results: Session Establishment Rate results obtained with 543 TLS Encrypted SIP may be lower than those obtained with plaintext 544 SIP. 546 6.8. Session Establishment Rate with IPsec Encrypted SIP 548 Objective: 549 To benchmark the Session Establishment Rate of the DUT/SUT with 550 zero failures when using IPsec Encryoted SIP. 551 Procedure: 552 1. Configure the DUT in the test topology shown in Figure 1 or 553 SUT as shown in Figures 2 or 3. 554 2. Configure Tester for SIP TCP, enable IPSec, Session Attempt 555 Rate = 100 SPS, maximum Session Attempts = 100,000 and Media 556 Streams Per Session = 0. 557 3. Start Tester to initiate SIP Session establishment with the 558 DUT. 560 4. Measure Session Attempt Failures and total Established 561 Sessions at the Tester. 562 5. If a Session Attempt Failure is recorded then reduce the 563 Session Attempt Rate configured on the Tester by 50%. 564 6. If no Session Attempt Failure is recorded then increase the 565 Session Attempt Rate configured on the Tester by 50%. 566 7. Repeat steps 3 through 6 until the Session Establishment Rate 567 is obtained and recorded. 568 Expected Results: Session Establishment Rate results obtained with 569 IPSec Encrypted SIP may be lower than those obtained with 570 plaintext SIP. 572 6.9. Session Establishment Rate with SIP Flooding 574 Objective: 575 To benchmark the Session Establishment Rate of the SUT with zero 576 failures when SIP Flooding is occurring. 577 Procedure: 578 1. Configure the DUT in the test topology shown in Figure 1 or 579 the SUT as shown in Figure 2. 580 2. Configure Tester for SIP UDP with an Session Attempt Rate = 581 100 SPS, maximum Session Attempts = 100,000, Associated Media 582 Streams Per Session = 0, and SIP INVITE Message Flood = 500 583 per second. 584 3. Start Tester to initiate SIP Session establishment with the 585 SUT and SIP Flood targetted at the Server. 586 4. At the Tester measure Session Attempt Failures, total 587 Established Sessions, and Packet Loss [RFC2544] of the media. 588 5. If a Session Attempt Failure or Packet Loss is recorded then 589 reduce the Session Attempt Rate configured on the Tester by 590 50%. 591 6. If no Session Attempt Failure or Packet Loss is recorded then 592 increase the Session Attempt Rate configured on the Tester by 593 50%. 594 7. Repeat steps 3 through 6 until the Session Establishment Rate 595 is obtained and recorded. 596 8. Repeat steps 1 through 7 with SIP INVITE Message Flood = 1000 597 per second. 598 Expected Results: Session Establishment Rate results obtained with 599 SIP Flooding may be degraded. 601 6.10. Maximum Registration Rate 603 Objective: 605 To benchmark the maximum registration rate of the DUT/SUT with 606 zero failures. 607 Procedure: 608 1. Configure the DUT in the test topology shown in Figure 1 or 609 SUT as shown in Figures 2 or 3. 610 2. Configure Tester with a Registration Rate = 100 SPS and 611 maximum registrations attempted = 100,000. 612 3. Set the registration timeout value to at least 3600 seconds. 613 4. At the Tester measure failed registration attempts, total 614 registrations and packet loss. 615 5. If a Failed Registration Attempt or Packet Loss is recorded 616 then reduce the Attempted Registration Rate configured on the 617 Tester by 50%. 618 6. If no Failed Registration or Packet Loss is recorded then 619 increase the Attempted Registration Rate configured on the 620 Tester by 50%. 621 7. Repeat steps 5 and 6 until the all registrations have 622 succeeded. This number is obtained and recorded. 623 Expected Results: 625 6.11. Maximum Re-Registration Rate 627 Objective: 628 To benchmark the maximum re-registration rate of the DUT/SUT with 629 zero failures. 630 Procedure: 631 1. Configure the DUT in the test topology shown in Figure 1 or 632 SUT as shown in Figures 2 or 3. 633 2. Execute test detailed in Section 6.10 to register the 634 endpoints with the registrar. The rest of the steps below 635 MUST be performed at least 5 minutes after, but no more than 636 15 minutes after the test performed in Section 6.10. 637 3. Configure Tester for an attempted Registration Rate = 100 SPS 638 and maximum registrations attempted = 100,000. 639 4. Configure Tester to re-register the same address-of-records 640 that were registered in Section 6.10. 641 5. At the Tester measure failed registration attempts, total 642 registrations and packet loss. 643 6. If a Failed Registration Attempt or Packet Loss is recorded 644 then reduce the Attempted Registration Rate configured on the 645 Tester by 50%. 646 7. If no Failed Registration or Packet Loss is recorded then 647 increase the Attempted Registration Rate configured on the 648 Tester by 50%. 649 8. Repeat steps 6 and 7 until the all re-registrations have 650 succeeded. This number is obtained and recorded. 652 Expected Results: The rate should be at least equal to but not more 653 than the result of Section 6.10. 655 6.12. Maximum IM Rate 657 Objective: 658 To benchmark the maximum IM rate of the SUT with zero failures. 659 Procedure: 660 1. Configure the DUT in the test topology shown in Figure 1 or 661 SUT as shown in Figures 2 or 3. 662 2. Configure Tester for an IM Rate = 100 SPS, Maximum IM 663 Attempted = 100,000. 664 3. At the Tester measure Failed IM Attempts, Total IM and Packet 665 Loss. 666 4. If a Failed IM Attempt or Packet Loss is recorded then reduce 667 the Attempted IM Rate configured on the Tester by 50%. 668 5. If no Failed IM or Packet Loss is recorded then increase the 669 Attempted IM Rate configured on the Tester by 50%. 670 6. Repeat steps 3 through 6 until the Maximum IM Rate is obtained 671 and recorded. 672 Expected Results: 674 6.13. Session Capacity without Media 676 Objective: 677 To benchmark the Session Capacity of the SUT without Associated 678 Media. 679 Procedure: 680 1. Configure the DUT in the test topology shown in Figure 1 or 681 SUT as shown in Figures 2 or 3. 682 2. Configure Tester for a Session Attempt Rate = Session 683 Establishment Rate, maximum Session Attempts = 10,000 and 684 Media Streams Per Session = 0. 685 3. Start Tester to initiate SIP Session establishment with the 686 DUT. 687 4. Measure Session Attempt Failures, total Established Sessions, 688 and Packet Loss [RFC2544] at the Tester. 689 5. If a Session Attempt Failure or Packet Loss is recorded then 690 reduce the maximum Session Attempts configured on the Tester 691 by 5,000. 692 6. If no Session Attempt Failure or Packet Loss is recorded then 693 increase the maximum Session Attempts configured on the Tester 694 by 10,000. 695 7. Repeat steps 3 through 6 until the Session Capacity is 696 obtained and recorded. 697 8. Repeat steps 1 through 7 for multimedia in which media streams 698 per session = 2. 700 Expected Results: This is the scenario to obtain the maximum Session 701 Capacity of the DUT/SUT. 703 6.14. Session Capacity with Media 705 Objective: 706 To benchmark the session capacity of the DUT/SUT with Associated 707 Media. 708 Procedure: 709 1. Configure the DUT in the test topology shown in Figure 1 or 710 SUT as shown in Figures 2 or 3. 711 2. Configure Tester for a Session Attempt Rate = 100 SPS, Session 712 Duration = 30 sec, maximum Session Attempts = 100,000 and 713 Media Streams Per Session = 1. 714 NOTE: The total offered load to the DUT/SUT SHOULD be equal 715 to the Throughput of the DUT/SUT as defined in [RFC2544]. 716 The offered load to the DUT/SUT for each media stream 717 SHOULD be equal to 718 Throughput/Maximum Session Attemps. 719 3. Start Tester to initiate SIP Session establishment with the 720 SUT and transmit media through the SUT to a destination other 721 than the server. 722 4. Measure Session Attempt Failures and total Established 723 Sessions at the Tester. 724 5. If a Session Attempt Failure is recorded then reduce the 725 maximum Session Attempts configured on the Tester by 5,000. 726 6. If no Session Attempt Failure is recorded then increase the 727 maximum Session Attempts configured on the Tester by 10,000. 728 7. Repeat steps 3 through 6 until the Session Capacity is 729 obtained and recorded. 730 Expected Results: Session Capacity results obtained with Associated 731 Media with any number of media streams per SIP session will be 732 identical to the Session Capacity results obtained without media. 734 6.15. Session Capacity with Media and a Media Relay/NAT and/or Firewall 736 Objective: 737 To benchmark the Session Establishment Rate of the SUT with 738 Associated Media. 739 Procedure: 740 1. Configure the SUT in the test topology shown in Figure 3. 741 2. Configure Tester for a Session Attempt Rate = 100 SPS, Session 742 Duration = 30 sec, maximum Session Attempts = 100,000 and 743 Media Streams Per Session = 1. 744 NOTE: The offered load for each media stream SHOULD be as 745 in Equation 1. 747 3. Start Tester to initiate SIP Session establishment with the 748 SUT and transmit media through the SUT to a destination other 749 than the server. 750 4. Measure Session Attempt Failures and total Established 751 Sessions at the Tester. 752 5. If a Session Attempt Failure is recorded then reduce the 753 maximum Session Attempts configured on the Tester by 5,000. 754 6. If no Session Attempt Failure is recorded then increase the 755 maximum Session Attempts configured on the Tester by 10,000. 756 7. Repeat steps 3 through 6 until the Session Capacity is 757 obtained and recorded. 758 Expected Results: Session Capacity results obtained with Associated 759 Media with any number of media streams per SIP session may be 760 lower than the Session Capacity without Media result if the Media 761 Relay, NAT or Firewall is sharing a platform with the server. 763 7. IANA Considerations 765 This document does not requires any IANA considerations. 767 8. Security Considerations 769 Documents of this type do not directly affect the security of 770 Internet or corporate networks as long as benchmarking is not 771 performed on devices or systems connected to production networks. 772 Security threats and how to counter these in SIP and the media layer 773 is discussed in RFC3261, RFC3550, and RFC3711 and various other 774 drafts. This document attempts to formalize a set of common 775 methodology for benchmarking performance of SIP devices in a lab 776 environment. 778 9. Acknowledgments 780 The authors would like to thank Keith Drage and Daryl Malas for their 781 contributions to this document. 783 10. References 785 10.1. Normative References 787 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 788 Requirement Levels", BCP 14, RFC 2119, March 1997. 790 [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 791 Network Interconnect Devices", RFC 2544, March 1999. 793 [I-D.sip-bench-term] 794 Poretsky, S., Gurbani, V., and C. Davids, "SIP Performance 795 Benchmarking Terminology", 796 draft-ietf-bmwg-sip-bench-term-02 (work in progress), 797 July 2010. 799 10.2. Informative References 801 [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, 802 A., Peterson, J., Sparks, R., Handley, M., and E. 803 Schooler, "SIP: Session Initiation Protocol", RFC 3261, 804 June 2002. 806 Authors' Addresses 808 Scott Poretsky 809 Allot Communications 810 300 TradeCenter, Suite 4680 811 Woburn, MA 08101 812 USA 814 Phone: +1 508 309 2179 815 Email: sporetsky@allot.com 817 Vijay K. Gurbani 818 Bell Laboratories, Alcatel-Lucent 819 1960 Lucent Lane 820 Rm 9C-533 821 Naperville, IL 60566 822 USA 824 Phone: +1 630 224 0216 825 Email: vkg@bell-labs.com 827 Carol Davids 828 Illinois Institute of Technology 829 201 East Loop Road 830 Wheaton, IL 60187 831 USA 833 Phone: +1 630 682 6024 834 Email: davids@iit.edu