idnits 2.17.1 draft-ietf-bmwg-sip-bench-term-03.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year == Line 465 has weird spacing: '...UTs and hop- ...' -- The document date (March 14, 2011) is 4785 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 2544 == Outdated reference: A later version (-12) exists of draft-ietf-bmwg-sip-bench-meth-03 ** Downref: Normative reference to an Informational draft: draft-ietf-bmwg-sip-bench-meth (ref. 'I-D.ietf-bmwg-sip-bench-meth') == Outdated reference: A later version (-08) exists of draft-ietf-soc-overload-design-05 == Outdated reference: A later version (-15) exists of draft-ietf-soc-overload-control-02 Summary: 2 errors (**), 0 flaws (~~), 6 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Benchmarking Methodology Working C. Davids 3 Group Illinois Institute of Technology 4 Internet-Draft V. Gurbani 5 Expires: September 15, 2011 Bell Laboratories, Alcatel-Lucent 6 S. Poretsky 7 Allot Communications 8 March 14, 2011 10 Terminology for Benchmarking Session Initiation Protocol (SIP) 11 Networking Devices 12 draft-ietf-bmwg-sip-bench-term-03 14 Abstract 16 This document provides a terminology for benchmarking SIP performance 17 in networking devices. Terms are included for test components, test 18 setup parameters, and performance benchmark metrics for black-box 19 benchmarking of SIP networking devices. The performance benchmark 20 metrics are obtained for the SIP control plane and media plane. The 21 terms are intended for use in a companion methodology document for 22 complete performance characterization of a device in a variety of 23 conditions making it possible to compare performance of different 24 devices. It is critical to provide test setup parameters and a 25 methodology document for SIP performance benchmarking because SIP 26 allows a wide range of configuration and operational conditions that 27 can influence performance benchmark measurements. It is necessary to 28 have terminology and methodology standards to ensure that reported 29 benchmarks have consistent definition and were obtained following the 30 same procedures. Benchmarks can be applied to compare performance of 31 a variety of SIP networking devices. 33 Status of this Memo 35 This Internet-Draft is submitted in full conformance with the 36 provisions of BCP 78 and BCP 79. 38 Internet-Drafts are working documents of the Internet Engineering 39 Task Force (IETF). Note that other groups may also distribute 40 working documents as Internet-Drafts. The list of current Internet- 41 Drafts is at http://datatracker.ietf.org/drafts/current/. 43 Internet-Drafts are draft documents valid for a maximum of six months 44 and may be updated, replaced, or obsoleted by other documents at any 45 time. It is inappropriate to use Internet-Drafts as reference 46 material or to cite them other than as "work in progress." 48 This Internet-Draft will expire on September 15, 2011. 50 Copyright Notice 52 Copyright (c) 2011 IETF Trust and the persons identified as the 53 document authors. All rights reserved. 55 This document is subject to BCP 78 and the IETF Trust's Legal 56 Provisions Relating to IETF Documents 57 (http://trustee.ietf.org/license-info) in effect on the date of 58 publication of this document. Please review these documents 59 carefully, as they describe your rights and restrictions with respect 60 to this document. Code Components extracted from this document must 61 include Simplified BSD License text as described in Section 4.e of 62 the Trust Legal Provisions and are provided without warranty as 63 described in the Simplified BSD License. 65 Table of Contents 67 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 68 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 5 69 2.1. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . 6 70 2.2. Benchmarking Models . . . . . . . . . . . . . . . . . . . 7 71 3. Term Definitions . . . . . . . . . . . . . . . . . . . . . . . 12 72 3.1. Protocol Components . . . . . . . . . . . . . . . . . . . 12 73 3.1.1. Session . . . . . . . . . . . . . . . . . . . . . . . 13 74 3.1.2. Signaling Plane . . . . . . . . . . . . . . . . . . . 15 75 3.1.3. Media Plane . . . . . . . . . . . . . . . . . . . . . 16 76 3.1.4. Associated Media . . . . . . . . . . . . . . . . . . . 16 77 3.1.5. Overload . . . . . . . . . . . . . . . . . . . . . . . 17 78 3.1.6. Session Attempt . . . . . . . . . . . . . . . . . . . 18 79 3.1.7. Established Session . . . . . . . . . . . . . . . . . 18 80 3.1.8. Invite-initiated Session (IS) . . . . . . . . . . . . 19 81 3.1.9. Non-INVITE-initiated Session (NS) . . . . . . . . . . 19 82 3.1.10. Session Attempt Failure . . . . . . . . . . . . . . . 20 83 3.1.11. Standing Sessions Count . . . . . . . . . . . . . . . 20 84 3.2. Test Components . . . . . . . . . . . . . . . . . . . . . 21 85 3.2.1. Emulated Agent . . . . . . . . . . . . . . . . . . . . 21 86 3.2.2. Signaling Server . . . . . . . . . . . . . . . . . . . 21 87 3.2.3. SIP-Aware Stateful Firewall . . . . . . . . . . . . . 22 88 3.2.4. SIP Transport Protocol . . . . . . . . . . . . . . . . 22 89 3.3. Test Setup Parameters . . . . . . . . . . . . . . . . . . 23 90 3.3.1. Session Attempt Rate . . . . . . . . . . . . . . . . . 23 91 3.3.2. IS Media Attempt Rate . . . . . . . . . . . . . . . . 23 92 3.3.3. Establishment Threshold Time . . . . . . . . . . . . . 24 93 3.3.4. Session Duration . . . . . . . . . . . . . . . . . . . 25 94 3.3.5. Media Packet Size . . . . . . . . . . . . . . . . . . 25 95 3.3.6. Media Offered Load . . . . . . . . . . . . . . . . . . 26 96 3.3.7. Media Session Hold Time . . . . . . . . . . . . . . . 26 97 3.3.8. Loop Detection Option . . . . . . . . . . . . . . . . 27 98 3.3.9. Forking Option . . . . . . . . . . . . . . . . . . . . 27 99 3.4. Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . 28 100 3.4.1. Registration Rate . . . . . . . . . . . . . . . . . . 28 101 3.4.2. Session Establishment Rate . . . . . . . . . . . . . . 29 102 3.4.3. Session Capacity . . . . . . . . . . . . . . . . . . . 30 103 3.4.4. Session Overload Capacity . . . . . . . . . . . . . . 31 104 3.4.5. Session Establishment Performance . . . . . . . . . . 31 105 3.4.6. Session Attempt Delay . . . . . . . . . . . . . . . . 32 106 3.4.7. IM Rate . . . . . . . . . . . . . . . . . . . . . . . 32 107 4. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 33 108 5. Security Considerations . . . . . . . . . . . . . . . . . . . 33 109 6. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 34 110 7. References . . . . . . . . . . . . . . . . . . . . . . . . . . 34 111 7.1. Normative References . . . . . . . . . . . . . . . . . . . 34 112 7.2. Informational References . . . . . . . . . . . . . . . . . 34 113 Appendix A. White Box Benchmarking Terminology . . . . . . . . . 35 114 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 35 116 1. Terminology 118 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 119 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 120 document are to be interpreted as described in BCP 14, RFC2119 121 [RFC2119]. RFC 2119 defines the use of these key words to help make 122 the intent of standards track documents as clear as possible. While 123 this document uses these keywords, this document is not a standards 124 track document. The term Throughput is defined in RFC2544 [RFC2544]. 126 For the sake of clarity and continuity, this document adopts the 127 template for definitions set out in Section 2 of RFC 1242 [RFC1242]. 128 Definitions are indexed and grouped together for ease of reference. 130 This document uses existing terminology defined in other BMWG work. 131 Examples include, but are not limited to: 133 Device under test (DUT) (c.f., Section 3.1.1 RFC 2285 [RFC2285]). 134 System under test (SUT) (c.f., Section 3.1.2, RFC 2285 [RFC2285]). 136 Many commonly used SIP terms in this document are defined in RFC 3261 137 [RFC3261]. For convenience the most important of these are 138 reproduced below. Use of these terms in this document is consistent 139 with their corresponding definition in [RFC3261]. 140 o Call Stateful: A proxy is call stateful if it retains state for a 141 dialog from the initiating INVITE to the terminating BYE request. 142 A call stateful proxy is always transaction stateful, but the 143 converse is not necessarily true. 144 o Stateful Proxy: A logical entity that maintains the client and 145 server transaction state machines defined by this specification 146 during the processing of a request, also known as a transaction 147 stateful proxy. The behavior of a stateful proxy is further 148 defined in Section 16. A (transaction) stateful proxy is not the 149 same as a call stateful proxy. 150 o Stateless Proxy: A logical entity that does not maintain the 151 client or server transaction state machines defined in this 152 specification when it processes requests. A stateless proxy 153 forwards every request it receives downstream and every response 154 it receives upstream. 155 o Back-to-back User Agent: A back-to-back user agent (B2BUA) is a 156 logical entity that receives a request and processes it as a user 157 agent server (UAS). In order to determine how the request should 158 be answered, it acts as a user agent client (UAC) and generates 159 requests. Unlike a proxy server, it maintains dialog state and 160 must participate in all requests sent on the dialogs it has 161 established. Since it is a concatenation of a UAC and a UAS, no 162 explicit definitions are needed for its behavior. 164 o Loop: A request that arrives at a proxy, is forwarded, and later 165 arrives back at the same proxy. When it arrives the second time, 166 its Request-URI is identical to the first time, and other header 167 fields that affect proxy operation are unchanged, so that the 168 proxy will make the same processing decision on the request it 169 made the first time. Looped requests are errors, and the 170 procedures for detecting them and handling them are described by 171 the protocol. 173 2. Introduction 175 Service Providers are now planning Voice Over IP (VoIP) and 176 Multimedia network deployments using the IETF developed Session 177 Initiation Protocol (SIP) [RFC3261]. SIP is a signaling protocol 178 originally intended to be used for the dynamic establishment, 179 disconnection and modification of streams of media between end users. 180 As it has evolved it has been adopted for use in a growing number of 181 applications and features. Many of these result in the creation of a 182 media stream, but some do not. Instead, they create other services 183 tailored to the end-users' immediate needs or preferences. The set 184 of benchmarking terms provided in this document is intended for use 185 with any SIP-enabled device performing SIP functions in the interior 186 of the network. The performance of end-user devices is outside the 187 scope of this document. 189 VoIP with SIP has led to the development of new networking devices 190 including SIP Server, Session Border Controllers (SBC), Back-to-back 191 user agents (B2BUA) and SIP-Aware Stateful Firewall. The mix of 192 voice and IP functions in these various devices has produced 193 inconsistencies in vendor reported performance metrics and has caused 194 confusion in the service provider community. SIP allows a wide range 195 of configuration and operational conditions that can influence 196 performance benchmark measurements. When the device under test 197 terminates or relays both media and signaling, for example, it is 198 important to be able to correlate a signaling measurement with the 199 media plane measurements to determine the system performance. As 200 devices and their functions proliferate, the need to have a 201 consistent set of metrics to compare their performance becomes 202 increasingly urgent. This document and its companion methodology 203 document [I-D.ietf-bmwg-sip-bench-meth] provide a set of black-box 204 benchmarks for describing and comparing the performance of devices 205 that incorporate the SIP User Agent Client and Server functions and 206 that operate in the network's core. 208 The definition of SIP performance benchmarks necessarily includes 209 definitions of Test Setup Parameters and a test methodology. These 210 enable the Tester to perform benchmarking tests on different devices 211 and to achieve comparable and repeatable results. This document 212 provides a common set of well-defined terms for Test Components, Test 213 Setup Parameters, and Benchmarks. All the benchmarks defined are 214 black-box measurements of the SIP Control (Signaling) plane. The 215 Test Setup Parameters and Benchmarks defined in this document are 216 intended for use with the companion Methodology document. Benchmarks 217 of internal DUT characteristics (also known as white-box benchmarks) 218 such as Session Attempt Arrival Rate, which is measured at the DUT, 219 are described in Appendix A to allow additional characterization of 220 DUT behavior with different distribution models. 222 2.1. Scope 224 The scope of this work item is summarized as follows: 225 o This terminology document describes SIP signaling (control- plane) 226 performance benchmarks for black-box measurements of SIP 227 networking devices. Stress and debug scenarios are not addressed 228 in this work item. 229 o The DUT must be an RFC 3261 capable network equipment. This may 230 be a Registrar, Redirect Server, Stateless Proxy or Stateful 231 Proxy. A DUT MAY also include a B2BUA, SBC functionality (this is 232 referred to as the "Signaling Server".) The DUT MAY be a multi- 233 port SIP-to-switched network gateway implemented as a SIP UAC or 234 UAS. 235 o The DUT MAY have an internal SIP Application Level Gateway (ALG), 236 firewall, and/or a Network Address Translator (NAT). This is 237 referred to as the "SIP Aware Stateful Firewall." 238 o The DUT or SUT MUST NOT be end user equipment, such as personal 239 digital assistant, a computer-based client, or a user terminal. 240 o The Tester acts as multiple "Emulated Agents" that initiate (or 241 respond to) SIP messages as session endpoints and source (or 242 receive) associated media for established connections. 243 o Control Signaling in presence of Media 244 * The media performance is not benchmarked in this work item. 245 * It is RECOMMENDED that control plane benchmarks are performed 246 with media present, but this is optional. 247 * The SIP INVITE requests MUST include the SDP body. 248 * The type of DUT dictates whether the associated media streams 249 traverse the DUT or SUT. Both scenarios are within the scope 250 of this work item. 251 * SIP is frequently used to create media streams; the control 252 plane and media plane are treated as orthogonal to each other 253 in this document. While many devices support the creation of 254 media streams, benchmarks that measure the performance of these 255 streams are outside the scope of this document and its 256 companion methodology document [I-D.ietf-bmwg-sip-bench-meth]. 257 Tests may be performed with or without the creation of media 258 streams. The presence or absence of media streams MUST be 259 noted as a condition of the test as the performance of SIP 260 devices may vary accordingly. Even if the media is used during 261 benchmarking, only the SIP performance will be benchmarked, not 262 the media performance or quality. 263 o Both INVITE and non-INVITE scenarios (such as Instant Messages or 264 IM) are addressed in this document. However, benchmarking SIP 265 presence is not a part of this work item. 266 o Different transport mechanisms -- such as UDP, TCP, SCTP, or TLS 267 -- may be used; however, the specific transport mechanism MUST be 268 noted as a condition of the test as the performance of SIP devices 269 may vary accordingly. 270 o Looping and forking options are also considered since they impact 271 processing at SIP proxies. 272 o REGISTER and INVITE requests may be challenged or remain 273 unchallenged for authentication purpose as this may impact the 274 performance benchmarks. Any observable performance degradation 275 due to authentication is of interest to the SIP community. 276 Whether or not the REGISTER and INVITE requests are challenged is 277 a condition of test and will be recorded and reported. 278 o Re-INVITE requests are not considered in scope of this work item. 279 o Only session establishment is considered for the performance 280 benchmarks. Session disconnect is not considered in the scope of 281 this work item. 282 o SIP Overload [I-D.ietf-soc-overload-design] is within the scope of 283 this work item. We test to failure and then can continue to 284 observe and record the behavior of the system after failures are 285 recorded. The cause of failure is not within the scope of this 286 work. We note the failure and may continue to test until a 287 different failure or condition is encountered. Considerations on 288 how to handle overload are deferred to work progressing in the SOC 289 working group [I-D.ietf-soc-overload-control]. Vendors are, of 290 course, free to implement their specific overload control behavior 291 as the expected test outcome if it is different from the IETF 292 recommendations. However, such behavior MUST be documented and 293 interpreted appropriately across multiple vendor implementations. 294 This will make it more meaningful to compare the performance of 295 different SIP overload implementations. 296 o IMS-specific scenarios are not considered, but test cases can be 297 applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT. 299 2.2. Benchmarking Models 301 This section shows the five models to be used when benchmarking SIP 302 performance of a networking device. Figure 1 shows the DUT plays the 303 role of a user agent client (UAC), initiating requests and absorbing 304 responses. This model can be used as a baseline performance for the 305 DUT acting as a UAC without associated media. 307 +--------+ Signaling request +--------+ 308 | +----------------------------->| | 309 | DUT | | Tester | 310 | | Signaling response | EA | 311 | |<-----------------------------+ | 312 +--------+ +--------+ 314 Figure 1: Baseline performance for DUT acting as a user agent client 315 without associated media 317 Figure 2 shows the DUT plays the role of a user agent server (UAS), 318 absorbing the requests and sending responses. This model can be used 319 as a baseline performance for the DUT acting as a UAS without 320 associated media. 322 +--------+ Signaling request +--------+ 323 | +----------------------------->| | 324 | Tester | | DUT | 325 | EA | Response | | 326 | |<-----------------------------+ | 327 +--------+ +--------+ 329 Figure 2: Baseline performance for DUT acting as a user agent server 330 without associated media 332 Figure 3 shows the DUT plays the role of a user agent client (UAC), 333 initiating requests and absorbing responses. This model can be used 334 as a baseline performance for the DUT acting as a UAC with associated 335 media. 337 +--------+ Signaling request +--------+ 338 | +----------------------------->| | 339 | DUT | | Tester | 340 | | Signaling response | EA | 341 | |<-----------------------------+ | 342 | |<============ Media =========>| | 343 +--------+ +--------+ 345 Figure 3: Baseline performance for DUT acting as a user agent client 346 with associated media 348 Figure 4 shows the DUT plays the role of a user agent server (UAS), 349 absorbing the requests and sending responses. This model can be used 350 as a baseline performance for the DUT acting as a UAS with associated 351 media. 353 +--------+ Signaling request +--------+ 354 | +----------------------------->| | 355 | Tester | | DUT | 356 | EA | Response | | 357 | |<-----------------------------+ | 358 | |<============ Media =========>| | 359 +--------+ +--------+ 361 Figure 4: Baseline performance for DUT acting as a user agent server 362 with associated media 364 Figure 5 shows that the Tester acts as the initiating and responding 365 Emulated Agents as the DUT/SUT forwards Session Attempts. 367 +--------+ Session +--------+ Session +--------+ 368 | | Attempt | | Attempt | | 369 | |<------------+ |<------------+ | 370 | | | | | | 371 | | Response | | Response | | 372 | Tester +------------>| DUT +------------>| Tester | 373 | (EA) | | | | (EA) | 374 | | | | | | 375 +--------+ +--------+ +--------+ 377 Figure 5: DUT/SUT performance benchmark for session establishment 378 without media 380 Figure 6 is used when performing those same benchmarks with 381 Associated Media traversing the DUT/SUT. 383 +--------+ Session +--------+ Session +--------+ 384 | | Attempt | | Attempt | | 385 | |<------------+ |<------------+ | 386 | | | | | | 387 | | Response | | Response | | 388 | Tester +------------>| DUT +------------>| Tester | 389 | | | | | (EA) | 390 | | Media | | Media | | 391 | |<===========>| |<===========>| | 392 +--------+ +--------+ +--------+ 394 Figure 6: DUT/SUT performance benchmark for session establishment 395 with media traversing the DUT 397 Figure 7 is to be used when performing those same benchmarks with 398 Associated Media, but the media does not traverse the DUT/SUT. 399 Again, the benchmarking of the media is not within the scope of this 400 work item. The SIP control signaling is benchmarked in the presence 401 of Associated Media to determine if the SDP body of the signaling and 402 the handling of media impacts the performance of the DUT/SUT. 404 +--------+ Session +--------+ Session +--------+ 405 | | Attempt | | Attempt | | 406 | |<------------+ |<------------+ | 407 | | | | | | 408 | | Response | | Response | | 409 | Tester +------------>| DUT +------------>| Tester | 410 | | | | | (EA) | 411 | | | | | | 412 +--------+ +--------+ +--------+ 413 /|\ /|\ 414 | Media | 415 +=============================================+ 417 Figure 7: DUT/SUT performance benchmark for session establishment 418 with media external to the DUT 420 Figure 8 is used when performing benchmarks that require one or more 421 intermediaries to be in the signaling path. The intent is to gather 422 benchmarking statistics with a series of DUTs in place. In this 423 topology, the media is delivered end-to-end and does not traverse the 424 DUT. 426 SUT 427 '--------------------------^^^^^^^^-----------------------` 428 / \ 429 +------+ Session +---+ Session +---+ Session +------+ 430 | | Attempt | | Attempt | | Attempt | | 431 | |<---------+ |<---------+ |<---------+ | 432 | | | | | | | | 433 | | Response | | Response | | Response | | 434 |Tester+--------->|DUT+--------->|DUT|--------->|Tester| 435 | | | | | | | | 436 | | | | | | | | 437 +------+ +---+ +---+ +------+ 438 /|\ /|\ 439 | Media | 440 +=============================================+ 442 Figure 8: DUT/SUT performance benchmark for session establishment 443 with multiple DUTs and end-to-end media 445 Figure 9 is used when performing benchmarks that require one or more 446 intermediaries to be in the signaling path. The intent is to gather 447 benchmarking statistics with a series of DUTs in place. In this 448 topology, the media is delivered hop-by-hop through each DUT. 450 SUT 451 '--------------------------^^^^^^^^-----------------------` 452 / \ 453 +------+ Session +---+ Session +---+ Session +------+ 454 | | Attempt | | Attempt | | Attempt | | 455 | |<---------+ |<---------+ |<---------+ | 456 | | | | | | | | 457 | | Response | | Response | | Response | | 458 |Tester+--------->|DUT+--------->|DUT|--------->|Tester| 459 | | | | | | | | 460 | | | | | | | | 461 | |<========>| |<========>| |<========>| | 462 +------+ Media +---+ Media +---+ Media +------+ 464 Figure 9: DUT/SUT performance benchmark for session establishment 465 with multiple DUTs and hop- by-hop media 467 Figure 10 illustrates the SIP signaling for an Established Session. 468 The Tester acts as the Emulated Agent(s) and initiates a Session 469 Attempt with the DUT/SUT. When the Emulated Agent (EA) receives a 470 200 OK from the DUT/SUT that session is considered to be an 471 Established Session. The illustration indicates three states of the 472 session bring created by the EA - Attempting, Established, and 473 Disconnecting. Sessions can be one of two type: Invite-Initiated 474 Session (IS) or Non-Invite Initiated Session (NS). Failure for the 475 DUT/SUT to successfully respond within the Establishment Threshold 476 Time is considered a Session Attempt Failure. SIP Invite messages 477 MUST include the SDP body to specify the Associated Media. Use of 478 Associated Media, to be sourced from the EA, is optional. When 479 Associated Media is used, it may traverse the DUT/SUT depending upon 480 the type of DUT/SUT. The Associated Media is shown in Figure 10 as 481 "Media" connected to media ports M1 and M2 on the EA. After the EA 482 sends a BYE, the session disconnects. Performance test cases for 483 session disconnects are not considered in this work item (the BYE 484 request is shown for completeness.) 486 EA DUT/SUT M1 M2 487 | | | | 488 | INVITE | | | 489 --------+--------------->| | | 490 | | | | 491 Attempting | | | 492 | 200 OK | | | 493 --------|<-------------- | | | 494 | | | | 495 | | | | 496 | | | Media | 497 Established | |<=====>| 498 | | | | 499 | BYE | | | 500 --------+--------------> | | | 501 | | | | 502 Disconnecting | | | 503 | 200 OK | | | 504 --------|<-------------- | | | 505 | | | | 507 Figure 10: Basic SIP test topology 509 3. Term Definitions 511 3.1. Protocol Components 512 3.1.1. Session 514 Definition: 515 The combination of signaling and media messages and processes that 516 enable two or more participants to communicate. 518 Discussion: 519 SIP messages in the signaling plane can be used to create and 520 manage applications for one or more end users. SIP is often used 521 to create and manage media streams in support of applications. A 522 session always has a signaling component and may have a media 523 component. Therefore, a Session may be defined as signaling only 524 or a combination of signaling and media (c.f. Associated Media, 525 see Section 3.1.4). SIP includes definitions of a Call-ID, a 526 dialogue and a transaction that support this application. A 527 growing number of usages and applications do not require the 528 creation of associated media. The first such usage was the 529 REGISTER. Applications that use the MESSAGE and SUBSCRIBE/NOTIFY 530 methods also do not require SIP to manage media streams. The 531 terminology Invite-initiated Session (IS) and Non-invite initiated 532 Session (NS) are used to distinguish between these different 533 usages. 535 A Session in the context of this document, is considered to be a 536 vector with three components: 538 1. A component in the signaling plane (SIP messages), sess.sig; 539 2. A media component in the media plane (RTP and SRTP streams for 540 example), sess.med (which may be null); 541 3. A control component in the media plane (RTCP messages for 542 example), sess.medc (which may be null). 544 An IS is expected to have non-null sess.sig and sess.med 545 components. The use of control protocols in the media component 546 is media dependent, thus the expected presence or absence of 547 sess.medc is media dependent and test-case dependent. An NS is 548 expected to have a non-null sess.sig component, but null sess.med 549 and sess.medc components. 551 Packets in the Signaling Plane and Media Plane will be handled by 552 different processes within the DUT. They will take different 553 paths within a SUT. These different processes and paths may 554 produce variations in performance. The terminology and benchmarks 555 defined in this document and the methodology for their use are 556 designed to enable us to compare performance of the DUT/SUT with 557 reference to the type of SIP-supported application it is handling. 559 Note that one or more sessions can simultaneously exist between 560 any participants. This can be the case, for example, when the EA 561 sets up both an IM and a voice call through the DUT/SUT. These 562 sessions are represented as an array session[x]. 564 Sessions will be represented as a vector array with three 565 components, as follows: 566 session-> 567 session[x].sig, the signaling component 568 session[x].medc, the media control component (e.g. RTCP) 569 session[x].med[y], an array of associated media streams (e.g. 570 RTP, SRTP, RTSP, MSRP). This media component may consist of zero 571 or more media streams. 572 Figure 11 models the vectors of the session. 574 Measurement Units: 575 N/A. 577 Issues: 578 None. 580 See Also: 581 Media Plane 582 Signaling Plane 583 Associated Media 584 Invite-initiated Session (IS) 585 Non-invite-initiated Session (NS) 586 |\ 587 | 588 | \ 589 sess.sig| 590 | \ 591 | 592 | \ 593 | o 594 | / 595 | / | 596 | / 597 | / | 598 | / 599 | / | 600 | / 601 | / | sess.medc 602 |/_____________________ 603 / / 604 / | 605 / / 606 sess.med / | 607 /_ _ _ _ _ _ _ _/ 608 / 609 / 610 / 611 / 613 Figure 11: Application or session components 615 3.1.2. Signaling Plane 617 Definition: 618 The control plane in which SIP messages [RFC3261] are exchanged 619 between SIP Agents [RFC3261] to establish a connection for media 620 exchange. 622 Discussion: 623 SIP messages are used to establish sessions in several ways: 624 directly between two User Agents [RFC3261], through a Proxy Server 625 [RFC3261], or through a series of Proxy Servers. The Signaling 626 Plane MUST include the Session Description Protocol (SDP). 627 The Signaling Plane for a single Session is represented by 628 session.sig. 630 Measurement Units: 631 N/A. 633 Issues: 634 None. 636 See Also: 637 Media Plane 638 Emulated Agents 640 3.1.3. Media Plane 642 Definition: 643 The data plane in which one or more media streams and their 644 associated media control protocols are exchanged after a media 645 connection has been created by the exchange of signaling messages 646 in the Signaling Plane. 648 Discussion: 649 Media may also be known as the "bearer channel". The Media Plane 650 MUST include the media control protocol, if one is used, and the 651 media stream(s). Examples of media are audio, video, whiteboard, 652 and instant messaging service. The media stream is described in 653 the SDP of the Signaling Plane. 654 The media for a single Session is represented by session.med. The 655 media control protocol is represented by session.medc. 657 Measurement Units: 658 N/A. 660 Issues: 661 None. 663 See Also: 664 Signaling Plane 666 3.1.4. Associated Media 668 Definition: 669 Media that corresponds to an 'm' line in the SDP payload of the 670 Signaling Plane. 672 Discussion: 674 Any media protocol MAY be used. 675 For any session's signaling component, represented as session.sig, 676 there may be one or multiple associated media streams which are 677 represented be a vector array session.med[y], which is referred to 678 as the Associated Media. 680 Measurement Units: 681 N/A. 683 Issues: 684 None. 686 3.1.5. Overload 688 Definition: 689 Overload is defined as the state where a SIP server does not have 690 sufficient resources to process all incoming SIP messages 691 [I-D.ietf-soc-overload-design]. The distinction between an 692 overload condition and other failure scenarios is outside the 693 scope of this document which is blackbox testing. 695 Discussion: 696 Under overload conditions, all or a percentage of Session Attempts 697 will fail due to lack of resources. SIP server resources may 698 include CPU processing capacity, network bandwidth, input/output 699 queues, or disk resources. Any combination of resources may be 700 fully utilized when a SIP server (the DUT/SUT) is in the overload 701 condition. For proxy-only type of devices, overload issues will 702 be dominated by the number of signaling messages they can handle 703 in a unit time before their throughput starts to drop. 704 For UA-type of network devices (e.g., gateways), overload must 705 necessarily include both the signaling traffic and media streams. 706 It is expected that the amount of signaling that a UA can handle 707 is inversely proportional to the amount of media streams currently 708 handled by that UA. 710 Measurement Units: 711 N/A. 713 Issues: 714 The issue of overload in SIP networks is currently a topic of 715 discussion in the SIPPING WG. The normal response to an overload 716 stimulus -- sending a 503 response -- is considered inadequate and 717 new response codes and behaviors may be specified in the future. 718 From the perspective of this document, all these responses will be 719 considered to be failures. There is thus no dependency between 720 this document and the ongoing work on the treatment of overload 721 failure. 723 3.1.6. Session Attempt 725 Definition: 726 A SIP Session for which the Emulated Agent has sent the SIP INVITE 727 or SUBSCRIBE NOTIFY and has not yet received a message response 728 from the DUT/SUT. 730 Discussion: 731 The attempted session may be an IS or an NS. The Session Attempt 732 includes SIP INVITEs and SUBSCRIBE/NOTIFY messages. It also 733 includes all INVITEs that are rejected for lack of authentication 734 information. 736 Measurement Units: 737 N/A. 739 Issues: 740 None. 742 See Also: 743 Session 744 Session Attempt Rate 745 Invite-initiated Session 746 Non-Invite initiated Session 748 3.1.7. Established Session 750 Definition: 751 A SIP session for which the Emulated Agent acting as the UE/UA has 752 received a 200 OK message from the DUT/SUT. 754 Discussion: 755 An Established Session MAY be type INVITE-Session (IS) or Non- 756 INVITE Session (NS). 758 Measurement Units: 759 N/A. 761 Issues: 762 None. 764 See Also: 766 Invite-initiated Session 767 Session Attempting State 768 Session Disconnecting State 770 3.1.8. Invite-initiated Session (IS) 772 Definition: 773 A Session that is created by an exchange of messages in the 774 Signaling Plane, the first of which is a SIP INVITE request. 776 Discussion: 777 An IS is identified by the Call-ID, To-tag, and From-tag of the 778 SIP message that establishes the session. These three fields are 779 used to identify a SIP Dialog (RFC3261 [RFC3261]). An IS may have 780 Associated Media description in the SDP body. An IS may have 781 multiple Associated Media streams. The inclusion of media is test 782 case dependent. An IS is successfully established if the 783 following two conditions are met: 784 1. Sess.sig is established by the end of Establishment Threshold 785 Time (c.f. Section 3.3.3), and 786 2. If a media session is described in the SDP body of the 787 signaling message, then the media session is established by 788 the end of Establishment Threshold Time (c.f. Section 3.3.3). 790 Measurement Units: 791 N/A. 793 Issues: 794 None. 796 See Also: 797 Session 798 Non-Invite initiated Session 799 Associated Media 801 3.1.9. Non-INVITE-initiated Session (NS) 803 Definition: 804 A session that is created by an exchange of messages in the 805 Signaling Plane that does not include an initial SIP INVITE 806 message. 808 Discussion: 809 An NS is successfully established if the Session Attempt via a 810 non- INVITE request results in the EA receiving a 2xx reply from 811 the DUT/SUT before the expiration of the Establishment Threshold 812 timer (c.f., Section 3.3.3). An example of a NS is a session 813 created by the SUBSCRIBE request. 815 Measurement Units: 816 N/A. 818 Issues: 819 None. 821 See Also: 822 Session 823 Invite-initiated Session 825 3.1.10. Session Attempt Failure 827 Definition: 828 A session attempt that does not result in an Established Session. 830 Discussion: 831 The session attempt failure may be indicated by the following 832 observations at the Emulated Agent: 833 1. Receipt of a SIP 4xx, 5xx, or 6xx class response to a Session 834 Attempt. 835 2. The lack of any received SIP response to a Session Attempt 836 within the Establishment Threshold Time (c.f. Section 3.3.3). 838 Measurement Units: 839 N/A. 841 Issues: 842 None. 844 See Also: 845 Session Attempt 847 3.1.11. Standing Sessions Count 849 Definition: 850 The number of Sessions currently established on the DUT/SUT at any 851 instant. 853 Discussion: 854 The number of Standing Sessions is influenced by the Session 855 Duration and the Session Attempt Rate. Benchmarks MUST be 856 reported with the maximum and average Standing Sessions for the 857 DUT/SUT. In order to determine the maximum and average Standing 858 Sessions on the DUT/SUT for the duration of the test it is 859 necessary to make periodic measurements of the number of Standing 860 Sessions on the DUT/SUT. The recommended value for the 861 measurement period is 1 second. 863 Measurement Units: 864 Number of sessions 866 Issues: 867 None. 869 See Also: 870 Session Duration 871 Session Attempt Rate 872 Session Attempt Rate 874 3.2. Test Components 876 3.2.1. Emulated Agent 878 Definition: 879 A device in test topology that initiates/responds to SIP messages 880 as one or more session endpoints and, wherever applicable, 881 sources/receives Associated Media for Established Sessions. 883 Discussion: 884 The Emulated Agent functions in the signaling and media planes. 885 The Tester may act as multiple Emulated Agents. 887 Measurement Units: 888 N/A 890 Issues: 891 None. 893 See Also: 894 Media Plane 895 Signaling Plane 896 Established Session 897 Associated Media 899 3.2.2. Signaling Server 901 Definition: 902 Device in test topology that acts to create sessions between 903 Emulated Agents in the media plane. This device is either a DUT 904 or component of a SUT. 906 Discussion: 907 The DUT MUST be a RFC 3261 capable network equipment such as a 908 Registrar, Redirect Server, User Agent Server, Stateless Proxy, or 909 Stateful Proxy. A DUT MAY also include B2BUA or SBC. 911 Measurement Units: 912 NA 914 Issues: 915 None. 917 See Also: 918 Signaling Plane 920 3.2.3. SIP-Aware Stateful Firewall 922 Definition: 923 Device in test topology that provides Denial-of-Service (DoS) 924 Protection to the Signaling and Media Planes for the Emulated 925 Agents and Signaling Server 927 Discussion: 928 The SIP-Aware Stateful Firewall MAY be an internal component or 929 function of the Session Server. The SIP-Aware Stateful Firewall 930 MAY be a standalone device. If it is a standalone device it MUST 931 be paired with a Signaling Server. If it is a standalone device 932 it MUST be benchmarked as part of a SUT. SIP-Aware Stateful 933 Firewalls MAY include Network Address Translation (NAT) 934 functionality. Ideally, the inclusion of the SIP-Aware Stateful 935 Firewall as a SUT has no degradation to the measured performance 936 benchmarks. 938 Measurement Units: 939 N/A 941 Issues: 942 None. 944 See Also: 946 3.2.4. SIP Transport Protocol 948 Definition: 950 The protocol used for transport of the Signaling Plane messages. 952 Discussion: 953 Performance benchmarks may vary for the same SIP networking device 954 depending upon whether TCP, UDP, TLS, SCTP, or another transport 955 layer protocol is used. For this reason it MAY be necessary to 956 measure the SIP Performance Benchmarks using these various 957 transport protocols. Performance Benchmarks MUST report the SIP 958 Transport Protocol used to obtain the benchmark results. 960 Measurement Units: 961 TCP,UDP, SCTP, TLS over TCP, TLS over UDP, or TLS over SCTP 963 Issues: 964 None. 966 See Also: 968 3.3. Test Setup Parameters 970 3.3.1. Session Attempt Rate 972 Definition: 973 Configuration of the Emulated Agent for the number of sessions 974 that the Emulated Agent attempts to establish with the DUT/SUT 975 over a specified time interval. 977 Discussion: 978 The Session Attempt Rate can cause variation in performance 979 benchmark measurements. Since this is the number of sessions 980 configured on the Tester, some sessions may not be successfully 981 established on the DUT. A session may be either an IS or an NS. 983 Measurement Units: 984 Session attempts per second 986 Issues: 987 None. 989 See Also: 990 Session 991 Session Attempt 993 3.3.2. IS Media Attempt Rate 994 Definition: 995 Configuration on the Emulated Agent for number of ISs with 996 Associated Media to be established at the DUT per continuous one- 997 second time intervals. 999 Discussion: 1000 Note that a Media Session MUST be associated with an IS. In this 1001 document we assume that there is a one to one correspondence 1002 between IS session attempts and Media Session attempts. By 1003 including this definition we leave open the possibility that there 1004 may be an IS that does not include a media description. Also note 1005 that the IS Media Attempt Rate defines the number of media 1006 sessions we are trying to create, not the number of media sessions 1007 that are actually created. Variations in the Media Session 1008 Attempt Rate might cause variations in performance benchmark 1009 measurements. Some attempts might not result in successful 1010 sessions established on the DUT. 1012 Measurement Units: 1013 session attempts per second (saps) 1015 Issues: 1016 None. 1018 See Also: 1019 IS 1021 3.3.3. Establishment Threshold Time 1023 Definition: 1024 Configuration of the Emulated Agent for representing the amount of 1025 time that an Emulated Agent will wait before declaring a Session 1026 Attempt Failure. 1028 Discussion: 1029 This time duration is test dependent. 1030 It is RECOMMENDED that the Establishment Threshold Time value be 1031 set to Timer B (for ISs) or Timer F (for NSs) as specified in RFC 1032 3261, Table 4 [RFC3261]. Following the default value of T1 1033 (500ms) specified in the table and a constant multiplier of 64 1034 gives a value of 32 seconds for this timer (i.e., 500ms * 64 = 1035 32s). 1037 Measurement Units: 1039 seconds 1041 Issues: 1042 None. 1044 See Also: 1045 session establishment failure 1047 3.3.4. Session Duration 1049 Definition: 1050 Configuration of the Emulated Agent that represents the amount of 1051 time that the SIP dialog is intended to exist between the two EAs 1052 associated with the test. 1054 Discussion: 1055 The time at which the BYE is sent will control the Session 1056 Duration 1057 Normally the Session Duration will be the same as the Media 1058 Session Hold Time. However, it is possible that the dialog 1059 established between the two EAs can support different media 1060 sessions at different points in time. Providing both parameters 1061 allows the testing agency to explore this possibility. 1063 Measurement Units: 1064 seconds 1066 Issues: 1067 None. 1069 See Also: 1070 Media Session Hold Time 1072 3.3.5. Media Packet Size 1074 Definition: 1075 Configuration on the Emulated Agent for a fixed size of packets 1076 used for media streams. 1078 Discussion: 1079 For a single benchmark test, all sessions use the same size packet 1080 for media streams. The size of packets can cause variation in 1081 performance benchmark measurements. 1083 Measurement Units: 1084 bytes 1086 Issues: 1087 None. 1089 See Also: 1091 3.3.6. Media Offered Load 1093 Definition: 1094 Configuration of the Emulated Agent for the constant rate of 1095 Associated Media traffic offered by the Emulated Agent to the DUT/ 1096 SUT for one or more Established Sessions of type IS. 1098 Discussion: 1099 The Media Offered Load to be used for a test MUST be reported with 1100 three components: 1101 1. per Associated Media stream; 1102 2. per IS; 1103 3. aggregate. 1104 For a single benchmark test, all sessions use the same Media 1105 Offered Load per Media Stream. There may be multiple Associated 1106 Media streams per IS. The aggregate is the sum of all Associated 1107 Media for all IS. 1109 Measurement Units: 1110 packets per second (pps) 1112 Issues: 1113 None. 1115 See Also: 1116 Established Session 1117 Invite Initiated Session 1118 Associated Media 1120 3.3.7. Media Session Hold Time 1122 Definition: 1123 Parameter configured at the Emulated Agent, that represents the 1124 amount of time that the Associated Media for an Established 1125 Session of type IS will last. 1127 Discussion: 1128 The Associated Media streams may be bi-directional or uni- 1129 directional as indicated in the test methodology. 1130 Normally the Media Session Hold Time will be the same as the 1131 Session Duration. However, it is possible that the dialog 1132 established between the two EAs can support different media 1133 sessions at different points in time. Providing both parameters 1134 allows the testing agency to explore this possibility. 1136 Measurement Units: 1137 seconds 1139 Issues: 1140 None. 1142 See Also: 1143 Associated Media 1144 Established Session 1145 Invite-initiated Session (IS) 1147 3.3.8. Loop Detection Option 1149 Definition: 1150 An option that causes a Proxy to check for loops in the routing of 1151 a SIP request before forwarding the request. 1153 Discussion: 1154 This is an optional process that a SIP proxy may employ; the 1155 process is described under Proxy Behavior in RFC 3261 in Section 1156 16.3 Request Validation and that section also contains suggestions 1157 as to how the option could be implemented. Any procedure to 1158 detect loops will use processor cycles and hence could impact the 1159 performance of a proxy. 1161 Measurement Units: 1162 NA 1164 Issues: 1165 None. 1167 See Also: 1169 3.3.9. Forking Option 1170 Definition: 1171 An option that enables a Proxy to fork requests to more than one 1172 destination. 1174 Discussion: 1175 This is an process that a SIP proxy may employ to find the UAS. 1176 The option is described under Proxy Behavior in RFC 3261 in 1177 Section 16.1. A proxy that uses forking must maintain state 1178 information and this will use processor cycles and memory. Thus 1179 the use of this option could impact the performance of a proxy and 1180 different implementations could produce different impacts. 1181 SIP supports serial or parallel forking. When performing a test, 1182 the type of forking mode MUST be indicated. 1184 Measurement Units: 1185 The number of endpoints that will receive the forked invitation. 1186 A value of 1 indicates that the request is destined to only one 1187 endpoint, a value of 2 indicates that the request is forked to two 1188 endpoints, and so on. This is an integer value ranging between 1 1189 and N inclusive, where N is the maximum number of endpoints to 1190 which the invitation is sent. 1191 Type of forking used, namely parallel or serial. 1193 Issues: 1194 None. 1196 See Also: 1198 3.4. Benchmarks 1200 3.4.1. Registration Rate 1202 Definition: 1203 The maximum number of registrations that can be successfully 1204 completed by the DUT/SUT in a given time period. 1206 Discussion: 1207 This benchmark is obtained with zero failure in which 100% of the 1208 registrations attempted by the Emulated Agent are successfully 1209 completed by the DUT/SUT. The maximum value is obtained by 1210 testing to failure. This means that the registration rate 1211 provisioned on the EA is raised progressively until a registration 1212 attempt failure is observed. 1214 Measurement Units: 1215 registrations per second (rps) 1217 Issues: 1218 None. 1220 See Also: 1222 3.4.2. Session Establishment Rate 1224 Definition: 1225 The average maximum rate at which the DUT/SUT can successfully 1226 establish sessions. 1228 Discussion: 1229 This metric is an average of maxima. Each maximum is measured in 1230 a separate sample. The Session Establishment Rate is the average 1231 of the maximas established in each individual sample. In each 1232 sample, the maximum in question is the number of sessions 1233 successfully established in continuous one-second intervals with 1234 prior sessions remaining active. This maximum is designated in 1235 the equation below as "rate in sample i". The session 1236 establishment rate is calculated using the following equation (n = 1237 number of samples): 1239 n 1240 -- 1241 \ rate at sample i 1242 / 1243 -- 1244 i = 1 1245 --------------------- 1246 (n) 1248 In each sample, the maximum is obtained by testing to failure. 1249 With zero failure, 100% of the sessions introduced by the Emulated 1250 Agent are successfully established. The maximum value is obtained 1251 by testing to failure. This means that the Session Attempt Rate 1252 provisioned on the EA is raised progressively until a Session 1253 Attempt Failure is observed. The maximum rate is the rate 1254 acheived in the interval prior to the interval in which the 1255 failure is observed. Sessions may be IS or NS or a a mix of both 1256 and will be defined in the particular test. 1258 Measurement Units: 1259 sessions per second (sps) 1261 Issues: 1262 None. 1264 See Also: 1265 Invite-initiated Sessions 1266 Non-INVITE initiated Sessions 1267 Session Attempt Rate 1269 3.4.3. Session Capacity 1271 Definition: 1272 The maximum value of Standing Sessions Count achieved by the DUT/ 1273 SUT during the process of steadily increasing the number of 1274 Session Attempts per unit time, before the first Session Attempt 1275 Failure occurs. 1277 Discussion: 1278 When benchmarking Session Capacity for sessions with media it is 1279 required that these sessions be permanently established (i.e., 1280 they remain active for the duration of the test.) This can be 1281 achieved by causing the EA not to send a BYE for the duration of 1282 the testing. In the signaling plane, this requirement means that 1283 the dialog lasts as long as the test lasts. In order to test 1284 Session Capacity for sessions with media, the Media Session Hold 1285 Time MUST be set to infinity so that sessions remain established 1286 for the duration of the test. If the DUT/SUT is dialog-stateful, 1287 then we expect its performance will be impacted by setting Media 1288 Session Hold Time to infinity, since the DUT/SUT will need to 1289 allocate resources to process and store the state information. 1290 The Session Capacity must be reported with the Session Attempt 1291 Rate used to reach the maximum. Since Session Attempt Rate is a 1292 zero-loss measurement, there must be zero failures to achieve the 1293 Session Capacity. The maximum is indicated at the Emulated Agent 1294 by arrival of a SIP 4xx, 5xx, or 6xx response from the DUT/SUT. 1295 Sessions may be IS or NS. 1297 Measurement Units: 1298 sessions 1300 Issues: 1301 None. 1303 See Also: 1304 Established Session 1305 Session Attempt Rate 1306 Session Attempt Failure 1308 3.4.4. Session Overload Capacity 1310 Definition: 1311 The maximum number of Established Sessions that can exist 1312 simultaneously on the DUT/SUT until it stops responding to Session 1313 Attempts. 1315 Discussion: 1316 Session Overload Capacity is measured after the Session Capacity 1317 is measured. The Session Overload Capacity is greater than or 1318 equal to the Session Capacity. When benchmarking Session Overload 1319 Capacity, continue to offer Session Attempts to the DUT/SUT after 1320 the first Session Attempt Failure occurs and measure Established 1321 Sessions until no there is no SIP message response for the 1322 duration of the Establishment Threshold. It is worth noting that 1323 the Session Establishment Performance is expected to decrease 1324 after the first Session Attempt Failure occurs. 1326 Units: 1327 Sessions 1329 Issues: 1330 None. 1332 See Also: 1333 Overload 1334 Session Capacity 1335 Session Attempt Failure 1337 3.4.5. Session Establishment Performance 1339 Definition: 1340 The percent of Session Attempts that become Established Sessions 1341 over the duration of a benchmarking test. 1343 Discussion: 1344 Session Establishment Performance is a benchmark to indicate 1345 session establishment success for the duration of a test. The 1346 duration for measuring this benchmark is to be specified in the 1347 Methodology. The Session Duration SHOULD be configured to 1348 infinity so that sessions remain established for the entire test 1349 duration. 1351 Session Establishment Performance is calculated as shown in the 1352 following equation: 1354 Session Establishment = Total Established Sessions 1355 Performance -------------------------- 1356 Total Session Attempts 1358 Session Establishment Performance may be monitored real-time 1359 during a benchmarking test. However, the reporting benchmark MUST 1360 be based on the total measurements for the test duration. 1362 Measurement Units: 1363 Percent (%) 1365 Issues: 1366 None. 1368 See Also: 1369 Established Session 1370 Session Attempt 1372 3.4.6. Session Attempt Delay 1374 Definition: 1375 The average time measured at the Emulated Agent for a Session 1376 Attempt to result in an Established Session. 1378 Discussion: 1379 Time is measured from when the EA sends the first INVITE for the 1380 call-ID in the case of an IS. Time is measured from when the EA 1381 sends the first non-INVITE message in the case of an NS. Session 1382 Attempt Delay MUST be measured for every established session to 1383 calculate the average. Session Attempt Delay MUST be measured at 1384 the Maximum Session Establishment Rate. 1386 Measurement Units: 1387 Seconds 1389 Issues: 1390 None. 1392 See Also: 1393 Maximum Session Establishment Rate 1395 3.4.7. IM Rate 1396 Definition: 1397 Maximum number of IM messages completed by the DUT/SUT. 1399 Discussion: 1400 For a UAS, the definition of success is the receipt of an IM 1401 request and the subsequent sending of a final response. 1402 For a UAC, the definition of success is the sending of an IM 1403 request and the receipt of a final response to it. For a proxy, 1404 the definition of success is as follows: 1405 A. the number of IM requests it receives from the upstream client 1406 MUST be equal to the number of IM requests it sent to the 1407 downstream server; and 1408 B. the number of IM responses it receives from the downstream 1409 server MUST be equal to the number of IM requests sent to the 1410 downstream server; and 1411 C. the number of IM responses it sends to the upstream client 1412 MUST be equal to the number of IM requests it received from 1413 the upstream client. 1415 Measurement Units: 1416 IM messages per second 1418 Issues: 1419 None. 1421 See Also: 1423 4. IANA Considerations 1425 This document requires no IANA considerations. 1427 5. Security Considerations 1429 Documents of this type do not directly affect the security of 1430 Internet or corporate networks as long as benchmarking is not 1431 performed on devices or systems connected to production networks. 1432 Security threats and how to counter these in SIP and the media layer 1433 is discussed in RFC3261 [RFC3261], RFC 3550 [RFC3550], RFC3711 1434 [RFC3711] and various other drafts. This document attempts to 1435 formalize a set of common terminology for benchmarking SIP networks. 1436 Packets with unintended and/or unauthorized DSCP or IP precedence 1437 values may present security issues. Determining the security 1438 consequences of such packets is out of scope for this document. 1440 6. Acknowledgments 1442 The authors would like to thank Keith Drage, Cullen Jennings, Daryl 1443 Malas, Al Morton, and Henning Schulzrinne for invaluable 1444 contributions to this document. 1446 7. References 1448 7.1. Normative References 1450 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 1451 Requirement Levels", BCP 14, RFC 2119, March 1997. 1453 [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 1454 Network Interconnect Devices", RFC 2544, March 1999. 1456 [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, 1457 A., Peterson, J., Sparks, R., Handley, M., and E. 1458 Schooler, "SIP: Session Initiation Protocol", RFC 3261, 1459 June 2002. 1461 [I-D.ietf-bmwg-sip-bench-meth] 1462 Davids, C., Gurbani, V., and S. Poretsky, "Methodology for 1463 Benchmarking SIP Networking Devices", 1464 draft-ietf-bmwg-sip-bench-meth-03 (work in progress), 1465 March 2011. 1467 7.2. Informational References 1469 [RFC2285] Mandeville, R., "Benchmarking Terminology for LAN 1470 Switching Devices", RFC 2285, February 1998. 1472 [RFC1242] Bradner, S., "Benchmarking terminology for network 1473 interconnection devices", RFC 1242, July 1991. 1475 [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. 1476 Jacobson, "RTP: A Transport Protocol for Real-Time 1477 Applications", STD 64, RFC 3550, July 2003. 1479 [RFC3711] Baugher, M., McGrew, D., Naslund, M., Carrara, E., and K. 1480 Norrman, "The Secure Real-time Transport Protocol (SRTP)", 1481 RFC 3711, March 2004. 1483 [I-D.ietf-soc-overload-design] 1484 Hilt, V., Noel, E., Shen, C., and A. Abdelal, "Design 1485 Considerations for Session Initiation Protocol (SIP) 1486 Overload Control", draft-ietf-soc-overload-design-05 (work 1487 in progress), March 2011. 1489 [I-D.ietf-soc-overload-control] 1490 Gurbani, V., Hilt, V., and H. Schulzrinne, "Session 1491 Initiation Protocol (SIP) Overload Control", 1492 draft-ietf-soc-overload-control-02 (work in progress), 1493 February 2011. 1495 Appendix A. White Box Benchmarking Terminology 1497 Session Attempt Arrival Rate 1499 Definition: 1500 The number of Session Attempts received at the DUT/SUT over a 1501 specified time period. 1503 Discussion: 1504 Sessions Attempts are indicated by the arrival of SIP INVITES OR 1505 SUBSCRIBE NOTIFY messages. Session Attempts Arrival Rate 1506 distribution can be any model selected by the user of this 1507 document. It is important when comparing benchmarks of different 1508 devices that same distribution model was used. Common 1509 distributions are expected to be Uniform and Poisson. 1511 Measurement Units: 1512 Session attempts/sec 1514 Issues: 1515 None. 1517 See Also: 1518 Session Attempt 1520 Authors' Addresses 1522 Carol Davids 1523 Illinois Institute of Technology 1524 201 East Loop Road 1525 Wheaton, IL 60187 1526 USA 1528 Phone: +1 630 682 6024 1529 Email: davids@iit.edu 1530 Vijay K. Gurbani 1531 Bell Laboratories, Alcatel-Lucent 1532 1960 Lucent Lane 1533 Rm 9C-533 1534 Naperville, IL 60566 1535 USA 1537 Phone: +1 630 224 0216 1538 Email: vkg@bell-labs.com 1540 Scott Poretsky 1541 Allot Communications 1542 300 TradeCenter, Suite 4680 1543 Woburn, MA 08101 1544 USA 1546 Phone: +1 508 309 2179 1547 Email: sporetsky@allot.com