idnits 2.17.1 draft-ietf-bmwg-sip-bench-term-01.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** You're using the IETF Trust Provisions' Section 6.b License Notice from 12 Sep 2009 rather than the newer Notice from 28 Dec 2009. (See https://trustee.ietf.org/license-info/) Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (February 8, 2010) is 5190 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) ** Downref: Normative reference to an Informational RFC: RFC 2544 == Outdated reference: A later version (-12) exists of draft-ietf-bmwg-sip-bench-meth-01 ** Downref: Normative reference to an Informational draft: draft-ietf-bmwg-sip-bench-meth (ref. 'I-D.ietf-bmwg-sip-bench-meth') Summary: 3 errors (**), 0 flaws (~~), 3 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Benchmarking Methodology Working S. Poretsky 3 Group Allot Communications 4 Internet-Draft V. Gurbani 5 Expires: August 12, 2010 Bell Laboratories, Alcatel-Lucent 6 C. Davids 7 Illinois Institute of Technology 8 February 8, 2010 10 Terminology for Benchmarking Session Initiation Protocol (SIP) 11 Networking Devices 12 draft-ietf-bmwg-sip-bench-term-01 14 Abstract 16 This document provides a terminology for benchmarking SIP performance 17 in networking devices. Terms are included for test components, test 18 setup parameters, and performance benchmark metrics for black-box 19 benchmarking of SIP networking devices. The performance benchmark 20 metrics are obtained for the SIP control plane and media plane. The 21 terms are intended for use in a companion methodology document for 22 complete performance characterization of a device in a variety of 23 conditions making it possible to compare performance of different 24 devices. It is critical to provide test setup parameters and a 25 methodology document for SIP performance benchmarking because SIP 26 allows a wide range of configuration and operational conditions that 27 can influence performance benchmark measurements. It is necessary to 28 have terminology and methodology standards to ensure that reported 29 benchmarks have consistent definition and were obtained following the 30 same procedures. Benchmarks can be applied to compare performance of 31 a variety of SIP networking devices. 33 Status of this Memo 35 This Internet-Draft is submitted to IETF in full conformance with the 36 provisions of BCP 78 and BCP 79. 38 Internet-Drafts are working documents of the Internet Engineering 39 Task Force (IETF), its areas, and its working groups. Note that 40 other groups may also distribute working documents as Internet- 41 Drafts. 43 Internet-Drafts are draft documents valid for a maximum of six months 44 and may be updated, replaced, or obsoleted by other documents at any 45 time. It is inappropriate to use Internet-Drafts as reference 46 material or to cite them other than as "work in progress." 48 The list of current Internet-Drafts can be accessed at 49 http://www.ietf.org/ietf/1id-abstracts.txt. 51 The list of Internet-Draft Shadow Directories can be accessed at 52 http://www.ietf.org/shadow.html. 54 This Internet-Draft will expire on August 12, 2010. 56 Copyright Notice 58 Copyright (c) 2010 IETF Trust and the persons identified as the 59 document authors. All rights reserved. 61 This document is subject to BCP 78 and the IETF Trust's Legal 62 Provisions Relating to IETF Documents 63 (http://trustee.ietf.org/license-info) in effect on the date of 64 publication of this document. Please review these documents 65 carefully, as they describe your rights and restrictions with respect 66 to this document. Code Components extracted from this document must 67 include Simplified BSD License text as described in Section 4.e of 68 the Trust Legal Provisions and are provided without warranty as 69 described in the BSD License. 71 Table of Contents 73 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 5 74 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 6 75 2.1. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . 7 76 2.2. Benchmarking Models . . . . . . . . . . . . . . . . . . . 8 77 3. Term Definitions . . . . . . . . . . . . . . . . . . . . . . . 11 78 3.1. Protocol Components . . . . . . . . . . . . . . . . . . . 11 79 3.1.1. Session . . . . . . . . . . . . . . . . . . . . . . . 11 80 3.1.2. Signaling Plane . . . . . . . . . . . . . . . . . . . 14 81 3.1.3. Media Plane . . . . . . . . . . . . . . . . . . . . . 15 82 3.1.4. Associated Media . . . . . . . . . . . . . . . . . . . 15 83 3.1.5. Overload . . . . . . . . . . . . . . . . . . . . . . . 16 84 3.1.6. Session Attempt . . . . . . . . . . . . . . . . . . . 17 85 3.1.7. Established Session . . . . . . . . . . . . . . . . . 17 86 3.1.8. Invite-initiated Session (IS) . . . . . . . . . . . . 18 87 3.1.9. Non-INVITE-initiated Session (NS) . . . . . . . . . . 18 88 3.1.10. Session Attempt Failure . . . . . . . . . . . . . . . 19 89 3.1.11. Standing Sessions . . . . . . . . . . . . . . . . . . 19 90 3.2. Test Components . . . . . . . . . . . . . . . . . . . . . 20 91 3.2.1. Emulated Agent . . . . . . . . . . . . . . . . . . . . 20 92 3.2.2. Signaling Server . . . . . . . . . . . . . . . . . . . 20 93 3.2.3. SIP-Aware Stateful Firewall . . . . . . . . . . . . . 21 94 3.2.4. SIP Transport Protocol . . . . . . . . . . . . . . . . 21 95 3.3. Test Setup Parameters . . . . . . . . . . . . . . . . . . 22 96 3.3.1. Session Attempt Rate . . . . . . . . . . . . . . . . . 22 97 3.3.2. IS Media Attempt Rate . . . . . . . . . . . . . . . . 22 98 3.3.3. Establishment Threshold Time . . . . . . . . . . . . . 23 99 3.3.4. Media Packet Size . . . . . . . . . . . . . . . . . . 24 100 3.3.5. Media Offered Load . . . . . . . . . . . . . . . . . . 24 101 3.3.6. Media Session Hold Time . . . . . . . . . . . . . . . 25 102 3.3.7. Loop Detection Option . . . . . . . . . . . . . . . . 25 103 3.3.8. Forking Option . . . . . . . . . . . . . . . . . . . . 26 104 3.4. Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . 26 105 3.4.1. Registration Rate . . . . . . . . . . . . . . . . . . 26 106 3.4.2. Session Establishment Rate . . . . . . . . . . . . . . 27 107 3.4.3. Session Capacity . . . . . . . . . . . . . . . . . . . 28 108 3.4.4. Session Overload Capacity . . . . . . . . . . . . . . 29 109 3.4.5. Session Establishment Performance . . . . . . . . . . 29 110 3.4.6. Session Attempt Delay . . . . . . . . . . . . . . . . 30 111 3.4.7. IM Rate . . . . . . . . . . . . . . . . . . . . . . . 31 112 4. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 31 113 5. Security Considerations . . . . . . . . . . . . . . . . . . . 31 114 6. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 32 115 7. References . . . . . . . . . . . . . . . . . . . . . . . . . . 32 116 7.1. Normative References . . . . . . . . . . . . . . . . . . . 32 117 7.2. Informational References . . . . . . . . . . . . . . . . . 32 118 Appendix A. White Box Benchmarking Terminology . . . . . . . . . 33 119 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 33 121 1. Terminology 123 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 124 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 125 document are to be interpreted as described in BCP 14, RFC2119 126 [RFC2119]. RFC 2119 defines the use of these key words to help make 127 the intent of standards track documents as clear as possible. While 128 this document uses these keywords, this document is not a standards 129 track document. The term Throughput is defined in RFC2544 [RFC2544]. 131 For the sake of clarity and continuity, this document adopts the 132 template for definitions set out in Section 2 of RFC 1242 [RFC1242]. 133 Definitions are indexed and grouped together for ease of reference. 135 This document uses existing terminology defined in other BMWG work. 136 Examples include, but are not limited to: 138 Device under test (DUT) (c.f., Section 3.1.1 RFC 2285 [RFC2285]). 139 System under test (SUT) (c.f., Section 3.1.2, RFC 2285 [RFC2285]). 141 Many commonly used SIP terms in this document are defined in RFC 3261 142 [RFC3261]. For convenience the most important of these are 143 reproduced below. Use of these terms in this document is consistent 144 with their corresponding definition in [RFC3261]. 145 o Call Stateful: A proxy is call stateful if it retains state for a 146 dialog from the initiating INVITE to the terminating BYE request. 147 A call stateful proxy is always transaction stateful, but the 148 converse is not necessarily true. 149 o Stateful Proxy: A logical entity that maintains the client and 150 server transaction state machines defined by this specification 151 during the processing of a request, also known as a transaction 152 stateful proxy. The behavior of a stateful proxy is further 153 defined in Section 16. A (transaction) stateful proxy is not the 154 same as a call stateful proxy. 155 o Stateless Proxy: A logical entity that does not maintain the 156 client or server transaction state machines defined in this 157 specification when it processes requests. A stateless proxy 158 forwards every request it receives downstream and every response 159 it receives upstream. 160 o Back-to-back User Agent: A back-to-back user agent (B2BUA) is a 161 logical entity that receives a request and processes it as a user 162 agent server (UAS). In order to determine how the request should 163 be answered, it acts as a user agent client (UAC) and generates 164 requests. Unlike a proxy server, it maintains dialog state and 165 must participate in all requests sent on the dialogs it has 166 established. Since it is a concatenation of a UAC and a UAS, no 167 explicit definitions are needed for its behavior. 169 o Loop: A request that arrives at a proxy, is forwarded, and later 170 arrives back at the same proxy. When it arrives the second time, 171 its Request-URI is identical to the first time, and other header 172 fields that affect proxy operation are unchanged, so that the 173 proxy will make the same processing decision on the request it 174 made the first time. Looped requests are errors, and the 175 procedures for detecting them and handling them are described by 176 the protocol. 178 2. Introduction 180 Service Providers are now planning Voice Over IP (VoIP) and 181 Multimedia network deployments using the IETF developed Session 182 Initiation Protocol (SIP) [RFC3261]. SIP is a signaling protocol 183 originally intended to be used for the dynamic establishment, 184 disconnection and modification of streams of media between end users. 185 As it has evolved it has been adopted for use in a growing number of 186 applications and features. Many of these result in the creation of a 187 media stream, but some do not. Instead, they create other services 188 tailored to the end-users' immediate needs or preferences. The set 189 of benchmarking terms provided in this document is intended for use 190 with any SIP-enabled device performing SIP functions in the core of 191 the network. The performance of end-user devices is outside the 192 scope of this document. 194 VoIP with SIP has led to the development of new networking devices 195 including SIP Server, Session Border Controllers (SBC), Back-to-back 196 user agents (B2BUA) and SIP-Aware Stateful Firewall. The mix of 197 voice and IP functions in these various devices has produced 198 inconsistencies in vendor reported performance metrics and has caused 199 confusion in the service provider community. SIP allows a wide range 200 of configuration and operational conditions that can influence 201 performance benchmark measurements. When the device under test 202 terminates or relays both media and signaling, for example, it is 203 important to be able to correlate a signaling measurement with the 204 media plane measurements to determine the system performance. As 205 devices and their functions proliferate, the need to have a 206 consistent set of metrics to compare their performance becomes 207 increasingly urgent. This document and its companion methodology 208 document [I-D.ietf-bmwg-sip-bench-meth] provide a set of black-box 209 benchmarks for describing and comparing the performance of devices 210 that incorporate the SIP User Agent Client and Server functions and 211 that operate in the network's core. 213 The definition of SIP performance benchmarks necessarily includes 214 definitions of Test Setup Parameters and a test methodology. These 215 enable the Tester to perform benchmarking tests on different devices 216 and to achieve comparable and repeatable results. This document 217 provides a common set of well-defined terms for Test Components, Test 218 Setup Parameters, and Benchmarks. All the benchmarks defined are 219 black-box measurements of the SIP Control (Signaling) plane. The 220 Test Setup Parameters and Benchmarks defined in this document are 221 intended for use with the companion Methodology document. Benchmarks 222 of internal DUT characteristics (also known as white-box benchmarks) 223 such as Session Attempt Arrival Rate, which is measured at the DUT, 224 are described in Appendix A to allow additional characterization of 225 DUT behavior with different distribution models. 227 2.1. Scope 229 The scope of this work item is summarized as follows: 230 o This terminology document describes SIP signaling (control- plane) 231 performance benchmarks for black-box measurements of SIP 232 networking devices. Stress and debug scenarios are not addressed 233 in this work item. 234 o The DUT must be an RFC 3261 capable network equipment. This may 235 be a Registrar, Redirect Server, Stateless Proxy or Stateful 236 Proxy. A DUT MAY also include a B2BUA, SBC functionality (this is 237 referred to as the "Signaling Server".) The DUT MAY be a multi- 238 port SIP-to-switched network gateway implemented as a SIP UAC or 239 UAS. 240 o The DUT MAY have an internal SIP Application Level Gateway (ALG), 241 firewall, and/or a Network Address Translator (NAT). This is 242 referred to as the "SIP Aware Stateful Firewall." 243 o The DUT or SUT MUST NOT be end user equipment, such as personal 244 digital assistant, a computer-based client, or a user terminal. 245 o The Tester acts as multiple "Emulated Agents" that initiate (or 246 respond to) SIP messages as session endpoints and source (or 247 receive) associated media for established connections. 248 o Control Signaling in presence of Media 249 * The media performance is not benchmarked in this work item. 250 * It is RECOMMENDED that control plane benchmarks are performed 251 with media present, but this is optional. 252 * The SIP INVITE requests MUST include the SDP body. 253 * The type of DUT dictates whether the associated media streams 254 traverse the DUT or SUT. Both scenarios are within the scope 255 of this work item. 256 * SIP is frequently used to create media streams; the control 257 plane and media plane are treated as orthogonal to each other 258 in this document. While many devices support the creation of 259 media streams, benchmarks that measure the performance of these 260 streams are outside the scope of this document and its 261 companion methodology document [I-D.ietf-bmwg-sip-bench-meth]. 262 Tests may be performed with or without the creation of media 263 streams. The presence or absence of media streams MUST be 264 noted as a condition of the test as the performance of SIP 265 devices may vary accordingly. Even if the media is used during 266 benchmarking, only the SIP performance will be benchmarked, not 267 the media performance or quality. 268 o Both INVITE and non-INVITE scenarios (such as Instant Messages or 269 IM) are addressed in this document. However, benchmarking SIP 270 presence is not a part of this work item. 271 o Different transport mechanisms -- such as UDP, TCP, SCTP, or TLS 272 -- may be used; however, the specific transport mechanism MUST be 273 noted as a condition of the test as the performance of SIP devices 274 may vary accordingly. 275 o Looping and forking options are also considered since they impact 276 processing at SIP proxies. 277 o REGISTER and INVITE requests may be challenged or remain 278 unchallenged for authentication purpose as this may impact the 279 performance benchmarks. Any observable performance degradation 280 due to authentication is of interest to the SIP community. 281 o Re-INVITE requests are not considered in scope of this work item. 282 o Only session establishment is considered for the performance 283 benchmarks. Session disconnect is not considered in the scope of 284 this work item. 285 o SIP Overload [I-D.ietf-sipping-overload-reqs] is within the scope 286 of this work item, but considerations on how to handle overload 287 are deferred to work progressing in the SIPPING working group 288 [I-D.ietf-sipping-overload-design]. Vendors are, of course, free 289 to implement their specific overload control behavior as the 290 expected test outcome if it is different from the IETF 291 recommendations. However, such behavior MUST be documented and 292 interpreted appropriately across multiple vendor implementations. 293 This will make it more meaningful to compare the performance of 294 different SIP overload implementations. 295 o IMS-specific scenarios are not considered, but test cases can be 296 applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT. 298 2.2. Benchmarking Models 300 This section shows the five models to be used when benchmarking SIP 301 performance of a networking device. Figure 1 shows a configuration 302 in which the Tester acting as the Emulated agents is in loopback 303 testing itself for the purpose of baselining its performance. 305 +--------+ Signaling request +--------+ 306 | +----------------------------->| | 307 | Tester | | Tester | 308 | EA | Signaling response | EA | 309 | |<-----------------------------+ | 310 +--------+ +--------+ 312 Figure 1: Test topology 1 - Emulated agent (EA) baseline performance 313 measurement 315 Figure 2 shows the basic configuration for benchmarking the 316 Registration of the DUT/SUT. 318 +--------+ Registration +--------+ 319 | +----------------------------->| | 320 | Tester | | DUT | 321 | EA | Response | | 322 | |<-----------------------------+ | 323 +--------+ +--------+ 325 Figure 2: Test topology 2 - Emulated agent (EA) registration to DUT/ 326 SUT 328 Figure 3 shows that the Tester acts as the initiating and responding 329 Emulated Agents as the DUT/SUT forwards Session Attempts. 331 +--------+ Session +--------+ Session +--------+ 332 | | Attempt | | Attempt | | 333 | |<------------+ |<------------+ | 334 | | | | | | 335 | | Response | | Response | | 336 | Tester +------------>| DUT +------------>| Tester | 337 | (EA) | | | | (EA) | 338 | | | | | | 339 +--------+ +--------+ +--------+ 341 Figure 3: Test topology 3 - DUT/SUT performance benchmark for session 342 establishment without media 344 Figure 4 is used when performing those same benchmarks with 345 Associated Media traversing the DUT/SUT. 347 +--------+ Session +--------+ Session +--------+ 348 | | Attempt | | Attempt | | 349 | |<------------+ |<------------+ | 350 | | | | | | 351 | | Response | | Response | | 352 | Tester +------------>| DUT +------------>| Tester | 353 | | | | | (EA) | 354 | | Media | | Media | | 355 | |<===========>| |<===========>| | 356 +--------+ +--------+ +--------+ 358 Figure 4: Test topology 4 - DUT/SUT performance benchmark for session 359 establishment with media traversing the DUT 361 Figure 5 is to be used when performing those same benchmarks with 362 Associated Media, but the media does not traverse the DUT/SUT. 363 Again, the benchmarking of the media is not within the scope of this 364 work item. The SIP control signaling is benchmarked in the presence 365 of Associated Media to determine if the SDP body of the signaling and 366 the handling of media impacts the performance of the DUT/SUT. 368 +--------+ Session +--------+ Session +--------+ 369 | | Attempt | | Attempt | | 370 | |<------------+ |<------------+ | 371 | | | | | | 372 | | Response | | Response | | 373 | Tester +------------>| DUT +------------>| Tester | 374 | | | | | (EA) | 375 | | | | | | 376 +--------+ +--------+ +--------+ 377 /|\ /|\ 378 | Media | 379 +=============================================+ 381 Figure 5: Test topology 5 - DUT/SUT performance benchmark for session 382 establishment with media external to the DUT 384 Figure 6 illustrates the SIP signaling for an Established Session. 385 The Tester acts as the Emulated Agent(s) and initiates a Session 386 Attempt with the DUT/SUT. When the Emulated Agent (EA) receives a 387 200 OK from the DUT/SUT that session is considered to be an 388 Established Session. The illustration indicates three states of the 389 session bring created by the EA - Attempting, Established, and 390 Disconnecting. Sessions can be one of two type: Invite-Initiated 391 Session (IS) or Non-Invite Initiated Session (NS). Failure for the 392 DUT/SUT to successfully respond within the Establishment Threshold 393 Time is considered a Session Attempt Failure. SIP Invite messages 394 MUST include the SDP body to specify the Associated Media. Use of 395 Associated Media, to be sourced from the EA, is optional. When 396 Associated Media is used, it may traverse the DUT/SUT depending upon 397 the type of DUT/SUT. The Associated Media is shown in Figure 6 as 398 "Media" connected to media ports M1 and M2 on the EA. After the EA 399 sends a BYE, the session disconnects. Performance test cases for 400 session disconnects are not considered in this work item (the BYE 401 request is shown for completeness.) 403 EA DUT/SUT M1 M2 404 | | | | 405 | INVITE | | | 406 --------+-------------->| | | 407 | | | | 408 Attempting | | | 409 | 200 OK | | | 410 --------|<--------------| | | 411 | | | | 412 | | | | 413 | | | Media | 414 Established | |<=====>| 415 | | | | 416 | BYE | | | 417 --------+-------------->| | | 418 | | | | 419 Disconnecting | | | 420 | 200 OK | | | 421 --------|<--------------| | | 422 | | | | 424 Figure 6: Basic SIP test topology 426 3. Term Definitions 428 3.1. Protocol Components 430 3.1.1. Session 432 Definition: 434 The combination of signaling and media messages and processes that 435 enable two or more participants to communicate. 437 Discussion: 438 SIP messages in the signaling plane can be used to create and 439 manage applications for one or more end users. SIP is often used 440 to create and manage media streams in support of applications. A 441 session always has a signaling component and may have a media 442 component. Therefore, a Session may be defined as signaling only 443 or a combination of signaling and media (c.f. Associated Media, 444 see Section 3.1.4). SIP includes definitions of a Call-ID, a 445 dialogue and a transaction that support this application. A 446 growing number of usages and applications do not require the 447 creation of associated media. The first such usage was the 448 REGISTER. Applications that use the MESSAGE and SUBSCRIBE/NOTIFY 449 methods also do not require SIP to manage media streams. The 450 terminology Invite-initiated Session (IS) and Non-invite initiated 451 Session (NS) are used to distinguish between these different 452 usages. 454 A Session in the context of this document, is considered to be a 455 vector with three components: 457 1. A component in the signaling plane (SIP messages), sess.sig; 458 2. A media component in the media plane (RTP and SRTP streams for 459 example), sess.med (which may be null); 460 3. A control component in the media plane (RTCP messages for 461 example), sess.medc (which may be null). 463 An IS is expected to have non-null sess.sig and sess.med 464 components. The use of control protocols in the media component 465 is media dependent, thus the expected presence or absence of 466 sess.medc is media dependent and test-case dependent. An NS is 467 expected to have a non-null sess.sig component, but null sess.med 468 and sess.medc components. 470 Packets in the Signaling Plane and Media Plane will be handled by 471 different processes within the DUT. They will take different 472 paths within a SUT. These different processes and paths may 473 produce variations in performance. The terminology and benchmarks 474 defined in this document and the methodology for their use are 475 designed to enable us to compare performance of the DUT/SUT with 476 reference to the type of SIP-supported application it is handling. 478 Note that one or more sessions can simultaneously exist between 479 any participants. This can be the case, for example, when the EA 480 sets up both an IM and a voice call through the DUT/SUT. These 481 sessions are represented as an array session[x]. 483 Sessions will be represented as a vector array with three 484 components, as follows: 485 session-> 486 session[x].sig, the signaling component 487 session[x].medc, the media control component (e.g. RTCP) 488 session[x].med[y], an array of associated media streams (e.g. 489 RTP, SRTP, RTSP, MSRP). This media component may consist of zero 490 or more media streams. 491 Figure 7 models the vectors of the session. 493 Measurement Units: 494 N/A. 496 Issues: 497 None. 499 See Also: 500 Media Plane 501 Signaling Plane 502 Associated Media 503 Invite-initiated Session (IS) 504 Non-invite-initiated Session (NS) 505 |\ 506 | 507 | \ 508 sess.sig| 509 | \ 510 | 511 | \ 512 | o 513 | / 514 | / | 515 | / 516 | / | 517 | / 518 | / | 519 | / 520 | / | sess.medc 521 |/_____________________ 522 / / 523 / | 524 / / 525 sess.med / | 526 /_ _ _ _ _ _ _ _/ 527 / 528 / 529 / 530 / 532 Figure 7: Application or session components 534 3.1.2. Signaling Plane 536 Definition: 537 The control plane in which SIP messages [RFC3261] are exchanged 538 between SIP Agents [RFC3261] to establish a connection for media 539 exchange. 541 Discussion: 542 SIP messages are used to establish sessions in several ways: 543 directly between two User Agents [RFC3261], through a Proxy Server 544 [RFC3261], or through a series of Proxy Servers. The Signaling 545 Plane MUST include the Session Description Protocol (SDP). 546 The Signaling Plane for a single Session is represented by 547 session.sig. 549 Measurement Units: 550 N/A. 552 Issues: 553 None. 555 See Also: 556 Media Plane 557 Emulated Agents 559 3.1.3. Media Plane 561 Definition: 562 The data plane in which one or more media streams and their 563 associated media control protocols are exchanged after a media 564 connection has been created by the exchange of signaling messages 565 in the Signaling Plane. 567 Discussion: 568 Media may also be known as the "payload" or "bearer channel". The 569 Media Plane MUST include the media control protocol, if one is 570 used, and the media stream(s). Examples of media are audio, 571 video, whiteboard, and instant messaging service. The media 572 stream is described in the SDP of the Signaling Plane. 573 The media for a single Session is represented by session.med. The 574 media control protocol is represented by session.medc. 576 Measurement Units: 577 N/A. 579 Issues: 580 None. 582 See Also: 583 Signaling Plane 585 3.1.4. Associated Media 587 Definition: 588 Media that corresponds to an 'm' line in the SDP payload of the 589 Signaling Plane. 591 Discussion: 593 Any media protocol MAY be used. 594 For any session's signaling component, represented as session.sig, 595 there may be one or multiple associated media streams which are 596 represented be a vector array session.med[y], which is referred to 597 as the Associated Media. 599 Measurement Units: 600 N/A. 602 Issues: 603 None. 605 3.1.5. Overload 607 Definition: 608 Overload is defined as the state where a SIP server does not have 609 sufficient resources to process all incoming SIP messages 610 [I-D.ietf-sipping-overload-reqs]. 612 Discussion: 613 Under such conditions, all or a percentage of Session Attempts 614 will fail due to lack of resources. SIP server resources may 615 include CPU processing capacity, network bandwidth, input/output 616 queues, or disk resources. Any combination of resources may be 617 fully utilized when a SIP server (the DUT/SUT) is in the overload 618 condition. For proxy-only type of devices, overload issues will 619 be dominated by the number of signaling messages they can handle 620 in a unit time before their throughput starts to drop. 621 For UA-type of network devices (e.g., gateways), overload must 622 necessarily include both the signaling traffic and media streams. 623 It is expected that the amount of signaling that a UA can handle 624 is inversely proportional to the amount of media streams currently 625 handled by that UA. 627 Measurement Units: 628 N/A. 630 Issues: 631 The issue of overload in SIP networks is currently a topic of 632 discussion in the SIPPING WG. The normal response to an overload 633 stimulus -- sending a 503 response -- is considered inadequate. 634 The exact method for handling overload is not determined yet and 635 as solutions are engineered, this draft will be updated 636 accordingly (Note: this draft now has a dependency on the Overload 637 working group created by Dispatch. 639 3.1.6. Session Attempt 641 Definition: 642 A SIP Session for which the Emulated Agent has sent the SIP INVITE 643 or SUBSCRIBE NOTIFY and has not yet received a message response 644 from the DUT/SUT. 646 Discussion: 647 The attempted session may be an IS or an NS. The Session Attempt 648 includes SIP INVITEs and SUBSCRIBE/NOTIFY messages. It also 649 includes all INVITEs that are rejected for lack of authentication 650 information. 652 Measurement Units: 653 N/A. 655 Issues: 656 None. 658 See Also: 659 Session 660 Session Attempt Rate 661 Invite-initiated Session 662 Non-Invite initiated Session 664 3.1.7. Established Session 666 Definition: 667 A SIP session for which the Emulated Agent acting as the UE/UA has 668 received a 200OK message from the DUT/SUT. 670 Discussion: 671 An Established Session MAY be type INVITE-Session (IS) or Non- 672 INVITE Session (NS). 674 Measurement Units: 675 N/A. 677 Issues: 678 None. 680 See Also: 681 Invite-initiated Session 682 Session Attempting State 683 Session Disconnecting State 685 3.1.8. Invite-initiated Session (IS) 687 Definition: 688 A Session that is created by an exchange of messages in the 689 Signaling Plane, the first of which is a SIP INVITE request. 691 Discussion: 692 An IS is identified by the Call-ID, To-tag, and From-tag of the 693 SIP message that establishes the session. These three fields are 694 used to identify a SIP Dialog (RFC3261 [RFC3261]). An IS may have 695 Associated Media description in the SDP body. An IS may have 696 multiple Associated Media streams. The inclusion of media is test 697 case dependent. An IS is successfully established if the 698 following two conditions are met: 699 1. Sess.sig is established by the end of Establishment Threshold 700 Time (c.f. Section 3.3.3), and 701 2. The media session described in the SDP body of the signaling 702 message is present. 704 Measurement Units: 705 N/A. 707 Issues: 708 None. 710 See Also: 711 Session 712 Non-Invite initiated Session 713 Associated Media 715 3.1.9. Non-INVITE-initiated Session (NS) 717 Definition: 718 A session that is created by an exchange of messages in the 719 Signaling Plane that does not include an initial SIP INVITE 720 message. 722 Discussion: 723 An NS is successfully established if the Session Attempt via a 724 non- INVITE request results in the EA receiving a 2xx reply from 725 the DUT/SUT before the expiration of the Establishment Threshold 726 timer (c.f., Section 3.3.3). An example of a NS is a session 727 created by the SUBSCRIBE request. 729 Measurement Units: 730 N/A. 732 Issues: 733 None. 735 See Also: 736 Session 737 Invite-initiated Session 739 3.1.10. Session Attempt Failure 741 Definition: 742 A session attempt that does not result in an Established Session. 744 Discussion: 745 The session attempt failure may be indicated by the following 746 observations at the Emulated Agent: 747 1. Receipt of a SIP 4xx, 5xx, or 6xx class response to a Session 748 Attempt. 749 2. The lack of any received SIP response to a Session Attempt. 751 Measurement Units: 752 N/A. 754 Issues: 755 None. 757 See Also: 758 Session Attempt 760 3.1.11. Standing Sessions 762 Definition: 763 The number of Sessions currently established on the DUT/SUT at any 764 instant. 766 Discussion: 767 The number of Standing Sessions is influenced by the Session 768 Duration and the Session Attempt Rate. Benchmarks MUST be 769 reported with the maximum and average Standing Sessions for the 770 DUT/SUT. In order to determine the maximum and average Standing 771 Sessions on the DUT/SUT for the duration of the test it is 772 necessary to make periodic measurements of the number of Standing 773 Sessions on the DUT/SUT. The recommended value for the 774 measurement period is 1 second. 776 Measurement Units: 777 Number of sessions 779 Issues: 780 None. 782 See Also: 783 Session Duration 784 Session Rate 785 Session Attempt Rate 787 3.2. Test Components 789 3.2.1. Emulated Agent 791 Definition: 792 A device in test topology that initiates/responds to SIP messages 793 as one or more session endpoints and, wherever applicable, 794 sources/receives Associated Media for Established Sessions. 796 Discussion: 797 The Emulated Agent functions in the signaling and media planes. 798 The Tester may act as multiple Emulated Agents. 800 Measurement Units: 801 N/A 803 Issues: 804 None. 806 See Also: 807 Media Plane 808 Signaling Plane 809 Established Session 810 Associated Media 812 3.2.2. Signaling Server 814 Definition: 815 Device in test topology that acts to create sessions between 816 Emulated Agents in the media plane. This device is either a DUT 817 or component of a SUT. 819 Discussion: 820 The DUT MUST be a RFC 3261 capable network equipment such as a 821 Registrar, Redirect Server, User Agent Server, Stateless Proxy, or 822 Stateful Proxy. A DUT MAY also include B2BUA or SBC. 824 Measurement Units: 825 NA 827 Issues: 828 None. 830 See Also: 831 Signaling Plane 833 3.2.3. SIP-Aware Stateful Firewall 835 Definition: 836 Device in test topology that provides Denial-of-Service (DoS) 837 Protection to the Signaling and Media Planes for the Emulated 838 Agents and Signaling Server 840 Discussion: 841 The SIP-Aware Stateful Firewall MAY be an internal component or 842 function of the Session Server. The SIP-Aware Stateful Firewall 843 MAY be a standalone device. If it is a standalone device it MUST 844 be paired with a Signaling Server. If it is a standalone device 845 it MUST be benchmarked as part of a SUT. SIP-Aware Stateful 846 Firewalls MAY include Network Address Translation (NAT) 847 functionality. Ideally, the inclusion of the SIP-Aware Stateful 848 Firewall as a SUT has no degradation to the measured performance 849 benchmarks. 851 Measurement Units: 852 N/A 854 Issues: 855 None. 857 See Also: 859 3.2.4. SIP Transport Protocol 861 Definition: 862 The protocol used for transport of the Signaling Plane messages. 864 Discussion: 865 Performance benchmarks may vary for the same SIP networking device 866 depending upon whether TCP, UDP, TLS, SCTP, or another transport 867 layer protocol is used. For this reason it MAY be necessary to 868 measure the SIP Performance Benchmarks using these various 869 transport protocols. Performance Benchmarks MUST report the SIP 870 Transport Protocol used to obtain the benchmark results. 872 Measurement Units: 873 TCP,UDP, SCTP, TLS over TCP, TLS over UDP, or TLS over SCTP 875 Issues: 876 None. 878 See Also: 880 3.3. Test Setup Parameters 882 3.3.1. Session Attempt Rate 884 Definition: 885 Configuration of the Emulated Agent for the number of sessions 886 that the Emulated Agent attempts to establish with the DUT/SUT 887 over a specified time interval. 889 Discussion: 890 The Session Attempt Rate can cause variation in performance 891 benchmark measurements. Since this is the number of sessions 892 configured on the Tester, some sessions may not be successfully 893 established on the DUT. A session may be either an IS or an NS. 895 Measurement Units: 896 Session attempts per second 898 Issues: 899 None. 901 See Also: 902 Session 903 Session Attempt 905 3.3.2. IS Media Attempt Rate 906 Definition: 907 Configuration on the Emulated Agent for number of ISs with 908 Associated Media to be established at the DUT per continuous one- 909 second time intervals. 911 Discussion: 912 The Media Session Attempt Rate can cause variation in performance 913 benchmark measurements. This is the number of sessions configured 914 on the Tester. Some attempts may not result in successful 915 sessions established on the DUT. Media Sessions MUST be 916 associated with an IS. By including this definition we leave open 917 the possibility that there may be an IS that does not include a 918 media description. In this document, however, we will assume that 919 there is a one to one correspondence between IS session attempts 920 and Media Session attempts. 922 Measurement Units: 923 session attempts per second (saps) 925 Issues: 926 None. 928 See Also: 929 IS 931 3.3.3. Establishment Threshold Time 933 Definition: 934 Configuration of the Emulated Agent for representing the amount of 935 time that an Emulated Agent will wait before declaring a Session 936 Attempt Failure. 938 Discussion: 939 This time duration is test dependent. 940 It is RECOMMENDED that the Establishment Threshold Time value be 941 set to Timer B (for ISs) or Timer F (for NSs) as specified in RFC 942 3261, Table 4 [RFC3261]. Following the default value of T1 943 (500ms) specified in the table and a constant multiplier of 64 944 gives a value of 32 seconds for this timer (i.e., 500ms * 64 = 945 32s). 947 Measurement Units: 948 seconds 950 Issues: 951 None. 953 See Also: 954 session establishment failure 956 3.3.4. Media Packet Size 958 Definition: 959 Configuration on the Emulated Agent for a fixed size of packets 960 used for media streams. 962 Discussion: 963 For a single benchmark test, all sessions use the same size packet 964 for media streams. The size of packets can cause variation in 965 performance benchmark measurements. 967 Measurement Units: 968 bytes 970 Issues: 971 None. 973 See Also: 975 3.3.5. Media Offered Load 977 Definition: 978 Configuration of the Emulated Agent for the constant rate of 979 Associated Media traffic offered by the Emulated Agent to the DUT/ 980 SUT for one or more Established Sessions of type IS. 982 Discussion: 983 The Media Offered Load to be used for a test MUST be reported with 984 three components: 985 1. per Associated Media stream; 986 2. per IS; 987 3. aggregate. 988 For a single benchmark test, all sessions use the same Media 989 Offered Load per Media Stream. There may be multiple Associated 990 Media streams per IS. The aggregate is the sum of all Associated 991 Media for all IS. 993 Measurement Units: 995 packets per second (pps) 997 Issues: 998 None. 1000 See Also: 1001 Established Session 1002 Invite Initiated Session 1003 Associated Media 1005 3.3.6. Media Session Hold Time 1007 Definition: 1008 Parameter configured at the Emulated Agent, that represents the 1009 amount of time that the Associated Media for an Established 1010 Session of type IS will last . 1012 Discussion: 1013 The Associated Media streams may be bi-directional or uni- 1014 directional as indicated in the test methodology. 1016 Measurement Units: 1017 seconds 1019 Issues: 1020 None. 1022 See Also: 1023 Associated Media 1024 Established Session 1025 Invite-initiated Session (IS) 1027 3.3.7. Loop Detection Option 1029 Definition: 1030 An option that causes a Proxy to check for loops in the routing of 1031 a SIP request before forwarding the request. 1033 Discussion: 1034 This is an optional process that a SIP proxy may employ; the 1035 process is described under Proxy Behavior in RFC 3261 in Section 1036 16.3 Request Validation and that section also contains suggestions 1037 as to how the option could be implemented. Any procedure to 1038 detect loops will use processor cycles and hence could impact the 1039 performance of a proxy. 1041 Measurement Units: 1042 NA 1044 Issues: 1045 None. 1047 See Also: 1049 3.3.8. Forking Option 1051 Definition: 1052 An option that enables a Proxy to fork requests to more than one 1053 destination. 1055 Discussion: 1056 This is an process that a SIP proxy may employ to find the UAS. 1057 The option is described under Proxy Behavior in RFC 3261 in 1058 Section 16.1. A proxy that uses forking must maintain state 1059 information and this will use processor cycles and memory. Thus 1060 the use of this option could impact the performance of a proxy and 1061 different implementations could produce different impacts. 1062 SIP supports serial or parallel forking. When performing a test, 1063 the type of forking mode must be indicated. 1065 Measurement Units: 1066 The number of endpoints that will receive the forked invitation. 1067 A value of 1 indicates that the request is destined to only one 1068 endpoint, a value of 2 indicates that the request is forked to two 1069 endpoints, and so on. This is an integer value ranging between 1 1070 and N inclusive, where N is the maximum number of endpoints to 1071 which the invitation is sent. 1072 Type of forking used, namely parallel or serial. 1074 Issues: 1075 None. 1077 See Also: 1079 3.4. Benchmarks 1081 3.4.1. Registration Rate 1083 Definition: 1085 The maximum number of registrations that can be successfully 1086 completed by the DUT/SUT in a given time period. 1088 Discussion: 1089 This benchmark is obtained with zero failure in which 100% of the 1090 registrations attempted by the Emulated Agent are successfully 1091 completed by the DUT/SUT. The maximum value is obtained by 1092 testing to failure. This means that the registration rate 1093 provisioned on the EA is raised progressively until a registration 1094 attempt failure is observed. 1096 Measurement Units: 1097 registrations per second (rps) 1099 Issues: 1100 None. 1102 See Also: 1104 3.4.2. Session Establishment Rate 1106 Definition: 1107 The maximum average rate at which the DUT/SUT can successfully 1108 establish sessions. 1110 Discussion: 1111 This metric is an average of maximas. Each maximum is measured in 1112 a separate sample. The Session Establishment Rate is the average 1113 of the maximas established in each individual sample. In each 1114 sample, the maximum in question is the number of sessions 1115 successfully established in continuous one-second intervals with 1116 prior sessions remaining active. This maximum is designated in 1117 the equation below as "rate in sample i". The session 1118 establishment rate is calculated using the following equation (n = 1119 number of samples): 1121 n 1122 -- 1123 \ rate at sample i 1124 / 1125 -- 1126 i = 1 1127 --------------------- 1128 (n) 1130 In each sample, the maximum is obtained by testing to failure. 1131 With zero failure, 100% of the sessions introduced by the Emulated 1132 Agent are successfully established. The maximum value is obtained 1133 by testing to failure. This means that the session rate 1134 provisioned on the EA is raised progressively until a Session 1135 Attempt Failure is observed. The maximum rate is the rate 1136 acheived in the interval prior to the interval in which the 1137 failure is observed. Sessions may be IS or NS or a a mix of both 1138 and will be defined in the particular test. 1140 Measurement Units: 1141 sessions per second (sps) 1143 Issues: 1144 None. 1146 See Also: 1147 Invite-initiated Sessions 1148 Non-INVITE initiated Sessions 1149 Session Attempt Rate 1151 3.4.3. Session Capacity 1153 Definition: 1154 The maximum number of Established Sessions that can exist 1155 simultaneously on the DUT/SUT until Session Attempt Failure 1156 occurs. 1158 Discussion: 1159 When benchmarking Session Capacity it is assumed that sessions are 1160 permanently established (i.e., they remain active for the duration 1161 of the test.) In order to make this measurement, the Session 1162 Duration MUST be set to infinity so that sessions remain 1163 established for the duration of the test. The Session Capacity 1164 must be reported with the Session Attempt Rate used to reach the 1165 maximum. Since Session Attempt Rate is a zero-loss measurement, 1166 there must be zero failures to achieve the Session Capacity. The 1167 maximum is indicated at the Emulated Agent by arrival of a SIP 1168 4xx, 5xx, or 6xx response from the DUT/SUT. Sessions may be IS or 1169 NS. 1171 Measurement Units: 1172 sessions 1174 Issues: 1175 None. 1177 See Also: 1178 Established Session 1179 Session Attempt Rate 1180 Session Attempt Failure 1182 3.4.4. Session Overload Capacity 1184 Definition: 1185 The maximum number of Established Sessions that can exist 1186 simultaneously on the DUT/SUT until it stops responding to Session 1187 Attempts. 1189 Discussion: 1190 Session Overload Capacity is measured after the Session Capacity 1191 is measured. The Session Overload Capacity is greater than or 1192 equal to the Session Capacity. When benchmarking Session Overload 1193 Capacity, continue to offer Session Attempts to the DUT/SUT after 1194 the first Session Attempt Failure occurs and measure Established 1195 Sessions until no there is no SIP message response for the 1196 duration of the Establishment Threshold. It is worth noting that 1197 the Session Establishment Performance is expected to decrease 1198 after the first Session Attempt Failure occurs. 1200 Units: 1201 Sessions 1203 Issues: 1204 None. 1206 See Also: 1207 Overload 1208 Session Capacity 1209 Session Attempt Failure 1211 3.4.5. Session Establishment Performance 1213 Definition: 1214 The percent of Session Attempts that become Established Sessions 1215 over the duration of a benchmarking test. 1217 Discussion: 1219 Session Establishment Performance is a benchmark to indicate 1220 session establishment success for the duration of a test. The 1221 duration for measuring this benchmark is to be specified in the 1222 Methodology. The Session Duration SHOULD be configured to 1223 infinity so that sessions remain established for the entire test 1224 duration. 1225 Session Establishment Performance is calculated as shown in the 1226 following equation: 1228 Session Establishment = Total Established Sessions 1229 Performance -------------------------- 1230 Total Session Attempts 1232 Session Establishment Performance may be monitored real-time 1233 during a benchmarking test. However, the reporting benchmark MUST 1234 be based on the total measurements for the test duration. 1236 Measurement Units: 1237 Percent (%) 1239 Issues: 1240 None. 1242 See Also: 1243 Established Session 1244 Session Attempt 1246 3.4.6. Session Attempt Delay 1248 Definition: 1249 The average time measured at the Emulated Agent for a Session 1250 Attempt to result in an Established Session. 1252 Discussion: 1253 Time is measured from when the EA sends the first INVITE for the 1254 call-ID in the case of an IS. Time is measured from when the EA 1255 sends the first non-INVITE message in the case of an NS. Session 1256 Attempt Delay MUST be measured for every established session to 1257 calculate the average. Session Attempt Delay MUST be measured at 1258 the Maximum Session Establishment Rate. 1260 Measurement Units: 1261 Seconds 1263 Issues: 1264 None. 1266 See Also: 1267 Maximum Session Establishment Rate 1269 3.4.7. IM Rate 1271 Definition: 1272 Maximum number of IM messages completed by the DUT/SUT. 1274 Discussion: 1275 For a UAS, the definition of success is the receipt of an IM 1276 request and the subsequent sending of a final response. 1277 For a UAC, the definition of success is the sending of an IM 1278 request and the receipt of a final response to it. For a proxy, 1279 the definition of success is as follows: 1280 A. the number of IM requests it receives from the upstream client 1281 MUST be equal to the number of IM requests it sent to the 1282 downstream server; and 1283 B. the number of IM responses it receives from the downstream 1284 server MUST be equal to the number of IM requests sent to the 1285 downstream server; and 1286 C. the number of IM responses it sends to the upstream client 1287 MUST be equal to the number of IM requests it received from 1288 the upstream client. 1290 Measurement Units: 1291 IM messages per second 1293 Issues: 1294 None. 1296 See Also: 1298 4. IANA Considerations 1300 This document requires no IANA considerations. 1302 5. Security Considerations 1304 Documents of this type do not directly affect the security of 1305 Internet or corporate networks as long as benchmarking is not 1306 performed on devices or systems connected to production networks. 1307 Security threats and how to counter these in SIP and the media layer 1308 is discussed in RFC3261 [RFC3261], RFC 3550 [RFC3550], RFC3711 1310 [RFC3711] and various other drafts. This document attempts to 1311 formalize a set of common terminology for benchmarking SIP networks. 1313 6. Acknowledgments 1315 The authors would like to thank Keith Drage, Cullen Jennings, Daryl 1316 Malas, Al Morton, and Henning Schulzrinne for invaluable 1317 contributions to this document. 1319 7. References 1321 7.1. Normative References 1323 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 1324 Requirement Levels", BCP 14, RFC 2119, March 1997. 1326 [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 1327 Network Interconnect Devices", RFC 2544, March 1999. 1329 [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, 1330 A., Peterson, J., Sparks, R., Handley, M., and E. 1331 Schooler, "SIP: Session Initiation Protocol", RFC 3261, 1332 June 2002. 1334 [I-D.ietf-bmwg-sip-bench-meth] 1335 Poretsky, S., Gurbani, V., and C. Davids, "Methodology for 1336 Benchmarking SIP Networking Devices", 1337 draft-ietf-bmwg-sip-bench-meth-01 (work in progress), 1338 January 2010. 1340 7.2. Informational References 1342 [RFC2285] Mandeville, R., "Benchmarking Terminology for LAN 1343 Switching Devices", RFC 2285, February 1998. 1345 [RFC1242] Bradner, S., "Benchmarking terminology for network 1346 interconnection devices", RFC 1242, July 1991. 1348 [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. 1349 Jacobson, "RTP: A Transport Protocol for Real-Time 1350 Applications", STD 64, RFC 3550, July 2003. 1352 [RFC3711] Baugher, M., McGrew, D., Naslund, M., Carrara, E., and K. 1353 Norrman, "The Secure Real-time Transport Protocol (SRTP)", 1354 RFC 3711, March 2004. 1356 [I-D.ietf-sipping-overload-design] 1357 Hilt, V., Noel, E., Shen, C., and A. Abdelal, "Design 1358 Considerations for Session Initiation Protocol (SIP) 1359 Overload Control", draft-ietf-sipping-overload-design-02 1360 (work in progress), July 2009. 1362 [I-D.ietf-sipping-overload-reqs] 1363 Rosenberg, J., "Requirements for Management of Overload in 1364 the Session Initiation Protocol", 1365 draft-ietf-sipping-overload-reqs-05 (work in progress), 1366 July 2008. 1368 Appendix A. White Box Benchmarking Terminology 1370 Session Attempt Arrival Rate 1372 Definition: 1373 The number of Session Attempts received at the DUT/SUT over a 1374 specified time period. 1376 Discussion: 1377 Sessions Attempts are indicated by the arrival of SIP INVITES OR 1378 SUBSCRIBE NOTIFY messages. Session Attempts Arrival Rate 1379 distribution can be any model selected by the user of this 1380 document. It is important when comparing benchmarks of different 1381 devices that same distribution model was used. Common 1382 distributions are expected to be Uniform and Poisson. 1384 Measurement Units: 1385 Session attempts/sec 1387 Issues: 1388 None. 1390 See Also: 1391 Session Attempt 1393 Authors' Addresses 1395 Scott Poretsky 1396 Allot Communications 1397 300 TradeCenter, Suite 4680 1398 Woburn, MA 08101 1399 USA 1401 Phone: +1 508 309 2179 1402 Email: sporetsky@allot.com 1404 Vijay K. Gurbani 1405 Bell Laboratories, Alcatel-Lucent 1406 1960 Lucent Lane 1407 Rm 9C-533 1408 Naperville, IL 60566 1409 USA 1411 Phone: +1 630 224 0216 1412 Email: vkg@alcatel-lucent.com 1414 Carol Davids 1415 Illinois Institute of Technology 1416 201 East Loop Road 1417 Wheaton, IL 60187 1418 USA 1420 Email: davids@iit.edu