idnits 2.17.1 draft-ietf-bmwg-sip-bench-term-11.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (July 2, 2014) is 3587 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- == Outdated reference: A later version (-12) exists of draft-ietf-bmwg-sip-bench-meth-10 Summary: 0 errors (**), 0 flaws (~~), 2 warnings (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Benchmarking Methodology Working Group C. Davids 3 Internet-Draft Illinois Institute of Technology 4 Intended status: Informational V. Gurbani 5 Expires: January 3, 2015 Bell Laboratories, 6 Alcatel-Lucent 7 S. Poretsky 8 Allot Communications 9 July 2, 2014 11 Terminology for Benchmarking Session Initiation Protocol (SIP) Devices: 12 Basic session setup and registration 13 draft-ietf-bmwg-sip-bench-term-11 15 Abstract 17 This document provides a terminology for benchmarking the Session 18 Initiation Protocol (SIP) performance of devices. Methodology 19 related to benchmarking SIP devices is described in the companion 20 methodology document. Using these two documents, benchmarks can be 21 obtained and compared for different types of devices such as SIP 22 Proxy Servers, Registrars and Session Border Controllers. The term 23 "performance" in this context means the capacity of the device-under- 24 test (DUT) to process SIP messages. Media streams are used only to 25 study how they impact the signaling behavior. The intent of the two 26 documents is to provide a normalized set of tests that will enable an 27 objective comparison of the capacity of SIP devices. Test setup 28 parameters and a methodology is necessary because SIP allows a wide 29 range of configuration and operational conditions that can influence 30 performance benchmark measurements. A standard terminology and 31 methodology will ensure that benchmarks have consistent definition 32 and were obtained following the same procedures. 34 Status of this Memo 36 This Internet-Draft is submitted in full conformance with the 37 provisions of BCP 78 and BCP 79. 39 Internet-Drafts are working documents of the Internet Engineering 40 Task Force (IETF). Note that other groups may also distribute 41 working documents as Internet-Drafts. The list of current Internet- 42 Drafts is at http://datatracker.ietf.org/drafts/current/. 44 Internet-Drafts are draft documents valid for a maximum of six months 45 and may be updated, replaced, or obsoleted by other documents at any 46 time. It is inappropriate to use Internet-Drafts as reference 47 material or to cite them other than as "work in progress." 48 This Internet-Draft will expire on January 3, 2015. 50 Copyright Notice 52 Copyright (c) 2014 IETF Trust and the persons identified as the 53 document authors. All rights reserved. 55 This document is subject to BCP 78 and the IETF Trust's Legal 56 Provisions Relating to IETF Documents 57 (http://trustee.ietf.org/license-info) in effect on the date of 58 publication of this document. Please review these documents 59 carefully, as they describe your rights and restrictions with respect 60 to this document. Code Components extracted from this document must 61 include Simplified BSD License text as described in Section 4.e of 62 the Trust Legal Provisions and are provided without warranty as 63 described in the Simplified BSD License. 65 Table of Contents 67 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 68 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4 69 2.1. Scope . . . . . . . . . . . . . . . . . . . . . . . . . . 6 70 3. Term Definitions . . . . . . . . . . . . . . . . . . . . . . . 7 71 3.1. Protocol Components . . . . . . . . . . . . . . . . . . . 7 72 3.1.1. Session . . . . . . . . . . . . . . . . . . . . . . . 7 73 3.1.2. Signaling Plane . . . . . . . . . . . . . . . . . . . 8 74 3.1.3. Media Plane . . . . . . . . . . . . . . . . . . . . . 8 75 3.1.4. Associated Media . . . . . . . . . . . . . . . . . . . 9 76 3.1.5. Overload . . . . . . . . . . . . . . . . . . . . . . . 9 77 3.1.6. Session Attempt . . . . . . . . . . . . . . . . . . . 9 78 3.1.7. Established Session . . . . . . . . . . . . . . . . . 10 79 3.1.8. Session Attempt Failure . . . . . . . . . . . . . . . 11 80 3.2. Test Components . . . . . . . . . . . . . . . . . . . . . 11 81 3.2.1. Emulated Agent . . . . . . . . . . . . . . . . . . . . 11 82 3.2.2. Signaling Server . . . . . . . . . . . . . . . . . . . 12 83 3.2.3. SIP Transport Protocol . . . . . . . . . . . . . . . . 12 84 3.3. Test Setup Parameters . . . . . . . . . . . . . . . . . . 13 85 3.3.1. Session Attempt Rate . . . . . . . . . . . . . . . . . 13 86 3.3.2. Establishment Threshold Time . . . . . . . . . . . . . 13 87 3.3.3. Session Duration . . . . . . . . . . . . . . . . . . . 14 88 3.3.4. Media Packet Size . . . . . . . . . . . . . . . . . . 14 89 3.3.5. Codec Type . . . . . . . . . . . . . . . . . . . . . . 15 90 3.4. Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . 15 91 3.4.1. Session Establishment Rate . . . . . . . . . . . . . . 15 92 3.4.2. Registration Rate . . . . . . . . . . . . . . . . . . 16 93 3.4.3. Registration Attempt Rate . . . . . . . . . . . . . . 17 94 4. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 17 95 5. Security Considerations . . . . . . . . . . . . . . . . . . . 17 96 6. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18 97 7. References . . . . . . . . . . . . . . . . . . . . . . . . . . 18 98 7.1. Normative References . . . . . . . . . . . . . . . . . . . 18 99 7.2. Informational References . . . . . . . . . . . . . . . . . 19 100 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19 102 1. Terminology 104 The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", 105 "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this 106 document are to be interpreted as described in BCP 14, RFC2119 107 [RFC2119]. RFC 2119 defines the use of these key words to help make 108 the intent of standards track documents as clear as possible. While 109 this document uses these keywords, this document is not a standards 110 track document. The term Throughput is defined in RFC2544 [RFC2544]. 112 For the sake of clarity and continuity, this document adopts the 113 template for definitions set out in Section 2 of RFC 1242 [RFC1242]. 115 The term Device Under Test (DUT) is defined in the following BMWG 116 documents: 118 Device Under Test (DUT) (c.f., Section 3.1.1 RFC 2285 [RFC2285]). 120 Many commonly used SIP terms in this document are defined in RFC 3261 121 [RFC3261]. For convenience the most important of these are 122 reproduced below. Use of these terms in this document is consistent 123 with their corresponding definition in the base SIP specification 124 [RFC3261] as amended by [RFC4320], [RFC5393] and [RFC6026]. 126 o Call Stateful: A proxy is call stateful if it retains state for a 127 dialog from the initiating INVITE to the terminating BYE request. 128 A call stateful proxy is always transaction stateful, but the 129 converse is not necessarily true. 130 o Stateful Proxy: A logical entity, as defined by [RFC3261], that 131 maintains the client and server transaction state machines during 132 the processing of a request. (Also known as a transaction 133 stateful proxy.) The behavior of a stateful proxy is further 134 defined in Section 16 of RFC 3261 [RFC3261] . A transaction 135 stateful proxy is not the same as a call stateful proxy. 136 o Back-to-back User Agent: A back-to-back user agent (B2BUA) is a 137 logical entity that receives a request and processes it as a user 138 agent server (UAS). In order to determine how the request should 139 be answered, it acts as a user agent client (UAC) and generates 140 requests. Unlike a proxy server, it maintains dialog state and 141 must participate in all requests sent on the dialogues it has 142 established. Since it is a concatenation of a UAC and a UAS, no 143 explicit definitions are needed for its behavior. 145 2. Introduction 147 Service Providers and IT Organizations deliver Voice Over IP (VoIP) 148 and Multimedia network services based on the IETF Session Initiation 149 Protocol (SIP) [RFC3261]. SIP is a signaling protocol originally 150 intended to be used to dynamically establish, disconnect and modify 151 streams of media between end users. As it has evolved it has been 152 adopted for use in a growing number of services and applications. 153 Many of these result in the creation of a media session, but some do 154 not. Examples of this latter group include text messaging and 155 subscription services. The set of benchmarking terms provided in 156 this document is intended for use with any SIP-enabled device 157 performing SIP functions in the interior of the network, whether or 158 not these result in the creation of media sessions. The performance 159 of end-user devices is outside the scope of this document. 161 A number of networking devices have been developed to support SIP- 162 based VoIP services. These include SIP Servers, Session Border 163 Controllers (SBC) and Back-to-back User Agents (B2BUA). These 164 devices contain a mix of voice and IP functions whose performance may 165 be reported using metrics defined by the equipment manufacturer or 166 vendor. The Service Provider or IT Organization seeking to compare 167 the performance of such devices will not be able to do so using these 168 vendor-specific metrics, whose conditions of test and algorithms for 169 collection are often unspecified. 171 SIP functional elements and the devices that include them can be 172 configured many different ways and can be organized into various 173 topologies. These configuration and topological choices impact the 174 value of any chosen signaling benchmark. Unless these conditions-of- 175 test are defined, a true comparison of performance metrics across 176 multiple vendor implementations will not be possible. 178 Some SIP-enabled devices terminate or relay media as well as 179 signaling. The processing of media by the device impacts the 180 signaling performance. As a result, the conditions-of-test must 181 include information as to whether or not the device under test 182 processes media. If the device processes media during the test, a 183 description of the media must be provided. This document and its 184 companion methodology document [I-D.ietf-bmwg-sip-bench-meth] provide 185 a set of black-box benchmarks for describing and comparing the 186 performance of devices that incorporate the SIP User Agent Client and 187 Server functions and that operate in the network's core. 189 The definition of SIP performance benchmarks necessarily includes 190 definitions of Test Setup Parameters and a test methodology. These 191 enable the Tester to perform benchmarking tests on different devices 192 and to achieve comparable results. This document provides a common 193 set of definitions for Test Components, Test Setup Parameters, and 194 Benchmarks. All the benchmarks defined are black-box measurements of 195 the SIP signaling plane. The Test Setup Parameters and Benchmarks 196 defined in this document are intended for use with the companion 197 Methodology document. 199 2.1. Scope 201 The scope of this document is summarized as follows: 202 o This terminology document describes SIP signaling performance 203 benchmarks for black-box measurements of SIP networking devices. 204 Stress and debug scenarios are not addressed in this document. 205 o The DUT must be RFC 3261 capable network equipment. This may be a 206 Registrar, Redirect Server, or Stateful Proxy. This document does 207 not require the intermediary to assume the role of a stateless 208 proxy. A DUT MAY also include a B2BUA, SBC functionality. 209 o The Tester acts as multiple "Emulated Agents" (EA) that initiate 210 (or respond to) SIP messages as session endpoints and source (or 211 receive) associated media for established connections. 212 o SIP Signaling in presence of media 213 * The media performance is not benchmarked. 214 * Some tests require media, but the use of media is limited to 215 observing the performance of SIP signaling. Tests that require 216 media will annotate the media characteristics as a condition of 217 test. 218 * The type of DUT dictates whether the associated media streams 219 traverse the DUT. Both scenarios are within the scope of this 220 document. 221 * SIP is frequently used to create media streams; the signaling 222 plane and media plane are treated as orthogonal to each other 223 in this document. While many devices support the creation of 224 media streams, benchmarks that measure the performance of these 225 streams are outside the scope of this document and its 226 companion methodology document [I-D.ietf-bmwg-sip-bench-meth]. 227 Tests may be performed with or without the creation of media 228 streams. The presence or absence of media streams MUST be 229 noted as a condition of the test as the performance of SIP 230 devices may vary accordingly. Even if the media is used during 231 benchmarking, only the SIP performance will be benchmarked, not 232 the media performance or quality. 233 o Both INVITE and non-INVITE scenarios (registrations) are addressed 234 in this document. However, benchmarking SIP presence or 235 subscribe-notify extensions is not a part of this document. 236 o Different transport -- such as UDP, TCP, SCTP, or TLS -- may be 237 used. The specific transport mechanism MUST be noted as a 238 condition of the test as the performance of SIP devices may vary 239 accordingly. 240 o REGISTER and INVITE requests may be challenged or remain 241 unchallenged for authentication purpose. Whether or not the 242 REGISTER and INVITE requests are challenged is a condition of test 243 which will be recorded along with other such parameters which may 244 impact the SIP performance of the device or system under test. 246 o Re-INVITE requests are not considered in scope of this document 247 since the benchmarks for INVITEs are based on the dialog created 248 by the INVITE and not on the transactions that take place within 249 that dialog. 250 o Only session establishment is considered for the performance 251 benchmarks. Session disconnect is not considered in the scope of 252 this document. This is because our goal is to determine the 253 maximum capacity of the device or system under test, that is the 254 number of simultaneous SIP sessions that the device or system can 255 support. It is true that there are BYE requests being created 256 during the test process. These transactions do contribute to the 257 load on the device or system under test and thus are accounted for 258 in the metric we derive. We do not seek a separate metric for the 259 number of BYE transactions a device or system can support. 260 o IMS-specific scenarios are not considered, but test cases can be 261 applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT. 263 3. Term Definitions 265 3.1. Protocol Components 267 3.1.1. Session 269 Definition: 270 The combination of signaling and media messages and associated 271 processing that enable a single SIP-based audio or video call, or 272 SIP registration. 274 Discussion: 275 The term "session" commonly implies a media session. In this 276 document the term is extended to cover the signaling and any media 277 specified and invoked by the corresponding signaling. 279 Measurement Units: 280 N/A. 282 Issues: 283 None. 285 See Also: 286 Media Plane 287 Signaling Plane 288 Associated Media 290 3.1.2. Signaling Plane 292 Definition: 293 The plane in which SIP messages [RFC3261] are exchanged between 294 SIP Agents [RFC3261]. 296 Discussion: 297 SIP messages are used to establish sessions in several ways: 298 directly between two User Agents [RFC3261], through a Proxy Server 299 [RFC3261], or through a series of Proxy Servers. The Session 300 Description Protocol (SDP) is included in the Signaling Plane. 302 Measurement Units: 303 N/A. 305 Issues: 306 None. 308 See Also: 309 Media Plane 310 EAs 312 3.1.3. Media Plane 314 Definition: 315 The data plane in which one or more media streams and their 316 associated media control protocols are exchanged between User 317 Agents after a media connection has been created by the exchange 318 of signaling messages in the Signaling Plane. 320 Discussion: 321 Media may also be known as the "bearer channel". The Media Plane 322 MUST include the media control protocol, if one is used, and the 323 media stream(s). Examples of media are audio and video. The 324 media streams are described in the SDP of the Signaling Plane. 326 Measurement Units: 327 N/A. 329 Issues: 330 None. 332 See Also: 333 Signaling Plane 335 3.1.4. Associated Media 337 Definition: 338 Media that corresponds to an 'm' line in the SDP payload of the 339 Signaling Plane. 341 Discussion: 342 Any media protocol MAY be used. 344 Measurement Units: 345 N/A. 347 Issues: 348 None. 350 3.1.5. Overload 352 Definition: 353 Overload is defined as the state where a SIP server does not have 354 sufficient resources to process all incoming SIP messages 355 [RFC6357]. 357 Discussion: 358 The distinction between an overload condition and other failure 359 scenarios is outside the scope of black box testing and of this 360 document. Under overload conditions, all or a percentage of 361 Session Attempts will fail due to lack of resources. In black box 362 testing the cause of the failure is not explored. The fact that a 363 failure occurred for whatever reason, will trigger the tester to 364 reduce the offered load, as described in the companion methodology 365 document, [I-D.ietf-bmwg-sip-bench-meth]. SIP server resources 366 may include CPU processing capacity, network bandwidth, input/ 367 output queues, or disk resources. Any combination of resources 368 may be fully utilized when a SIP server (the DUT) is in the 369 overload condition. For proxy-only (or intermediary) devices, it 370 is expected that the proxy will be driven into overload based on 371 the delivery rate of signaling requests. 373 Measurement Units: 374 N/A. 376 3.1.6. Session Attempt 377 Definition: 378 A SIP INVITE or REGISTER request sent by the EA that has not 379 received a final response. 381 Discussion: 382 The attempted session may be either an invitation to an audio/ 383 video communication or a registration attempt. When counting the 384 number of session attempts we include all requests that are 385 rejected for lack of authentication information. The EA needs to 386 record the total number of session attempts including those 387 attempts that are routinely rejected by a proxy that requires the 388 UA to authenticate itself. The EA is provisioned to deliver a 389 specific number of session attempts per second. But the EA must 390 also count the actual number of session attempts per given time 391 interval. 393 Measurement Units: 394 N/A. 396 Issues: 397 None. 399 See Also: 400 Session 401 Session Attempt Rate 403 3.1.7. Established Session 405 Definition: 406 A SIP session for which the EA acting as the UE/UA has received a 407 200 OK message. 409 Discussion: 410 An Established Session may be either an invitation to an audio/ 411 video communication or a registration attempt. Early dialogues 412 for INVITE requests are out of scope for this work. 414 Measurement Units: 415 N/A. 417 Issues: 418 None. 420 See Also: 422 None. 424 3.1.8. Session Attempt Failure 426 Definition: 427 A session attempt that does not result in an Established Session. 429 Discussion: 430 The session attempt failure may be indicated by the following 431 observations at the EA: 432 1. Receipt of a SIP 3xx-, 4xx-, 5xx-, or 6xx-class response to a 433 Session Attempt. 434 2. The lack of any received SIP response to a Session Attempt 435 within the Establishment Threshold Time (c.f. Section 3.3.2). 437 Measurement Units: 438 N/A. 440 Issues: 441 None. 443 See Also: 444 Session Attempt 446 3.2. Test Components 448 3.2.1. Emulated Agent 450 Definition: 451 A device in the test topology that initiates/responds to SIP 452 messages as one or more session endpoints and, wherever 453 applicable, sources/receives Associated Media for Established 454 Sessions. 456 Discussion: 457 The EA functions in the Signaling and Media Planes. The Tester 458 may act as multiple EAs. 460 Measurement Units: 461 N/A 463 Issues: 464 None. 466 See Also: 467 Media Plane 468 Signaling Plane 469 Established Session 470 Associated Media 472 3.2.2. Signaling Server 474 Definition: 475 Device in the test topology that facilitates the creation of 476 sessions between EAs. This device is the DUT. 478 Discussion: 479 The DUT is a RFC3261-capable network intermediary such as a 480 Registrar, Redirect Server, Stateful Proxy, B2BUA or SBC. 482 Measurement Units: 483 NA 485 Issues: 486 None. 488 See Also: 489 Signaling Plane 491 3.2.3. SIP Transport Protocol 493 Definition: 494 The protocol used for transport of the Signaling Plane messages. 496 Discussion: 497 Performance benchmarks may vary for the same SIP networking device 498 depending upon whether TCP, UDP, TLS, SCTP, websockets [RFC7118] 499 or any future transport layer protocol is used. For this reason 500 it is necessary to measure the SIP Performance Benchmarks using 501 these various transport protocols. Performance Benchmarks MUST 502 report the SIP Transport Protocol used to obtain the benchmark 503 results. 505 Measurement Units: 506 While these are not units of measure, they are attributes that are 507 one of many factors that will contribute to the value of the 508 measurements to be taken. TCP, UDP, SCTP, TLS over TCP, TLS over 509 UDP, TLS over SCTP, and websockets are among the possible values 510 to be recorded as part of the test. 512 Issues: 513 None. 515 See Also: 516 None. 518 3.3. Test Setup Parameters 520 3.3.1. Session Attempt Rate 522 Definition: 523 Configuration of the EA for the number of sessions per second 524 (sps) that the EA attempts to establish using the services of the 525 DUT. 527 Discussion: 528 The Session Attempt Rate is the number of sessions per second that 529 the EA sends toward the DUT. Some of the sessions attempted may 530 not result in a session being established. 532 Measurement Units: 533 Session attempts per second 535 Issues: 536 None. 538 See Also: 539 Session 540 Session Attempt 542 3.3.2. Establishment Threshold Time 544 Definition: 545 Configuration of the EA that represents the amount of time that an 546 EA client will wait for a response from an EA server before 547 declaring a Session Attempt Failure. 549 Discussion: 550 This time duration is test dependent. 552 It is RECOMMENDED that the Establishment Threshold Time value be 553 set to Timer B or Timer F as specified in RFC 3261, Table 4 554 [RFC3261]. Following the default value of T1 (500ms) specified in 555 the table and a constant multiplier of 64 gives a value of 32 556 seconds for this timer (i.e., 500ms * 64 = 32s). 558 Measurement Units: 559 Seconds 561 Issues: 562 None. 564 See Also: 565 None. 567 3.3.3. Session Duration 569 Definition: 570 Configuration of the EA that represents the amount of time that 571 the SIP dialog is intended to exist between the two EAs associated 572 with the test. 574 Discussion: 575 The time at which the BYE is sent will control the Session 576 Duration. 578 Measurement Units: 579 seconds 581 Issues: 582 None. 584 See Also: 585 None. 587 3.3.4. Media Packet Size 589 Definition: 590 Configuration on the EA for a fixed number of frames or samples to 591 be sent in each RTP packet of the media stream when the test 592 involves Associated Media. 594 Discussion: 595 This document describes a method to measure SIP performance. If 596 the DUT is processing media as well as SIP messages the media 597 processing will potentially slow down the SIP processing and lower 598 the SIP performance metric. The tests with associated media are 599 designed for audio codecs and the assumption was made that larger 600 media packets would require more processor time. This document 601 does not define parameters applicable to video codecs. 603 For a single benchmark test, media sessions use a defined number 604 of samples or frames per RTP packet. If two SBCs, for example, 605 used the same codec but one puts more frames into the RTP packet, 606 this might cause variation in the performance benchmark results. 608 Measurement Units: 609 An integer number of frames or samples, depending on whether 610 hybrid- or sample-based codec are used, respectively. 612 Issues: 613 None. 615 See Also: 616 None. 618 3.3.5. Codec Type 620 Definition: 621 The name of the codec used to generate the media session. 623 Discussion 624 For a single benchmark test, all sessions use the same size packet 625 for media streams. The size of packets can cause a variation in 626 the performance benchmark measurements. 628 Measurement Units: 629 This is a textual name (alphanumeric) assigned to uniquely 630 identify the codec. 632 Issues: 633 None. 634 See Also: 635 None. 637 3.4. Benchmarks 639 3.4.1. Session Establishment Rate 641 Definition: 642 The maximum value of the Session Attempt Rate that the DUT can 643 handle for an extended, pre-defined, period with zero failures. 645 Discussion: 647 This benchmark is obtained with zero failure. The session attempt 648 rate provisioned on the EA is raised and lowered as described in 649 the algorithm in the accompanying methodology document 650 [I-D.ietf-bmwg-sip-bench-meth], until a traffic load over the 651 period of time necessary to attempt N sessions completes without 652 failure, where N is a parameter specified in the algorithm and 653 recorded in the Test Setup Report. 655 Measurement Units: 656 sessions per second (sps) 658 Issues: 659 None. 661 See Also: 662 Invite-Initiated Sessions 663 Non-Invite-Initiated Sessions 664 Session Attempt Rate 666 3.4.2. Registration Rate 668 Definition: 669 The maximum value of the Registration Attempt Rate that the DUT 670 can handle for an extended, pre-defined, period with zero 671 failures. 673 Discussion: 674 This benchmark is obtained with zero failures. The registration 675 rate provisioned on the Emulated Agent is raised and lowered as 676 described in the algorithm in the companion methodology draft 677 [I-D.ietf-bmwg-sip-bench-meth], until a traffic load consisting of 678 registration attempts at the given attempt rate over the period of 679 time necessary to attempt N registrations completes without 680 failure, where N is a parameter specified in the algorithm and 681 recorded in the Test Setup Report. 682 This benchmark is described separately from the Session 683 Establishment Rate (Section 3.4.1), although it could be 684 considered a special case of that benchmark, since a REGISTER 685 request is a request for a Non-Invite-Initiated session. It is 686 defined separately because it is a very important benchmark for 687 most SIP installations. An example demonstrating its use is an 688 avalanche restart, where hundreds of thousands of end points 689 register simultaneously following a power outage. In such a case, 690 an authoritative measurement of the capacity of the device to 691 register endpoints is useful to the network designer. 692 Additionally, in certain controlled networks, there appears to be 693 a difference between the registration rate of new endpoints and 694 the registering rate of existing endpoints (register refreshes). 696 This benchmark can capture these differences as well. 698 Measurement Units: 699 registrations per second (rps) 701 Issues: 702 None. 704 See Also: 705 None. 707 3.4.3. Registration Attempt Rate 709 Definition: 710 Configuration of the EA for the number of registrations per second 711 that the EA attempts to send to the DUT. 713 Discussion: 714 The Registration Attempt Rate is the number of registration 715 requests per second that the EA sends toward the DUT. 717 Measurement Units: 718 Registrations per second (rps) 720 Issues: 721 None. 723 See Also: Non-Invite-Initiated Session 725 4. IANA Considerations 727 This document requires no IANA considerations. 729 5. Security Considerations 731 Documents of this type do not directly affect the security of 732 Internet or corporate networks as long as benchmarking is not 733 performed on devices or systems connected to production networks. 734 Security threats and how to counter these in SIP and the media layer 735 is discussed in RFC3261 [RFC3261], RFC 3550 [RFC3550] and RFC3711 736 [RFC3711]. This document attempts to formalize a set of common 737 terminology for benchmarking SIP networks. Packets with unintended 738 and/or unauthorized DSCP or IP precedence values may present security 739 issues. Determining the security consequences of such packets is out 740 of scope for this document. 742 6. Acknowledgments 744 The authors would like to thank Keith Drage, Cullen Jennings, Daryl 745 Malas, Al Morton, and Henning Schulzrinne for invaluable 746 contributions to this document. Dale Worley provided an extensive 747 review that lead to improvements in the documents. We are grateful 748 to Barry Constantine, William Cerveny and Robert Sparks for providing 749 valuable comments during the document's last calls and expert 750 reviews. 752 7. References 754 7.1. Normative References 756 [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate 757 Requirement Levels", BCP 14, RFC 2119, March 1997. 759 [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for 760 Network Interconnect Devices", RFC 2544, March 1999. 762 [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, 763 A., Peterson, J., Sparks, R., Handley, M., and E. 764 Schooler, "SIP: Session Initiation Protocol", RFC 3261, 765 June 2002. 767 [RFC5393] Sparks, R., Lawrence, S., Hawrylyshen, A., and B. Campen, 768 "Addressing an Amplification Vulnerability in Session 769 Initiation Protocol (SIP) Forking Proxies", RFC 5393, 770 December 2008. 772 [RFC4320] Sparks, R., "Actions Addressing Identified Issues with the 773 Session Initiation Protocol's (SIP) Non-INVITE 774 Transaction", RFC 4320, January 2006. 776 [RFC6026] Sparks, R. and T. Zourzouvillys, "Correct Transaction 777 Handling for 2xx Responses to Session Initiation Protocol 778 (SIP) INVITE Requests", RFC 6026, September 2010. 780 [I-D.ietf-bmwg-sip-bench-meth] 781 Davids, C., Gurbani, V., and S. Poretsky, "SIP Performance 782 Benchmarking Methodology", 783 draft-ietf-bmwg-sip-bench-meth-10 (work in progress), 784 May 2014. 786 7.2. Informational References 788 [RFC2285] Mandeville, R., "Benchmarking Terminology for LAN 789 Switching Devices", RFC 2285, February 1998. 791 [RFC1242] Bradner, S., "Benchmarking terminology for network 792 interconnection devices", RFC 1242, July 1991. 794 [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. 795 Jacobson, "RTP: A Transport Protocol for Real-Time 796 Applications", STD 64, RFC 3550, July 2003. 798 [RFC3711] Baugher, M., McGrew, D., Naslund, M., Carrara, E., and K. 799 Norrman, "The Secure Real-time Transport Protocol (SRTP)", 800 RFC 3711, March 2004. 802 [RFC6357] Hilt, V., Noel, E., Shen, C., and A. Abdelal, "Design 803 Considerations for Session Initiation Protocol (SIP) 804 Overload Control", RFC 6357, August 2011. 806 [RFC7118] Baz Castillo, I., Millan Villegas, J., and V. Pascual, 807 "The Websocket Protocol as a Transport for the Session 808 Initiation Protocol (SIP)", RFC 7118, January 2014. 810 Authors' Addresses 812 Carol Davids 813 Illinois Institute of Technology 814 201 East Loop Road 815 Wheaton, IL 60187 816 USA 818 Phone: +1 630 682 6024 819 Email: davids@iit.edu 821 Vijay K. Gurbani 822 Bell Laboratories, Alcatel-Lucent 823 1960 Lucent Lane 824 Rm 9C-533 825 Naperville, IL 60566 826 USA 828 Phone: +1 630 224 0216 829 Email: vkg@bell-labs.com 830 Scott Poretsky 831 Allot Communications 832 300 TradeCenter, Suite 4680 833 Woburn, MA 08101 834 USA 836 Phone: +1 508 309 2179 837 Email: sporetsky@allot.com