idnits 2.17.1 draft-doria-genart-experience-04.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- No issues found here. Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- The document date (July 6, 2011) is 4650 days in the past. Is this intentional? Checking references for intended status: Informational ---------------------------------------------------------------------------- -- Obsolete informational reference (is this intentional?): RFC 5226 (Obsoleted by RFC 8126) Summary: 0 errors (**), 0 flaws (~~), 1 warning (==), 3 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group M. Barnes 3 Internet-Draft Polycom 4 Intended status: Informational A. Doria 5 Expires: January 7, 2012 Lulea University of Technology 6 H. Alvestrand 7 Google 8 B. Carpenter 9 University of Auckland 10 July 6, 2011 12 General Area Review Team (Gen-ART) Experiences 13 draft-doria-genart-experience-04 15 Abstract 17 The General Area Review Team (Gen-ART) has been doing Reviews of 18 Internet Drafts (I-Ds) since 2004. This document discusses the 19 experience and the lessons learned over the past 7 years of this 20 process. The review team initially reviewed the I-Ds before each of 21 the IESG telechats. Since late 2005, review team members have been 22 assigned to review I-Ds during IETF Last Call, unless no IETF Last 23 Call is necessary for the I-D. The same reviewer then reviews any 24 updates when the I-D is placed on an IESG telechat agenda. 26 Status of this Memo 28 This Internet-Draft is submitted in full conformance with the 29 provisions of BCP 78 and BCP 79. 31 Internet-Drafts are working documents of the Internet Engineering 32 Task Force (IETF). Note that other groups may also distribute 33 working documents as Internet-Drafts. The list of current Internet- 34 Drafts is at http://datatracker.ietf.org/drafts/current/. 36 Internet-Drafts are draft documents valid for a maximum of six months 37 and may be updated, replaced, or obsoleted by other documents at any 38 time. It is inappropriate to use Internet-Drafts as reference 39 material or to cite them other than as "work in progress." 41 This Internet-Draft will expire on January 7, 2012. 43 Copyright Notice 45 Copyright (c) 2011 IETF Trust and the persons identified as the 46 document authors. All rights reserved. 48 This document is subject to BCP 78 and the IETF Trust's Legal 49 Provisions Relating to IETF Documents 50 (http://trustee.ietf.org/license-info) in effect on the date of 51 publication of this document. Please review these documents 52 carefully, as they describe your rights and restrictions with respect 53 to this document. Code Components extracted from this document must 54 include Simplified BSD License text as described in Section 4.e of 55 the Trust Legal Provisions and are provided without warranty as 56 described in the Simplified BSD License. 58 Table of Contents 60 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 3 61 2. Who are the Gen-ART review team members? . . . . . . . . . . . 3 62 3. Goals of Gen-ART . . . . . . . . . . . . . . . . . . . . . . . 3 63 4. Gen-ART Reviews . . . . . . . . . . . . . . . . . . . . . . . 4 64 4.1. IETF LC Review Process . . . . . . . . . . . . . . . . . . 4 65 4.2. IESG Telechat Review Process . . . . . . . . . . . . . . . 5 66 4.3. Form of Review . . . . . . . . . . . . . . . . . . . . . . 5 67 4.4. Gen-ART Process Overview . . . . . . . . . . . . . . . . . 7 68 5. Secretarial Process . . . . . . . . . . . . . . . . . . . . . 9 69 5.1. Maintaining review spreadsheet . . . . . . . . . . . . . . 9 70 5.2. Last Call Assignment procedure . . . . . . . . . . . . . . 11 71 5.3. Telechat Assignment procedure . . . . . . . . . . . . . . 12 72 5.4. Capturing reviews . . . . . . . . . . . . . . . . . . . . 12 73 6. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 74 7. Impressions . . . . . . . . . . . . . . . . . . . . . . . . . 13 75 7.1. Reviewers' Impressions . . . . . . . . . . . . . . . . . . 14 76 7.2. General Area Directors' Impressions . . . . . . . . . . . 15 77 7.3. Gen-ART Secretaries' Impressions . . . . . . . . . . . . . 16 78 8. Needed Improvements . . . . . . . . . . . . . . . . . . . . . 16 79 9. Applicability . . . . . . . . . . . . . . . . . . . . . . . . 18 80 10. Security considerations . . . . . . . . . . . . . . . . . . . 18 81 11. IANA considerations . . . . . . . . . . . . . . . . . . . . . 18 82 12. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18 83 13. Changes since Last Version . . . . . . . . . . . . . . . . . . 19 84 14. Informative References . . . . . . . . . . . . . . . . . . . . 20 85 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 21 87 1. Introduction 89 The General Area Review Team (Gen-ART) was created personally by the 90 General Area Director in 2004. The review team has been retained by 91 subsequent General Area Directors. It has no official role in the 92 IETF standards process, except as a set of individuals entitled, like 93 everyone, to comment on Internet-Drafts. Its Secretary, and the team 94 of volunteer reviewers, serve at the invitation of the General AD. 96 Discussion of this document is intended to take place on the IETF 97 mailing list in the absence of a better home. 98 In addition, comments may be specifically sent to the gen-art mailing 99 list: . 101 Note to RFC Editor: Please remove the above paragraph prior to 102 publication. 104 2. Who are the Gen-ART review team members? 106 The reviewers are typically individuals that have a fair amount of 107 experience within various IETF Working Groups (WGs), have authored WG 108 I-Ds and RFCs, and are often considered to be subject matter experts 109 (SMEs) in their particular areas of work. The current review team is 110 comprised of such technical experts including several WG chairs and 111 past and current IAB members. Several past and current ADs have 112 served as reviewers. Two past General ADs have also served as 113 reviewers, with one currently serving. 115 Members of the review team sometimes excuse themselves from the team 116 for various reasons, typically due to "day job" demands. However, 117 they often rejoin (for periods of time) as their schedules allow. 118 Also, some reviewers remain on the team, while their review workload 119 is decreased by assigning them just one I-D (at Last Call time) to 120 review each month. Section 12 provides a list of currently active 121 reviewers, along with those who have served on the review team in the 122 past. 124 3. Goals of Gen-ART 126 The original and continuing goal of the Gen-ART team was, and is, to 127 offload some of the burden from the General Area AD of IESG reviews. 128 The load for the bi-weekly IESG reviews is often quite large; 129 occasionally there are more than 20 I-Ds scheduled for discussion in 130 a single telechat. Thus, ADs also have less than a week's notice for 131 many of the I-Ds on the telechat agenda. 133 Gen-ART was based on a model that had proved productive in the OPS 134 Directorate: Quick review close to telechat time, to advise the AD on 135 issues that remain serious. By having a trusted group of reviewers 136 read and evaluate the I-Ds, the General Area AD would be able to 137 concentrate on those I-Ds where there was a concern expressed by the 138 reviewer. The reviewers are expected to provide feedback based on a 139 whole set of criteria including the criteria summarized in 140 Section 4.3. The overall objective is to ensure that the I-Ds are 141 well structured, can be easily understood, at least at a high level, 142 and provides a reasonable basis for implementation (for standard's 143 track I-Ds). 145 While other area (and WG) directorates/review teams existed prior to 146 Gen-ART and more have been established since Gen-ART, the roles of 147 each are fairly distinct. Thus, there is little overlap between the 148 goals and review criteria for the various review teams. It is also 149 very valuable for these other review teams to operate independently. 150 For example, when both Gen-ART reviews and Security Directorate 151 (SecDir) reviews raise the same sorts of concerns, it's a clear red 152 flag that the I-D needs more work before progressing. In addition, 153 due to the typical thoroughness (and objectiveness) of the various 154 review teams' reviews, the sponsoring AD and document shepherd are 155 often able to work with the editors/WG (and vice versa depending upon 156 area and WG structure) to improve the overall quality of the final 157 I-D. 159 Statistics from the Gen-ART reviews over the past 6 + years show a 160 trend of increased quality and readiness for progression of I-Ds by 161 the time they are placed on the telechat agenda. Additional 162 statistics are discussed in Section 6. 164 4. Gen-ART Reviews 166 4.1. IETF LC Review Process 168 While the original process was meant only for reviews just before the 169 IESG telechat, it was decided to include IETF Last Call reviews in 170 early 2005. This initially seemed to be an overloading of the 171 process and presented some initial difficulties. However, over time 172 it has proven to be quite effective. Assigning the I-Ds at IETF LC 173 time typically gives a reviewer more time to review an I-D. And, in 174 some cases, the IETF LC version is the one to appear on the telechat. 175 Thus, by the time I-Ds are added to the telechat agenda, a majority 176 (typically at least 70%) have already been reviewed. For those I-Ds 177 that have been up-versioned, the amount of time dedicated to re- 178 review depends upon the review summary for the IETF LC review. 180 The assignments at IETF LC time evolved to minimize the gap between 181 LC announcements and assignment time, with the secretary doing LC 182 assignments every Thursday night. This typically allows the reviewer 183 at least one week and sometimes two to three weeks to complete the 184 review. The reviews are obviously most helpful when done on or 185 before the end of IETF LC. 187 The Last Call assignments are done on a fairly strict round robin 188 basis to ensure a fair workload amongst all the reviewers. Reviewers 189 that are unavailable (vacations, etc.) during the review period 190 timeframe obviously are excluded from that round of assignments, but 191 remain in the same queue position for the next round. The order is 192 occasionally modified to avoid assigning an editor/author or WG chair 193 their own I-Ds. A reviewer may also NACK an assignment if they feel 194 they may have some bias (although corporate affiliations are not 195 considered to be sources of bias) or they don't feel they can review 196 the I-D in a timely manner. 198 The assignment process is completely manual, although a spreadsheet 199 tremendously facilitates the process. The details are described in 200 Section 5. Ideally, this process could be automated. However, 201 manual intervention would still be required to maintain the 202 appropriate available reviewer list (unless reviewers took on the 203 task of maintaining their data in some sort of database). Further 204 details on the tools necessary to automate the entire process are 205 provided in Section 8. 207 4.2. IESG Telechat Review Process 209 The process for reviewing I-Ds when they appear on the IESG agenda: 211 o The "nearly final" IESG meeting agenda generally appears on 212 Thursday night, less than one week before the IESG telechat. The 213 Gen-ART secretary uses this as the input for the assignment 214 process. 215 o For I-Ds reviewed at IETF Last Call, a new review is only asked 216 for if the I-D is revised. In this case the reviewer, typically 217 the person who did the Last Call review, only needs to check that 218 any open issues were resolved. Often the draft will not have 219 changed between IETF LC and the IESG telechat review. Section 4.4 220 provides the step by step telechat review assignment process, with 221 specific details on the maintenance of the review assignment data, 222 maintained in spreadsheets detailed in section Section 5. 224 4.3. Form of Review 226 Rather than invent new guidelines, the Gen-ART requirements for the 227 form of a review stole liberally from 228 draft-carpenter-solutions-sirs-01, making adaptations for the special 229 "late, quick review" case and the nature of the General Area's 230 concerns. 232 Each review must start with a summary statement chosen from or 233 adapted from the following list: 235 o This draft is ready for publication as a [type] RFC, where [type] 236 is Informational, Experimental, etc. (In some cases, the review 237 might recommend publication as a different [type] than requested 238 by the author.) 239 o This draft is basically ready for publication, but has nits that 240 should be fixed before publication. 241 o This draft is on the right track but has open issues, described in 242 the review. 243 o This draft has serious issues, described in the review, and needs 244 to be rethought. 245 o This draft has very fundamental issues, described in the review, 246 and further work is not recommended. 247 o Unfortunately, I don't have the expertise to review this draft. 249 The length of a review can vary greatly according to circumstances, 250 and it is considered acceptable for purely editorial comments to be 251 sent privately if it's obvious that the I-D needs substantial 252 revision. All substantive comments, however, must be included in the 253 public review. Wherever possible, comments should be written as 254 suggestions for improvement rather than as simple criticism. 255 Explicit references to prior work and prior IETF discussion should be 256 given whenever possible. 258 Reviewers are asked to review for all kinds of problems, from basic 259 architectural or security issues, Internet-wide impact, technical 260 nits, problems of form and format (such as IANA Considerations or 261 incorrect references), and editorial issues. Since these reviews are 262 on I-Ds that are supposed to be finished, the review should consider 263 "no issue too small" - but should cover the whole range from the 264 general architectural level to the editorial level. 266 All reviews should apply generally agreed IETF criteria, such as: 268 o [RFC1958]: The Architectural Principles of the Internet 269 o [RFC3426]: General Architectural and Policy Considerations 270 o [RFC3439]: Some Internet Architectural Guidelines and Philosophy 271 o ID-Checklist: The "ID checklist" document maintained by the IESG 272 o [I-D.rfc-editor-rfc2223bis]: Instructions to RFC Authors 273 o [RFC5226]: BCP 26 - Guidelines for Writing an IANA Considerations 274 Section in RFCs 276 o [RFC3552]: Guidelines for Writing RFC Text on Security 277 Considerations 278 o As well as any other applicable architectural or procedural 279 documents. It is considered important that reviews give precise 280 references to such criteria when relevant to a comment. 282 Of special interest to the General area, because it does not fall 283 under any other area: 285 o Clear description of why the I-D or protocol is useful to the 286 Internet. 287 o Adherence to IETF formalities such as capitalized "must", 288 "should", etc. in normative statements, per the ID-Checklist. 289 o Useful and reasonable IANA considerations. Ensure that all 290 necessary registries are defined/referenced and ensure definition 291 and compliance with IANA assignment criteria. 292 o Correct dependencies for normative references. 293 o That it's written in reasonably clear English. 294 o Checking the updates/obsoletes information. 295 o Running idnits and checking the output. 296 o Checking that things imported by reference especially from other 297 RFCs make sense (notably definitions of terms, security 298 considerations, lists of criteria)and ensure they are used as 299 intended by the referenced document. 300 o Examples (eg FQDNs, telephone numbers, IP addresses) are taken 301 from the right spaces. 303 4.4. Gen-ART Process Overview 305 The following provides a general overview of the Gen-ART process 306 along with some basic rules associated with assignments. The very 307 precise details of the secretary's process are provided in Section 5. 309 o The availability of reviewers and the order of assignments for the 310 next round of Last Call document assignments is updated weekly and 311 is available on the directory where all the assignments and 312 reviews are cached. 313 o At telechat assignment time, all previously reviewed I-Ds are 314 assigned to the reviewer who reviewed them previously, assuming 315 that reviewer is available. Otherwise, these I-Ds are assigned to 316 a new person in the process described below. 317 o The secretary attempts to avoid assigning I-Ds that might conflict 318 with other IETF roles such as WG chairs, other directorates, etc. 319 However, in the cases where the secretary doesn't note the 320 conflict, the reviewer should notify the secretary and gen-art 321 mailing list so another reviewer may be assigned. 323 o It should be emphasized that assignment is never made according to 324 a reviewer's technical specialty. Even though it happens, when, 325 for example, routing I-Ds fall on routing experts or MIBs fall on 326 MIB doctors, it is coincidental. To the reviewer, the choice 327 looks random. 328 o There is an attempt to evenly distribute I-Ds amongst reviewers at 329 LC time by using a round robin process, starting from where the 330 previous week's assignments stopped. 331 o Typically, there is no attempt made to actually equalize the load 332 as the length and complexity of the I-Ds is not taken into account 333 in this process. (Thus, a reviewer could end up with a couple of 334 hundred-page I-Ds, but this is statistically rare.) However, in 335 the case of a reviewer that might receive more than one new LC I-D 336 at one time, the secretary does try to ensure that both are not 337 large I-Ds. 338 o Once the assignments are made, the web pages that list the reviews 339 and the assignments are posted. Since the telechat agenda is not 340 published until the end of the day on the Thursdays prior to the 341 telechats (i.e., one week prior to), the secretary needs to 342 complete the assignments on that Thursday evening. This often 343 requires working later in the evening and also requires an 344 Internet connection even when traveling. 345 o If the reviewers notice any problems or conflict of interest, a 346 bargaining process, shifting I-Ds from one reviewer to another, 347 takes place. The secretary updates the assignment files with any 348 new assignments. 349 o Once the review has been completed the reviewer sends the review 350 to the Gen-ART list, ideally using the template provided in the 351 review assignment emails. Typically, reviews are also sent to 352 authors, the responsible AD and the WG chairs/document shepherd. 353 The only case where this might not be done is when there are no 354 issues found for a re-review and none had been found on an initial 355 review. Sending the review to the authors, ADs and/or WG chairs/ 356 Proto Shepherds was originally voluntary but is now considered 357 standard practice. Reviewers may also send the reviews to the 358 IETF discussion list, but that is entirely at the discretion of 359 the reviewer, in which case the author must be copied on the 360 review to ensure they see any follow-up discussion. Reviewers may 361 also send the comments to the WG, however, this typically causes 362 the review to end up in the moderation queue, as most reviewers do 363 not want to subscribe to the WG lists for the I-Ds they review. 364 Thus, it is expected that any of the original recipients (i.e., 365 authors, WG chairs/document shepherd or responsible AD) may 366 forward the review to the WG mailing list if they believe it is 367 necessary. In the past, sending these reviews resulted in 368 confusion among the authors, who may not have been expecting a 369 Gen-ART review and may not be familiar with Gen-ART. Thus, 370 reviewers are reminded to pre-pend the description of Gen-ART and 371 the purpose of the review. This information is part of the 372 standard template provided in the review assignment emails. 373 o The secretary gathers the reviews, sometimes edits them for 374 format, records the review in the spreadsheet on the web pages, 375 including the synopsis. This is typically done on Thursday. This 376 is one aspect of the process that can be easily delegated such 377 that one volunteer uploads all the reviews and then the secretary 378 need only update the fields in the spreadsheet. If the reviewer 379 has not provided a synopsis ("Summary" field in the template), the 380 secretary makes a best guess based on the review details. Note 381 that in most cases the reviewers do include a synopsis. 382 o Ideally the reviews should be posted to the gen-art mailing list 383 by close of business on the East coast on Tuesday. This is 384 necessary to allow the General AD time to consider the reviews 385 prior to the telechat. If the reviews are received after Tuesday 386 the review may not be read by the AD before the IESG telechat. 387 Due to time constraints, the spreadsheets containing review 388 summaries/assignments are only updated on Thursday evenings when 389 the new LC assignments and upcoming telechat assignments are done. 390 Ideally, the reviews would get uploaded on the Tuesdays prior to 391 the telechat along with the updated spreadsheets. 392 o If the AD concludes that the concerns raised by the reviewer 393 warrant placing a DISCUSS comment on the I-D, the AD will do so, 394 and the DISCUSS must be resolved before the I-D advances. 395 Usually, the reviewer will be involved in the resolution process, 396 but the responsibility for the DISCUSS rests with the AD. 398 5. Secretarial Process 400 This section summarizes the details of managing the review materials, 401 including the spreadsheet used to track all reviews and the HTML 402 files containing the review assignments. 404 5.1. Maintaining review spreadsheet 406 A spreadsheet is used to enter all the I-Ds at the time of assignment 407 and to capture all the reviews. For IETF LC assignments, the 408 assignments are completed before adding the I-Ds to the spreadsheet 409 as described in Section 5.2. For telechat assignments I-Ds are 410 obviously only added in the cases where there is no previous LC 411 assignment. For the other I-Ds, the appropriate fields are updated 412 as described in Section 5.3 414 All the reviews can be accessed from the spreadsheet via hyperlinks 415 from specific fields as summarized below. The following information 416 is maintained in the spreadsheet (in the order listed): 418 1. "Chat/LC Date": indicates either the date on which the LC review 419 is due or the date of the telechat. 420 2. "Document": Filename for the I-D, which includes a hyperlink to 421 the IETF I-D tracker. 422 3. "Assigned": Name of the reviewer assigned to that I-D. 423 4. "Category": This field contains one of the following self 424 explanatory values: "Doc - WG", "Doc - Ind/AD", or "IETF LC". 425 Note that Gen-ART does not review I-Ds submitted directly to the 426 RFC editor. The "IETF LC" field is entered obviously for all 427 I-Ds at LC time. It is changed to one of the other appropriate 428 fields, based on the information in the telechat agenda 429 5. "Previous Review": This includes a link to any previous reviews. 430 For example, when a doc appears on a telechat agenda, if an IETF 431 LC review was done, this field is updated with the review summary 432 for the LC review (i.e., the information from the "Current Review 433 Summary" as described below is copied to this column). The field 434 is set to "New" when an I-D is first assigned/added to the 435 spreadsheet. In the case of returns, this field has a value of 436 "Return" or "Return/IETF-LC" for I-Ds for which there is an LC 437 review. It should be noted that since Gen-ART started doing 438 reviews at LC time, there seem to be far fewer returns on the 439 agenda. 440 6. "Current Review Summary": This field includes the review type and 441 version number of the document which is to be reviewed or has 442 been reviewed (e.g., LC: -02). When the field also contains a 443 review summary after the review type/version number (e.g., 444 Telechat: -06 Ready), the active hyperlink points to the review. 445 Occasionally, a reviewer will re-review an I-D prior to its 446 telechat assignment, in which case it is added to the spreadsheet 447 but the date does not change to maintain consistency in the date 448 field, since the reviews themselves contain the review date. 450 The following summarizes the steps to add a new I-D to the 451 spreadsheet: 453 1. In order to optimize steps, blank rows are first inserted for the 454 number of new I-Ds to be added. 455 2. To minimize data entry, a row with default fields (including the 456 links for the hyperlinks) is kept at the end of the file. There 457 is a separate default row for IETF LC versus Telechat 458 assignments. This row is copied into each of the new blank rows. 459 The dates are then entered (this allows the double checking that 460 all I-Ds are accounted for from the review assignments, 461 especially LC). 462 3. The I-D name is then copied to the name field as well as being 463 appended to the hyperlink for the "Review Summary" field. The 464 hyperlink is included as part of the default row. This minimizes 465 the steps in enter the reviews in the spreadsheet. 467 4. The data is also sorted by "Chat/LC Date", "Assigned" and 468 "Document". The file is then saved and closed. 469 5. The file is then reopened and saved as HTML. 470 6. The file is opened a second time and sorted by "Assigned", 471 "Chat/LC Date" and "Document" to provide the I-D reviewers an 472 easy way to find any outstanding assignments. 474 5.2. Last Call Assignment procedure 476 The secretary can cache the Last Call assignments as they are 477 announced and/or check the IETF announcement mailing list archives. 478 The assignments are done on Thursday evening, along with any telechat 479 assignments. This optimizes the process in terms of batch changes to 480 files. 482 The assignments are listed in an HTML file. The following are the 483 steps in creating that file: 485 1. The order of assignment is actually created the week before, with 486 the details below. Thus, before starting the new assignments the 487 current file is saved for editing for the following week. The 488 current filenaming convention is "reviewersyymmdd-lc.html" (e.g., 489 for July 8th, 2010, the file reviewers100708-lc.html was created 490 and the file for the following week is named reviewers100715- 491 lc.html). 492 2. Since the file is already prepared with the appropriate ordering 493 of reviewers, the assignments are done in the order of due dates. 494 The link to the I-D in the datatracker is copied into the 495 assignment file along with the intended RFC status for each of 496 the new LC I-Ds. 497 3. The "Due Date" paragraph from the Last Call announcement is 498 shortened as follows: "IETF LC ends on:", keeping the date. 499 4. Once the assignment file is complete, the new I-Ds are added to 500 the spreadsheet as described in above. 501 5. The assignment file for the next week is then updated to reflect 502 the next reviewer in the round robin process, by simply cutting 503 and pasting the names in the list in a block and removing any 504 "one doc per month" reviewers (annotated with an "*") that have 505 already received their monthly assignment. If the next round of 506 assignments occurs at the beginning of a new month, the "one doc 507 per month" reviewers are added back into the list (in the normal 508 "by first name alphabetical order"). 509 6. The assignment files and updated spreadsheets are then cached on 510 the Gen-ART server. 511 7. An email providing a link to the assignment file, along with the 512 updated spreadsheets is sent to the gen-art mailing list. This 513 email has a standard form, such that the reviewers can simply cut 514 and paste the template to include the Gen-ART context statement 515 and link to FAQ. 517 5.3. Telechat Assignment procedure 519 Since LC assignments are now the starting point for Gen-ART I-D 520 reviews, the telechat assignments are generally straightforward as 521 the majority of the I-Ds are already in the spreadsheet. The 522 following details the steps: 524 1. The telechat agenda is typically available around 6PM PDT. In 525 order to create the assignment HTML file, the agenda is created 526 from the email announcing the upcoming telechat agenda. The 527 filename has the following format, with the date corresponding to 528 the telechat date (versus the data of assignment as is the case 529 for last call assignments): "reviewersyymmdd".html. 530 2. Rows are added to the agenda for the reviewer's name. 531 3. The reviewers names are then added to the weekly assignment file. 532 4. As each reviewer is added to the assignment file, the review 533 spreadsheet is updated as follows: 534 * "Chat/LC Date" is changed to the telechat date. 535 * The link to the LC review, if available, is copied as the link 536 for the "Previous Review" column. 537 * The "text" for the "Current Review" is updated to reflect the 538 new review type (i.e., Telechat) and version. 539 5. In the case of an I-D that did not go through IETF LC, a reviewer 540 is assigned using the order in the file to be used for Last Call 541 assignments for the next week. 542 6. Once the reviewer(s) have been determined, the LC assignment file 543 for the next week is updated. 544 7. Any new I-Ds are then added to the spreadsheet (and the updates 545 saved) per the steps as described in Section 5.1. 546 8. The assignment files and updated spreadsheets are then cached on 547 the server used for Gen-ART. 548 9. An email providing a link to the assignment file, along with the 549 updated spreadsheets is sent to the gen-art mailing list. This 550 email has a standard form, such that the reviewers can simply cut 551 and paste the template to include the Gen-ART context statement 552 and link to FAQ. 554 5.4. Capturing reviews 556 As noted in Section 4.4 the spreadsheet is typically updated with the 557 review summaries on Thursday evenings just prior to entering the data 558 for that week's LC and any Telechat assignments. The following 559 summarizes the steps to capture the reviews: 561 1. Currently, a volunteer is assisting the secretary in caching the 562 email reviews as they arrive. 563 2. In the cases where the review is included inline in the body of 564 the email, the review is cut and pasted into a text file and 565 saved with the reviewers last name appended to the filename - 566 e.g., draft-ietf-xyz-00-smith.txt . 567 3. In the case where the review is included as an attachment to the 568 email, the file can be directly saved and uploaded. 569 4. The volunteer uploads the reviews by around 5pm CST on Thursdays, 570 thus they are available to the secretary at the time that weeks 571 assignments are done. This sequence is necessary to ensure the 572 information for I-Ds on the upcoming telechat is up-to-date. 573 5. The review summary is entered into the the "Current Summary" 574 field. Noting that the hyperlink to the review (added at 575 assignment time) will automatically work when the file is 576 uploaded. 577 6. Once all the reviews have been entered and the spreadsheets 578 formatted, the review spreadsheet is saved and files uploaded per 579 the last three steps in Section 5.1. 581 6. Results 583 Over the past 7 years, the Gen-ART has provided reviewing services to 584 3 ADs and has done around two thousand publicly available reviews. 585 The reviews have been executed with a team of around a dozen full- 586 time reviewers and other reviewers receiving one I-D assignment each 587 month. There are currently 9 reviewers in the latter category. The 588 full-time reviewers receive 2-3 assignments each month. In terms of 589 improving quality, the number of I-Ds that are now ready at the time 590 of the telechat since the reviews are now initiated at LC time has 591 increased. Based on the data from 2007, there were over 250 I-Ds 592 that were assigned at LC time that went through IESG review. Of 593 those 250 I-Ds, 80% of the LC reviews were completed (205 I-Ds). Of 594 the completed reviews about 75% (144 I-Ds) were "Ready" at the time 595 of the telechat. Of those 144 I-Ds, roughly 1/4 had been deemed 596 "Ready" (with no nits) at LC time (based on a sample of 50 reviews). 597 For the I-Ds that were not reviewed at LC time, only about 1/4 of 598 those were deemed "Ready" when they were reviewed for the telechat. 599 So, doing the gen-art reviews at Last Call time does seem to improve 600 the quality of the I-Ds for the telechat. 602 7. Impressions 604 This section is divided into 3 subsections, the impressions as 605 gathered from the Gen-ART, the impressions of the ADs for whom they 606 worked, and the impressions of the secretaries of Gen-ART. 608 7.1. Reviewers' Impressions 610 The following list of comments are excerpted and edited from comments 611 sent in by the reviewers of Gen-ART in response to the request: 613 "We'd like to ask you each to write a few lines about your personal 614 experience and lessons learned as a Gen-ART reviewer." 616 o We really do find problems, but we don't find problems with most 617 I-Ds. 618 o Comments seem to be in three areas: editorial/grammar, editorial/ 619 what-the-heck-does-this-mean, and actual problems. I'm seeing 620 fewer reviews in the first category, which is a good thing. 621 o It is becoming rarer that we hear back "these guys have suffered 622 enough, I'm voting no objection" (I'm remembering an LDAP I-D that 623 had been around so long it had 2119 referenced AS A DRAFT - some 624 people suffered a lot). 625 o The direct assignment of reviews is necessary and effective. It 626 does not matter much as far as I can tell what scheme is used to 627 actually do the assignment. 628 o Folks are very open to the reviews that come out of Gen-ART. This 629 somewhat surprised me because I have seen resistance to outside 630 reviews in other cases. 631 o The improvements that have come about (for example one of my 632 latest, the sipping conference draft - whatever the outcome) have 633 made a big difference to the comprehensibility and usability of 634 the I-Ds - and provide a useful incentive to keep going. 635 o Some form of review like this is desperately needed. While most 636 of the stuff we see is good, every once in a while really bad 637 errors have made their way all the way to IESG vote. 638 o Reading this stuff is interesting. I like having a reason to read 639 a wide range of materials. 640 o I am more than convinced that this can be and is a valuable 641 process. It is IMO a pity that SIRS and so on did not take off, 642 because this late stage reviewing is a poor substitute for doing 643 the same thing at a much earlier stage. Very few of the drafts 644 that have come past my screen are truly fully ready for IESG 645 review. It is actually a joy to find the occasional nugget that 646 is both well written and is a proper technical job, such that the 647 review really can say 'This is ready'. 648 o I have certainly found the process intellectually stimulating! It 649 encourages me to take a wider interest in what is going on in the 650 IETF, but consumes a fair bit of time to do a proper job, and 651 requires a very wide knowledge to be able to properly catch the 652 cross-area implications: I hope (believe!) that this is something 653 that one gets better at with experience and doing a few of these 654 reviews. 656 o There are probably a very limited pool of people who have both the 657 time and the inclination to keep on doing these reviews. It does 658 require a fair bit of dedication. 659 o It is difficult to avoid correcting the English, even if that is 660 not really the point: Often really bad English (whether as a 661 result of non-mother tongue authors with limited grasp or mother 662 tongue authors using informal language) obscures/corrupts what is 663 being said or just makes it impossible to read. 664 o Mostly authors welcome the comments: I think most of them 665 understand the concept of 'ego-free reviewing' and we have 666 generally been constructive rather than destructive. 667 o Part of the job of Gen-ART is to think the unthinkable from 668 another point of view, to challenge (apparently undocumented) 669 assumptions and apply experience from other fields. 671 7.2. General Area Directors' Impressions 673 It should be noted that these impressions are from multiple General 674 Area Directors' thus the "I"s are not necessarily associated with a 675 specific AD. 677 o It's essential. The reviewing load for the IESG DOES NOT 678 SCALE. 679 o Without Gen-ART, I would be a much less effective General AD. 680 o On a single fortnight example, the IESG had 21 drafts on the 681 agenda. It is just impossible (to conscientiously review all the 682 documents), and no wonder we sometimes miss serious issues. 683 o So I think a distributed review team with o(30) trusted reviewers 684 needs to be institutionalized. I suspect that will need to be 685 formalized in a BCP sooner or later - with their reviews having a 686 formal position in the standards process, and the expectation that 687 the whole IESG truly reviews all I-Ds being relaxed. 688 o We've learned that polite, well reasoned, constructive reviews are 689 very positively received by authors and WGs. Dismissive reviews 690 are counter-productive. And reviews sent in private eventually 691 show up in public, so it's better to go public at the start. 692 o Normally, LC reviews are available in good time for the draft to 693 be revised before reaching the IESG agenda. It is important that 694 this happens, except for an emergency situation where the 695 responsible AD has good reason to place the draft on the agenda 696 immediately. In that case it would be preferable for the AD to 697 inform the Gen-ART team, so that the review can be expedited. 698 o The other problem is a big detail - between late Thursday or early 699 Friday when the secretary sends out the assignments, and Wednesday 700 when the General Area AD likes to start filling in ballots based 701 on the reviews received by COB on Tuesday, there are only three 702 work days (plus possible volunteer time over the weekend). Now 703 even with only one I-D to review, that may be a real challenge. 705 Sometimes, a lucky reviewer will get 130 pages (e.g. 706 draft-ietf-nntpext-base-27). That doesn't compute. 707 o There are some mechanical issues. The process followed is far too 708 manual. Everything needs to be robotic except for the judgment 709 calls about which reviewer gets which draft. Similarly, the 710 reviewer should be able to just paste the review into a web form, 711 click, and it's sent off to everyone appropriate and posted to the 712 review site. 714 7.3. Gen-ART Secretaries' Impressions 716 Serving as the secretary of Gen-ART is a worthwhile experience. From 717 a personal point of view, it gives the secretary an easy way to track 718 all of the work going through the IESG review process and see how the 719 work flowed through that process. Also, by reviewing and sometimes 720 creating the one line abstracts that go on the review web page, the 721 secretary has an opportunity to really get a survey of the work being 722 approved by the IETF. 724 The nature of these reviews is informal, and originally the reviews 725 were only intended for the General Area AD, though they were made 726 public. During 2004 there was little if any interaction between 727 authors and reviewers. There was some discussion during 2004 about 728 trying to expand the role of Gen-ART to a more formal, early review 729 model, i.e to evolve it into a form of SIRS. The original Gen-ART 730 secretary was against such a transformation because she felt it would 731 risk something that worked. She believed that the risk was inherent 732 in formalizing the reviews and in adding mechanisms for standardizing 733 the review mechanisms that would resort from formalization. Another 734 concern involves the interaction between reviewers and authors. As 735 discussed above, it has become the practice to send reviews to the 736 authors with an explanation about the nature of Gen-ART reviews. 737 While it is clear that this has resulted in improved RFCs, it has 738 also resulted in increased work load for the reviewers. 740 The secretary thinks that Gen-ART as an experiment that works well, 741 but the secretary believes it is fragile. The secretary is often 742 concerned about overburdening reviewers, and feels it is her 743 responsibility to keep them from burning out. Adding additional 744 reviewers to the review team would help to alleviate this concern. 745 In terms of the process, adding additional reviewers has minimal 746 impact. 748 8. Needed Improvements 750 The current size of the review team introduces a fairly heavy 751 workload for the individual reviewers that are not on the "one doc 752 per month" assignment cycle. Additional reviewers would be really 753 helpful to alleviate this workload. It is also important to note 754 that having additional reviewers adds minimal workload to the 755 secretary's process, thus the only blocking point is finding the 756 right folks that are interested in this type of volunteer role. As 757 noted in Section 7.2, 30 would be a good size for the review team. 758 This would cut the workload for an individual reviewer in half (given 759 the current model of 9 reviewers on the "one doc per month" 760 assignment cycle). 762 Obviously, automation of the process would be a good thing. However, 763 Gen-ART secretaries are not necessarily highly motivated to 764 transition to a more automated approach until a significant part of 765 the process is automated. In more recent consideration of this 766 situation, it likely would be best to first automate the process of 767 entering the reviews, as that benefits the review team as a whole. 768 This automation should allow the reviewers to enter the reviews via a 769 web interface that would automatically generate the appropriate 770 emails - quite similar to how the draft "Upload" tool currently 771 works. Also, given consistent naming conventions for the review 772 forms, this step would automate some of the process for the 773 secretary, as the reviews would automatically appear via the 774 spreadsheet hyperlinks, although there would still be a need to 775 manually enter the summary. But, this would eliminate the need to 776 edit/normalize and upload files. And, hopefully, eliminate the 777 problem encountered with unflowed text in emails and getting the 778 review properly formatted using some text editors. 780 Section 5 was written to facilitate the process of determining tools 781 requirements, by providing the very detailed steps currently applied 782 to the process. As noted above, automating the upload of the reviews 783 could be a good first step. This is somewhat starting at the end of 784 the process. However, it seems that by automating in this direction, 785 we may have optimal results, since one of the earliest steps in the 786 process is the task of assiging reviewers it likely needs the most 787 manual intervention, even with tools available. 789 The current SecDir secretary does use some tools for assignments and 790 generating assignment emails. These tools could be considered for 791 use by the Gen-ART secretary. Since the SecDir reviews are not 792 cached and the information maintained for those reviews is less 793 detailed, there would be no reusability of that aspect. However, if 794 the Gen-ART spreadsheet can be automatically populated (with 795 assignments and completed reviews), the SecDir may be able to make 796 use of that same tool. 798 9. Applicability 800 As implemented today, the process has no formal role in the IETF 801 standards process. But as trust in the review team has built, and as 802 the team itself has learned to deliver reviews that are generally 803 well received, they have had a significant impact on I-D quality and 804 on timeliness. Rather than becoming a roadblock, they have (in 805 general) allowed the General AD to feel more confident in reaching 806 decisions and be more precise in resolving issues. Since reviews now 807 typically appear during IETF Last Call, the reviews like the SecDir 808 reviews are now generally expected. So, the role of the team has 809 evolved to be more formal than in the past (i.e., when this I-D was 810 first published in 2005). However, the handling of the reviews 811 remain entirely within the scope of the ADs, document shepherds, WG 812 chairs and authors as they deem appropriate. 814 10. Security considerations 816 Since this is an informational I-D about an open process, the 817 security considerations are specific to the process and users 818 involved in the process. The primary concern would be to limit the 819 people that have the ability to create and update the Gen-ART data/ 820 files to ensure that the integrity of the data is maintained. For 821 example, each Gen-ART reviewer should have a unique user name/ 822 password, just as folks do to access any other IETF maintained data, 823 as appropriate. 825 11. IANA considerations 827 As this is an informational I-D about an IETF process, there are no 828 IANA considerations. 830 12. Acknowledgments 832 Initial comments were received from the members of the Gen-ART team 833 and the experiences discussed in this I-D were derived from their 834 hard work over the last 7+ years. We thank the past reviewers of the 835 Gen-ART team: 836 Mark Allman 837 Harald Alvestrand (creator of Gen-ART) 838 Ron Bonica 839 Scott Brim 840 Gonzalo Camarillo 841 Sharon Chisholm 842 Spencer Dawkins 843 Lakshminath Dondeti 844 Avri Doria (past secretary) 845 Pasi Eronen 846 Dorothy Gellert 847 Eric Gray 848 Avashalom Houri 849 Glenn Kowack 850 John Loughney 851 Lucy Lynch 852 Enrico Marocco 853 Michael Patton 854 Stefan Santesson 855 Robert Sparks 856 Tom Taylor 857 Sean Turner 858 Christian Vogt 859 Suzannne Woolf 861 As well as the current team of reviewers/secretary: 862 Mary Barnes (past secretary, 2005-2010) 863 Richard Barnes 864 David Black 865 Ben Campbell 866 Brian Carpenter (past GEN AD) 867 Elwyn Davies 868 Francis Dupont 869 Roni Even 870 Miguel-Angel Garcia 871 Vijay Gurbani (assisting secretary to upload reviews) 872 Wassim Haddad 873 Joel Halpern 874 Suresh Krishnan 875 Peter McCann 876 Jean Mahoney (secretary as of Jan. 2011) 877 Alexey Melnikov 878 Kathleen Moriarty 880 13. Changes since Last Version 882 Changes between 03 and 04: 884 1. Updates based on pre-publication review including: 885 * Editorial. 886 * Changing the terms "documents" and "drafts" to "I-Ds". 887 * Removed all references to "PROTO". 888 2. Updates to reflect changes in the spreadsheet format and general 889 process. 890 3. Removing the name of the spreadsheet tool as any spreadsheet tool 891 should work. 893 Changes between 02 and 03: 895 1. Updated to reflect current practices such as another volunteer 896 uploading the reviews, the spreadsheet only being updated on 897 Thursdays and the movement of the reviews to another server. 898 2. Updated step by step details on the secretary's process to 899 facilitate tools requirements. 900 3. Updated lists of reviewers. 901 4. Added the documents which provide the baseline for reviews to the 902 informational reference section. 904 Changes between 01 to 02: 906 1. Updated to reflect current practices such as assignment at Last 907 Call time being the norm, standard template for reviews/emails, 908 new members of team, comments from current secretary, General 909 Area AD and review team members. 910 2. Removed previous comments in the "Impressions" section that are 911 no longer germane such as problems with no standard boilerplate, 912 problems with LC reviews, etc. 913 3. Added step by step details on the secretary's process to 914 facilitate tools requirements. 915 4. Added the documents which provide the baseline for reviews to the 916 informal reference section. 918 14. Informative References 920 [RFC1958] Carpenter, B., "Architectural Principles of the Internet", 921 RFC 1958, June 1996. 923 [RFC3426] Floyd, S., "General Architectural and Policy 924 Considerations", RFC 3426, November 2002. 926 [RFC3439] Bush, R. and D. Meyer, "Some Internet Architectural 927 Guidelines and Philosophy", RFC 3439, December 2002. 929 [I-D.rfc-editor-rfc2223bis] 930 Reynolds, J. and R. Braden, "Instructions to Request for 931 Comments (RFC) Authors", draft-rfc-editor-rfc2223bis-08 932 (work in progress), July 2004. 934 [RFC5226] Narten, T. and H. Alvestrand, "Guidelines for Writing an 935 IANA Considerations Section in RFCs", BCP 26, RFC 5226, 936 May 2008. 938 [RFC3552] Rescorla, E. and B. Korver, "Guidelines for Writing RFC 939 Text on Security Considerations", BCP 72, RFC 3552, 940 July 2003. 942 Authors' Addresses 944 Mary Barnes 945 Polycom 946 TX 947 US 949 Email: mary.ietf.barnes@gmail.com 951 Avri Doria 952 Lulea University of Technology 953 Arbetsvetenskap 954 Lulea 955 SE-97187 957 Email: avri@acm.org 959 Harald Alvestrand 960 Google 961 Beddingen 10 962 Trondheim 7014 963 NO 965 Email: harald@alvestrand.no 966 Brian E Carpenter 967 University of Auckland 968 PB 92019 969 Auckland, 1142 970 New Zealand 972 Phone: 973 Email: brian.e.carpenter@gmail.com