idnits 2.17.1 draft-bradner-access-control-01.txt: ** The Abstract section seems to be numbered Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- ** Cannot find the required boilerplate sections (Copyright, IPR, etc.) in this document. Expected boilerplate is as follows today (2024-04-25) according to https://trustee.ietf.org/license-info : IETF Trust Legal Provisions of 28-dec-2009, Section 6.a: This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79. IETF Trust Legal Provisions of 28-dec-2009, Section 6.b(i), paragraph 2: Copyright (c) 2024 IETF Trust and the persons identified as the document authors. All rights reserved. IETF Trust Legal Provisions of 28-dec-2009, Section 6.b(i), paragraph 3: This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Simplified BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Simplified BSD License. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- ** Missing expiration date. The document expiration date should appear on the first and last page. ** The document seems to lack a 1id_guidelines paragraph about Internet-Drafts being working documents. ** The document seems to lack a 1id_guidelines paragraph about 6 months document validity -- however, there's a paragraph with a matching beginning. Boilerplate error? ** The document seems to lack a 1id_guidelines paragraph about the list of current Internet-Drafts. ** The document seems to lack a 1id_guidelines paragraph about the list of Shadow Directories. ** The document is more than 15 pages and seems to lack a Table of Contents. == No 'Intended status' indicated for this document; assuming Proposed Standard Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The document seems to lack an Introduction section. ** The document seems to lack an IANA Considerations section. (See Section 2.2 of https://www.ietf.org/id-info/checklist for how to handle the case when there are no actions for IANA.) Miscellaneous warnings: ---------------------------------------------------------------------------- -- The document seems to lack a disclaimer for pre-RFC5378 work, but may have content which was first submitted before 10 November 2008. If you have contacted all the original authors and they are all willing to grant the BCP78 rights to the IETF Trust, then this is fine, and you can ignore this comment. If not, you may need to add the pre-RFC5378 disclaimer. (See the Legal Provisions document at https://trustee.ietf.org/license-info for more information.) -- Couldn't find a document date in the document -- date freshness check skipped. Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) No issues found here. Summary: 10 errors (**), 0 flaws (~~), 1 warning (==), 2 comments (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 1 Network Working Group Scott Bradner 2 Internet Draft Harvard University 4 Source directed access control on the Internet. 6 8 Status of this Memo 10 This document is an Internet-Draft. Internet-Drafts are working 11 documents of the Internet Engineering Task Force (IETF), its areas, 12 and its working groups. Note that other groups may also distribute 13 working documents as Internet-Drafts. 15 Internet-Drafts are draft documents valid for a maximum of six months 16 and may be updated, replaced, or obsoleted by other documents at any 17 time. It is inappropriate to use Internet-Drafts as reference 18 material or to cite them other than as ``work in progress.'' 20 To learn the current status of any Internet-Draft, please check the 21 1id- abstracts.txt listing contained in the Internet-Drafts Shadow 22 Directories on ftp.is.co.za (Africa), nic.nordu.net (Europe), 23 munnari.oz.au (Pacific Rim), ds.internic.net (US East Coast), or 24 ftp.isi.edu (US West Coast). 26 1. Abstract 27 This Internet Draft was developed from a deposition that I submitted 28 as part of a challenge to the Communications Decency Act of 1996, 29 part of the Telecommunications Reform Act of 1996. The 30 Telecommunications Reform Act is a U.S. federal law substantially 31 changing the regulatory structure in the United States in the 32 telecommunications arena. The Communications Decency Act (CDA) part 33 of this law has as its aim the desire to protect minors from some of 34 the material carried over telecommunications networks. In particular 35 the law requires that the sender of potentially offensive material 36 take "effective action" to ensure that it is not presented to minors. 37 A number of people have requested that I publish the deposition as an 38 informational RFC since some of the information in it may be useful 39 where descriptions of the way the Internet and its applications work 40 could help clear up confusion in the technical feasibility of 41 proposed content control regulations. 43 2. Control and oversight over the Internet 44 No organization or entity operates or controls the Internet. The 45 Internet consists of tens of thousands of local networks linking 46 millions of computers, owned by governments, public institutions, 47 non-profit organizations, and private companies around the world. 48 These local networks are linked together by thousands of Internet 49 service providers which interconnect at dozens of points throughout 50 the world. None of these entities, however, controls the Internet; 51 each entity only controls its own computers and computer networks, 52 and the links allowed into those computers and computer networks. 54 Although no organizations control the Internet, a limited number of 55 organizations are responsible for the development of communications 56 and operational standards and protocols used on the Internet. These 57 standards and protocols are what allow the millions of different (and 58 sometimes incompatible) computers worldwide to communicate with each 59 other. These standards and protocols are not imposed on any computer 60 or computer network, but any computer or computer network must follow 61 at least some of the standards and protocols to be able to 62 communicate with other computers over the Internet. 64 The most significant of the organizations involved in defining these 65 standards include the Internet Society (ISOC), the Internet 66 Architecture Board (IAB), Internet Engineering Steering Group (IESG), 67 and the Internet Engineering Task Force (IETF). The following 68 summary outlines the relationship of these four organizations: 70 The Internet Society (ISOC) is a professional society that is 71 concerned with the growth and evolution of the worldwide Internet, 72 with the way in which the Internet is and can be used, and with the 73 social, political, and technical issues which arise as a result. The 74 ISOC Trustees are responsible for approving appointments to the IAB 75 from among the nominees submitted by the IETF nominating committee 76 and ratifying the IETF Standards Process. 78 The Internet Architecture Board (IAB) is a technical advisory group 79 of the ISOC. It is chartered to provide oversight of the 80 architecture of the Internet and its protocols, and to serve, in the 81 context of the Internet standards process, as a body to which the 82 decisions of the IESG may be appealed. The IAB is responsible for 83 approving appointments to the IESG from among the nominees submitted 84 by the IETF nominations committee and advising the IESG on the 85 approval of Working Group charters. 87 The Internet Engineering Steering Group (IESG) is responsible for 88 technical management of IETF activities and the Internet standards 89 process. As a part of the ISOC, it administers the process according 90 to the rules and procedures which have been ratified by the ISOC 91 Trustees. The IESG is directly responsible for the actions 92 associated with entry into and movement along the Internet "standards 93 track," including final approval of specifications as Internet 94 Standards. 96 The Internet Engineering Task Force (IETF) is a self-organized group 97 of people who make technical and other contributions to the 98 engineering and evolution of the Internet and its technologies. It 99 is the principal body engaged in the development of new Internet 100 standard specifications. The IETF is divided into eight functional 101 areas. They are: Applications, Internet, IP: Next Generation, 102 Network Management, Operational Requirements, Routing, Security, 103 Transport and User Services. Each area has one or two area 104 directors. These area directors, along with the IETF/IESG Chair, 105 form the IESG. 107 In addition to these organizations, there are a variety of other 108 formal and informal groups that develop standards and agreements 109 about specialized or emerging areas of the Internet. For example, 110 the World Wide Web Consortium has developed agreements and standards 111 for the Web. 113 None of these organizations controls, governs, runs, or pays for the 114 Internet. None of these organizations controls the substantive 115 content available on the Internet. None of these organizations has 116 the power or authority to require content providers to alter, screen, 117 or restrict access to content on the Internet other than content that 118 they themselves create. 120 Beyond the standards setting process, the only Internet functions 121 that are centralized are the allocation of numeric addresses to 122 networks and the registration of "domain names." Three entities 123 around the world share responsibility for ensuring that each network 124 and computer on the Internet has a unique 32-bit numeric "IP" address 125 (such as 123.32.22.132), and for ensuring that all "domain names" 126 (such as "harvard.edu") are unique. InterNIC allocates IP addresses 127 for the Americas, and has counterparts in Europe and Asia. InterNIC 128 allocates large blocks of IP addresses to major Internet providers, 129 who in turn allocate smaller blocks to smaller Internet providers 130 (who in turn allocate even smaller blocks to other providers or end 131 users). InterNIC does not, however, reliably receive information on 132 who receives each numeric IP address, and thus cannot provide any 133 central database of computer addresses. In addition, a growing 134 number of computers access the Internet indirectly through address 135 translating devices such as application "firewalls". With these 136 devices the IP address used by a computer on the "inside" of the 137 firewall is translated to another IP address for transmission over 138 the Internet. The IP address used over the Internet can be 139 dynamically assigned from a pool of available IP addresses at the 140 time that a communication is initiated. In this case the IP 141 addresses used inside the firewall is not required to be globally 142 unique and the IP addresses used over the Internet do not uniquely 143 identify a specific computer. Neither the InterNIC nor its 144 counterparts in Europe and Asia control the substantive content 145 available on the Internet, nor do they have the power or authority to 146 require content providers to alter, screen, or restrict access to 147 content on the Internet. 149 3. Characteristics of Internet communications 150 There are a wide variety of methods of communications over the 151 Internet, including electronic mail, mail exploders such as listserv, 152 USENET newsgroups, Internet Relay Chat, gopher, FTP, and the World 153 Wide Web. With each of these forms of communication, the speaker has 154 little or no way to control or verify who receives the communication. 156 As detailed below, for each of these methods of communications, it is 157 either impossible or very difficult for the speaker to restrict 158 access to his or her communications "by requiring use of a verified 159 credit card, debit account, adult access code, or adult personal 160 identification number." Similarly, for each of these methods of 161 communication, there are no feasible actions that I know of that the 162 speaker can take that would be reasonably effective to "restrict or 163 prevent access by minors" to the speaker's communications. 165 With each of these methods of communications, it is either 166 technologically impossible or practically infeasible for the speaker 167 to ensure that the speech is not "available" to a minor. For most of 168 these methods--mail exploders such as listserv, USENET newsgroups, 169 Internet Relay Chat, gopher, FTP, and the World Wide Web--there are 170 technological obstacles to a speaker knowing about or preventing 171 access by minors to a communication. Yet even for the basic point- 172 to-point communication of electronic mail, there are practical and 173 informational obstacles to a speaker ensuring that minors do not have 174 access to a communication that might be considered "indecent" or 175 "patently offensive" in some communities. 177 3.1 Point-to-Point Communications 179 3.1.1 Electronic Mail. 180 Of all of the primary methods of communication on the Internet, there 181 is the highest likelihood that the sender of electronic mail will 182 personally know the intended recipient (and know the intended 183 recipient's true e-mail address), and thus the sender (i.e., the 184 speaker or content provider) may be able to transmit potentially 185 "indecent" or "patently offensive" content with relatively little 186 concern that the speech might be "available" to minors. 188 There is significantly greater risk for the e-mail speaker who does 189 not know the intended recipient. As a hypothetical example, if an 190 AIDS information organization receives from an unknown individual a 191 request for information via electronic mail, the organization has no 192 practical or effective way to verify the identity or age of the e- 193 mail requester. 195 An electronic mail address provides no authoritative information 196 about the addressee. Addresses are often chosen by the addressees 197 themselves, and may or may not be based on the addressees' real 198 names. For millions of people with e-mail addresses, no additional 199 information is available over the Internet. Where information is 200 available (via, for example, inquiry tools such as "finger"), it is 201 usually provided by the addressee, and thus may not be accurate 202 (especially in a case of a minor seeking to obtain information the 203 government has restricted to adults). 205 There exists no universal or even extensive "white pages" listing of 206 e-mail addresses and corresponding names or telephone numbers. Given 207 the rapidly expanding and global nature of the Internet, any attempt 208 as such a listing likely will be incomplete (and likely will not 209 contain information about the age of the e-mail addressee). Nor is 210 there any systematic, practical, and efficient method to obtain the 211 identity of an e-mail address holder from the organization or 212 institution operating the addressee's computer system. 214 Moreover, it is relatively simple for someone to create an e-mail 215 "alias" to send and receive mail under a different name. Thus, a 216 given e-mail address may not even be the true e-mail address of the 217 recipient. On some systems, for example, an individual seeking to 218 protect his or her anonymity could easily create a temporary e-mail 219 address for the sole purpose of requesting information from an AIDS 220 information resource. In addition, there exist "anonymous remailers" 221 which replace the original e-mail address on messages with a randomly 222 chosen new one. The remailer keeps a record of the relationship 223 between the original and the replacement name so that return mail 224 will get forwarded to the right person. These remailers are used 225 frequently for discussion or support groups on sensitive or 226 controversial topics such as AIDS. 228 Thus, there is no reasonably effective method by which one can obtain 229 information from existing online information sources about an e-mail 230 address sufficient to ensure that a given address is used by an adult 231 and not a minor. 233 Absent the ability to comply with the Communications Decency Act 234 based on information from existing online information sources, an e- 235 mail speaker's only recourse is to interrogate the intended e-mail 236 recipient in an attempt to verify that the intended recipient is an 237 adult. Such verification inherently and unavoidably imposes the 238 burden of an entirely separate exchange of communications prior to 239 sending the e-mail itself, and is likely to be unreliable if the 240 recipient intends to deceive the speaker. 242 This separate preliminary communication is required because with 243 electronic mail, there is a complete electronic and temporal 244 "disconnect" between the sender and recipient. Electronic mail can 245 be routed through numerous computers between the sender and the 246 recipient, and the recipient may not "log in" to retrieve mail until 247 days or even weeks after the sender sent the mail. Thus, at no point 248 in time is there any direct or even indirect electronic linkage 249 between sender and recipient that would allow the sender to 250 interrogate the recipient prior to sending an e-mail. Thus, 251 unavoidably, the Communications Decency Act requires that the sender 252 incur the administrative (and in some cases financial) cost of an 253 entirely separate exchange of communications between sender and 254 recipient prior to the sender having sufficient information to ensure 255 that the recipient is an adult. Even if the sender were to establish 256 that an e-mail addressee is not a minor, the sender could not be sure 257 that the addressee was not sharing their computer account with 258 someone else, as is frequently done, who is a minor. 260 If an e-mail is part of a commercial transaction of sufficient value 261 to justify the time and expense of obtaining payment via credit card 262 from the e-mail addressee, an e-mail sender may be able to utilize 263 the credit card or debit account options set out in the 264 Communications Decency Act. At this time, however, one cannot verify 265 a credit or debit transaction over the Internet, and thus an e-mail 266 speaker would have to incur the expense of verifying the transaction 267 via telephone or separate computer connection to the correct banking 268 entity. Because of current concerns about data security on the 269 Internet, such an e-mail credit card transaction would likely also 270 require that the intended e-mail recipient transmit the credit card 271 information to the e-mail sender via telephone or the postal service. 273 Similarly, utilizing the "adult access code" or "adult personal 274 identification number" options set out in the statute would at this 275 time require the creation and maintenance of a database of adult 276 codes. While such a database would not be an insurmountable 277 technological problem, it would require a significant amount of human 278 clerical time to create and maintain the information. As with the 279 credit or debit transactions, an adult code database would also 280 likely require that information be transmitted by telephone or postal 281 mail. 283 Moreover, such an adult access code would likely be very ineffective 284 at screening access by minors. For the adult access code concept to 285 work at all, any such code would have to be transmitted over the 286 Internet, and thus would be vulnerable to interception and 287 disclosure. Any sort of "information based" code--that is, a code 288 that consists of letters and numbers transmitted in a message--could 289 be duplicated and circulated to other users on the Internet. It is 290 highly likely that valid adult access codes would themselves become 291 widely distributed on the Internet, allowing industrious minors to 292 obtain a valid code and thus obtain access the material sought to be 293 protected. 295 A somewhat more effective alternative to this type of "information 296 based" access code would be to link such a code to the unique 32-bit 297 numeric "IP" addresses of networks and computers on the Internet. 298 Under this approach, "adult" information would only be transmitted to 299 the particular computer with the "approved" IP address. For tens of 300 millions of Internet users, however, IP addresses for a given access 301 session are dynamically assigned at the time of the access, and those 302 users will almost certainly utilize different IP addresses in 303 succeeding sessions. For example, users of the major online services 304 such as America Online (AOL) are only allocated a temporary IP 305 address at the time they link to the service, and the AOL user will 306 not retain that IP address in later sessions. Also, as discussed 307 above, the use of "firewalls" can dynamically alter the apparent IP 308 address of computers accessing the Internet. Thus, any sort of IP 309 address-based screening system would exclude tens of millions of 310 potential recipients, and thus would not be a viable screening 311 option. 313 At bottom, short of incurring the time and expense of obtaining and 314 charging the e-mail recipient's credit card, there are no reasonably 315 effective methods by which an e-mail sender can verify the identity 316 or age of an intended e-mail recipient even in a one-to-one 317 communication to a degree of confidence sufficient to ensure 318 compliance with the Communications Decency Act (and avoid the Act's 319 criminal sanction). 321 3.2 Point-to-Multipoint Communications 322 The difficulties described above for point-to-point communications 323 are magnified many times over for point-to-multipoint communications. 324 In addition, for almost all major types of point-to-multipoint 325 communications on the Internet, there is a technological obstacle 326 that makes it impossible or virtually impossible for the speaker to 327 control who receives his or her speech. For these types of 328 communications over the Internet, reasonably effective compliance 329 with the Communications Decency Act is impossible. 331 3.2.1 Mail Exploders 332 Essentially an extension of electronic mail allowing someone to 333 communicate with many people by sending a single e-mail, "mail 334 exploders" are an important means by which the Internet user can 335 exchange ideas and information on particular topics with others 336 interested in the topic. "Mail exploders" is a generic term covering 337 programs such as "listserv" and "Majordomo." These programs typically 338 receive electronic mail messages from individual users, and 339 automatically retransmit the message to all other users who have 340 asked to receive postings on the particular list. In addition to 341 listserv and Majordomo, many e-mail retrieval programs contain the 342 option to receive messages and automatically forward the messages to 343 other recipients on a local mailing list. 345 Mail exploder programs are relatively simple to establish. The 346 leading programs such as listserv and Majordomo are available for 347 free, and once set up can generally run unattended. There is no 348 practical way to measure how many mailing lists have been established 349 worldwide, but there are certainly tens of thousands of such mailing 350 lists on a wide range of topics. 352 With the leading mail exploder programs, users typically can add or 353 remove their names from the mailing list automatically, with no 354 direct human involvement. To subscribe to a mailing list, a user 355 transmits an e-mail to the automated list program. For example, to 356 subscribe to the "Cyber-Rights" mailing list (relating to censorship 357 and other legal issues on the Internet) one sends e-mail addressed to 358 "listserv@cpsr.org" and includes as the first line of the body of the 359 message the words "subscribe cyber-rights name" (inserting a person's 360 name in the appropriate place). In this example, the listserv 361 program operated on the cpsr.org computer would automatically add the 362 new subscriber's e-mail address to the mailing list. The name 363 inserted is under the control of the person subscribing, and thus may 364 not be the actual name of the subscriber. 366 A speaker can post to a mailing list by transmitting an e-mail 367 message to a particular address for the mailing list. For example, 368 to post a message to the "Cyber-Rights" mailing list, one sends the 369 message in an e-mail addressed to "cyber-rights@cpsr.org". Some 370 mailing lists are "moderated," and messages are forwarded to a human 371 moderator who, in turn, forwards messages that moderator approves of 372 to the whole list. Many mailing lists, however, are unmoderated and 373 postings directed to the appropriate mail exploder programs are 374 automatically distributed to all users on the mailing list. Because 375 of the time required to review proposed postings and the large number 376 of people posting messages, most mailing lists are not moderated. 378 An individual speaker posting to a mail exploder mailing list cannot 379 control who has subscribed to the particular list. In many cases, 380 the poster cannot even find out the e-mail address of who has 381 subscribed to the list. A speaker posting a message to a list thus 382 has no way to screen or control who receives the message. Even if 383 the mailing list is "moderated," an individual posting to the list 384 still cannot control who receives the posting. 386 Moreover, the difficulty in knowing (and the impossibility of 387 controlling) who will receive a posting to a mailing list is 388 compounded by the fact that it is possible that mail exploder lists 389 can themselves be entered as a subscriber to a mailing list. Thus, 390 one of the "subscribers" to a mailing list may in fact be another 391 mail exploder program that re-explodes any messages transmitted using 392 the first mailing list. Thus, a message sent to the first mailing 393 list may end up being distributed to many entirely separate mailing 394 lists as well. 396 Based on the current operations and standards of the Internet, it 397 would be impossible for someone posting to a listserv to screen 398 recipients to ensure the recipients were over 17 years of age. Short 399 of not speaking at all, I know of no actions available to a speaker 400 today that would be reasonably effective at preventing minors from 401 having access to messages posted to mail exploder programs. 402 Requiring such screening for any messages that might be "indecent" or 403 "patently offensive" to a minor would have the effect of banning such 404 messages from this type of mailing list program. 406 Even if one could obtain a listing of the e-mail addresses that have 407 subscribed to a mailing list, one would then be faced with the same 408 obstacles described above that face a point-to-point e-mail sender. 409 Instead of obtaining a credit card or adult access code from a single 410 intended recipient, however, a posted to a mailing list may have to 411 obtain such codes from a thousand potential recipients, including new 412 mailing list subscribers who may have only subscribed moments before 413 the poster wants to post a message. As noted above, complying with 414 the Communications Decency Act for a single e-mail would be very 415 difficult. Complying with the Act for a single mailing list posting 416 with any reasonable level of effectiveness is impossible. 418 3.2.2 USENET Newsgroups. 419 One of the most popular forms of communication on the Internet is the 420 USENET newsgroup. USENET newsgroups are similar in objective to mail 421 exploder mailing lists--to be able to communicate easily with others 422 who share an interest in a particular topic--but messages are 423 conveyed across the Internet in a very different manner. 425 USENET newsgroups are distributed message databases that allow 426 discussions and exchanges on particular topics. USENET newsgroups 427 are disseminated using ad hoc, peer-to-peer connections between 428 200,000 or more computers (called USENET "servers") around the world. 429 There are newsgroups on more than twenty thousand different subjects. 430 Collectively, almost 100,000 new messages (or "articles") are posted 431 to newsgroups each day. Some newsgroups are "moderated" but most 432 are open access. 434 For unmoderated newsgroups, when an individual user with access to a 435 USENET server posts a message to a newsgroup, the message is 436 automatically forwarded to adjacent USENET servers that furnish 437 access to the newsgroup, and it is then propagated to the servers 438 adjacent to those servers, etc. The messages are temporarily stored 439 on each receiving server, where they are available for review and 440 response by individual users. The messages are automatically and 441 periodically purged from each system after a configurable amount of 442 time to make room for new messages. Responses to messages--like the 443 original messages--are automatically distributed to all other 444 computers receiving the newsgroup. The dissemination of messages to 445 USENET servers around the world is an automated process that does not 446 require direct human intervention or review. 448 An individual who posts a message to a newsgroup has no ability to 449 monitor or control who reads the posted message. When an individual 450 posts a message, she transmits it to a particular newsgroup located 451 on her local USENET server. The local service then automatically 452 routes the message to other servers (or in some cases to a 453 moderator), which in turn allow the users of those servers to read 454 the message. The poster has no control over the handling of her 455 message by the USENET servers worldwide that receive newsgroups. 456 Each individual server is configured by its local manager to 457 determine which newsgroups it will accept. There is no mechanism to 458 permit distribution based on characteristics of the individual 459 messages within a newsgroup. 461 The impossibility of the speaker controlling the message distribution 462 is made even more clear by the fact that new computers and computer 463 networks can join the USENET news distribution system at any time. 464 To obtain newsgroups, the operator of a new computer or computer 465 network need only reach agreement with a neighboring computer that 466 already receives the newsgroups. Speakers around the world do not 467 learn that the new computer had joined the distribution system. 468 Thus, just as a speaker cannot know or control who receives a 469 message, the speaker does not even know how many or which computers 470 might receive a given newsgroup. 472 For moderated newsgroups, all messages to the newsgroup are forwarded 473 to an individual who can screen them for relevance to the topics 474 under discussion. The screening process, however, does not increase 475 the ability of the original speaker to control who receives a given 476 message. A newsgroup moderator has as little control as the original 477 speaker over who receives a message posted to the newsgroup. 479 Based on the current operations and standards of the Internet, it 480 would be impossible for someone posting to a USENET newsgroup to 481 screen recipients to ensure that the recipients were over 17 years of 482 age. Short of not speaking at all, I know of no actions available to 483 a speaker today that would be reasonably effective at preventing 484 minors from having access to USENET newsgroup messages. Requiring 485 such screening for any messages that might be "indecent" or "patently 486 offensive" to a minor would have the effect of banning such messages 487 from USENET newsgroups. 489 A speaker also has no means by which he or she could require 490 listeners to provide a credit card, debit account, adult access code, 491 or adult personal identification number. Each individual USENET 492 server controls access to the newsgroups on that server, and a 493 speaker has no ability to force a server operator to take any 494 particular action. The message is out of the speaker's hands from 495 the moment the message is posted. 497 Moreover, even if one hypothesized a system under which a newsgroup 498 server would withhold access to a message until the speaker received 499 a credit card, debit account, adult access code, or adult personal 500 identification number from the listener, there would be no feasible 501 way for the speaker to receive such a number. Because a listener may 502 retrieve a message from a newsgroup days after the speaker posted the 503 message, such a hypothetical system would require the speaker either 504 to remain at his or her computer 24 hours a day for as many as ten 505 days after posting the message, or to finance, develop, and maintain 506 an automated system to receive and validate access numbers. All of 507 this effort would be required for the speaker to post even a single 508 potentially "patently offensive" message to a single newsgroup. 510 Moreover, even if such a hypothetical system did exist and a speaker 511 were willing to remain available 24 hours a day (or operate a costly 512 automated system) in order to receive access numbers, not all 513 computers that receive USENET newsgroups could reasonably transmit 514 such access numbers. Some computers that receive newsgroups do so 515 only by a once-a-day telephone connection to another newsgroup 516 server. Some of these computers do not have any other type of 517 Internet connection, and indeed some computers that receive USENET 518 newsgroups do not even utilize the TCP/IP communications protocol 519 that is required for direct or real time communications on the 520 Internet. These computers would have no means by which a prospective 521 listener's access code could be communicated back to a speaker. 523 It is my opinion that if this hypothetical access system ever were 524 created, it would be so burdensome as to effectively ban from USENET 525 newsgroups messages that might be "indecent" or "patently offensive." 526 Moreover, the communications standards and protocols that would allow 527 such a hypothetical access system have not as of today been 528 developed, and no Internet standards setting body of which I am aware 529 is currently developing such standards and protocols. Specifically, 530 such a hypothetical access system is not part of the "next 531 generation" Internet Protocol that I helped to develop. 533 3.2.3 Internet Relay Chat. 534 Another method of communication on the Internet is called "Internet 535 Relay Chat" (or IRC). IRC allows for real time communication between 536 two or more Internet users. IRC is analogous to a telephone party 537 line, using a computer and keyboard rather than a telephone. With 538 IRC, however, at anyone time there are thousands of different party 539 lines available, in which collectively tens of thousands of users are 540 engaging in discussions, debates, and conversations on a huge range 541 of subjects. Moreover, an individual can create a new party line to 542 discuss a different topic at any time. While many discussions on IRC 543 are little more than social conversations between the participants, 544 there are often conversations on important issues and topics. 545 Although I have not personally operated an IRC server in my career, I 546 am familiar enough with the operations of IRC servers to be able to 547 identify the obstacles that a speaker would encounter attempting to 548 identify other participants and to verify that those participants 549 were not minors. 551 There exists a network of dozens of IRC servers across the world. To 552 speak through IRC, a speaker connects to one of these servers and 553 selects the topic the speaker wishes to "join." Within a particular 554 topic (once a speaker joins a topic), all speakers on that topic can 555 see and read everything that everyone else transmits. As a practical 556 matter, there is no way for each person who joins a discussion to 557 interrogate all other participants (sometimes dozens of participants) 558 as to their identity and age. Because people join or drop out of 559 discussions on a rolling basis, the discussion line would be 560 overwhelmed with messages attempting to verify the identity of the 561 participants. 563 Also as a practical matter, there is no way that an individual 564 speaker or an individual IRC server operator could enforce an "adults 565 only" rule for a selection of the discussion topics. Dozens of IRC 566 servers are interconnected globally so that people across the world 567 can talk to each other. Thus, a speaker connected to an IRC server 568 in the United States can speak directly to a listener in Asia or 569 Europe. There is no practical way that a speaker in the United 570 States can be reasonably certain that a given IRC discussion is in 571 fact "adults only." 573 Nor can a speaker, prior to or at the time of joining an IRC 574 discussion, ascertain with any confidence the identity of the other 575 participants in the discussion. Individual participants in an IRC 576 conversation are able to participate anonymously by using a 577 pseudonym. A new speaking joining the conversation can see a list of 578 pseudonyms of other participants, but has no possibly way of 579 determining the real identify (or even the real e-mail address) of 580 the individuals behind each pseudonym. 582 Based on the current operations and standards of the Internet, it 583 would be impossible for someone participating in a IRC discussion to 584 screen recipients with a level of certainty needed to ensure the 585 recipients were over 17 years of age. Short of not speaking at all, 586 I know of no actions available to a speaker today that would be 587 reasonably effective at preventing minors from having access to 588 speech in an IRC discussion. Requiring such screening of recipients 589 by the speakers for any IRC discussions that might be "indecent" or 590 "patently offensive" to a minor would have the effect of banning such 591 discussions. 593 4.0 Information Retrival Systems 594 With FTP (or File Transfer Protocol), gopher, and the World Wide Web, 595 the Internet is a vast resource for information made available to 596 users around the world. All three methods (FTP, gopher, and the Web) 597 are specifically geared toward allowing thousands or millions of 598 users worldwide to access content on the Internet, and none are 599 specifically designed to limit access based on criteria such as the 600 age of the Internet user. Currently much of this information is 601 offered for free access. 603 4.1 Anonymous FTP 604 "Anonymous FTP" is a basic method by which a content provider can 605 make content available to users on the Internet. FTP is a protocol 606 that allows the efficient and error free transfer of files from one 607 computer to another. To make content available via FTP, a content 608 provider establishes an "Anonymous FTP server" capable of receiving 609 FTP requests from remote users. This approach is called "anonymous" 610 because when a remote user connects to an FTP server, the remote user 611 enters the word "anonymous" in response to the server's request for a 612 user name. By convention, the remote user is requested to enter his 613 or her e-mail address when prompted for a "password." The user is 614 then given access to a restricted portion of the server disk and to 615 the files in that area. Even though the user may have entered their 616 e- mail address in response to the password prompt, there is no 617 effective validation or screening is possible using the FTP server 618 software that is currently available. Using currently available FTP 619 software, a content provider has no way to screen access by 620 "anonymous" users that may be minors. Even if a content provider 621 could determine the age of a particular remote user, the currently 622 available FTP software cannot be set to limit the user's access to 623 non-"adult" file areas. 625 FTP server software can allow non-"anonymous" users to access the FTP 626 server, and in that mode can require the users to have individual 627 passwords that are verified against a pre-existing list of passwords. 628 There are two major problems, however, that prevent this type of 629 non-"anonymous" FTP access from being used to allow broad access to 630 information over the Internet (as anonymous FTP can allow). First, 631 with current server software each non-"anonymous" FTP user must be 632 given an account on the server computer, creating a significant 633 administrative burden and resource drain. If more than a limited 634 number of users want access to the FTP system, the requirement of 635 separate accounts would quickly overwhelm the capacity of the server 636 to manage the accounts--the FTP server software was not designed to 637 manage thousands or millions of different user/password combinations. 638 Second, under existing FTP server software, each of these named users 639 would have complete access to the server file system, not a 640 restricted area like the anonymous FTP function supports. This would 641 create a significant security problem. For these two reasons, as a 642 practical matter FTP cannot be used to give broad access to content 643 except via the anonymous FTP option (which, as noted above, does not 644 allow for screening or blocking of minors). 646 As discussed below with regard to the World Wide Web, even if someone 647 redesigned the currently available FTP server software to allow the 648 screening of minors, the administrative burden of such screening 649 would in many cases overwhelm the resources of the content provider. 651 Based on the current operations and standards of the Internet, it is 652 not possible or practically feasible for someone operating an 653 anonymous FTP file server to screen recipients with a level of 654 certainty needed to ensure the recipients were over 17 years of age. 655 Short of not operating an anonymous FTP server at all, I know of no 656 actions available to a content provider today that would be 657 reasonably effective at preventing minors from having access to 658 "adult" files on the FTP server. Requiring such screening by 659 anonymous FTP server operators to prevent minors from accessing FTP 660 files that might be "indecent" or "patently offensive" to a minor 661 would have the effect of banning such anonymous FTP access. 663 4.2 Gopher. 664 The gopher program is similar to FTP in that it allows for basic 665 transfer of files from one computer to another, but it is also a 666 precursor to the World Wide Web in that it allows a user to 667 seamlessly jump from one gopher file server to another in order to 668 locate the desired information. The development of gopher and the 669 linking of gopher servers around the worlds dramatically improved the 670 ability of Internet users to locate information across the Internet. 672 Although in many ways an improvement over FTP, gopher is simpler than 673 FTP in that users need not enter any username or password to gain 674 access to files stored on the gopher server. Under currently 675 available gopher server software, a content provider has no built-in 676 ability to screen users. Thus a content provider could not prevent 677 minors from retrieving "adult" files. 679 As discussed below with regard to the World Wide Web, even if the 680 gopher server software allowed the screening of minors, the 681 administrative burden of such screening would in many cases overwhelm 682 the resources of the content provider. 684 Based on the current operations and standards of the Internet, it is 685 not possible for someone operating a gopher file server to screen 686 recipients with a level of certainty needed to ensure the recipients 687 were over 17 years of age. Short of not operating a gopher server at 688 all, I know of no actions available to a content provider today that 689 would be reasonably effective at preventing minors from having access 690 to "adult" files on a gopher server. Requiring such screening of 691 users by gopher server operators to prevent minors from accessing 692 files that might be "indecent" or "patently offensive" to a minor 693 would have the effect of banning gopher servers wherever there is any 694 such material. 696 4.3 World Wide Web (WWW). 697 Fast becoming the most well known method of communicating on the 698 Internet, the "World Wide Web" offers users the easy ability to 699 locate and view a vast array of content on the Internet. The Web 700 uses a "hypertext" formatting language called hypertext markup 701 language (HTML), and Web "browsers" can display HTML documents 702 containing text, images, and sound. Any HTML document can include 703 links to other types of information or resources anywhere in the 704 world, so that while viewing an HTML document that, for example, 705 describes resources available on the Internet, an individual can 706 "click" using a computer mouse on the description of the resource and 707 be immediately connected to the resource itself. Such "hyperlinks" 708 allow information to be accessed and organized in very flexible ways, 709 and allow individuals to locate and efficiently view related 710 information even if the information is stored on numerous computers 711 all around the world. 713 Unlike with USENET newsgroups, mail exploders, FTP, and gopher, an 714 operator of a World Wide Web server does have some ability to 715 interrogate a user of a Web site on the server, and thus has some 716 ability to screen out users. An HTML document can include a fill-in- 717 the-blank "form" to request information from a visitor to a Web site, 718 and this information can be transmitted back to the Web server. The 719 information received can then be processed by a computer program 720 (usually a "Common Gateway Interface," or "CGI," script), and based 721 on the results of that computer program the Web server could grant or 722 deny access to a particular Web page. Thus, it is possible for some 723 (but not all, as discussed below) World Wide Web sites to be designed 724 to "screen" visitors to ensure that they are adults. 726 The primary barrier to such screening is the administrative burden of 727 creating and maintaining the screening system. For an individual Web 728 site to create a software system capable of screening thousands of 729 visitors a day, determining (to the extent possible) whether a 730 visitor is an adult or a minor, and maintaining a database to allow 731 subsequent access to the Web site would require a significant on- 732 going effort. Moreover, as discussed above with regard to electronic 733 mail, the task of actually establishing a Web visitor's identity or 734 "verifying" a credit card would require a significant investment of 735 administrative and clerical time. As there is no effective method to 736 establish identity over the Internet, nor is there currently a method 737 to verify credit card numbers over the Internet (and given the 738 current cost of credit card verifications done by other means), this 739 type of identification process is only practical for a commercial 740 entity that is charging for access to the Web information. 742 Beyond the major administrative burden that would be required for a 743 Web site host to comply with the Communications Decency Act, there 744 are two additional problems presented by the Act. First, many Web 745 publishers cannot utilize computer programs such as CGI scripts to 746 process input from a Web visitor. For example, I have been informed 747 that the major online services such as America Online and Compuserve 748 do not allow their customers to run CGI scripts or other processes 749 that could be a significant drain on the online services' computers 750 as well as a potential security risk. Thus, for this category of Web 751 publisher, the Communications Decency Act works as a ban on any 752 arguably "indecent" or "patently offensive" speech. It is impossible 753 for this category of Web publisher to control access to their Web 754 sites. 756 Moreover, even for Web publishers who can use CGI scripts to screen 757 access, the existence of Web page caching on the Internet can make 758 such screening ineffective. "Caching" refers to a method to speed up 759 access to Internet resources. Caching is often used at one or both 760 ends of, for example, a transatlantic or transpacific cable that 761 carries Internet communications. An example of caching might occur 762 when a Internet user in Europe requests access to a World Wide Web 763 page located in the United States. The request travels by 764 transatlantic cable to the United States, and the Web page is 765 transmitted back across the ocean to Europe (and ultimately to the 766 user who requested access). But, the operator of the transatlantic 767 cable will place the Web page in a storage "cache" located on the 768 European side of the cable. Then, if a second Internet user in 769 Europe requests the same Web page, the operator of the transatlantic 770 cable will intercept the request and provide the page from its 771 "cache" (thereby reducing traffic on the transatlantic cable). This 772 type of caching typically occurs without the awareness of the 773 requesting user. Moreover, in this scenario, the original content 774 provider is not even aware that the second user requested the Web 775 page--and the original content provider has no opportunity to screen 776 the access by the second user. Nevertheless, the original content 777 provider risks prosecution if the content is "adult" content and the 778 second requester is a minor. The use of caching web servers is 779 rapidly increasing within the United States (mostly to help moderate 780 the all too rapid growth in Internet traffic), and thus can affect 781 entirely domestic communications. For example, a growing number of 782 universities use caching web servers to reduce the usage of the link 783 to their Internet service provider. In light of this type of 784 caching, efforts to screen access to Web pages can only at best be 785 partially effective. 787 In light of the existence of Web page caching on the Internet, it 788 would be extremely difficult if not impossible to for someone 789 operating a World Wide Web server to ensure that no minors received 790 "adult" content. 792 Moreover, for those Web page publishers who lack access to CGI 793 scripts, there is no possible way for them to screen recipients to 794 ensure that all recipients are over 17 years of age. For these 795 content providers, short of not supporting World Wide Web access to 796 their materials, I know of no actions available to them that would be 797 reasonably effective at preventing minors from having access to 798 "adult" files on a World Wide Web server. Requiring such screening 799 by these Web publishers to prevent minors from accessing files that 800 might be "indecent" or "patently offensive" to a minor would have the 801 effect of banning their speech on the World Wide Web. 803 The Web page caching described above contributes to the difficulty of 804 determining with specificity the number of visitors to a particular 805 Web site. Some Web servers can count how many different Web clients, 806 some of which could be caching Web servers, requested access to a Web 807 site. Some Web servers can also count how many "hits"--or separate 808 file accesses--were made on a particular Web site (a single access to 809 a Web page that contains a images or graphic icons would likely be 810 registered as more than one "hit"). With caching, the actual number 811 of users that retrieved information that originated on a particular 812 Web server is likely to be greater than the number of "hits" recorded 813 for the server. 815 5.0 Client-end Blocking 816 As detailed above, for many important methods of communication on the 817 Internet, the senders--the content providers--have no ability to 818 ensure that their messages are only available to adults. It is also 819 not possible for a Internet service provider or large institutional 820 provider of access to the Internet (such as a university) to screen 821 out all or even most content that could be deemed "indecent" or 822 "patently offensive" (to the extent those terms can be understood at 823 all). A large institution could at least theoretically screen a 824 portion of the communications over the Internet, scanning for example 825 for "indecent" words, but not pictures. Such a screening program 826 capable of screening a high volume of Internet traffic at the point 827 of its entry into the institution would require an investment of 828 computing resources of as much as one million dollars per major 829 Internet information conduit. In addition it would be quit difficult 830 to configure such a system to only control the content for those 831 users that are under-age recipients, since in many cases the 832 information would be going to a server within the university where 833 many users, under-age and not, would have access to it. 835 Based on my experience and knowledge of the Internet, I believe that 836 the most effective way to monitor, screen, or control the full range 837 of information transmitted over the Internet to block undesired 838 content is at the client end--that is, by using software installed in 839 the individual user's computer. Such software could block certain 840 forms of incoming transmissions by using content descriptive tags in 841 the messages, or could use content ratings developed by third parties 842 to select what can and cannot be retrieved for display on a user's 843 computer. 845 6.0 Tagging Material 846 I am informed that the government in this action may advocate the use 847 of special tags or flags in electronic mail messages, USENET 848 newsgroup postings, and World Wide Web HTML documents to indicate 849 "adult" material. To my knowledge, no Internet access software or 850 World Wide Web browsers are currently configurable to block material 851 with such tags. Thus, the headers and flags the government may 852 advocate is currently an ineffective means to ensure the blocking of 853 access by minors to "adult" material. Even in a future where there 854 are defined standards for such tags and there are readably available 855 browsers that are configurable to make use of those tags, a content 856 provider--e.g., a listserv or Newsgroup poster or a Web page 857 author--will have little power to ensure that the client software 858 used to receive the postings was in all cases properly configured to 859 recognize these tags and to block access to the posting when 860 required. Thus I feel that the tagging that may be proposed by the 861 government would in fact not be "effective" in ensuring that the 862 poster's speech would not be "available to a person under 18 years of 863 age," as the Communications Decency Act requires. Although I 864 strongly support both voluntary self-rating and third-party rating 865 (as described in the preceding paragraph), I do not feel that the use 866 of tags of this type would satisfy the speaker's obligation to take 867 effective actions to ensure that "patently offensive" material would 868 not be "available" to minors. Furthermore, since it is impossible 869 to embed such flags or headers in many of the documents made 870 available by anonymous FTP, gopher and the World Wide Web without 871 rendering the files useless (executable programs for example), any 872 government proposal to require the use of tags to indicate "adult" 873 material would not allow the continued use of those methods of 874 communication for speech that might be deemed "indecent" or "patently 875 offensive." 877 With the exception of electronic mail and e-mail exploders all of the 878 methods of Internet communications discussed above require an 879 affirmative action by the listener before the communication takes 880 place. A listener must take specific action to receive 881 communications from USENET newsgroups, Internet Relay Chat, gopher, 882 FTP, and the World Wide Web. In general this is also true for e-mail 883 exploders except in the case where a third party subscribes the user 884 to the exploder list. These communications over the Internet do not 885 "invade" a person's home or appear on a person's computer screen 886 unbidden. Instead, a person must almost always take specific 887 affirmative steps to receive information over the Internet. 889 7.0 Acknowledgment 890 I owe a great deal of thanks to John Morris of Jenner and Block, one 891 of the law firms involved in the CDA challenge. Without his 892 extensive help this document would not exist, or if it did, it would 893 be even more scattered. 895 8.0 Security Considerations 896 To be actually able to do the type of content access control that the 897 CDA envisions would require a secure Internet infrastructure along 898 with secure ways to determine the minor status of potential 899 reciepiants around the world. Developing such a system is outside of 900 the scope of this document. 902 9.0 Author's Address 904 Scott Bradner 905 Harvard University 906 1350 Mass Ave. 907 Cambridge MA 02138 USA 909 +1 617 495 3864 911 sob@harvard.edu