Author Archives: Kathleen Moriarty

Strengthening Internet Security and Accommodating Data Center Requirements: Finding a Path Forward

With the intention to encourage the development of a solution to an issue currently under discussion within an IETF working group, I wanted to offer a personal view of a possible ways forward. This view is informed by a number of conversations with people involved in the discussion with different perspectives. However, the following does not represent nor is it intended to suggest an IETF position. The work to develop that remains to be done.

One of the strengths of the IETF process is that it brings together a diverse set of technical specialists—network operators, academics, developers, and protocol designers. The IETF seeks broad participation in its standards development processes because it leads to more robust standards—ones that work better and are more broadly applicable to the many different use cases found on the global Internet. However, it is not uncommon for implementers to learn about changes in protocols close to their publication date. This is an opportunity to encourage those using IETF standards to stay informed, at a minimum by reading the appropriate mailing lists.

In my opinion, unless voices are heard and solutions are found, the objective of end-to-end encryption to protect users privacy will be limited as a result of deployment challenges. I fully agree that end-to-end secure communications is necessary to protect the security of Internet sessions, end users privacy, and anonymity for human rights considerations. Unless we look at the obstacles and find ways to fix them, we won’t reach the goal of end-to-end security.

For the past several years, the IETF has been working to strengthen the Internet against pervasive monitoring. In line with that effort, the TLS Working Group has been developing Transport Layer Security (TLS) 1.3. TLS 1.3 is designed to improve security and protect information sent over the Internet that from being intercepted and decrypted by unauthorized entities. While RFC7258 had already recommended against use of static RSA keys for TLS 1.2, this was formally deprecated in TLS 1.3 in favor of the more secure key exchange based on the Elliptic Curve Diffie-Hellman algorithm. These are important steps towards protecting end user privacy and the security of TLS protected sites to keep pace with the evolving threat landscape.

In the case of TLS 1.3, which is nearing completion, enterprise data center operators have recently proposed a deployment method that would enable intercepting and decrypting traffic within a data center through the use of a static Diffie Hellman key. This proposal would continue a method of intercepting and monitoring traffic similar to what is in place for all previous versions of TLS and SSL with static RSA keys. The intention is to restrict the application of this method for use within a data center only, and not with connections to the Internet.  

Some data center operators need the ability to look at network traffic for transaction monitoring because of regulations. At the same time they want to adhere to best practices by encrypting network traffic and thereby protect against malicious interceptions. While the proposed scheme for TLS would address this need for monitoring and is applicable to existing techniques used in some data centers, it should be noted that this is not the only possible technical approach that could enable encrypted traffic transmission in data centers as well as regulatory-required monitoring.

Within this use case, client TLS sessions over the Internet are not intended to use this proposal, but rather to maintain forward secrecy (likely not using the 0-RTT option) with the sessions terminating at the edge of the data center. The proposal is intended for use within the data center where both ends of the encrypted sessions are managed by the enterprise.

Others within the TLS working group have voiced concerns about the proposed approach, as there is no technical way to limit it’s usage to the data center. In other words, the method could be used for wiretapping purposes by a third party if it were deployed for a connection over the Internet. The TLS working group agreed to continue to discuss the issue, but how to solve the problem is uncertain at the moment.  

The proposal is not likely to be further considered. With this in mind, a discussion on alternate approaches to meet the use case requirements could be quite useful. It may be possible to adapt existing work in the IETF, not necessarily TLS, to meet the requirement, should the IETF choose to work on this problem.  

If TLS 1.3 sessions were terminated at the network edge, another solution could be used within the data center. One possibility is the use of IPsec. While all data centers do not have the same needs, some are working toward use of protocols such as IPsec or tcpcrypt operating at the IP and TCP layers rather than the application layer.

Could IPsec be adapted to meet requirements? IPsec transport mode isn’t well deployed, and has some interoperability issues, but this may present an opportunity to work towards a solution outside of TLS.  There are also multiple group keying solutions already defined for IPsec including Group Domain Of Interpretation (GDOI) [RFC6407] and Multimedia Internet KEYing (MIKEY) [RFC3830]. I haven’t seen a proposal yet for a protocol designed fit for purpose, but that could be of interest too. My impression from the set of requirements is that we do have time to work collaboratively to a solution.

It may be of interest to know the I2NSF working group is reviewing a proposal from data center operators in cooperation with the IPSecME working group to automate deployment of IPsec tunnels within a data center; a very productive interim call took place on 6 September specific to this proposal.  

A third possibility is a multi-party session transport encryption protocol designed specifically for the data center monitoring use case, of which none have been proposed to date to my knowledge.  

A longer term solution is to determine what is missing from application logging and endpoint resources to maintain end-to-end encryption and eliminate monitoring via interception. There are scaling issues with pushing these functions out to the endpoint, so perhaps there are other methods that could be used to enable monitoring functions without exposing potentially sensitive information that may or may not be privacy related. This will take lots of work and I encourage application protocol developers to think toward solutions.

While the exact way forward is yet to be determined, I believe that by working collaboratively we can strengthen the Internet and accommodate a broader set of use cases to make IETF standards more relevant to the implementers and operators who put them into practice.

I personally think that we should be doing something to solve the data center use case and would like to see a separate solution from one that uses TLS. In doing so, the likelihood of TLS 1.3 being deployed as intended to protect users privacy increases, as does the security of the terminating site, and the data center operators would gain a solution fit for purpose within their closed environments. It should be noted that there is nothing that prevents the implementation strategy described in the proposal from being deployed; an RFC isn’t necessary for that to happen.

After reading lengthy email discussions, the TLS 1.3 draft and the proposal, and listening to the presentations at the WG session, it appears that there is motivation and expertise to address use cases not anticipated by TLS 1.3. Some are working toward this with the goal of draft adoption within the TLS working group with an extension based solution in which the client is aware of interception. But adoption is not guaranteed.

If this issue affects you, and you believe you can contribute, I encourage you to read through the WG mailing list archive and propose ways forward either adapting a solution with TLS, via alternate encryption and decryption solutions for use within the data center, or through improvements to applications to eliminate the desire to intercept traffic. The use case presented is important to data center operators and working with those deploying IETF protocols increases the success of those protocols being deployed as intended. The IETF doesn’t need to take action, but I’d like to encourage those with ideas to advance the thinking in this problem space.

IETF Diversity Update

Kathleen M. Moriarty, Security Area Director

Charts from Jari Arkko, IETF Chair

It’s been a while since we’ve had a diversity related update and with the approval of the Anti-Harassment BCP [1] and publication of the Independent Stream Editor (ISE) document, RFC7704 [2] it seems timely. The anti-harassment BCP helps to establish a code of conduct supported in the IESG statement on the same topic from 2013. RFC7704 is an opinion piece from the editors citing issues from the 2011-2012 time period that while still surface on occasion, have seen improvement due to many efforts from IETF participants.

While the IETF has diversity issues, and like any other organization we will have to work on this continuously. In 2012, when these behavior and diversity issues were glaringly apparent to the IETF, the chair, Jari Arkko worked with others to establish a “Diversity Design Team”, of which I was appointed as one of the co-chairs with Suresh Krishnan. The final readout from the Diversity Design Team was given at the IETF 87 Plenary in July 2013 [3], switching the focus to inclusion from diversity. The Internet Society was already working on important programs to increase regional diversity, such as the Fellows and Policy programs, which bring new people from various regions of the world to meetings to introduce them to the IETF. Other efforts, such as the Mentoring Program were established by individual participants in 2013, with Alissa Cooper and Brian Haberman leading the effort with other individuals on the team [6][5]. The mentoring program assists newcomers at their first IETF meeting pairing them with volunteer mentors that guide them through their first IETF and helping them negotiate the new environment as well as procedures. The IETF culture and complex procedures are tough to negotiate and this effort is key toward improving retention of newcomers. The IETF does quite a bit of work from the bottom up, so if a need for something is seen, individuals are welcome and encouraged to drive them as was shown by the mentoring program. The 2013 nominating committee (NomCom), led by Allison Mankin, took it upon themselves to first raise awareness on implicit bias amongst committee members in self assessments, as well as taking steps to have those nominated for IETF leadership positions provide ‘blind’ questionnaire responses as not to identify themselves in the initial selection process. These techniques were supported by research to improve diversity and in from their own research and awareness raised by the Diversity Design Team. The 2013 NomCom appointed 3 women to the IESG and 1 to the IAB. The 2014 NomCom added one more woman to each of those leadership groups as well, matching (at least for the IESG) a prior high point of 4/15 women in the IESG.

On a personal note, I could be considered as part of the 2012 Systers ‘experiment’ mentioned in RFC7704. I was not appointed in that cycle as a Security Area Director (although am currently serving in that role), so I do count in the numbers cited. I bring this up as a few women in the nominating cycle for 2012, like myself, fully expected that the nominating cycle would be a dry run for the following year. While that may sound odd, if all things are equal with two candidates, the NomCom will select the incumbent for a second term. This is a result of the complexities of the position and the amount of time it takes to come up to speed fully in the responsibilities for the role. In the 2012 cycle, Stephen Farrell was an incumbent running for his second term. Stephen and his co-AD at the time, Sean Turner encouraged me to accept their nomination as they wanted the NomCom to have reasonable candidates to select from and they also wanted me to consider a possible appointment the following year when there was no incumbent. Several very qualified IETF participants accepted nominations the following year, 2013. Since I was the Global Lead Security Architect at EMC, this would take discussions with management to determine if there was support for me taking on this time consuming appointment and stepping away from a very important role at EMC. This is not uncommon for many candidates as they tend to be in high-level positions in their organizations. For me, this was quite helpful for the long term process. While there may have been a bias in the 2012 NomCom as cited in RFC7704, at least a couple of the qualified females had no expectation of being appointed that year. Another woman accepted a nomination for an Area Director role just in case the incumbent was selected for a more senior role and the NomCom needed a reasonable nominee/volunteer to appoint. She was also appointed by a subsequent NomCom as she is also well qualified for the role of an Area Director and held senior level positions in her supporting organization.

The IETF leadership has been very supportive of fostering an inclusive environment, which from studies is one of the most critical factors to improving culture and working towards a more diverse environment. In addition to supporting a number of programs, an anti-harassment statement was issued by the IESG in November 2013 [4] and was followed by a consensus draft, on it’s way to final publication as a Best Current Practices RFC [1]. Research has also demonstrated that programs are the most effective way to make progress in an area that will require continued focus for the IETF. We are not perfect but have come a long way in a few years. The IETF will need continued support from the Chairs, IESG, and IAB to foster continued improvements. As such, this consideration is part of the NomCom evaluation process now.

Unfortunately, the problems cited in RFC7704 are not unique to the IETF, so there was plenty of research for the Diversity Design Team to draw from. The final readout from the Diversity Design Team was given at a recorded IETF87 plenary [8] with slides available [3], where the focus was changed to one of promoting an inclusive environment. While the male/female ratios may have sparked the IETF conversations, we also needed to consider regional diversity and ways to develop a pipeline of potential IETF leadership candidates along all measures of diversity. This goal is supported by research that shows the benefits of diverse groups toward improved outcomes and higher collective intelligence of diverse groups versus homogeneous groups. Working Group chairs and Area Directors support these goals by providing training opportunities to individuals and considering diversity when selections are made. If you take the time to watch the recording, you’ll notice an interaction at the end that might be more typical of the cited RFC7704 behavior. While on the surface this may look bad, what followed the plenary demonstrated progress in that apologies and an additional acknowledgement of the problems we face as a community were the result. What this boils down to is that the IETF is not perfect, but we have moved to an environment where bad behavior is called out more readily and have options now such as working with an ombudsmen to have neutral arbitrators, thanks to the work in the anti-harassment BCP [1].

For my own working groups, I do feel they are welcoming environments for all to participate both on the lists and in the room. When an occasional comment that is not appropriate occurs (very rare), it’s been called out immediately and this is a new norm where the behavior is corrected on the spot. It’s progress. If I or other ADs are missing behaviors we should be catching, please let us know. This is a community effort.


CARIS Workshop Summary and Reflection

Kathleen Moriarty, Security AD & CARIS Program Committee Chair

The Internet Architecture Board (IAB) and the Internet Society (ISOC) hosted day-long Coordinating Attack Response at Internet Scale (CARIS) workshop took place last Friday in coordination with the Forum for Incident Response and Security Teams (FIRST) Conference in Berlin. The workshop included members of the FIRST community, attack response working group representatives (APWG, ACDC, etc.), network & security operators, RIR representatives, researchers, vendors, and representatives from standards communities. Key goals of the workshop were to improve mutual awareness, understanding, and coordination among the diverse participating organizations. The workshop also aimed at providing greater awareness of existing efforts to mitigate specific types of attacks, and greater understanding of the options others have to collaborate and engage with these efforts.

The day-long workshop included a mix of invited and selected speakers with opportunities to collaborate throughout, taking full advantage of the tremendous value of having these diverse communities with common goals in one room. There were approximately 50 participants engaged in the CARIS workshop from the 25 papers received and additional 20 template submissions.  The template submissions will be maintained at the Internet Society web site and as a result of the workshop will be amended to provide additional value to the computer security incident response teams (CSIRTs) and attack response communities/operators on their information exchange activities.  The CARIS participants found the template submissions to be very useful in coordinating their future attack mitigation efforts.  Nothing like this had previously been done — this is open for the global community and hosted in a neutral location.  All submissions are linked from the agenda.

The workshop talks and panels involved full participation from attendees who were required to read all other submissions.  The panels were organized to spur conversation between specific groups to see if we could further progress towards more efficient and effective attack mitigation efforts.  See paper and blog series for additional information on possible approaches to accomplish more effective attack response and information exchanges with methods that require fewer analysts.

Panel groups:

  • Coordination between CSIRTS and attack response mitigation efforts
  • Distributed Denial of Service and Botnet researchers, vendors, and operators
  • Infrastructure: DNS and RIR providers and researchers
  • Trust and Privacy with the exchange of potentially sensitive information
  • IAB wrap up for architecture next steps

There were a few items that stood out to me from the workshop (more to be included in the formal report):

  1. The participants are interested in expanded information on the resources and assistance offered by the RIRs and DNS providers.  Participants are going to define what is needed with follow through on next steps.
  2. Another reoccurring theme was the lack of knowledge by the community of basic security principles such as ingress and egress filtering explained in BCP38.  The CSIRTS, operators, and vendors of attack mitigation tools found this particularly frustrating.  As a result, follow up activities may include determining if security guidance BCPs require updates or to determine whether there are opportunities to educate on these basic principles already documented by the IETF.
  3. After the workshop, the Internet Society hosted a three and a half hour boat tour through the canals of Berlin, offering additional time for collaboration among participants.  One of the lively discussions was the need for better transports for information exchange.  As the author of Real-time Inter-network Defense (RID), I agree.  RID was written more than 10 years ago and while the patterns established still show promise, there are updated solutions being worked on.  One such solution is in the IETF DOTS working group, that has an approach similar to RID with updated formats and protocols to meet the demands of todays DDoS attacks.  While TAXII (another transport option) is just in transition to OASIS, its base is similar to RID in its use of SOAP-like messaging, which will likely prevent it from scaling to the demands of the Internet.  Vendors also cited several interoperability challenges in TAXII.  Alternatively, XMPP-Grid has been proposed in the IETF SACM working group and it offers promise as the data exchange protocol.  XMPP inherently meets the requirements for today’s information exchanges with features such as publish/subscribe, federation, and use of a control channel.  XMPP-grid is taking off too with at least 10 current vendors using open source code in their products with several more planning to add support.  Review and discussion of this draft would be helpful.  REST was also brought up as a needed interface.  IETF’s MILE has a draft detailing a common RESTful interface (ROLIE) that could be used with any data format and may be of interest.  It would be good to hear from the community if this draft is of value to assist with that gap and it would be resurrected if helpful.

This blog just offers a taste of the workshop and a full report will be forthcoming as will follow up from the IAB on this important meeting.  As the workshop chair, I was very excited that the CARIS workshop had over 20% female participation!   In a field where the percentage is usually between 12-18%, this was impressive.

I would like to offer a sincere thank you to each of the program committee members as well as our sponsors:

  • FIRST provided a room and excellent facilities in partnership with their annual conference in Berlin.
  • The Internet Society hosted the social event, a boat ride through the canals of Berlin.
  • EMC Corporation provided lunch, snacks and coffee throughout the day to keep us going!

Program Committee:
Matthew Ford, Internet Society, UK
Ted Hardie, Google
Joe Hildebrand, Cisco, USA
Eliot Lear, Cisco, Switzerland
Kathleen M. Moriarty, EMC Corporation, USA
Andrew Sullivan, Dyn
Brian Trammell, ETH Zurich, Switzerland

End-to-End Message Encryption — Can it be done?

End-to-end (e2e) encryption for email is hard.  We know this from OpenPGP and S/MIME efforts with the main problem being around obtaining, installing, and exchanging keys.  While there are a number of positive efforts to fix e2e encryption for email, it may take a while for a viable easy to use solution to be deployed and actively used.  What if instead, we think more about what we want to communicate that needs to be encrypted and if a shift is needed to be open to new solutions?

Today with social media, we are trending toward texting and instant messaging as a default form of communication.  We typically use these technologies for quick messages and some, like XMPP have e2e encryption capabilities already… but it could be better.

Wouldn’t it be nice to have less mail to manage?  As we all know, quick messages and discussions are better suited to XMPP, SMS, or other IM related protocols.  Using these technologies gets you away from managing an unwieldy inbox, and if  it’s encrypted at the data level (e2e) you don’t have to worry about exposing that data.

XMPP already has Off-the-Record (OTR) encryption support, so aren’t we done?  No, that would be nice.  OTR is lacking some features and the crypto could be improved upon, but it certainly fits the bill for easy-to-use.

Developers and some XMPP working group participants have plans for new features and need to gauge support from the community.   Improved e2e functionality has been proposed in IETF drafts in the past, but has not gotten enough attention and really could help us improve our e2e capabilities for messaging.  Some of the features proposed include:

  • The ability to send encrypted messages when the recipient is offline, where the recipient can read the messages when they come back online
  • e2e encrypted group chat
  • Improved security and reliability of e2e encryption solution for messaging
  • ‘device mobility’ or the ability to send and receive messages on any of your devices
  • added accountability, to know who has the ability to read messages and who
    has the ability to confer that ability to others

If XMPP were favored in cases when you prefer to have your messages encrypted, it may lessen the need for e2e encrypted email, shifting how we accomplish communication goals.  For those unfamiliar, the XMPP protocol provides instant messaging or chat.  Use of instant messaging embedded in other tools makes this a more accessible method of communication for every day users, that may not be technical, but want their privacy protected with encryption.

So you say OTR is good enough.  Is it really?  While I do use OTR quite often and am glad to have it, it doesn’t always work.  An OTR session could break and send your text in the clear without forewarning.  This could be a problem.

Not only that, file transfers (both meta-data and data) are also completely unprotected by OTR, without warning in most user interfaces.  Along with many of the other protocols on this list:

A couple of highlights of XMPP extensions impacted:

  • Presence — Your presence is visible to everyone you’ve subscribed to — and all of the servers between you and them.
  • File transfers — and the associated metadata — are completely
    unprotected by OTR.
  • Vcard  — exchanging contact information in XMPP — that too is exposed with current OTR implementations
  • etc.

all of which have the potential to leak personal information without e2e encryption.

What’s needed?

Interest expressed in this work with active reviews and contributors on requirements and proposals to drive work on end-to-end encryption for XMPP forward.  This could be a game changer.


Kathleen Moriarty, Security Area Director

Coordinating Incident Response at Internet Scale (CARIS)

Kathleen Moriarty, Security Area Director

Coordinating incident response at Internet scale as a concept sounds fabulous, but can we achieve it? What will it take?

For those working in incident response and information sharing efforts, we know there is much to be done. While there is lots of good work progressing this area of information security, there are still very few resources skilled in forensics and mitigating threats. The CARIS workshop will bring together the diverse sets of experts to collaborate and better scale their efforts.

Last year, I wrote a blog series on the problems in the space with some ideas on how we might be able to progress in a way that helps not only the large organizations with resources to participate, but also smaller organizations with no resources.  The smaller organizations are part of the supply chain, hence the motivation to assist them. You can find more information in the blog series: Driving Towards More Effective Sharing Models.

One of the key takeaways, is the need for coordination among those driving efforts to progress this space including those running attack type mitigation efforts (APWG, ACDC, etc.), operators at service providers, regional CSIRTs, security professionals at large organizations, researchers, and vendors. Coordination requires getting these folks into the same room to see how we might collectively advance this space and have a greater impact with the few resources dedicated to these activities. The Internet Architecture Board (IAB) and the Internet Society (ISOC) CARIS workshop is set to take place on June 19th on the last day of the FIRST conference in Berlin.

CARIS will be run as a workshop to allow for active participation of attendees with a requirement to submit a research paper or fill in a template on your organizations sharing and mitigation efforts. All research papers accepted will be published on the IAB CARIS site and the template information will be shared out with participants via ISOC. The template will provide information needed for organizations to participate in each other’s efforts, potentially reducing duplication of effort and improve scaling of resources. This increased coordination of threat information may help with automation through the involvement of vendors.  Additionally, the increased coordination could assist with the ability to directly address threats where they can be mitigated or stopped by service providers, CSIRTS, or threat specific working groups.

One goal of this coordination is to more efficiently address threats for all, rather than limiting activity to sharing by organizations with adequate resources. This requires coordination among those with resources. The database of sharing efforts has the potential to increase collaborative efforts by involving communities such as the service providers and vendors who might be able to more quickly address such threats. Bringing this diverse crowd into a full day workshop could be a catalyst to enable future collaboration between organizations. We look forward to your submission and collaboration! The call for papers is open until April 3, 2015.