# Usable formal methods side meeting (IETF 115) - Thursday 10th November, 11.30am - Richmond 6 #fdt #improving-protocol-standards ## Goals and motivation - RFCs written in English -> informal, ambiguous -> errors/inconsistencies as a result - Limited adoption of formal languages/techniques/structured approaches - Does it make sense to form an IRTF research group to explore the systematic use of formal methods? if so, what are the factors limiting their adoption? - Carsten Bormann: is conformance testing explicitly in scope? - Jonathan Hoyland: from an academic definition, this wouldn’t include testing. But there is some confusion about what testing means in this sense — reasonable to discuss. ## “A Bird’s Eye View of Formal Methods? And Why We Should Care” (Jonathan Hoyland) - Slide presentation - What are formal methods and why should IETFers care? - Splits into two: formal analysis (does the design do what I intended?), and formal verification (does this implementation follow from the design?) - Formal analysis - write algebraic description of protocol and of desired properties - check compatibility between the two - undecidable in the general case - both description steps are hard - compatibility check and proof are also hard - Formal verification - implement protocol in a given language - write algebraic description of the protocol - model check - produce code using a (verified?) compiler - still hard: problem undecidable in general case - correct implementation, etc, also hard - why should we care then if it’s all difficult? - applied to security-critical drafts for a while, found bugs before they became RFCs - future RG should: - encourage and enable formal analysis _during_ the drafting process - act as a knowledge base on formal methods - providing feedback to developers to make tools easier to use - peer review on analyses - Discussion - Carsten B: formal description should be in scope. My objectives here are a bit less ambitious: just happy if I can get the people I’m working with to generate ABNF, or just even read the ABNF that I generate - Jonathan H: definitely a good first step, so in scope. - Carsten B: need to understand why people aren’t people doing it. Techniques give unambiguous definition of the protocol. Provides a way to develop a common understanding of the protocol, and an automated input for manual checking — very important. Run the formal description through some tooling, generate examples, show people the examples and check understanding. The same input can go into automated testing tools. People really like it that they can feed formal description into code generators. these features/benefits - Chris Ignacio: ABNF isn’t verifiable, right? You can specify things that are too complicated to go through a formal proof? It’s a grammar that can have recursive definitions, and has infinite expressions. - Jonathan H: Recursion and infinite expressions not necessarily a problem. but might be undecidable in general. What we can try to do is define protocols that _can_ be proven. In the past (TLS 1.3), had a few points where we said to the working group that there were elements that weren’t necessarily problematic, but that were unprovable. This was useful input: leads to designs that are provably correct. - Toerless Eckert: Starting point for me is that in a protocol spec, I want two independent descriptions of what the protocol is. English text, and then some formal specification. As a minimum can then verify them against each other, even if that’s a manual process — can move towards automation. - Jonathan H: slight risk that the two descriptions don’t align, can lead to twice as many problems. - Tony Li (remote): Formal specification would be a major accomplishment and a far lower bar than formal analysis and verification. Even just for human to human communication, it would avoid a great deal of ambiguity. - Jonathan H: agree that it is a low bar, but we can already much of this. We don’t necessarily need to set a low bar. - Hans-Dieter Hiep: should not focus too much on one description language. The slides say algebraic, but do we need to fix on that? can be more general? - Jonathan H: absolutely, don’t need to, didn’t want to overload slides. - Yaron Sheffer: to Carsten’s point, formal descriptions might expand the attack surface, not contract it. Issues with ASN.1, JSON encoders, etc. - Jonathan H: of course, but that’s why we need formal descriptions. - Stephen Farrell: definitely formal verification should be in scope. but should happen alongside easier approaches, like socialising these ideas within the IETF. formal verification should be a later goal. - Carsten B: a comment - two definitions in a document that don’t agree is great. shows that WG didn’t agree, or didn’t understand, and there’s a problem to be fixed - with one description, this isn’t highlighted. ## “Formal Specification and Specification-based Testing of QUIC” (Ken McMillan and Lenore Zuck) - Slide presentation - Going to talk about the uses of formal specifications, within the context of specifying the quiz transport protocol. - We talk a lot about specifications, but few have actually seen one - assume that they’ll be provided to us - What is the function/form/content/process? (qs about specs) - Functions: thinking tool, contract between designers, maybe part of a formal proof, as a test/formal verification component - QUIC intro - Importantly: english language RFC that tells you how to implement QUIC. lead to a bunch of implementations, can be tested for performance and interoperability. - Test generality: implementations are not sufficiently adversarial - might show conformance to an interpretation of the standard that is wrong - Doesn’t show compliance to a common standard - TLS example - Drives the specification function: primary function as a test artefact (generate adversarial tests, check protocol compliance), secondary function as a contract (capture protocol knowledge implicit in implementations, aid future implementers in compliance) - Everything else then follows from this - form of the spec needs to be compositional, assume/guarantee properties, be deterministic. needs to contain safety properties of events visible on the wire. creating the specification needs to capture properties from testing _actual_ implementation. - Methodology: formal specs of both client and server, then formally prove user-level guarantees. use _testing_ to show this. generate client packets automatically from spec. server packets automatically validated against spec. - SIGCOMM 2019 paper. capturing quick spec by testing. used I-D. specified only safety processes. refined by testing four implementations of QUIC - Discovered many errors in targeted implementations - Found errors not found in interop testing - shows the value of the approach - Conclusions: specification is a tool with a use, need to understand what the use of a spec is - Generate adversarial tests, check compliance of implementations, capture protocol knowledge from implementations - A spec doesn’t need to be complete or perfect to serve its function well; developing a specification is an iterative process - Must have corrective forces pushing on both sides - Can address significant pain points in protocol development - Capture knowledge in precise and actionable form - Efficient approach to unit test development - Discussion - Corine de Kater: isn’t this just programming? implementation, and testing? - Ken McMillan: I think that this is a step before implementation and testing. - Stephen F: good talk. I disagree with the last bullet (specs come first). It’s much more messy — starts with bunch of people interested in something. - Ken McM: to clarify, don’t think that the whole process starts with specifications, but the process of getting formal verification into the IETF starts with specs. - Stephen F: would push back on “starts with” phrase. - Jonathan H: agree with Stephen, formal verification in TLS 1.3 didn’t come along until maybe draft 7. But also liked the presentation, need to consider early too. - Ken McM: purpose here is to get people to specify their protocols formally, so that we can do something with it. Formal verification is hard, but there’s a lot we can do with formal descriptions that have value. - Jonathan H: security is further down this path. TLS 1.3 goes through this process in full across multiple iterations. - Colin Perkins: think that it is clear that there are broader questions around how we bring formal specs/verifications/validation into the IETF. - Dennis Jackson (remote): is the QUIC spec public/open source? - Carsten B: to the last bullet point, think that it should be humans first. Formal description is about human communication. It isn’t sufficient to have some code that some grad student wrote, and they alone understand, but that then cannot be used to communicate with each other. Need to get to a point where we can use these formal descriptions as a communication tool. ## “Formal Security Analysis: OAuth and OpenID Connect” (Daniel Fett, Guido Schmitz) - Slide presentation - All about security analysis of web protocols - Protocols running on top of the web infrastructure (so not the web itself) - Detailed Dolev-Yao style model - web infrastructure model (WIM) - Challenges: web infrastructure is complex, accurate modelling is hard, comprehensive tool-supported models aren’t available - “A Comphrensive Formal Security Analysis of OAuth 2.0” (2016). Helped to uncover new and unexpected problems (e.g.m mix-up attack) creation of OAuth security BCP and RFC 9207 - Challenges: no attacker model, no clearly defined security goals, OAuth is not one protocol, but many - OIDF: formal analysis for high-security OAuth profiles - FAPI: high-security profile of OAuth and OpenID connect - Two versions - analysis found new attacks using WIM, FAPI 2 - formal analysis as an explicit goal - Guido Schmitz: ongoing work - formal protocol analysis framework built on top of F* verification ecosystem. creating a tool (DY*). case studies helped to expose problems in the tooling/model. ## Discussion - Colin P: very useful presentations, range of tools and techniques across the stack, appropriate to different use cases across the IETF - Meeting is to discuss the possibility of creating an IRTF RG - The IETF WG formation process does not apply - Process described in RFC 2014 - Prepare a draft charter and a list of proposed founding members for a new research group - IRTF chair and IRSG review, then forward to IAB for approval. Review considers: - research objectives clear/relevant? - foster work that wouldn’t get done otherwise? - what (if any) is the overlap with other RGs? - is there sufficient interest and expertise in the research group’s topic? - Quick overview of proposed charter text - Next steps/questions: - does this seem appropriate for an RG? - what if anything have we missed? - are there volunteers to work on this topic? - what would be the initial research directions? - Nicola Rustignoli: SCION has a research effort to do formal verification. OSS implementation has been partially formally verified in core parts. Not something that we have brought to the IRTF or IETF. Definitely an area where we see a gap. Experience from our side might be useful. can’t comment on when/how - but definitely a need. Questions: would this research group also support verification of existing protocols/standards, providing support/tools or doing the actual work? - Colin P: providing a forum where people can get help with such analysis, by bringing together expertise. - Jonathan H: seems appropriate, nothing obvious missed, happy to work on this topic, several initial research directions: first, publish and review existing and ongoing work; and second, provide a forum, knowledge, evangelism resources. - ???: work is appropriate, relevant, and overdue. - Ken McM: one of things that would be interesting for the RG to do would be comparative studies of different tools/languages and techniques. - Jonathan H: strongly agree. There is one SoK (Systemization of Knowledge) paper that has done this, but very little out there. Just understanding differences would be useful. - Hans-Dieter H: very interested in working in this area, already got funding. - Stephen F: good idea, would support, and hope it happens. - ???: looking at model checking for ??? protocol. Was hard to find resources on this. Would be good to have this research group. - Sofia Celi (remote): two sets of initial research directions. First, cfrg specs, and second, formal analysis of QUIC. - Benjamin Beurdouche (remote): collecting needs from WGs.