How should we describe and specify protocols? And how can we ensure that
network protocol specifications are consistent and correct, and that
implementations match the specification?
The IETF community has long used natural language to describe and specify
protocols, with occasional use of formal languages and some limited amount
of formal verification. In many ways this is clearly a successful approach.
After all, the Internet exists as an interoperable, global, multi-vendor,
network in large part due to the protocol standards the IETF develops. It
is clearly possible to build interoperable implementations from IETF
standards.
But is this the right approach to developing protocol specifications?
The way the IETF describes protocols has changed little over the years.
It may be time to consider whether new techniques have been developed that
might change how protocols are specified, and possibly improve the process.
With help from Chris Wood,
I organised a side meeting at
IETF 115 to discuss whether there could be benefits to more systematic
use of formal methods in the IETF, what are the factors limiting the
adoption of such techniques, and whether it would make sense to create
an IRTF research group to explore such
topics.
Welcome to Ivan Nikitin who
started as a PhD student this week. Ivan will be jointly advised by
me and Ornela Dardha,
and will be looking at how to make session types a practical design
tool for protocol standards.
To what extent is the technical development of the Internet driven by a
small clique of standards developers? What influence do those
developers have? And what control do large technology companies have on
the development of the technical standards that define the operation of
the Internet?
We attempted to answer these questions in a paper that
Prashant Khare
recently presented at the International Conference on
Web and Social Media (ICWSM).
In that paper,
we studied more than 2 million email messages, sent by
almost 45 thousand participants in the Internet
Engineering Task Force (IETF), over a twenty year period. The IETF
is one of the leading technical standards development organisations
relating to the Internet, and these discussions record the development
of key Internet standards – including the latest versions of the web
protocols, security standards, and widely used telephony, video
conferencing, and video streaming platforms – to understand the social
interactions and dynamics of the standards-setting process.
The development of IETF standards takes place by a mix of plenary and
interim meetings, and continuous, ongoing, discussion on mailing lists.
We analysed the public IETF
mailing list archives to understand the role email plays in the
development of Internet standards, collecting 2.4 million emails, sent
by 75 thousand participants to 1153 mailing lists during the period
1995-2000.
The IETF, as a standards development organisation for the Internet,
grew out of the original US government-funded project that developed
some of the early internetworking protocols. The organisation has
long recognised
that this history has biased its participants towards "well-funded,
American, white, male technicians, demonstrating a distinctive and
challenging group dynamic, both in management and in personal
interactions", and it is
well understood in the IETF that this lack of diversity is
problematic for an organisation that tries to develop the Internet
for all its
end users. The community has made some efforts to encourage
attendance by a broader range of people. Has it succeeded in these
attempts to diversify?
As the Internet develops, it's natural to ask whether it's getting
easier or harder to develop new standards, and whether the complexity
of those standards is changing over time. In the following, we
show that Internet standards documents have become more densely
interconnected over the years, and are taking longer to publish.
The IETF is becoming slower over time, because new standards are
harder to develop when they must account for, and retain backwards
compatibility with, increasing amounts of prior work.
The technical details that describe the operation of the Internet are
defined in the RFC series of
documents. This series began publication in 1969, as a set of requests
for comment on the initial design ideas for the system that became the
Internet. Over time, it developed into the main series of technical
specifications and standards that describe how the Internet works.
As the Internet had grown from a research project to critical
international infrastructure, it’s natural to ask how the underpinning
standards development process has changed. In particular, one might ask
how the number of Internet standards being published has changed over
time, and how this relates to the development of the network.
Stephen McQuistin will present our paper on
Characterising the IETF Through the Lens of RFC Deployment at the ACM Internet
Measurement Conference today.
This paper examines the shifts and trends within the Internet standards
development process, to show how the time needed to develop a technical
standard for the Internet has increased over the years, and how those
standards have become more complex as the network has evolved.
Then, building on these observations, it develops statistical models to
understand the factors that lead to successful uptake and deployment of
protocols, deriving insights to improve the standardisation process.
Two members of my research group recently gave talks about our work
on parsing protocol standards.
Stephen McQuistin spoke in the
UK Next Generation Networks seminar series, giving an update on the
work presented in our
ANRW'20 paper, outlining the system architecture and showing how
our system can be used to describe the format of real-world protocols
such as TCP. Vivian Band
spoke about
Rust for Safer Protocol Development as part of RustFest Global 2020,
describing our approach to parsing protocols with a focus on the back-end
code generation, and our use of Rust
with the nom parser combinator
framework to generate safe parsing code.
Stephen McQuistin presented our paper on
Parsing Protocol Standards to Parse Standard Protocols at the
ACM/IRTF Applied Networking Research Workshop 2020 last week. In
this paper we consider the problem of how to parse Internet protocols
standards documents to generate a typed representation of protocol
data units, and from that to generate parsing and serialisation code
to ease implementation of the protocol. The goal is to make it easier
to implement, test, and validate network protocols, and to help improve
security by removing the need to write protocol parsing code by hand.
I was pleased to join a panel discussion at the IFIP Networking 2020
Workshop on the Future of Internet Transport today, along with Jana
Iyengar and Alessandro Ghedini, and moderated by Gorry Fairhurst. The
slides
I used to frame my position are available, and if you're registered for
the conference,
a recording of the discussion is also available.
We released an update to our Python library for access to the IETF
Datatracker and RFC Index
(GitHub,
PyPI). The library
is supporting infrastructure for our research, that we use to find
and retrieve new drafts to test our protocol parsing and generation
tools, and to support bibliometric analysis of IETF standards
documents.
Welcome to Dejice Jacob, who's joined our project on Improving Protocol
Standards. He will be working on improving our tooling to extract packet
descriptions from IETF standards documents, and on integration with the
IETF datatracker.
I'm seeking a postdoctoral research associate to work on the EPSRC-funded
project "Streamlining Social Decision Making for Improved Internet Standards",
working with Matthew Purver, Patrick Healy, Gareth Tyson, and Ignacio
Castro at Queen Mary, University of London. The project involves
working with the IETF and research community, to understand and improve
the distributed decision making process by which Internet protocol
standards are developed.
I attended the mid-term review workshop for projects funded under the
EPSRC Engineering for a Prosperous Nation call, held at the Law Society
in London earlier today.
The initial goal of our work on improving protocol standards is to make
it easier to automatically generate packet parsers from IETF RFCs. We've
submitted
an internet draft describing a machine-readable way to specify
packet formats, while being only a minimal change compared to existing
authoring methods. Stephen McQuistin and I will be at the IETF meeting
in Montreal this week (20-27 July 2019) to discuss this with the IETF
community.
I'm pleased to announce that EPSRC has agreed to support our new
project on "Streamlining Social Decision Making for Improved Internet
Standards"
(EP/S036075/1).
This is
joint work with Matthew Purver, Patrick Healy, Gareth Tyson, and
Ignacio Castro at Queen Mary, University of London, and will run for
three years starting January 2020.
Welcome to Vivian Band, who's joined our project on Improving Protocol
Standards for a ten week paid summer internship. Vivian will be working
on implementing the back-end code generator that parses protocol data
units in our system, initially aiming to generate protocol parsers in
the Rust programming language.
I'm at IETF 103 in Bangkok this week, along with Stephen McQuistin,
where we'll be discussing how to bring more structured and machine
readable features to IETF specifications, exploring the attitudes
of participants to such formalisms, and trying to understand the
barriers to adoption. If you're at the IETF meeting, and interested
in these topics, please come and talk to me or to
Stephen.
I gave a talk at Loughborough
University last week, entitled “Parsing Protocol Standards,
Parsing Standard Protocols”, and will repeat it as a Systems
Section seminar in Glasgow later today. The talk introduces some of
the work we're doing in my EPSRC
project on Improving Protocol Standards for a more Trustworthy Internet.
I'm seeking a postdoctoral research associate to work on the EPSRC-funded
project "Improving Protocol Standards for a more Trustworthy Internet".
The position involves working with the IETF and research community, to
develop semi-formal tools and methods for specifying protocol standards,
along with tools to support their use in the IETF community. The goal
is to improve the quality of Internet protocol standards.
I'm pleased that the EPSRC has agreed to support our work on "Improving
Protocol Standards for a more Trustworthy Internet". This is a two year
project, funded under the Engineering for a Prosperous Nation call, that
aims to make the Internet more robust by improving the way in which the
underlying protocol standards are developed.
Stephen McQuistin will join the
project as a Research Assistant, and I expect to advertise for a
further RA position shortly.