University of Texas at Austin Department of Computer Sciences
Networking Research Laboratory
Department of Computer Sciences
The University of Texas at Austin

Director: Simon S. Lam (more publications)

Adaptive Backoff Algorithms for Multiple Access: A History

The use of random access to share a broadcast channel was first proposed by Norman Abramson for the ALOHA System (AFIPS Conf. Proceedings, 1970).  He carried out his analysis under the assumption that both new and retransmitted packets arrive according to a Poisson process.  This assumption abstracted away the need for a backoff algorithm.

The first backoff algorithm for multiple access was proposed and investigated in our 1973 paper in National Computer Conference for the slotted ALOHA protocol.  We proposed to delay the retransmission of a collided packet by a random time, chosen uniformly over K slots (K > 1) where K is a parameter.  We showed that the channel throughput increases with K and, only in the limit as K goes to infinity, does the channel throughput approach the 1/e value predicted by analysis based upon the Poisson process assumption.

In ARPANET Satellite System Note 48 (NIC 17655), I presented simulation results to show that slotted ALOHA using a fixed K for backoff was unstable.  In simulations, the channel throughput quickly collapsed to zero even though the rate of new packet arrivals was 0.35 which is less than 1/e.  These observations led to several research contributions I made in 1973. First I developed a Markov Chain model that characterizes ALOHA instability. The equations and graphs of this model were reproduced in many textbooks, such as Schwartz (Addison-Wesley 1987), Kleinrock Vol. 2 (Wiley 1977), Rom & Sidi (Springer-Verlag 1990), and Bertsekas & Gallager (Prentice-Hall 1992).  

During 1973, I also proposed several adaptive backoff algorithms and investigated their effectiveness, including one called Heuristic RCP.  In Chapter 6 of my 1974 dissertation, page 210 (also my 1975 paper in National Computer Conference, page 150),  I wrote about Algorithm 4 (Heuristic RCP) as follows:  For a backlogged packet with m previous collisions, the uniform retransmission randomization interval is taken to be Km  where Km  is a monotone nondecreasing function in m. When the channel traffic increases, the probability of channel collision increases. As a result, the "effective" value of K increases. If Km  is a steep enough function of m, we see that channel saturation will be prevented.

I evaluated Heuristic RCP using analyses and simulations for several functions and came to the conclusion that it is simple to implement and is capable of preventing channel saturation under temporary overload conditions. Heuristic RCP was also described in our 1975 Transactions on Communications paper.

The binary exponential backoff algorithm used in Ethernet is a special case of Heuristic RCP with the function Km chosen to be 2m .
 

Selected publications