In the last issue of the SIGMOD Record, Jeff Ullman remembered the database course taught by Catriel Beeri at Princeton in 1977 or so. This paper (submitted to TODS in March of 1978) was one of the first products of the ferment generated by Catriel's course. The question addressed was: when does a set of functional dependencies guarantee that any relation that satisfies it can be decomposed without loss of information? The special case of decomposing one relation into two had been solved years before by Delobel and Casey and by Rissanen, and incorrect generalizations of this result, made by well-known researchers, were circulating in manuscript.
As a graduate student, reading an early version of what became known as the ABU paper, I was struck by several facts: that database theory was subtle enough that well-known researchers could make mistakes; that the mysterious phenomenon of ``the connection trap'' bandied about at the time could be cleanly formalized and analyzed; and that humour was allowed, so that Theorem 2 was called ``the Mickey Mouse Theorem'' for reasons that are obvious if you look at the accompanying figure (regrettably this name did not make it to the published version).
The simple and elegant algorithm for testing losslessness in this paper, which Jeff and Al Aho used to describe as ``chasing down dependencies,'' served as a starting point for Shuky Sagiv, Dave Maier, and me, all graduate students at the time, to start what became a whole body of work on the chase method, still an important theoretical tool today. In fact, at about the same time as this issue of the Record ships, a paper that applies the chase to the highly au courant topic of information integration is being presented at the ICDT'99 conference in Jerusalem, which is co-chaired by none other than Catriel Beeri.
Copyright © 1999 by the author(s). Review published with permission.