# Dennis Ritchie's Lost Thesis

Many of you, dear readers, have heard of Dennis Ritchie. In the late 1960s, he left graduate studies in applied mathematics at Harvard for a position at Bell Telephone Laboratories, where he worked all his life. Shortly after joining Labs, Ritchie joined forces with Ken Thompson to create the fundamental pair that spawned the next digital world: the Unix operating system and the programming language C. Thompson led the development of the OS, and Ritchie was involved in the creation of C, on which Thompson rewrote Unix. At that time, Unix became the basis for most of the operating systems from which our digital world was built, and the C language became (and still remains) one of the most popular languages for creating software that sets this world in motion.

*Unix creators Ken Thompson and Dennis Ritchie. Photo source unknown.*

On Labs’s personal Ritchie web pages (which the current owner of Nokia still supports), he describes his journey into the academic world of computer science in a characteristic dry and derogatory style:

“I... got a bachelor's degree and a scientific degree from Harvard University, where I was a student in physics and a graduate student in applied mathematics... The subject of my 1968 doctoral dissertation was sub-recursive function hierarchies. The experience of my student studies convinced me that I am not smart enough for a physicist, and that computers are pretty curious. My postgraduate experience has convinced me that I am not smart enough to become a specialist in the theory of algorithms, and that I prefer procedural rather than functional languages. ”

^{1 }.

Whatever the price of this introspection, Dennis's path definitely brought him to the area and environment in which he made an outstanding contribution.

## “Everything except the stitched instance”

Perhaps it may seem unexpected that, until recently, despite Ritchie’s well-deserved fame in the computer field, his dissertation, the intellectual and biographical fork that separated his academic career in computer science from his work at Bell Labs, which led to the emergence of C and Unix, was

*lost*. Lost? Yes, quite right - they did not publish it and it was absent in public collections; there weren’t even records of her in the Harvard library catalog, nor in the dissertation databases.

After the death of Dennis Ritchie in 2011, his sister Lynn conducted a very thorough search for the official copy and any records at Harvard. None of this happened, but she found a copy from the widow of Ritchie, the former scientific adviser. Until recently, for a quarter of a century, only a dozen people could read Ritchie’s dissertation. Why did it happen?

From the description of Ritchie’s academic path, you can see that he does not explicitly say that he received a PhD for his 1968 dissertation. Because he did not receive this title. Why? It seems that the reason was the inability to fulfill the necessary requirements for the official placement of his completed dissertation in the Harvard library. Professor Albert Meyer of the Massachusetts Institute of Technology, who studied in the same stream with Ritchie on a master's degree, recalls this story in a recent interview :

“I heard this story from Pat Fisher [the scientific adviser of Ritchie and Meyer at Harvard] and it really is true: in those days, according to Harvard’s rules, you had to give the university a booked copy of your dissertation - to get PhD, you needed a certificate from the library. According to Pat, Dennis presented his dissertation. She was approved by the dissertation council; he had a typewritten original of the dissertation, ready to be handed over, but then he found out that the library needed to be given a bookmarked copy. And the price of brochure at that time was significant... not a heavy, but non-trivial amount.Pat said Dennis took it this way: “If the Harvard library needs a stitched copy for storage, then let it pay for the book because I'm not going to!” And, obviously, he didn’t go for it, as a result he didn’t getting a phd. That is, he did not finish his studies on the stage of “everything except a dissertation”. He reached the stage of “everything but a bookmarked copy” of

^{2 }.

Although Lynn Ritchie’s investigation confirmed that Dennis Ritchie didn’t pass a booked copy of his dissertation and didn’t graduate from Harvard with PhD, his brother John believes that Dennis’s actions were something other than a high price irritation: he Already had a prestigious research position at Bell Labs and "he never liked doing the little things of life." We will never know the true reasons; perhaps they were not fully understood by Ritchie himself. But we know with certainty that until recently, Dennis Ritchie’s dissertation has been lost for a quarter century.

*Dennis Ritchie (right) around the time he started working at Bell Laboratories with two Labs employees: his father Alistair Ritchie (left) and the founder of electronic telephone switching, William Keister (center). Photo from the collection of the Ritchie family.*

## In the collection

The collection of Dennis Ritchie , recently donated by Ritchie’s brother and sister to the Computer History Museum, has now been discovered several historical treasures. The first is a collection of the very first Unix source code dating from 1970-71. Another is a darkened and stained photocopy of Ritchie’s doctoral dissertation

*Program Structure and Computational Complexity*. The museum gladly took the opportunity to create a digital copy of Ritchie’s own original dissertation (as well as a much more legible digital scan of the original copy owned by Albert Meyer) and first published them.

## Decoding of Ritchie’s dissertation

*The first published page from a previously lost manuscript of Dennis Ritchie’s dissertation.*

It is one thing to restore a copy of Ritchie’s lost dissertation and publish it, another to understand it. To get an idea of what Ritchie’s dissertation is about, we need to go back to the beginning of the 20th century, during a period of creative brainwashing, in which mathematicians, philosophers and logicians puzzled over the most fundamental foundations of mathematics. Over the centuries preceding this fermentation of minds, the qualitative characteristics of mathematical knowledge — its accuracy and certainty — have given it a special, sometimes divine, status. Although philosophical discussions about the source or foundations of these qualities extend at least to Pythagoras and Plato, at the beginning of the 20th century, influential mathematicians and philosophers began to consider formal logic as the foundation of mathematics, in which the rules and procedures of reasoning were expressed in symbol systems.

Throughout the 1920s, the German mathematician David Hilbert was an extremely influential researcher who sought to declare formal logic the basis of mathematics. In particular, Hilbert believed that certain qualities of mathematics could be established, for example, that mathematics was free from contradictions and that the truth or falsity of any mathematical assumption could be demonstrated by certain proofs of formal logic. The evidence of this kind, advocated by Hilbert and called “finite”, uses simple, explicit, almost mechanical rules for manipulating symbols of formal logic.Such evidence should be based on a rigorous construction of character strings, line by line, deduced from one another.

In the 1930s, in search of such rules for logical manipulation of symbols, mathematicians and philosophers found a connection between

*calculations*and step-by-step rigidly defined processes by which living computers ("computers") and mechanical calculators performed mathematical operations. Kurt Gödel deduced evidence related to this topic, but unfortunately for Hilbert and his allies, it showed the opposite. Instead of proving that logic is a guarantee of the provability of everything that is true in mathematics, Gödel’s logic demonstrated the opposite — that mathematics is

*incomplete*. To obtain this astounding result, Godel based his reasoning on special types of mathematical objects called

*primitive recursive functions*. For Gödel, in recursive functions, it was important that they are

*computable*, that they use "final procedures" similar to those simple, almost mechanical rules that Hilbert referred to.

*Left: student Kurt Gödel in 1925. Wikipedia/public domain. Right: David Hilbert, 1912. Wikipedia*

Soon after Godel, the American mathematician and logician Alonzo Church used similar computability considerations to formulate a logical proof that showed that mathematics is not always

*decidable*, that is, in mathematics there are formulations whose truth or falsity cannot be proved. Church's proof was based on the notion of “effectively computable functions”, which was based on Gödel's recursive functions. Almost at the same time, regardless of Church, Briton Alan Turing deduced a proof of the same result, but on the basis of “computability” defined by the work of an abstract “computer”. This abstract Turing machine, capable of performing any calculations or calculations, later became the most important foundation of theoretical computer sciences.

In the following decades, preceding the emergence of computer science as a recognized discipline, mathematicians, philosophers, and others began to explore the nature of the calculations themselves, becoming ever more distant from connections with the foundations of mathematics. In an interview, Albert Meyer explains it this way:

“In the 1930s and 1940s, an understanding of what was computable and what was not was actively worked out. Thanks to Gödel and Turing, logical restrictions appeared on what can and cannot be calculated. And a new idea that appeared in the early 1960s was as follows: “Let's try to understand what can be done with the help of calculations”. So the concept of computational complexity appeared, which meant that all kinds of things can be obtained by calculations, but not all of them are easy... How well can they be calculated? ”

Later, with the development of electronic digital computing, for many such researchers the question was not so much that logical reasoning about computability could tell us about the nature of mathematics, but that such logical reasoning could tell us about the limits of computability itself. When all these limits were well investigated, the interest of researchers shifted to the study of the nature of computability within these limits. What can we prove in the field of

*possible*calculations?

*Professor Albert Meyer shares his memories of Ritchie in an interview in 2018.*

One of the few places where such studies were conducted in the mid-1960s, when Dennis Ritchie and Albert Meyer entered Harvard graduate school, were parts of the departments of applied mathematics. Often such departments were also places where electronic digital computing was first introduced in educational institutions. Meyer recalls:

“Applied mathematics was a huge field in which the theory of computing occupied a tiny new part.”

Both Ritchie and Meyer were attracted to the Department of Applied Mathematics at Harvard thanks to their student math classes at the university, although Meyer does not remember that Ritchie knew as a student. In their graduate work, both of them became very interested in the theory of computing, so they chose Pat Fisher as their supervisor. At that time, Fisher had just received his PhD and worked at Harvard only in the most important early years of Ritchie and Meyer studies, and then, in 1965, transferred to Columbia University. (Later, in 1982, Fisher became one of Unabomber’s goals.) Meyer recalls:

“Patrick was very interested in understanding the nature of computing: what simplified them, what complicated it, why they need to be done in different ways... What are they capable of executing different types of programs?”

## Summer task and its solution

After their first year of graduate studies, Fisher individually hired Ritchie and Meyer as his laboratory assistants for the summer (at least Meyer was not aware of hiring Ritchie). What was Meyer supposed to do? Work on one “unsolved problem” of the theory of computation, chosen by Fisher. At the end of the year he was to provide a report. Fisher was absent. Meyer spent a sad summer working alone on a task; in the end, he told Fisher that he had achieved only minor results. Shortly afterwards, after attending a seminar for Fischer graduate students, Meyer was amazed to understand the solution to the summer problem. He joyfully shared his discovery with Fisher, but "was surprised and a little disappointed with Pat's message that Dennis also solved the problem." Fisher set Ritchie and Meyer one task for the summer, and did not even tell them about it!

*Dennis Ritchie while graduate student. His father Alistair Ritchie (also employed at Bell Labs) is in the backseat of Dennis BSA 650 motorcycle. The photo belongs to the Ritchie family.*

Fisher’s summer task was an attempt to answer a more general question of computational complexity — relative simplicity, or how quickly one compares from the other. Recall that Gödel used primitive recursive functions to demonstrate an example of computability by finite procedures, an essential element of his famous work. In the 1950s, the Polish mathematician Andrzej Grzegorczyk created a hierarchy of the same recursive functions based on the speed or slowness of the growth of functions. Fisher instructed Meyer and Ritchie to investigate how this hierarchy of functions relates to computational complexity for the summer.

To Meyer's credit, his disappointment with the results of his summer work provided more attention to Ritchie’s decision:

*cyclic programs*. Meyer recalls: "... this concept of loop programs invented by Dennis... was so elegant and important, and it also became such a terrific research and intellectual mechanism for analyzing the topic that I was no longer bothered that he solved the problem, not me.".

The solution of Fisher's summer problem in the form of cyclic programs became the foundation of Ritchie’s dissertation of 1968. In fact, these are very small, limited computer programs that would seem familiar to anyone who has ever used the FOR command to create program loops on BASIC. In cyclic programs, you could assign a variable a value of zero, add one to a variable, or move a value from one variable to another. And that’s all they were capable of. The only way to control cyclic programs was... a simple cycle in which a sequence of instructions is repeated a given number of times. It is also important that the loops could be “nested”, that is, the loop could be placed inside the loop.

In his dissertation, Ritchie showed that it is precisely such and only such cyclic functions that are needed to create primitive recursive Gödel functions; it was these functions that made up the Grzegorczyk hierarchy. Godel positioned his recursive functions as actually computable, and Ritchie showed that cyclic programs are suitable for this work. Ritchie’s dissertation showed that the degree of “nesting” of cyclic programs — the depth of the cycles inside the cycles — is a measure of their computational complexity, as well as a criterion for the time required for their calculations.Moreover, he showed that evaluating cyclic programs by their depth of nesting cycles is completely equivalent to the Grzegorczyk hierarchy. The growth rate of primitive recursive functions is indeed related to their computational complexity; in fact, they are identical.

Meyer recalls:

“Loop programs turned into a very simple model that any researcher in the theory of computation could instantly understand... The traditional formulation from the point of view of primitive recursive hierarchies had a very detailed logical record with complex syntax. And suddenly, instead, we got a description of cyclic programs in three to four lines. ”

Although, as we already understood, Ritchie’s development of such cyclic programs never reached the world of computer science through his dissertation, she managed to penetrate the literature thanks to a joint publication in 1967, written together with Albert Meyer.

Meyer tells:

“Dennis was a very nice, good-natured, modest guy. Obviously, very smart, but at the same time laconic... Therefore, we talked a little, and talked about the article, which for the most part I wrote. It seems to me that he didn’t write it at all, but he read it and made his comments... and explained to me the concept of cyclic programs. ”

The article “The Complexity of Loop Programs” was published by ACM in 1967; it was an important step in the beginning of Meyer's productive and successful career in theoretical computer science

^{3 }. But she also became a breaking point for him and Ritchie. Meyer recalls:

“For me it was a disappointment. I would love to continue working together, because he seemed like a smart and pleasant person who was interesting to work with, but, you know, he was already doing other things. He could stay awake all night and play Spacewar! ”

At the beginning of my article, I said that in a biographical note on my website, Ritchie joked: “My post-graduate experience convinced me that I am not smart enough to become a specialist in algorithm theory, and that I prefer procedural rather than functional languages”. Yes, his addiction to procedural languages is undeniable, but our study of his lost dissertation says that he was nevertheless smart enough for theoretical computer sciences. It is more likely that, thanks to his postgraduate study experience, Ritchie was fascinated by theoretical work, which, in turn, led to a fascination with practical implementation, the construction of new systems and new languages as a way of exploring the boundaries, nature, and possibilities of computation.

**Read Dennis Ritchie’s lost dissertation:**

## Notes

- Dennis M. Ritchie bio, Bell Labs
- Oral History of Albert Meyer, interviewed by David C. Brock, September 5, 2018, CHM
- Albert R. Meyer and Dennis M. Ritchie, “The Complexity of Loop Programs,” in
*Proceedings of the 1967 22nd National Conference On -*(the 1967 22nd national conference, Washington, DC, United States: ACM Press, 1967), 465–69, doi.org/10.1145/800196.806014 .

## About the Author

David C. Brock is a technology historian and director of the Computer History Museum Software History Center. He is interested in the history of computing, electronics and equipment, as well as oral history.

Source