Skip to content
0 / 17
Pre-launch Review

Oral History of Margaret Hamilton (2017)

Computer History Museum, April 13, 2017. Interview by David C. Brock. 3 hours, 3 minutes.

Oral History of Margaret Hamilton

Loading transcript…


Margaret Hamilton was born in 1936 in Paoli, Indiana, into a family of teachers, ministers, and musicians. Her father was a philosophy and poetry professor, president of the Michigan Poetry Society. Her mother taught high school. Both sets of grandparents were ministers — one side Quaker, speaking “thee” and “thou.”

The family moved frequently through Indiana, Michigan, and Ohio. Hamilton grew up in small towns — “it would be lucky if I could say it had 100 people” — and became accustomed to being the new kid. Music was everywhere: “always maybe three pianos being played at the same time by people in the family.”

“School for me was fun. I looked forward to it. And I loved every course — I shouldn’t say loved. That’s a strong word. I mean math was my favorite. And what I did not like was home ec because girls were supposed to take it. And, again, it was rebelling against what we had to do.”

She played on an all-boys baseball team (“we were in such a small town that they needed an extra player”), sang in choir, played in a band, and pursued “you name the hobby.”


Hamilton’s most formative pre-computing experience was managing the Arcadian Copper Mine tourist operation in Michigan’s Upper Peninsula, starting at age sixteen. What began as guiding a family a day grew to over 1,000 visitors daily over several summers. She hired and managed other guides (including her brother), ran the jewelry department, and handled the finances for the owner, Arvo Walitalo, who had never been past fourth grade.

“Over this several years I was running the Arcadian Copper Mine to take people on tours for quite a long time on summers during both high school and college. I also worked at Joe’s Chicken Basket at nighttime when I was running the copper mine. So you can see I got an idea of what you’re getting into when you’re going to work every day and for responsibility and what it means to have responsibility.”

When her guides went on strike, she negotiated raises for everyone — including herself. When she found a large piece of native copper, the owner proposed a Solomonic solution: he cut it in half.


Education: Michigan, Earlham, and Florence Long

Section titled “Education: Michigan, Earlham, and Florence Long”

Hamilton attended the University of Michigan first (“probably because I had a good scholarship”), then transferred to Earlham College, a Quaker-founded school where generations of her family had studied. She majored in mathematics and philosophy — convinced, along with her father, that “philosophy and mathematics were definitely connected.”

The decisive influence was Florence Long, head of Earlham’s math department:

“She was so good at it. But she was also a wonderful human being. She would invite all of her students, myself being the only woman and all of the guys, over to her house for cucumber sandwiches. I mean it was like a family kind of thing. But she was good. And I remember thinking, ‘I want to do what she’s doing. I want to teach the kind of mathematics she’s teaching.’”

Hamilton married in 1958 and had her daughter Lauren in late 1959. The couple agreed: one would work to support the family while the other pursued graduate school. Hamilton deferred her planned Ph.D. at Brandeis — a deferral that became permanent once she discovered computing.


First Computing: Edward Lorenz and the LGP-30

Section titled “First Computing: Edward Lorenz and the LGP-30”

Hamilton’s introduction to computing was working for MIT meteorologist Edward Lorenz — the father of Chaos Theory — writing weather prediction software in hexadecimal on the LGP-30:

“He had an LGP-30 in his office. And he had at least two Ph.D.’s if not three. Neither — none of them I should say, was in the field of computers and programming — and software —, but he loved that computer.”

Lorenz taught her the hardware intimacies: how to skip drum locations for speed, how the machine worked at the lowest level. Hamilton’s innovation was hacking the binary paper tape directly — “I poked a hole in there. That was the way to get a 1 when it was a 0. And if I put Scotch Tape over it, I could go the opposite” — stretching the paper tape down the hallway of the meteorology department to make changes.

She also worked on the PDP-1 at Project MAC, bringing her infant daughter along at night. The hackers considered her “the establishment” but also “an exception” — “I was a programmer. I was a serious programmer. I was there to build real systems.” In a memorable episode, she caught the hackers modifying the hardware (“there’s a chapter that they talk about my coming in there, you know, ruining their fun”).

Years later, she discovered Lorenz had thanked her for the programming in one of his famous papers on chaos theory: “I never knew he did that. That really meant the world to me.”


At Lincoln Labs, Hamilton worked on the SAGE air defense system, developing radar tracking software on the AN/FSQ-7 (XD-1). Getting hired required an unusual accommodation:

“He said to me, ‘We’re really interested in having an interview,’ he said, ‘But please forgive me.’ He said, ‘But I’ve only interviewed men and I do interviews in my hotel room and I can’t do that with you because you’re a girl.’ He said, ‘Would it be okay if we had an interview at the bar in the hotel?’”

Her first assignment was deciphering a departed programmer’s intentionally tricky, uncommented code — “all the comments were in Latin and Greek.” This experience convinced her to always write comments.

The SAGE computer’s crashes were public spectacles — “bells and whistles, really loud, everybody could hear it. Flashing lights. And then, it’s like, and you’re standing there like they caught you.” Hamilton’s solution: Polaroid pictures of each programmer posing with their bug.

Then came the seashore program:

“One of the things about my program, the one I spent the most time on, was whenever it ran, it sounded like the most beautiful seashore. And people would come listen to it. It was music.”

When a computer operator called at 4 AM saying “something terrible happened; your program no longer sounds like a seashore anymore,” Hamilton drove in and fixed it. This became one of her canonical stories about debugging by sound.


Hamilton learned about the Apollo program through the news and immediately went for an interview. She had two job offers on the same day — one for onboard flight software, one for ground support systems:

“I thought, ‘Oh, my God. I know which one I want but I don’t want to hurt anybody’s feelings. I’ll just tell them to decide.’ And they flipped a coin. And the one that won was the right one.”

Her early assignment was writing an abort program for unmanned missions — given to her because “she’s a beginner and it’s never going to go there anyway.” She named it “ForgetIt.” When a mission actually aborted and her code executed, “I became an overnight expert.”

She quickly moved into systems software, managing the interfaces between all the flight software modules:

“The engineers would throw their requirements over the wall and just expect everything was gonna start working. Well, I started working on the area where you worry about how these different algorithms, once becoming software or even before, interface with each other.”


At its peak, Hamilton’s group comprised about 100 software engineers managing code from 300-400 “guests” — engineers from other disciplines who submitted code to be integrated into the flight software. Multiple missions ran concurrently, each with its own “rope mothers” (Assembly Control Supervisors) for the Command Module and Lunar Module.

Hamilton learned from her own mistakes. After putting a systems software change directly into the daily release and breaking everyone’s programs:

“Everybody was standing outside my door saying, ‘My program’s not working. What did you do?’ So I came up with this new thing. It was called offline version — until you were happy that your change worked in the offline version, you don’t put it into the main.”

She also discovered the “augekugel method” — a term the engineers used constantly that she was embarrassed to ask about, until she learned it simply meant “eyeballing” in German.


One of Hamilton’s most-told stories involves her daughter Lauren, who crashed a hardware simulation by selecting the pre-launch program (P01) during flight:

“She had selected the pre-launch program when it was in flight, which meant that two programs were sharing the same erasable. So I came back and told people about it… I kept saying, ‘We’ve gotta put a fix in there.’ And the powers that be… they didn’t want to put that in because they were worried about extra code, and the astronauts would ‘never, ever make a mistake.’”

Hamilton argued persistently but was overruled. She managed to get a program note added: “do not select P01 during flight.” On Apollo 8, an astronaut did exactly that — entering P01 instead of “star one” during navigation. The program note told them what had happened and how to recover.

“I was in the SCAMA room and we were pouring over the listings. I said, ‘It’s the Lauren bug. I know it’s the Lauren bug.’”

After Apollo 8, she was finally allowed to put the protective fix into the software.


Around 1966, Hamilton began worrying about a fundamental gap: there was no way to interrupt the astronauts’ displays during an emergency. The flight software was asynchronous, but the human-machine interface was synchronous.

She convened a meeting with hardware and software engineers. The hardware team said it couldn’t be done — the display hardware wasn’t left on throughout the mission. Hamilton asked: “Why can’t it be left on?” They came back two days later: “We’ve decided to leave the hardware on.”

The systems software team raised a harder objection: true parallelism between the astronaut and the software. Hamilton went home, solved it overnight, and returned with the “five-second display” concept — count to five before the astronaut responds, ensuring the priority display is seen.

On Apollo 11, just before landing:

“All of the sudden guess what comes up: 1201 and 1202 priority displays telling them there’s an emergency. This is just before they land… The astronaut knew that he had put the switch in a position that had caused extra stuff affecting the computer, and he realized, ‘Oh, yeah,’ and he put it back in the right place and they landed.”

Jack Garman in Mission Control made the “Go” call — he had written down all possible alarm codes at his boss’s request just days before, after the same alarms appeared during simulation.

“Going through my mind had nothing to do with the mission. It was like, oh, my god. My software — the priority displays, that was the part I had personally written."


Hamilton coined the term during the Apollo program, using it to distinguish the discipline of building flight software from the hardware engineering around it:

“Why don’t we call this one the hardware engineering part and this the software engineering part?” The engineers thought it was funny: “There goes Margaret with her software engineering.”

A turning point came when a respected hardware engineer stood up in a meeting: “You know, Margaret’s right. This is engineering, what you people are doing, just as much as the stuff we’re doing.”

At a 2009 Apollo reunion, Davy Hoag approached Hamilton: “Margaret, why are they just talking about hardware stuff? Where’s the software stuff? They should be getting you up there to talk about software.”


After Apollo, Hamilton secured Air Force funding for a systematic study of all the errors found during the software’s lifecycle. The findings were surprising:

“We found out that 73 percent of the errors were what we called interface problems. And 44 percent were found by eyeballing.”

From these errors, Hamilton’s team derived six axioms of control — a formal mathematical theory. The axioms led to patterns, the patterns to a language, and the language to an automation system:

“We took those errors and came up with this set of axioms… came up with patterns, that if you use the axioms but the patterns which could be derived from them, the problems that we had found would not exist. There would be, for example, no interface problems.”

The key philosophical shift was from “after the fact” (find bugs, fix bugs) to “before the fact” (design systems where bugs cannot exist). Hamilton drew a medical analogy: “after the fact” is like treating disease without washing hands; “before the fact” is prevention.


Higher Order Software and Hamilton Technologies

Section titled “Higher Order Software and Hamilton Technologies”

Hamilton left Draper in the mid-1970s and founded Higher Order Software, Inc. The company grew to over 100 people with customers including Citibank and Scott Paper. Ross Perot’s EDS nearly acquired them for $37-38 million, but the deal collapsed when General Motors bought EDS and halted all acquisitions.

Investors pushed the company away from its engineering roots toward IBM commercial products. Hamilton decided to leave:

“I decided it was time to go home… to leave that company and think about taking our technology and evolving it further. I decided I would just go to this part of Italy and become a waitress and forget about this entire world.”

Friends from DOD talked her out of it, provided funding, and Hamilton Technologies was born. The technology expanded from software-only to full systems, culminating in the Universal Systems Language (USL) and its 001 Tool Suite.

In a Department of Defense “shootout” competition with 80 observing organizations, Hamilton’s system was the only one that could go from requirements all the way through to automatically generated code: “The people that were observing it were saying, ‘Stop it. We believe you.’”


On role models: “My father, my grandfather, my grandmother who was a journalist, and Florence Long, the mathematician, Professor Lorenz.”

On advice to young people:

“Don’t be afraid to question things and don’t be afraid to ask so-called stupid questions.”

“Never say never, and never give up. Just because people say it’s never gonna work, you know, that doesn’t mean you have to give up.”

On solving impossible problems: Unable to do a required somersault for Earlham’s phys ed requirement, Hamilton realized she could do one in water — “it’s learning to think of solving a problem. If you can’t solve it put it in a different place.”

On women in computing, Hamilton pushed for institutional change — getting MIT’s credit union to stop requiring a husband’s signature for women’s loans, fighting pay inequities. She notes the problem has evolved: “I think it’s tougher now in some ways because of things like the internet where bullying is easier and it’s hidden.”

On the future of software: “The before the fact paradigm has a big chance I think in the future of taking off… if you’re dealing with a paradigm or a language in an environment which can handle any kind of system, then maybe some of the problems in AI that could be there because you’re using earlier paradigms might speed up more.”