Skip to main content
Business Analytics

Oppenheimer, Ulam and Risk Analytics: The Legacy of WWII Scientists on Contemporary Computing

Robert Oppenheimer silhouette graphic

Sherry Fowler, Poole College professor of practice in information technology and business analytics, explores the connection between the earliest development of atomic weapons to modern-day data analytics.

Have you seen the movie Oppenheimer, released this summer? It is based on the true story of Dr. Robert Oppenheimer, the American theoretical physicist and scientific director of the secret Los Alamos Laboratory in New Mexico during World War II. This movie has sparked renewed interest in WWII and the role technology played in its outcome. The accounts of scientific and technological efforts like the Manhattan Project (the WWII endeavor of the United States to create an atomic bomb at the Los Alamos Laboratory) are once again on the big screen.

As compelling as Oppenheimer’s role was in the Oppenheimer film, several other scientists responsible for the development of the atomic bomb and the scientific computations behind it are not depicted in the movie. Still, they have a lasting legacy that extends beyond science and mathematics into other fields… including business.

Stan Ulam’s invention of the Monte Carlo method

One of those scientists was Dr. Stanisław Ulam, a Polish-American mathematician and nuclear physicist. Over his lifetime, he made notable contributions, including discovering cellular automata, creating a new design of thermonuclear weapons and advocating nuclear pulse propulsion. However, interestingly, Ulam is perhaps best known for his unintentional realization and subsequent innovative approach demonstrating the practical ability of computers to apply statistical methods to functions without known solutions

This analytic approach, known as the Monte Carlo Simulation method, was central to the simulations required in the Manhattan Project. 

The Monte Carlo method is a standard and ubiquitous approach still used in many disciplines today, including physics and engineering. It is taught in many business schools because it allows the modeling of phenomena with significant uncertainty in multiple inputs to calculate risk. It is part of a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results and is categorized as a prescriptive data analytics approach. 

Prescriptive analytic methods prescribe not only what could happen in the future based on a decision, but what should be done to make it happen.

What’s in a name?

The Monte Carlo method was secretly code-named after the Monte Carlo Casino in Monaco, where Ulam’s uncle liked to gamble. One of Ulam’s colleagues, Nicholas Metropolis (a Greek-American physicist recruited to work at Los Alamos by Oppenheimer), dubbed it the Monte Carlo method, and the name stuck.

Even more intriguing, the idea behind the method came from the card game, solitaire. Here’s how …

The backstory

The scientists at Los Alamos were concurrently developing two bombs ─ the gun-type, fission weapon and the more complicated implosion-type nuclear weapon, which used chemical explosives to smash fissile material, eventually leading to a nuclear chain reaction and significant energy release. John von Neumann, the famous Hungarian-American mathematician, physicist and early digital computer developer, had joined the Manhattan Project to work on the calculations needed to build an atomic bomb. He invited Ulam to come to the U.S., and by November 1943, Ulam too had been asked to join the Manhattan Project.

In early 1944, Oppenheimer had a decision to make that ultimately involved both von Neumann and Ulam. He decided to reorganize the Los Alamos Laboratory to focus solely on building an implosion-type weapon instead of the gun-type plutonium weapon (code-named, “Thin Man“). Ulam helped to design lens configurations that would provide spherical implosions to assist with the speed of the implosion and determined the difficult hydrodynamical calculations needed to minimize asymmetries. 

Using primitive facilities, Ulam and von Neumann deployed this design, which was ultimately used in the “Trinity” test bomb for the Manhattan Project on July 16, 1945, and in the “Fat Man” plutonium implosion-type weapon, detonated over Nagasaki, Japan on August 9, 1945 – effectively causing Japan to surrender six days later and ending WWII. The design motivated the advocacy of Ulam and von Neumann for powerful computational capabilities at Los Alamos, integral to the deployment of the atomic bomb.

It’s all in how you play the game

After the war ended in 1945, Ulam left Los Alamos and became an associate professor at the University of Southern California in Los Angeles. However, in January 1946, Ulam suffered an acute, life-endangering attack of encephalitis, which was alleviated by emergency brain surgery. During his recuperation, Ulam had begun to “apply” mathematics to other subjects, like physics and biology, and he played solitaire to pass the time. He had thought about playing hundreds of solitaire games to try and estimate, statistically, the probability of a successful outcome. Specifically, he asked what were the chances that a Canfield solitaire with 52 cards laid out would come out successfully. 

At first, he tried to estimate each outcome with various combinations of calculations, but then he decided to think more abstractly and simply play the game one hundred times and count the number of successful plays. He determined that the more games he played, the more accurately he could predict the outcome of the game without playing. This method seemed to work, and he would later use it in his calculations on thermonuclear weapons.

The approach goes nuclear

When Ulam had mostly recovered from his illness, he attended a secret conference with von Neumann and others at Los Alamos to discuss thermonuclear weapons with Edward Teller, a Hungarian-American theoretical physicist known for his scientific ability and volatile disposition. 

Teller had been Ulam’s first boss at Los Alamos and had tasked him to work on the design of a hypothetical “Super” weapon based on nuclear fusion instead of a practical fission bomb. Soon, Ulam and Teller disagreed on the efficacy of Teller’s calculations, leading Ulam to move to another group. Later, at the conference, Ulam noticed that Teller’s focus was still on developing this “Super” weapon. However, it was hard for Teller to progress past the ideation stage because the calculations were exceedingly difficult, and there was no way to run small tests. 

Eventually, the conference group decided to join in the exploration of Teller’s ideas, prompting a new position to be offered to Ulam at Los Alamos. Ulam accepted in 1946 and began to work on a thermonuclear weapon. 

Using the Monte Carlo method in a game with higher stakes

Ulam and the scientists at Los Alamos eventually determined that Teller’s original “Super” design did not produce the necessary megaton yield, so Ulam and Teller instead began to collaborate on the Teller-Ulam design, which became the basis for all thermonuclear weapons. 

Interestingly, Ulam’s experience playing cards while he had been recuperating led to his applying the solution to a complex problem that baffled the physicists at Los Alamos ─ how to diffuse neutrons in the core of a nuclear weapon. Though the scientists knew the average distance a neutron would travel in a substance before colliding with an atomic nucleus and the amount of energy the neutron was likely to emit after a collision, they could not solve the problem using deterministic mathematical methods.

Ulam proposed using random experiments instead, inventing the modern version of the Markov Chain Monte Carlo method. Ulam and co-author Metropolis later published the first unclassified paper on the Monte Carlo method in 1949. This approach later led to one of the first analog computers, known as the Fermi trolley (later renamed the FERMIAC), which performed a mechanical simulation of random diffusion of neutrons. Eventually, as computers improved in speed and programmability, these methods became even more useful.

The gamble and the wildcard solution

In early 1950, U.S. President Harry Truman ordered a program to develop a thermonuclear super” weapon, after learning that the Soviet Union had tested its first fission bomb by duplicating the U.S. effort for the “Fat Man” bomb. This new “super weapon” required far more energy than an atomic bomb.

Ulam was asked to review the results of calculations directed toward the development of the bomb (from the Teller-Ulam design). Though Ulam had the breakthrough idea, he and Teller developed the idea into the first workable design for this megaton-range hydrogen bomb, first proposed in a classified scientific paper on March 9, 1951. The “supercompression” idea was to divide the fission component from the fusion components of the weapon and use the radiation produced by the fission bomb to condense the fusion fuel and then ignite it. The idea worked, and importantly, was implemented with the assistance of a digital computing device. Interestingly, Oppenheimer was morally opposed to building the hydrogen bomb, but he considered the atomic bomb “technically sweet.”

The connection to the first digital computers

Ulam realized the importance of digital computing devices. He remembered that von Neumann had earlier sponsored Metropolis in 1945 to deploy calculations on the ENIAC (Electronic Numerical Integrator and Computer), the first programmable, electronic, digital general-purpose computer. In 1951, Metropolis led a team of scientists to construct the MANIAC (Mathematical and Numerical Integrator and Calculator). Smaller than the ENIAC but able to store programs, Teller and Ulam employed the digital device, which used a computer architecture from von Neumann, to perform the engineering calculations for the hydrogen bomb. The first test of this fully featured thermonuclear device (“Ivy Mike“) on November 1, 1952, was successful.

How the method helps companies analyze risk today

Today, the work of Ulam lives on. Many Monte Carlo method calculations are conducted on large massively parallel supercomputers and small personal computers, especially in risk simulations for businesses, as the calculations provide quick results that are often accurate. These calculations have become popular in many disciplines and in a wide variety of industries and governmental organizations.

Monte Carlo Risk Simulation is featured in proprietary software such as SAS JMP, @Risk, Analytic Solver Platform and Crystal Ball, as well as open-source apps such as Cassandra Monte Carlo Software. Such software helps organizations analyze risk utilizing the Monte Carlo method by providing two outputs of a function: the outcome of a calculation and the likelihood of that outcome occurring, both important in determining and managing the risk associated with a business decision.

Instead of predicting a set of fixed values, a Monte Carlo simulation predicts a set of outcomes based on a range of estimates using a probability distribution for each input. It recalculates the results repeatedly (often thousands of times), each time using a separate set of random numbers (within the minimum and maximum values) to produce likely outcomes. 

For example, in a business situation where product price, demand and cost are all uncertain for an upcoming quarter, a Monte Carlo simulation could predict a quarterly profit (or loss) for the product and the likelihood of obtaining that profit (or loss). By utilizing the appropriate probability distribution for each uncertain input and sampling thousands of values within each range for price, demand and cost, the simulation displays the expected value for the profit and the likelihood (e.g., 90%) of achieving it. This approach is far better than simple sensitivity analyses, which can only use deterministic methods on one or two inputs, and leads to better decision-making.

The rest of the story

Ulam’s work with the Monte Carlo method and the hydrogen bomb are depicted in the movie Adventures of a Mathematician based on the 1983 book of the same name. However, the stories of other scientists from this era also deserve to be told. The Imitation Game features the account of Dr. Alan Turing and the Turing Machine, which not only impacted the outcome of Nazi Germany in WWII but was also a precursor to modern computing, AI and the Turing Test. Another worthy movie is Top Secret ‘Rosies’: The Female ‘Computers’ of WWII, the documentary featuring the story of female mathematicians who became human computers for the Army during WWII.

Moving forward, perhaps films will be released about other famous computational scientists from that era whose legacy survives them. Interestingly, if we measured that outcome of success using the Monte Carlo method, there would likely be little to no risk.