In 1970, then University of Delaware professor Arthur Hoerl and his colleague UD alumnus Robert “Bob” Kennard developed ridge regression, a now world-famous statistical methodology. This innovative method has withstood the test of time and revolutionized statistical modeling, leading to various machine learning methods popular today.

Fifty years later, Arthur Hoerl’s son Roger, the Donald Brate and Stanley Peschel Associate Professor of Statistics at Union College, shared the story behind the development of the now famous ridge regression during UD’s Lerner College’s Institute for Financial Services Analytics and the College of Engineering‘s Department of Electrical and Computer Engineering’s Distinguished Speaker Series in March. More than 70 people, including former students and colleagues of Arthur Hoerl, as well as other members of his family, attended the recorded webinar.

“Ridge analysis was the first real innovation that my father developed,” said Roger Hoerl, who earned his Ph.D. in statistics from UD in 1983. “He later applied ridge analysis to the multicollinearity problem, which is how he and Bob Kennard actually came up with ridge regression.”

Multicollinearity is a common problem in analyzing observational data, particularly in chemical engineering, engineering and machine learning. It occurs when two or more predictors in a regression are highly related to one another, such that they do not provide unique and/or independent information to regression. Traditional models struggle to separate the effects of the predictors. This is a problem because findings could be potentially misleading and not make sense.

Arthur Hoerl and Kennard first introduced ridge regression in their *Technometrics *papers “Ridge regression: biased estimation of nonorthogonal problems” and “Ridge regression: applications in nonorthagonal problems.” This was the result of 10 years of research into the field of ridge analysis and ridge regression. Today, these articles are among *Technometrics’* most cited published research*.*

Roshan Joseph, editor of *Technometrics*, published a special edition in celebration of the 50th Anniversary of the Ridge Regression__ in 2020__, which features a historical perspective by Roger Hoerl.

“It was 50 years ago that Arthur Hoerl and Robert Kennard published their breakthrough articles on ridge regression in* Technometrics*,” Joseph wrote in the special edition. “I am not sure at that time if they had realized the enormous impact their article would make in the field of statistics. Today, in modern statistics, we cannot imagine a world without ridge and its successors such as lasso and other regularization techniques. They have become indispensable tools in the hands of statisticians and data scientists.”

Arthur Hoerl earned his B.S. in mechanical engineering at the University of Southern California (USC) in 1944. He was initially drafted to participate in the Second World War in what is now referred to as the “Battle of the Bulge,” but due to his engineering degree and scores on the Army math aptitude test, he reported instead to Los Alamos National Laboratory to work on the Manhattan Project. After the war, he worked as an engineer solving problems that involved data analysis. He earned an M.S. in math from USC in 1950 and then was hired by DuPont as their first statistician. He left after 17 years to take a tenure track position at UD teaching statistics. He served on the UD faculty from 1967 until he retired in 1986.

“It is clear to me that my father’s background, becoming an engineer first, and then working on the Manhattan Project, had a tremendous impact on how he viewed statistical problems, including the multicollinearity problem in regression,” Roger Hoerl said.

A Delaware native, Kennard also served in World War II and due to his math proficiency was assigned to the Signal Corps. His unit broke the Japanese “purple code” and intercepted and decoded messages from the Japanese high command. After the war, he earned his bachelor’s degree in physics in 1949, a master’s degree in statistics in 1952, at both UD, and a doctoral degree in mathematical statistics at Carnegie Mellon University. He came to Dupont five years after Arthur Hoerl where they met and, although they were never at UD at the same time, they kept in close contact.

“Bob maintained a lifelong interest in physics and science in general,” Roger Hoerl said. “He was, in some sense, a scientist first and a statistician second.”

According to Roger Hoerl, the Hoerl-Kennard team brought engineering, scientific method and mathematical statistics viewpoints to the problem of multicollinearity and the three viewpoints were needed to develop ridge regression.

“A big part of their motivation was solving a real problem, which was the fact that they were looking at regression models where a coefficient was negative when they knew from subject matter knowledge it had to be positive,” Roger Hoerl said. “They tended to be looking at chemical or chemical engineering data at DuPont. Frequently they would say, ‘No, it’s gotta be positive, but it keeps coming up negative.’ So that led them to dig into the multicollinearity problem.”

“Ridge regression beautifully resolves the balance between the accuracy and the stability and robustness of a regression,” said Bintong Chen, professor of operations management and director of the Institute for Financial Services Analytics at UD’s Lerner College. “This philosophy and approach have inspired many techniques to follow, including those commonly used in machine learning.”

A multitude of fields including econometrics, chemistry and engineering use ridge regression to estimate the coefficients of multiple-regression models in scenarios where independent variables are highly correlated. The Lerner College courses in statistical learning, machine learning/data mining, fintech and data science all incorporate ridge regression as part of the course.

“In a variety of fields—finance, genetics, epidemiology, economics and others—it can be difficult to sort out the influence of one cause from another,” said Paul Laux, professor of finance and JPMC Senior Fellow at UD’s Lerner College. “Ridge regression provides a way to do this, by insisting that a cause be statistically important enough to justify the complication it adds to a model.

“By proposing a specific, interpretable, and computable way to measure the idea of ‘important enough,’ ridge regression has enabled more stable and dependable prediction in all these areas,” Laux continued. “This technique, created by two Blue Hens, has become one of the go-to tools in the machine learning toolkit.”