Learning from History

Design has a long and rich history. Humanity has been creating and ‘designing’ tools/products/systems for people since before we were able to record it.


Design as we have come to think of it started to take hold in the early 1900s, and has grown by leaps and bounds since then. Many people, systems, and events have shaped us and have helped us grow as a practice.

For example, most in the field would probably point to XEROX PARC as a seminal point for user experience design. A lot of great products and concepts emanated from the work done here, and these helped shape some of the most well-known companies that exist today.

If your background is in Human Factors, you are probably familiar with the research coming out of the US Army (before the Air Force existed) as a pivotal moment in the history of design, in which researchers started figuring out that finding successful pilots was more a design issue than a personnel selection/training issue.

And with this broad history, there are a lot of lessons to be learned that all designers should carry forward. Yet in many cases it feels like we haven’t we learned from our history at all. All too often it seems that we as an industry have failed to capture, teach, and learn the fundamental lessons that gives design the potential that we all talk about.

We as an industry have failed to capture, teach, and learn the fundamental lessons that gives design the potential that we all talk about.

I want to take some time here to highlight some key moments, groups, and people that are not commonly talked about in the history of design. These (and many others unnamed) have provided valuable lessons that all designers could use and should be covered more.

Three Mile Island Incident

In March of 1979, the Three Mile Island Nuclear Facility in Pennsylvania experienced an incident in one of the reactors (#2). After a series of events, a valve got stuck open but the sensor reported that it had closed. As operations continued, some of the nuclear reactor coolant was able to escape. Since the indicator showed that the valve was closed, operators were unable to identify the issue and took actions that made the problem worse — they were unable to form a good mental model of the state of the plant. It took several hours and a new shift of employees to determine the situations and resolve the issue.

Beyond instilling fear of nuclear power as a viable energy provider, the incident had a number of implications. A key outcome that affects us as designers is that it “stimulated an international and multidisciplinary process of inquiry about how complex systems fail and about cognitive work of operators handling evolving emergencies.” (From David D. Woods book chapter: ‘On the Origins of Cognitive Systems Engineering: Personal Reflections’ in the book ‘Cognitive Systems Engineering: A Future for a Changing World’). Governments and industries started better recognizing that how people think about the domain or problem space will affect how they work in that space. From this incident — combined with a thorough examination of system design practices and the early stages of AI development at large — the entire field Cognitive Systems Engineering began.