While serving as director of the Defense Advanced Research Projects Agency (DARPA) from 1975 to 1977, George Heilmeier crafted a set of questions to help Agency officials evaluate and select research programs with the potential to generate field-changing innovations. Known as the “Heilmeier Catechism,” these questions have guided the selection of “high-risk, high-reward” programs for many Advanced Research Projects Agencies (ARPAs), and continue to serve as an effective framework for assessing innovative program ideas today.
The Heilmeier Catechism (HC) consists of eight simple questions:
- What are you trying to do? Articulate your objectives using absolutely no jargon.
- How is it done today, and what are the limits of current practice?
- What is new in your approach and why do you think it will be successful?
- Who cares? If you are successful, what difference will it make?
- What are the risks?
- How much will it cost?
- How long will it take?
- What are the midterm and final “exams” to check for success?
Since the HC was introduced in the 1970s, some have considered the need for updates. A few months ago, our team facilitated a discussion in our online forum for ARPA experts (Reach out if you want to join!) on potential additions or revisions to the HC, and if it should be modified at all. In this post, we will summarize a few key learnings from this discussion.
A Timeless, Flexible Framework
Many agree that the HC provides clear, succinct guidance that has withstood the test of time. Former DARPA program manager Vincent Sabio describes the HC as “timeless in its simplicity.”
While the HC endures as a framework, it must not be rigid in its application. Joshua Elliott, a fellow at the Federation of American Scientists and former DARPA program manager, views the HC as a “dynamic tool, not a static product.” He explains, “Framing and thinking aids like the HC will never be perfectly suited to capturing every consideration for every type of program, problem or domain. But that’s not the point. The point is to get you thinking and to force you to iterate and edit your thoughts.” Similarly, Steven Buchsbaum, a former DARPA program manager and key leader in launching the Homeland Security Advanced Research Projects Agency (HSARPA), sees the HC as a “useful conversation starter” that inspires further discussions rather than a doctrine.
“Behind the use of Heilmeier’s questions is a philosophical mindset, and the mindset is much more important than the specific questions,” said Bob Douglass, the CTO and co-founder of Alta Montes, LLC and a former program manager and assistant office director at DARPA.
Drawing from his extensive experience at DARPA, Douglass shared that the HC is not an “‘eight commandments’ of sacred questions” but a useful first filter for evaluating research programs at the agency. Once a program passes the HC test, leaders can move on to further considerations for their specific field. In the field of defense, for example, this could include legal, ethical, safety, privacy, diplomatic, and public perception considerations.
What Might Some Modifications Look Like?
In keeping with the dynamic nature of the HC, several ARPAs have made modifications that reflect their specific mission and the population they serve. The fields of intelligence, public health, and education offer examples.
In the intelligence sector, the potential misuse and misperception of program outcomes are a prime consideration. While serving as director at the Intelligence Advanced Research Projects Activity (IARPA), Jason Matheny added to the HC, “How might this program be misperceived or misused (and how can we prevent that from happening)?” This addition has also been adopted by the Advanced Research Projects Agency – Health (ARPA-H).
This new question invites considerations such as: How might this program be misperceived by the public or foreign governments? How might its outcomes or products be used for malicious purposes? What are some unintended harmful consequences of this program? And importantly, how can these risks be mitigated?
In recent decades, equity has been of greater focus in program creation and scaling. That could mean anything from selecting a research sample that is representative of the general population, to testing and piloting an innovation with users from diverse communities, or ensuring easy and affordable access for all.
Specifically, ARPA-H has incorporated this consideration into its HC by posing this question: “To ensure equitable access for all people, how will cost, accessibility, and user experience be addressed?”
With the unique challenges and complexities of the U.S. education system, the HC should be modified when applied to education R&D, according to Russell Shilling, an innovation advisor and former DARPA program officer.
As an early advisor for the Advanced Education Research and Development Fund (AERDF), an education R&D nonprofit modeled after DARPA, Shilling is experienced in undertaking high-risk, high-reward projects in education. In his work, he has found two additions to the HC particularly useful:
- How does this solution scale while meeting the diversity and inclusion needs of a broad range of students?
- What are the ethical risks, including equity, privacy, and other sensitive issues?
Compared to many other fields, education is characterized by the diversity of students and their communities, as well as the need to safeguard student privacy. These traits necessitate extra attention in selecting and funding research programs.
Possible cross-sector additions to the HC
Apart from sector-specific questions, a few experts also proposed cross-cutting questions that could be added to the HC.
Sabio noted the need to articulate the problem a research program is trying to address. He proposed the addition of “What problem are you trying to solve? State the problem clearly and succinctly.” He explained that although the first HC question – “What are you trying to do? Articulate your objectives using absolutely no jargon.” – centers around program goals, it does not call for an explicit statement about the perceived problem. As a result, programs could deviate off-course and achieve outcomes that do not solve the specific problem – jeopardizing downstream adoptions and applications.
Another helpful question could be “What outcome(s) do you hope to achieve with your endeavor?” Sanj Sivapiragasam, who has worked on DARPA projects, explained how this is a new spin on “What problem are you trying to solve?” It prompts leaders to consider program impact as not solely about solving a specific problem but being driven by user needs and open to “enhancing something, or creating a capability to do something that humanity was unable to do before.”
Reflections That Embody The ARPA Spirit
A few of the reflections shared in the online discussion are windows into what makes ARPAs unique.
Adam Russell, former acting deputy director at ARPA-H and former DARPA program manager, drew the connection between the flexibility that is core to how ARPAs operate and the flexibility that should be applied to the HC. Russell argued, “Since the success of an ARPA model in any mission area depends on preserving as many degrees of freedom as possible for the organization, the Program Managers, support staff, and Directors, I think the ability for different ARPAs to add elucidating questions should be seen as another degree of freedom that should be embraced.” Russell notes that, once the HC is put into place (tailored or not), it should be adhered to in a principled way. He calls this being “catechommitted” and writes about it here.
Christian Macedonia, former DARPA program manager, suggested adding a question that embodies the non-conformist thinking that makes ARPAs so effective. Macedonia offered, “Is your program amenable to the Heilmeier framework or should it be considered exempt? If exempt, please explain.” This question makes room for programs that may not fit the mold of the HC but could still be worth pursuing. Macedonia asked, “Think of where we would be if people were never given the opportunity to think outside the old order?”