The Ad Hoc
Research Thinking Field Guide

Illustration of three layers of paper with icons of a computer, crane, clipboard, and trash can emanating from the layers.
Written by:
Photo of Alex Mack

Alex Mack

This field guide is for digital services and technology leaders working at the federal, state, or local government level. It describes a new, advanced way of applying research approaches to strategic decision making across digital services.

Research is more than UX

Most U.S. government agencies do not yet use research to its full potential.

Of course research is invaluable to inform user experience and design (UX), usability testing, and which features to build next. Forms and features that are tested deliver better digital experiences for constituents. This application of research is inherently valuable. Yet, research’s biggest strategic contributions go beyond a “check the box” approach after the fact.

The real value of research lies in its ability to inform better decisions at every level, from how to lay out a dashboard to what priorities should drive long-term resource planning in digital services. Research enables you to clearly describe wider issues and identify potential solutions in a way that can be tested. It allows you to determine appropriate metrics with which to measure success, and to steer a course ever closer to that success. Research — or more accurately, the approach that we at Ad Hoc call Research Thinking — drives far better outcomes.

In the rest of this guide, you’ll learn:

  • What Research Thinking (ReThink) is and its core principles
  • The Research Thinking process, step by step
  • Common mistakes to avoid in agency situations
  • Specific guidance on applying Research Thinking to strategy, customer experience (CX), and risk reduction

At Ad Hoc, we believe user research can drive dramatically better decisions — if the research is applied with systematic, strategic thinking.

Agencies can see not just improved customer experience, but also decreased risk and increased ability to deliver useful services quickly with no more cost.

Research Thinking turns research into outcomes for the direct benefit of the agency and its mission.

A tale of two approaches

Juan Doe is the new VP of technology at the Generic Federal Agency. The agency’s mission is to provide specific benefits for low-income citizens who qualify, and Juan believes deeply in that mission. His mother received these benefits at a critical time in her life, and they helped her build a better life for her family, including Juan.

The GFA’s benefits are truly life changing for recipients, but qualifying is a difficult process for both beneficiaries and the hard working civil servants at the agency. The agency leadership knows that its systems need to modernize and has found the funds. Juan is in charge of digitizing a key qualification form and making it work with the new systems.

The goal for this project is to decrease the agency’s application backlog and make the application process easier for beneficiaries and employees alike.

Here our story diverges into two paths.

The conventional path

In the first path, the agency takes a conventional approach. Juan and his team predetermine the requirements of the new system down to the smallest detail, and request proposals accordingly. They select the vendor largely based on price, but Juan wants to do the project properly. He ensures that the vendor selected has staffed a visual designer to mock up the form and has included usability testing on the form as part of their proposal.

A year later, Juan and his team have the final form integrated into the agency’s new system and have seen the impact. The form is exactly what they asked for, but hasn’t led to the outcomes he had hoped for. The backlog barely shrinks ten percent, leaving applicants still waiting hundreds of days for a decision, and agency employees still have to use painstaking workarounds to make the process function. Juan got the job done, but he feels unsettled about the results.

What if he had taken another path?

The Research Thinking path

In the second path, Juan and his team select a vendor based on both price and a proposal focusing on achieving better outcomes for the agency. The vendor’s process will take the same amount of time as their competitors’, but that time will be spent differently. They will spend the first few weeks in discovery, determining what they need to accomplish for beneficiaries, employees, and the agency, with success criteria and metrics. They will describe their assumptions and find what information will be needed. Then, they will hit the ground running, doing detailed research on the agency’s existing systems and their impacts on people.

The teams then will together finalize a roadmap to reach the agency’s big picture goals for the application in manageable steps. With this plan in place, the form migration will end up taking a fundamentally different technical approach than Juan had originally planned, but he feels profoundly good about the path. Even better, the form migration will be completed with better outcomes for constituents, in less time, and with the ability to adapt the technology as the agency and the public’s needs change.

A year later, Juan’s team continues to work diligently with the vendor to iterate toward the big picture on the roadmap. The redesigned (and legally compliant) form has been integrated into the agency’s employee system seamlessly for five months.

Based on the Research Thinking, the team has also enabled applicants to track the status of their application in a simple-to-use online portal, and continues to deliver new features that make an impact. Hard working civil employees have stopped Juan in the halls to tell him how much easier their jobs have gotten. And countless beneficiaries have received funds for the first time.

The agency is featured in a major newspaper, and their turnaround cited as a success from the White House. Juan’s team is making progress bit by bit to deliver the customer experience they’ve always hoped for.

What is Research Thinking?

At its core, Research Thinking is about learning strategically. ReThink considers what information is needed to inform a particular decision or decisions, and determines how to gather that data well.

ReThink includes standard research activities, such as writing interview guides, recruiting participants, interviewing people, developing surveys, and analyzing results. But, the approach does not do research for its own sake; it places those activities in the context of an agency’s larger goals. ReThink then goes further, seeking to understand the context of the people, policies, systems, and the overall environment involved, so that a decision can be made from a truly informed place.

Research done under a ReThink approach is not overly broad or unfocused; it does not go on forever, but always has a specific and actionable goal. Research is tightly connected to the questions the team needs answered in order to decide and take action. As a result, ReThink is cost effective, and has an outsized impact on positive decisions with good outcomes.

Focused on people. The Research Thinking process is driven by human-centered design principles, considering the needs of all of the humans in the system, from constituents to stakeholders and front-line civil servants. As a result, decisions made from an ReThink approach are based on reality and equity, and tend to lead to positive customer and employee experience, as well as stakeholder satisfaction.

Deeply knowledgeable about systems. In addition to traditional research activities, Research Thinking can include a wide range of additional information-gathering tasks and data analysis steps. These strategic activities build the broader context of policy, systems, and the overall environment in which products are delivered and used.

Useful for a variety of decisions. The types of decisions that Research Thinking can and should inform are broad. They include not just design and user experience decisions, but also decisions around what services to create, how to create them, and the priorities and roadmaps for developing them over time. Agencies wanting to make data-driven decisions will find Research Thinking approaches invaluable.

The central questions of Research Thinking are:

  • What decision do we need to make?
  • What do we need to know in order to make this decision?
  • Do we already have the right data to make this decision? If not, how do we gather that data?
  • What are our assumptions? Are they true in practice?
  • How can we use our resources to make a bigger impact in this situation?
  • How will this decision affect all of the people involved?

Notice that ReThink considers the big-picture questions about each decision, and ideally, the entire context the decision sits within. This is very different from the kind of user research that gets slotted in where convenient, often late in the process. That kind of research can be helpful, but Research Thinking goes much further, to think about the big picture: what we are building, why we are doing it, and what impacts it will have.

What you will need to do Research Thinking well

Research Thinking is an approach that can be used by professionals from many backgrounds. Many of its skills — such as framing decisions, determining outcomes, questioning assumptions, asking better questions, and applying data to inform decisions — can be applied by everyone.

However, we recommend that an experienced research practitioner lead the research and analysis. This ensures that the methods chosen are appropriate, the research is conducted ethically, and the analysis is thorough and well-connected to the decision at hand. If you do not have an advanced research skill set on your team already, you’ll need to add it, and work closely with your practitioner(s) throughout the project.


Principles of Research Thinking

Here are the most important philosophical pillars of a Research Thinking approach.

  1. Own assumptions.
  2. Look beyond just the humans.
  3. Think in outcomes, not artifacts.
  4. Remain adaptable.

Own assumptions

Most projects begin with a set of assumptions — this is a natural and necessary human tendency. To start planning, we all must rely on existing knowledge and best guesses. However, assumptions can be dangerous, and unspoken assumptions often lead to future problems and wasted work. To ensure that ReThink works as planned, everyone involved must be willing to identify and be honest about their assumptions from the beginning of the project, and during it.

Just because you identify assumptions, of course, does not mean you have to explicitly research or test every one. The practice itself is still important. You acknowledge what you don’t yet know. You create a clear view of your existing knowledge so that you can prioritize future research. You also explicitly choose what work will not be done as part of the project, freeing resources to go where most helpful.

Identifying and owning assumptions is a key practice in ReThink. Even when you do not explicitly research an assumption, an identified assumption may be proven wrong. If that happens, you then have an opportunity to make a better decision based on the new information.

Look beyond just the humans

Like human-centered design (HCD), Research Thinking prioritizes understanding, designing for, and making decisions that serve the experiences of all of the people involved. However, Research Thinking goes further than HCD. ReThink asks questions to uncover the values, goals, and constraints that impact peoples’ behavior, their why.

Surface-level research might tell you that people are frustrated by long wait times and lack of transparency when they apply for a benefit. This information tells you about what a problem is, but it does not help you understand the factors that create that problem, nor how to begin to address it. ReThink will seek to understand those realities too. This means researchers may need to dig into what agency employees are doing, what systems are collecting and storing information, and even what processes have been put into place to ensure policy is being followed. Part of the skill of an advanced researcher is to decide how and when research on a topic is “done,” to fully answer the important question at hand, and no more.


The ReThink approach addresses: (1) the experience of all of the humans, but also (2) the full scope of an environment, to understand the broader context in which a service is delivered.

Think in outcomes, not artifacts

It is surprising how many contracts still prioritize artifacts over outcomes. While artifacts like service blueprints, personas, and customer journeys can be helpful tools for building understanding, by themselves they do not accomplish any outcomes. They do nothing inherently to advance an agency’s mission, serve its employees and stakeholders, uphold policy, or deliver services to people. The “thing” is not the result.

Research Thinking, on the other hand, starts with outcomes, and only adds artifacts if they will help those outcomes happen. That means research is not done simply to create personas, journey maps, or even user stories. Research is instead framed from the beginning with the end goals in mind. Work is tightly connected to the ultimate decisions or actions that must be taken, and the resulting insights apply directly to the agency’s goals.

This approach is more difficult in practice than it sounds. In fact, an outcome focus can feel emotionally unsatisfying to people used to a tangible “thing” that proves work has been done. ReThink does not produce as many of these as traditional “do-first” approaches, but the ones it does produce are far more useful.

Remain adaptable

A key aspect of Research Thinking is the willingness to learn and adapt along the way.

The ReThink approach holds knowledge lightly; research always brings up new information which will change the conceptualization of a product or service. This is normal, and productive. In fact, the most effective research is often done in concert with other technical experts who work iteratively. Information will spark testing, which will lead to new needs for information. Discovery should not be the end; there is always something new and useful to learn.

More than anything else, Research Thinking requires flexibility in approach and methods. A skilled research practitioner will match methodologies to what information is required for a decision or action, but this is not enough on its own. Often, the “best” methods are not possible, so reaching outcomes requires creativity and flexibility. “Best” must always balance time, resources, and priorities, and as such, non-research practitioners can provide valuable input and guidance.

By focusing on outcomes and remaining flexible, it’s possible to change course as many times as needed to effectively reach the goal.


Research Thinking provides the most value when brought in as early in the product journey as possible; it will provide a framework to focus and inform major decisions at that stage. However, it’s never too late to use a ReThink lens, and no decision is ever too small. There is always something relevant to learn.

If the entire Research Thinking approach feels too big, or too expensive at any point, step into it instead in bits and pieces. Even if the decision is as small as which form layout to use, approaching that decision from a ReThink framework can provide important benefits. The more you use Research Thinking, the more it will feel natural, and the better your results will get.


The Research Thinking process

Note that while ReThink is a roughly linear process, it can look messy in practice. There may be different lines of research starting at different points in the process and overlapping. This is normal, and to be expected.

Apply the process to your situation in the way that best fits your needs.

  1. Find your purpose and outcomes for the project.
  2. What big-picture decisions must be made?
  3. Determine what you don’t yet know.
  4. After you describe, prioritize.
  5. Research strategically.
  6. Translate data into action.
  7. Make the decision(s).

1. Find your purpose and outcomes for the project

Because Research Thinking values the big picture first, the process starts by understanding what the big picture is. That often means creating a dialogue between users, stakeholders, and the organization to decide. What people want to achieve is important to understand, since it can bring out opportunities to add additional value or to reach goals by different means.

ReThink begins by finding the answers to three questions.

  • What is your purpose for the project? (In other words, why are you beginning this initiative now?)
  • What outcomes do you want to achieve from it?
  • What constraints are you working within?

For large-scale projects or decisions, determining these answers can take several complex discussions with many stakeholders! For small ones, such as which layout is easier for users to read, these answers can take less than five minutes with pen and paper. Either way, this step cannot be skipped; we find that a little time spent thinking critically can avoid literally weeks or months of wasted effort down the road.

Normally you will begin a new initiative with one of the first two questions partially answered; you will need to be sure to have the full answers to all three questions before you move on.

Do not skip the conversation about constraints. While limitations such as policy, time, funding, and even strong stakeholder opinions may at times feel frustrating, they ultimately push you to creatively work through recommendations and solutions. Beginning this conversation early, in a collaborative fashion, makes sure as many players in the ecosystem as possible are invested in, and bought into, the approach or solution that results.


Defining outcomes

See the earlier “Principles of Research Thinking” section for important information about how ReThink approaches outcomes. These should always ultimately be framed in terms of every human touched by the project, as well as the processes and systems involved. The solution must work for the organization as a whole, the end users, stakeholders, and employees, if it will be sustainable long-term.

2. What big-picture decisions must be made for the project or program?

Moving from goals and outcomes to big-picture, concrete decisions seems like a straightforward step. In practice, it is often one of the hardest parts of the process. Yet, it is well worth the extra thought; research cannot improve the quality of decisions if the decisions are not made explicit.

Let’s take a project where the purpose is to “improve the physical and mental health of seniors.”

  • What mission-critical strategic decisions must be made next?
    • For example, you may need to choose between in-person interventions or digital ones. Which is likely to have more impact on seniors’ health? Is there a third option?
  • What are the criteria for making those strategic decisions?
    • For example, you may prioritize a digital or in-person intervention based on not only the impact to seniors’ health, but also cost, digital systems capabilities, and other key factors.
    • Your criteria will include the constraints you identified in the previous step, but will not be limited to them. Choose comprehensive criteria that you can return to throughout the project for a clear and unbiased assessment of progress.
    • Finally, based on the criteria, eliminate strategic options that clearly don’t qualify. For example, costs that are several times higher than your budget, or options that require more people than you have, make it easy to eliminate those options.
  • For each of the strategies still in consideration, what big-picture tactical decisions must be made?
    • For example, for digital interventions, you may need to decide between a website, mobile-first site, or text campaign to begin. Are there any other options?
    • Repeat the elimination step for tactics, based on the same criteria.

For small-scale decisions, such as which layout is easier for users to read, this entire process may be less formal, and be worked through in the space of a meeting. However, even for small decisions it is important to be intentional about the fact that a decision needs to be made, and the basis for making it.


Slow down and think decisions through

Many decisions in everyday life go unnoticed. People assume that a specific choice is the right one, and move forward without considering the other potential paths or approaches. However, it is always worth defining a decision rather than moving forward blindly. After all, not making a decision is a decision. Framing a contract in a specific way represents several decisions. By defining all decisions, including the “hidden” ones, you can explicitly explore creative options and do better, more actionable research.

3. Determine what you don’t yet know

Once you have your list of critical decisions, the next step is figuring out what you need to know in order to have confidence in making those decisions. This requires formally describing assumptions and gaps in your current knowledge.

Unstated assumptions can become the hidden killers of projects. For instance, if an agency has assumed that users complete an application or enrollment process in a single visit, it is crucial to confirm that reality. Otherwise, the agency may build a service where there is no ability to save. If users need to return several times without saved progress, the application may not work, and the agency may not be able to provide services. The mission failure was preventable, since the underlying assumption could have been tested easily.

Not every assumption or gap in knowledge must be researched immediately, but all must be identified. Understanding the gaps and deciding which are most important allows you to spend your resources wisely, and keeps important outcomes from being derailed unnecessarily.

Examples of assumptions and unknowns

Assumptions that can impact outcomes
  • Users can complete an application or enrollment process in a single visit.
  • Mobile app ratings accurately reflect how well the service is being delivered via app.
  • A modernized digital service built on new technology should have exactly the same feature set as the current version, because the risks of removing an existing feature outweigh any potential gains from new features developed instead.
Unknowns that can impact outcomes
  • What other sources of information do people use to learn about the service and how to access it? What is the agency not providing that they need?
  • Who is not accessing the service that needs it?

Make note of what you know, what you don’t know, and what you assume in a format that feels as lightweight as possible. Our teams have used sticky notes and a whiteboard, a Figma board online, or a diagram with plain-form written notes. The format is less important than the thinking.

A sample known, unknown, assumed template

What we know
And how we know it
What we assume
And why we assume it
What we don’t know
And why we need to know it
Users
Context of use
Impact on the overall ecosystem
Risks

Notice that we ask why to help identify the hidden “drivers” of reality and behavior. At every stage Research Thinking works to relate research to the larger outcomes and goals.

Question what you know

Take the time to question each piece of information you think you know, looking for hidden assumptions that could hurt the initiative. Ask, “how do we know that? What data do we have to support that?” “Do our users actually know how to do this? Is this important to them?” Questioning assumptions is always an uncomfortable exercise, but it is one that is critical to success.

Do a double-check

Once you’ve made a list of what you need to know and what you assume for each strategic and tactical decision option, do a double-check with your research practitioner. Is the information you need to obtain for a given decision researchable? If not, cross the option off.

Be prepared for existing knowledge to take work to assemble

Often, there is already a lot of existing data and knowledge, but it may be spread across different sources. If it is not yet consolidated or analyzed, or not analyzed in light of the current questions, make a note. There may be work that needs to be done to bring that information together early in the research phase, before traditional research is done. Consider using common Research Ops approaches to creating, maintaining, and governing repositories.

4. After you describe, prioritize

Once the knowns, unknowns, and assumptions are listed, there will be way more to know than can reasonably be addressed in a single research project. (This is normal, as every project must work within the reality of time and budget constraints.) The next step, then, is to prioritize with a critical eye. What must be learned now? What can safely wait, or not be researched at all?

Some questions to consider are:

  • What information is mission critical? What must we know to make the decision confidently?
  • What information is merely nice to have?
  • Which unknowns can potentially block decision making?
  • Which assumptions are least well supported by data?
  • And which of these, if wrong, would have negative effects on outcomes?

Decide on research priorities using the purpose, outcomes, decisions, and criteria you determined earlier. Your advanced research practitioner may be helpful here to inform what is and isn’t possible within the bounds of well-designed research.

Consider beginning with foundational assumptions

Often the first priority should be to research foundational assumptions, as these will affect the entirety of the product or service decisions moving forward.

For example, an agency is changing their online application processes online. They rely on social services organizations to do direct outreach to beneficiaries. The agency plans to send out a pre-recorded training about the new processes to the organizations. Will this format be effective? If not, thousands of beneficiaries could ultimately become confused and go through the process incorrectly. This assumption must be researched first for the remainder of the work to be successful.

Go into the research phase (in the next step) with a good understanding of what you will need to learn, and in what order. You need not yet know exactly how you will learn this information.

5. Research strategically

Finally, it is time to research! Research Thinking means learning as much as possible as strategically as possible, within constraints. That means carefully matching research methods to your priorities and what you need to learn.

Research Thinking aims to do the least amount of research possible to make a specific decision well. However, it also ensures enough research is done to enable a long-term impact, improving your product, service, or outcomes iteratively over time. All research should therefore be continually designed and re-designed to help you reach your purpose, in the short and long term.

Work with a research practitioner

We strongly encourage you to work closely with an advanced research practitioner during this step. How to obtain the needed data efficiently and ethically, and which methods to use, are both questions that will require deep expertise to answer. The research itself can be done by a dedicated team of researchers, or can be democratized and conducted by people with a variety of backgrounds. However, the planning must be done by an experienced practitioner, to ensure that it results in actionable research.

Making research actionable

The purpose of research is to enable you to make decisions and take actions, no more and no less. Research that delivers on this promise is called “actionable.”

The most important part of making research actionable is ensuring that the methods chosen, and the way they are implemented, can actually provide the data needed for a specific decision. For instance, usability testing is invaluable for helping make decisions about page layout; it builds a solid understanding of how people understand and navigate the page. The same method tells you nothing about the overall experience of using a digital service or the outcomes of use, and would be unsuitable for decisions about high-level strategy. It is the match between decision and method that makes research actionable.

Be creative before, during, and after this research step, and remain flexible. There are often several alternate ways to achieve a single actionable end. What you will need to learn also often changes as you’re researching, and unexpected changes or obstacles arise. Resource constraints may mean reprioritizing work part way through. When something happens, keep your eye on your outcomes, work with your practitioner, and adjust accordingly.

The following are best practices in the research phase of the Research Thinking process.

Leverage existing research

One of the most common mistakes we see is beginning each research project from scratch, often unintentionally duplicating past efforts. Rather than wasting time recreating what is already known, we recommend beginning each research project with a formal step designed to locate and leverage existing research.

Mining existing sources of information should not be limited to reports and transcripts from user research studies. Even policy can be a source of research. (Not only does it tell us about the constraints and rules we are operating in, it provides insight into the people and the ecosystem in which decisions are being made.)

Places to look may include:

  • Reports from past research projects conducted inside your organization
  • Existing site metrics and data analyses
  • External reports by related organizations
  • Oversight reports in government
  • News stories that contain existing research

The range of existing data that can be useful is broad, and should be approached with a creative eye.

Employ a broad range of data and research types

When planning research, ReThink recommends using a creative mix of methods when possible, rather than any one individual research method alone. One set of data will provide information to help “fill in” the gaps in another set to provide additional clarity and confidence to conclusions.

Occasionally the various data and methods will give rise to seemingly contradictory information, but this too is beneficial. This is a signal that the problem may be more complex than predicted, and that in-depth attention will be needed in analysis (two steps from now) to identify the reasons for these divergences.

We recommend working closely with your advanced research practitioner throughout the planning process. With care, your methods will be able to not only answer needed questions, but also to address ethical considerations and program constraints.

Informing the decision cycle

Match your mix of methods to where you are in the decision making cycle. For example, some forms of research, like contextual interviews, are suited to inform high level product, strategy, and design decisions. At the other end, extremely small-level tactical decisions, such as whether specific details of design decisions are working for users, are usually a good match for usability testing.

Qualitative and quantitative methods

Use a combination of qualitative and quantitative methods, as they complement each other and lead to a more comprehensive view of the ecosystem than either approach alone.

Qualitative methods illuminate the why of user preferences and behaviors. The most common are variations of interviewing and observing individual users. These commonly include 1:1 structured interviews, observing participants as they try to accomplish their goals, diary studies where participants track their activities over time, feedback sessions, and task driven usability studies.

However, these interactive approaches can also be supplemented by other inputs. Our teams have drawn qualitative findings from sources such as:

  • Feedback surveys
  • Call center logs
  • App store reviews
  • Online forums

Quantitative methods point to the what of user behavior. They can also be drawn from a variety of sources, including site metrics, surveys, and unmoderated usability studies. These methods tell us more about how people use the tools we build, and the demographics of the groups using them.

It is not enough to surface statistics about numbers of clicks; the quantitative data must directly connect to the questions that need to be answered, such as whether users can successfully solve the problems they are seeking to solve.


Research Thinking moves quickly

Consider using Research Thinking to approach research on a fast-moving deadline. ReThink is particularly valuable to frame experimental, iterative approaches in this way. Center the assumption or guess as a decision-point, and go through the ReThink process with low-risk research studies to inform that decision. For example, you can test a possible “right answer” by designing a very basic, low-fidelity prototype to show to users, even something as simple as a sketch on cardboard. Or, you could build basic functionality on a site to gather metrics and feedback on an assumption in a few days, to inform the direction of next steps. Research Thinking helps frame these studies as learning exercises, with decision points based on what was learned.

Then, when you have made the immediate decision, repeat the process for the next decision point. You’ll be surprised at the way the process aids your need for velocity.

Consult a range of experts and perspectives

In the same way that we generally recommend using a range of research methods, we recommend seeking out a broad range of individual perspectives across your research wherever possible. This applies in four ways:

  • Involving a range of stakeholders and subject matter experts, to ensure a range of organizational needs are considered.
  • Conducting direct user research with a range of end users.
  • Including non-researchers in conducting research, as observers with different perspectives see new things.
  • Empowering participants as partners to co-create research that reflects the needs of the individuals impacted.

Seeking out a variety of perspectives is a research best practice. No single data source, stakeholder, or subject matter expert has a complete view of a complex environment. Neither is a single type of end user able to speak to the needs and experience of all users. By consulting a range of sources, you get a wider perspective on the system, and a more complete understanding than would otherwise be possible.

You will also naturally find and address many more potential risks and unintended consequences than you would otherwise be able to surface.

Make sure to consult users with a diversity of experiences and needs

Particular care should be taken to seek out end users with a range of experiences and needs. Some users will have differing goals or outcomes they want to achieve from the system or service, and others will access it differently, such as with a mobile device or screen reader. However, the range of perspectives should go further, to include people who may not traditionally be thought of as users, but who may still actively use or be directly impacted by a system or service.

For example, while the beneficiaries of a health care agency are the direct users of the service, their caregivers may be just as involved in negotiating the system. Employees and stakeholders may also be impacted, and should be given the opportunity to speak.

Democratizing research

Research Thinking should include ways for non-researchers to study people and develop insights. Often, this involves opportunities for non-researchers to observe or “ride along” with research sessions, with space for them to ask their own questions. While practitioners who have trained for years in research bring a unique skill set, learning only grows through inclusivity. Many researchers see their task as “make the familiar strange and the strange familiar,” and fresh eyes are one of the best ways of doing the former.

Participants can also be included in co-creating research design, or be given additional voice in participant-driven research methods such as diaries. This co-creation can result in new insights and areas for investigation, ensuring decisions and actions taken truly reflect the needs of the individuals impacted.

6. Interpret the data in the context of your decisions

Data do not speak for themselves. It is the meaning behind the data that brings the most value to decision making. The analysis phase is when the meaning is made.

Analysis brings together a variety of perspectives and voices and makes sense of them in the context of the criteria and decisions you determined. This level of analysis and interpretation is a complex skill best led by an experienced practitioner, as with the research itself.


Deliverables focused around outcomes.

A good deliverable in the ReThink framework does not simply “deliver data,” but rather focuses on answering the questions that were asked. It makes meaning, curating and prioritizing information, and clearly tells the story that connects data to strategic outcomes.

Deliverables should always draw a direct line to recommended decisions and actions. The connection between the insights and the next steps should be clear, and the implications of decisions, to the extent that they are understood, should be articulated. This means curating what may be a large amount of data so that the interpretations are clear and are not obscured by excess data on other topics.

As counter-intuitive as it may seem, delivering more is not the same as delivering better. The best deliverables focus and clarify the path to the desired outcomes.

  1. In analysis, the first step in making sense of the data is to organize it. You will begin with a tangle of data reflecting what you heard and observed, and will need to bring order. A variety of methods for grouping and developing themes can be useful, from bottom-up coding to affinity diagramming and more formalized, structured frameworks. Whatever method you choose, you will need to ensure that the resulting themes reflect the guiding questions that drove the research.

    Excluding the irrelevant

    One of the biggest challenges for growing practitioners is separating out what is relevant and what is not in the analysis. While all learning is good, detailed extraneous information can be overwhelming to you and your audience and is ultimately counterproductive.

    That is not to say that findings should be thrown out. Responsible Research Thinking ensures the data is usable for other projects in the future. All data should be available in a repository for cross reference and for reuse in future projects.

  2. Next, interpret what you have observed in light of the questions and decision at hand. Organization is not enough; a deeper analysis goes beyond the surface to understand the meaning and impact represented, and what that impact means for the specific initiative or service. Your work is not done until you create this meaning out of the data.

    Integrating multiple sources of data

    A ReThink approach to analysis integrates relevant data across multiple sources to inform a comprehensive understanding. This integration can be tricky, but it adds tremendous value.

    First, organize and analyze each data set on its own. Once you have managed the individual data sets, you can assess how the findings relate to and inform each other. Analyze the distinctions between what people say and what they do. Pull themes from interviews and contextual observations, and combine them with quantitative data for a more complete understanding. When different data sources lead to differing conclusions, take the time to determine why, as the underlying reasons for the disparities are likely to be significant. (See the previous broad range of data types section, 6b.)

  3. Find the why.

    Once you have integrated multiple sources of your data, you have a framework for sensemaking questions such as:

    • What does this mean?
    • Why is this happening?
    • Why does this matter?

    The last question is particularly important. The results and analyses need to be translated so that it becomes clear how they can guide decisions and actions for greatest impact.

  4. Finally, frame your findings so that they can be used by decision makers, even if the decision maker is you. Research Thinking analysis includes not just context, data, and insights, but also telling a coherent, relevant story about what you have learned.

    Storytelling is an important skill for anyone who works with data. A story is more than an account of incidents or events. It is a path to understanding. Good narrative structure can enable decision makers to both see the import of the data, and see the path to action.

    A narrative structure in this context does not have to involve any actual stories. Instead, it is a framework that relays what is important; rather than presenting isolated topics and ideas, a narrative structure centers key themes, makes clear what matters, and builds on itself. A good narrative structure also edits out what does not contribute to the themes and will distract from the main points, and unifies what remains.

    Create deliverables that tell the story of your data, focused on the decisions, actions, and outcomes that are needed.

7. Make the decision(s)

Decision making has an entire field dedicated to it, and the frameworks for good decision making are beyond the scope of this guide. That being said, in many cases, looking at the data and the analysis together with the criteria you chose earlier will naturally result in a very small number of clear priorities. The decision will be straightforward.

Otherwise, a variety of frameworks exist to take a list of detailed options with good data and turn them into decisions and roadmaps. (One method is the gap scoring method Ad Hoc used on Search.gov.) The choice of framework will depend on the kinds of decisions you have to make. However you get there, the Research Thinking process ends when decision makers make the decision. Unless, of course, the decision leads to more questions that require more research; in that case, you will start again with Step One.


ReThink turns research into robust, informed decision making

It is a way of approaching problems that can be infused through an agency and its partners, creating better outcomes at lower risk. ReThink gets far more value out of agency data and research, and provides better paths to agency outcomes.

Creating effective digital services and general services with good CX requires more than technical know-how and delivery. It requires research to ensure that agencies are building the right things in the right ways to best meet the needs of people. The same is true of strategy and other decision needs; bringing data and a variety of perspectives to bear on decision making leads to better decisions with generally more positive outcomes across agency work.

While thinking doesn’t require more time or budget, it does involve more effort. Problems must be framed and questions aimed at addressing why and how rather than simply what. Research Thinking requires questioning assumptions and testing hypotheses, consulting a variety of sources and stakeholders to research each. The data becomes most important when you can connect the data to the outcomes and decisions needed.

ReThink broadens the definition of research. It goes far beyond targeted user studies, to create a more holistic view of the overall context of people (such as users, stakeholders, helpers, and others affected) and systems (such as policy, technical systems, and the overall environment). ReThink allows for a flexible, inclusive approach to understanding that will adapt to changing circumstances and project needs. The process empowers everyone involved to ask questions and look for answers, which naturally decreases risk as more potential issues and unintended consequences are raised during the process. Step by step, agencies can make more informed, effective decisions with Research Thinking.

Cover of the Rethink case study document.

ReThink resources

Want to learn more? Read our case studies to examine the far-reaching benefits of the ReThink approach, and then watch a webinar recording with ReThink author Alex Mack.

Get access to the resources