• Text Resize A A A
  • Print Print
  • Share Share Share Share

How to Select an Evidence-Based Teen Pregnancy Prevention Program


Go to Section: Objectives > Overview of the Steps


So you want to implement an evidence-based teen pregnancy prevention program...

Excellent news! The Office of Adolescent Health (OAH) has developed this e-module to help you choose the program that is best suited to meet your specific needs and goals. Program selection does not occur in a vacuum; it is a process that involves thought, planning, and coordination. The steps laid out in this module will walk you through the process of program selection, the first in a series of steps involved in implementing evidence-based programs (EBPs). We also touch briefly on aspects of implementing an EBP in the section entitled "After Program Selection."

At the conclusion of this e-learning module, participants will be able to:

  1. Define the term "evidence-based";
  2. Discuss the importance of implementing evidence-based programs;
  3. Identify the four main steps in selecting the most appropriate evidence-based teen pregnancy prevention program for their particular needs; and
  4. Describe how program selection fits into a larger process of high quality program implementation.

Overview of the Steps

There are four main steps involved in the selection of an EBP. The sections that follow will cover each of these steps in greater detail. You may want to download the Program Selection Checklist handout to assist you throughout the process of selecting an EBP. You should consult it regularly and frequently for reminders and guidance from start to finish.

1. Identify the problem(s)

Key Questions: What issue(s) or concern(s) do you want to address? What are your specific needs and what resources do you already have in place to begin to address them?

In this section, you will learn about the critical role that needs and resource assessments play in achieving sustainable impacts in teen pregnancy prevention, including why they are important and how they relate to the EBP selection process.

2. Develop a logic model

Key Questions: What process will you use to address this issue? What are the long-term outcomes you hope to eventually achieve? How do you plan to achieve them? What resources will you need in order to achieve those outcomes?

In this section you will learn about the role and importance of developing a logic model as part of your EBP selection process. The development and use of a logic model will provide you with a detailed roadmap for reaching your goals and outline the criteria by which you will judge the success of your initiative. Your logic model will also provide you with criteria you will use to judge the appropriateness of potential EBPs for teen pregnancy prevention.

3. Identify potential programs

Key Questions: What interventions could you implement as a part of your process for addressing your issue of concern?

There are quite a few EBPs for teen pregnancy prevention out there. How will you narrow your search to find the program that is right for you? This section will walk you through the process of using the U.S. Department of Health and Human Services (HHS)’s Evidence-Based Programs Database and accompanying implementation reports to identify potential programs. You will become familiar with information that is provided on each program, learn how to set search parameters, and sort through results.

4. Assess fit

Key Questions: Of the different interventions you have identified, which are most applicable to your population of interest and community.

Once you have identified programs that meet your basic criteria using the HHS Evidence-Based Programs Database, you must assess whether they are pertinent for your priority population and whether they are relevant and acceptable to both the specific implementation setting and the broader community. This section will teach you how to assess the various dimensions of population and environmental “fit” to find the program that is best possible match for you.

It may be tempting to rush through the first two steps of this process so that you can jump right into reviewing programs. However, these steps are critical to ensuring that the decisions you make in steps 3 and 4 are well-informed and will best position you to successfully impact teen pregnancy prevention in your community. Organizations that don’t spend adequate time identifying the problem and developing a clear understanding of how they plan to address that problem among their target population may end up investing time and money in a program that looks great but isn’t actually a good fit for their community, organization, or target population.

The following video, developed by the ACT for Youth Center for Excellence at Cornell University, helps to demonstrate the importance of careful program selection. [This video was produced by the ACT for Youth Center for Excellence. HHS is not responsible for the content of this video and does not endorse or recommend any products, processes, services, manufacturers, or companies referenced therein.]

Before we get started with these steps, however, we should have a clear understanding of what it means to be an EBP and why we should limit our selection to programs that meet these criteria.

About EBPs

Go to Section: Introduction > Study Criteria > Key Outcomes > Testimonials from Practitioners > Activity


Regardless of background or professional role, we all want our efforts to improve the lives and trajectories of adolescents to be successful. If we spend time or money to address a problem, we want to ensure that we will see a return on our investments. This is where evidence-based programs (EBPs) come in. The teen pregnancy prevention programs in the HHS Evidence-Based Programs Database are proven to reduce teenage pregnancy, behavioral risks underlying teenage pregnancy, or other associated risk factors among teens age 19 and younger. In other words, if you select an evidence-based teen pregnancy prevention program that is appropriate for your setting and population and implement it with fidelity, you are more likely to experience the same results as those described in the program evaluations.

Study Criteria

In order to understand the benefit of using an evidence-based program, it is helpful to know how programs become evidence-based. Not only will you have a better idea of what sort of information to look for when assessing the effectiveness of a particular program, but you will also be able to explain the benefits of using an EBP to your staff and other key partners, such as school administrators, teachers, funders, and other community partners.

A program is deemed evidence-based if:

  • Evaluation research shows that it produces positive results;
  • The results can be attributed primarily to the program itself, rather than to other extraneous factors or events.

The U.S. Department of Health and Human Services (HHS) maintains a list of evidence-based teen pregnancy prevention programs that is located in the Evidence-Based Programs Database. HHS considers the following criteria and others to determine if a program is evidence-based (for more information on the HHS Teen Pregnancy Prevention Evidence Review criteria, please refer to the systematic review by the HHS Office of the Assistant Secretary for Planning and Evaluation (ASPE) in the Resources Section):

Type and Number of Participants

Rigorous evaluations include a minimum number of participants in both the treatment and control groups. If a study has too few participants, it is difficult to say that the same results could be reliably achieved again. Evaluations of teen pregnancy prevention programs, specifically, must also limit their study participants to individuals ages 19 or younger. When a program is evaluated among a well-defined population, the confidence that comparable results can be achieved when implementing the program among individuals with similar characteristics is increased.

Study Design

To be designated as “evidence-based,” a program must be evaluated using an experimental or quasi-experimental design and sound statistical methods. Experimental designs randomly assign participants to a treatment and a control group and assess whether the outcomes for the treatment group are statistically significantly different from those in the control group. Random assignment allows researchers to claim that the intervention is responsible for the difference, instead of other reasons (e.g., that people who choose to be in the study are more motivated to change). Quasi-experimental designs also divide participants into treatment and control groups, but do not do so randomly. Consequently, experimental studies provide stronger evidence for program success than do quasi-experimental studies, though both are considered more rigorous than non-experimental evaluations.

Key Outcomes

An EBP for teen pregnancy prevention will have demonstrated change on one or more of the following outcomes:

  1. Sexual activity
    • Delay sexual initiation. Compared to individuals in the control group, did adolescents who received the program wait longer to begin having sex? How much longer?
    • Decrease the frequency of sexual intercourse. Compared to individuals in the control group, did adolescents who received the program have sex less often? How much less often?
    • Decrease number of sexual partners/increase monogamy. Compared to individuals in the control group, did adolescents who received the program have fewer sexual partners? How many fewer partners?
  2. Increase use or consistency of use of contraception. Compared to individuals in the control group, did adolescents who received the program use contraception more frequently (count of overall usage) and/or more consistently (ratio of usage to sexual encounters)? How much more frequently/consistently?
  3. Sexually transmitted infections (STIs). Did the incidence (and prevalence) of STIs among adolescents significantly decrease following implementation of the program?
  4. Pregnancies. Did the incidence (and prevalence) of pregnancies among adolescents significantly decrease following implementation of the program?

The following video was developed by the ACT for Youth Center for Excellence at Cornell University and describes in a bit more detail what makes a program evidence-based. [This video was produced by the ACT for Youth Center for Excellence. HHS is not responsible for the content of this video and does not endorse or recommend any products, processes, services, manufacturers, or companies referenced therein.]

Funders have placed increased pressure on organizations to implement EBPs because of their increased likelihood of success. However, as you could see from the video, EBPs are not one-size-fits all, and the growing number of EBP options can make identifying the right program a challenge. The remainder of this e-learning module will focus on the process of identifying and selecting the EBP that is right for you.

Testimonials from Practitioners

The following testimonials come from OAH TPP grantees with experience selecting and implementing EBPs. In the following testimonials they discuss the benefits and challenges they have experienced:

Linda Rogers (Iredell-Statesville Schools)

As an OAH grantee, we replicated two evidence-based programs: Making Proud Choices and Be Proud! Be Responsible! Be Protective! Using these programs enabled us to deliver a consistent, medically accurate, and age appropriate curriculum to over 3,500 middle and high school students in Iredell-Statesville Schools. We set our initial goals based on the research findings of these programs. We are proud to say that we have seen positive outcomes in our teen pregnancy rate, intentions on using birth control and condoms, and an increased intention to be abstinent. Without using these evidence-based programs, we do not believe we would have had these positive results.”


Francine Levin (Community Action Partnership of San Luis Obispo County)

Providing our EBP with fidelity seemed like it would be nearly impossible in our conservative community at first. We weren't sure how the inclusion of a condom demonstration would be received by school boards and parents. We spoke directly to the program developer who insisted that the condom use skills were essential skill-building for program effectiveness, and that not including this important aspect was essentially a "deal-breaker." When approaching schools, and later garnering parent permission, our approach was total transparency. We provided the evidence that supported the program’s effectiveness, the importance of implementing the program with fidelity, and explicitly stated that a condom demonstration using an anatomically correct penis model were part of the program. Only two school districts opted out, and both have since decided to offer the program after learning about its success at other local schools. Of the permissions slips returned by students, less than 1% of parents did not allow their child to participate. When students were asked which activity they liked best, the condom demo was listed more than any other activity. When students were asked what was the most important thing they learned, "correct condom use" was listed more than anything else. The requirement to follow the EBP with fidelity gave us the courage and credibility needed to advocate for a more comprehensive program in schools. To our surprise, the community not only saw the need but overwhelmingly gave support for a truly comprehensive approach.”


Melissa Peskin, Ph.D. (University of Texas School of Public Health)

We are implementing an evidence-based sexual health curriculum that provides detailed step-by-step lessons about building skills, changing attitudes, and increasing knowledge about healthy relationships and adolescent sexual health, which may be challenging topics. The response from teachers, students, and parents to the curriculum has been overwhelmingly positive. Teachers have reported not only the successes achieved by their students, but also their own successes in becoming better all-around teachers. It is truly amazing to see the reach of the curriculum and the positive impact it is having on both students and teachers.”


Jasiel Fernandez (Vale Esperar)

Implementing an evidence-based program has provided great learning opportunities for our grass-roots organization. It has allowed us to establish more consistent standards to ensure the medical-accuracy of program content as well as more effective delivery. Likewise, it has afforded leadership and staff clear metrics for development and improvement of our department. Training on our EBP model allowed for consistency in terms of increasing the capacity of community partners and their facilitators. It takes a bit of effort to manage the different elements at first, but it is well worth it.”


Andrea Gomez, RN (Tulare Community Health Clinic)

[An] evidence based curriculum reduces the risk for adolescents; it helps decrease pregnancy and sexually transmitted disease (STDs) rates. Reducing the Risk (RTR) and Draw the Line/Respect the Line enhance the ability for students to comprehend the importance of making informed decisions. RTR emphasizes the importance of having a healthy relationship. It gives the students the opportunity to think about the future and their goals in life, and allows them to practice skills to get out of situations they are uncomfortable in. Students are encouraged to initiate a conversation with their parents and guardians regarding sexual health. Independent evaluation of our OAH TPP Grant has shown that students consistently (over the 4 year study) have a high increase in knowledge of STD/HIV after completing the lessons in the evidence based curricula. Our team has seen a positive impact in our students. The following are examples of the students’ ‘I learned’ statements completed on the last day of class: ‘I learned that there are diseases that come from having sex;’ ‘I learned that sexual intercourse has a lot of consequences and things that can affect you, your partner, and you and your partner’s body and future;’ ‘Abstinence is the only way to stay protected 100%;’ ‘I noticed that I wanna wait to have sex. I’m not trying to have a kid at 15.’”


Now that you have heard about some of the benefits and challenges that other OAH TPP grantees have experienced when implementing an EBP for teen pregnancy prevention, complete the Benefits and Challenges to Implementing an Evidence-Based TPP Program  worksheet to help you start thinking about how your organization might benefit from implementing an EBP and what challenges you might face.

Step 1: Identify the Problem(s)

Go to Section: Key Questions > Needs and Resource Assessment > Example Needs and Resource Assessment

Key Questions

What is the problem(s) you would like to address? Think beyond simply “high number of teen pregnancies in my school district/community.” Are there particular groups in which the high rates are more prominent, (e.g., age groups, race/ethnic groups, specific schools or neighborhoods)? Are sexually transmitted infections (STIs) of concern as well? Are adolescents simply uninformed about the existence of services or resources in their area, or is there a lack of relevant services? Do adolescents even know what these services and resources are or what purposes they serve? Be as specific as possible. You can’t identify solutions until you understand your problem(s).

Needs and Resource Assessment

A needs and resource assessment is a systematic way of gathering information that describes, in detail, the needs and resources of the target population and larger community. A need is a lack of some resource, tool, or program that puts adolescents at a disadvantage or places them at risk for negative health or social outcomes, including teen pregnancy. Resources are types of support, services, or programs that are available in the community, such as a reproductive health care clinics or out-of-school-time programs.

Conducting a needs and resource assessment provides a sound understanding of the needs and conditions of a priority population, which is critical in implementing a program that addresses those needs. Needs and resource assessments are helpful – even if you have already selected or are implementing a program – and should be conducted on a regular basis. Some of the benefits of conducting such an assessment include the following:

  • Identify a priority population by assessing the data
  • Learn more about suspected needs and possibly uncover new ones
  • Identify common sexual risk-taking behaviors
  • Identify the determinants (i.e., the risk and protective factors) of those behaviors
  • Design programs more strategically
  • Gather baseline data that can help with program planning and evaluation
  • Strategic use of resources (i.e., staff, funding, materials)
  • Gain support from stakeholders through strategic planning
  • Update information about your priority population and program participants
  • Review for program improvement
  • Use for future program planning

The following are good sources of data for your needs and resource assessment (refer to the Best Practices for Conducting a Needs and Resource Assessment for additional sources of data):

Local Data

  • County or municipal public health reports
  • School district reports

State Data

National Data

Refer to the Needs and Resource Assessment Checklist to help you assess the comprehensiveness of your data collection plan. Remember, the more thorough your needs and resources assessment is, the better informed you will be when it comes to selecting an EBP that will truly make a meaningful difference in your community. Consider conducting focus groups with youth, parents, or community leaders to gain insights about attitudes, values, and norms, and identify potential barriers to implementation; learn about existing and previous teen pregnancy, STI, and HIV/AIDS prevention efforts; assess the existence and accessibility of health services for teens in your specific area; ascertain the important determinants (i.e., risk and protective factors) that influence sexual risk-taking behaviors; and identify potential collaborations or partnerships you could leverage to support your efforts.

Example Needs and Resource Assessment

The following is an example of a community that was able to better target their teen pregnancy prevention efforts through a careful assessment of community needs and resources.

A community that is currently targeting high school students with its teen pregnancy prevention efforts is still experiencing high pregnancy and STI rates. A needs assessment reveals that many students are already engaging in risky behaviors in middle school, highlighting the need for starting teen pregnancy prevention efforts at earlier ages. In this example, conducting a needs assessment revealed why an existing intervention was not having its intended effect and helped identify more relevant foci for prevention efforts.

Now that we understand the importance of doing a careful and comprehensive needs and resource assessment, we will explore how that information can be used to develop a logic model that can help to guide the EBP selection process.

Step 2: Develop a Logic Model

Go to Section: What is a Logic Model? > Outcomes > Outputs, Activities & Inputs

What is a Logic Model?

A logic model is a graphical depiction of your desired outcomes and your plan for obtaining them. Logic models consist of four major components—inputs, activities, outputs, and outcomes—and serve two purposes.

First, program staff use logic models as tools to strategically, purposefully, and scientifically identify the causal pathways between goals and interventions. In other words, they allow program staff to make sure that there is scientific evidence and theory to support a link between a particular intervention and the outcomes that are being targeted.

Second, they also point program staff to the process and outcome indicators to be measured and evaluated, helping them to evaluate the fidelity of program implementation and make corrections along the way. This process of using data to inform program implementation is a key component of performance management. To learn more about performance management, you can check out the performance management resources on the Office of Adolescent Health’s Teen Pregnancy Prevention Resource Center.

Why are Logic Models Important?

Logic models are the foundation of all good program planning and implementation. They are an important planning tool that will help you identify all of the resources and actions necessary to achieve your goal(s). The development of a logic model facilitates the program selection process by honing in on the specific outcomes of interest and thinking critically about the actions necessary to attain these outcomes and the resources and capacities necessary to carry them out.

Selecting a program without first developing a logic model leaves organizations vulnerable to a number of difficulties and problems later on. For example, the individuals implementing the program may discover upon starting that they lack certain resources or capacities to carry out the program with fidelity. Alternatively, implementing a program with a different population than it was originally designed for may not yield similar results to those achieved in the evaluation study. Logic models help to prevent such pitfalls by establishing an informed framework for program selection. Logic models are also useful for evaluation of your efforts later on.


Working Backward

In developing a logic model, it is important to work backwards and begin at the end with your desired outcome(s).

Logic model: Inputs, Activities, Outputs, Outcomes


Outcomes are the benefits for participants during or after their involvement in a program. They should be directly related to the problem that you identified through your needs and resources assessment, as described earlier in this module. Examples include delay of sexual initiation, decreases in the frequency of sexual activity or number of partners, increases in condom/contraceptive use, and reductions in the incidence of teen pregnancy. Outcomes can be measured at different points, ranging from immediately following the conclusion of a program to several years later.

Remember, outcomes should be specific, measurable, achievable, relevant, and time-bound (SMART). For example, a “SMART” outcome might be to reduce incidence of teen pregnancy in Smith County by 20% within six years. By denoting the amount of reduction desired, the outcome is specific. It is also measurable as the information needed to assess it is available via vital statistics for Smith County. A 20% reduction over a period of six years is a realistic undertaking, making the outcome an achievable one. Assuming Smith County has a high teen pregnancy rate compared to neighboring counties or within the state, it is a relevant goal as well. Finally, specifying a deadline of September 2020 makes the outcome time-bound.

Clearly identifying and articulating the teen pregnancy prevention outcomes that you hope to address will help you to narrow your search for TPP programs that are known to impact the outcome you have identified.

  • Short-term outcomes. Short-term outcomes are the immediate effects of a program and often focus on change in knowledge, attitudes, and skills. For example, an organization wishes to reduce teen pregnancy by delaying sex. A short-term outcome in this example would be an increase in teens’ positive attitudes about delaying sex.
  • Intermediate outcomes. Intermediate outcomes are achieved within 3-5 years of program initiation, and often include change in behavior, norms, or policies. In the example, an intermediate outcome would be adolescents’ delaying of sex (a behavior).
  • Long-term outcomes. Long-term outcomes are achieved within 4-6 years of program initiation and include changes in organizations and systems. In the example, the long-term outcome would be a reduction in the incidence of teen pregnancy (a systemic change).

Outputs, Activities & Inputs

Working Backward Continued: What Leads to Outcomes?

After deciding what you want your short-term, intermediate, and long-term outcomes to be, you can keep working backwards to see what will get you there:

Logic model: Inputs, Activities, Outputs, Outcomes


Outputs are the products of a program’s activities. While this is an important part of the logic model, you won’t be able to fill most of this part in until you have selected your program. However, you may be able to identify targets such as the number of youth you hope to reach and the number of locations you plan to implement the program. Examples of outputs include the number of classes taught, participants, or brochures distributed. Being able to identify expected outputs makes it easier for you to assess whether you need to make adjustments in how you are implementing your program. For example, if you intended to deliver the program to 50 participants but have only recruited 30 youth, you may need to adjust your recruitment strategies accordingly.


Activities are what a program does —the actual events that take place—to fulfill its mission. This section is where you will document the activities that are associated with the program that you select. Examples of activities can include lessons, condom distribution, etc. While you won’t know the specific activities until you have selected your program, you should note any restrictions to keep in mind. In particular, you should use this section to document community norms and values, such as parent attitudes about sexual health education or local policies about sexual health education in schools that will influence which program activities you can conduct. For example, if your community requires abstinence-only education, you should not select a program that includes condom demonstrations. Similarly, if time constraints require you to limit class sessions to 50 minutes, you should not select a program with 90-minute sessions.


Inputs are resources a program needs to achieve its objectives. Examples include staff, volunteers, facilities, equipment, curricula, and money. This component is especially important because you will need to consider all of the resources that will be required, including training, in order to implement the program with fidelity. Once you have determined the necessary inputs, you must assess your current resources and capacities and identify where gaps or deficits occur. Doing a thorough job of assessing your current capacities will be critical as you move on to Step 3: Identifying Potential Programs.

For additional assistance in designing your own logic model, refer to the Logic Model Template and Logic Model Data worksheets.

Step 3: Identify Potential Programs

Go to Section: Identify Potential Programs > Using the Database > Implementation Reports > Activity

Identify Potential Programs

HHS’s searchable database of evidence-based teen pregnancy prevention programs provides an efficient way to filter through the myriad interventions by program type, setting, length, age, race/ethnicity, outcomes affected, and study rating. You may want to start your search by only excluding programs that you know don’t apply. For example, you should have already identified both an outcome and a target population, so you may be ready to exclude programs that don’t target your particular outcome or that are not designed for the age group you are targeting.

As you narrow down your search by including more parameters, you will obtain a shorter list of potential programs. If you make your search too specific, you may not end up with any programs that match all of your criteria. In that case, you should consider which of your criteria are absolutely necessary, and which criteria are merely preferences. Remember, the decisions about your search criteria – including things like age of participants, setting, and target outcomes – should be informed by the needs and resource assessment that was described in “Step 1: Identify the Problem(s).

Upon retrieval of your search results, read through each program’s implementation report for more detailed information to further narrow your options and ultimately decide on a program selection. You can do this by clicking the program name in the list of search results. First, let’s turn to the database itself and walk through the various search criteria.


Using the Database

Filter Options

You can filter programs through the following options:

Program Type

What type of program are you looking to implement? Teen pregnancy prevention can be a goal of several types of programs, including abstinence programs, clinic-based programs, programs for special populations, sexuality education programs, or youth development programs.

Abstinence programs are those that focus on delaying sexual initiation. Clinic-based programs are designed to be implemented in clinical settings, such as health centers. Programs for special populations are those that target specialized groups, (e.g., expectant and parenting teens, youth in juvenile detention). Sexuality education programs generally include information on (or otherwise promote) both the benefits of abstinence and risk mitigation through condom and contraceptive use for sexually active adolescents. Finally, youth development programs combine elements of abstinence and sexuality education with broader services, such as mentoring, health services, or case management. You may want to select “all” to begin with, unless you are targeting a specific population.

  1. Abstinence
  2. Clinic-Based
  3. Sexuality Education
  4. Youth Development
Implementation Setting

Where are you looking to implement your program? Not all programs are designed to be implemented in all settings, though some can work in more than one. The database will allow you to choose one or more implementation settings. These settings include after school/community-based organizations (e.g., YMCA), schools (elementary, middle, and/or high schools), health clinics, or other specialized settings. If you have not already identified a particular setting, you may want to select all to begin with. However, if you know that you will be partnering with a school or a local after school program, you can specify that here.

  1. In School: Elementary School
  2. In School: Middle School
  3. In School: High School
  4. Alternative School
  5. After School
  6. Community-based Organization
  7. Health Clinic
  8. Home-based Case Management
  9. Correctional Facility
  10. Online
  11. Other
  12. Show Me All Types of Settings
Age Group

What are the ages of the adolescents with whom you work? Here again, not all programs are designed for all youth. Younger adolescents (e.g., 13 -year-olds) differ quite dramatically in terms of development, cognition, and physical attributes from older adolescents (e.g., 18-year-olds). Be sure to select the age range that most closely matches your participants in order to filter out programs that were not designed or evaluated with adolescents in that range. In general, you should consider specifying this criterion at the start of your search.

  1. 13 years or younger
  2. 14 to 17 years
  3. 18 to 19 years
  4. 20 years or older
  5. Show me all age groups

Another important demographic characteristic to consider during program selection is population. Programs designed for and evaluated with African-American adolescents may not yield the same outcomes for Latino adolescents, for example. If your priority population includes adolescents of varied populations, select each applicable option to limit your results to those evaluated with these different sub-populations.

  1. Female
  2. Male
  3. Latino
  4. African American
  5. White
  6. Asian
  7. Any race/ethnicity 
  8. Pregnant or parenting
  9. Incarcerated youth
  10. Foster care youth
  11. Homeless youth
  12. Sexually active
  13. LGBTQ
  14. Show me all target populations

Implementation Reports

Each program has an associated implementation report that provides you with the following information:

  1. Developer(s): Name(s) of the individual(s) who initially authored the program
  2. Program description and overview: Brief and general introduction of the program
  3. Core components: Key program components, including content, instructional techniques (e.g., lecture, role plays, video), and implementation requirements (e.g., number of lessons, number of facilitators required)
  4. Target population: Specific population with which the program was evaluated as well as any other potential target populations identified by the developer for which the program may be applicable
  5. Program setting: Specific setting in which the program was evaluated as well as any other potential program settings identified by the developer that may be applicable
  6. Program duration: Time frame required to implement the program (i.e., number and length of sessions)
  7. Curriculum materials: All of the materials necessary to implement the program (e.g., manual, worksheets)
  8. Adaptations: Any allowable adaptations to the program authorized by the developer(s) and/or OAH that have been determined not to interfere with the program’s integrity or results; note that allowable adaptations are not available for all programs
  9. Program focus: Program’s type or approach (e.g., abstinence, sexual health, youth development)
  10. Research evidence: Citation for the article or report of the program evaluation, setting in which the program was evaluated, characteristics of the participants evaluated, study design utilized in the evaluation, strength of the evidence yielded by the evaluation, and the evaluation’s overall findings

You can access each program’s implementation report by selecting the program of interest in the “Find a Program” menu on the Evidence-Based TPP Programs page of the OAH website. The implementation reports will be immensely useful during your program selection decision-making process. The information they offer goes far beyond what a simple database search can yield. For instance, while some programs have only been evaluated in one setting, the program developers may have identified other settings in which the program could be implemented without interfering with its efficacy. While this information would not come up during your initial database search, you would be able to acquire it by reading through the program implementation reports.


Now it’s your turn to practice using the database to search for EBPs based on specific criteria. Consider the following examples:

Scenario 1: A community that has already established after-school teen pregnancy prevention initiatives now needs a program that can be implemented within its middle schools during the school day. In the HHS Evidence-Based Programs Database, use the search functions to narrow programs by implementation setting to obtain a list of programs that will meet this community’s new needs.

Scenario 2: A community-based organization partners with a school district to provide a teen pregnancy prevention program after school to middle school students. Based on time constraints, they must select a program that has no more than 15 sessions. Use the search functions in the HHS Evidence-Based Programs Database to identify programs that satisfy the implementation setting, age, and intervention length criteria.

Scenario 3: A school board authorizes the implementation of a teen pregnancy prevention initiative in its middle schools, but specifies any program implemented must be an abstinence-based program. Use the HHS Evidence-Based Programs Database to sort by program type to view only abstinence programs

Step 4: Assess Fit

Go to Section: Assess Fit > Population Fit > Environment Fit > Adaptations > Activities

Assess Fit

Once you have conducted your search and narrowed down your list of program options, it is time to assess the degree to which they fit your target population and the larger environment in which they are to be implemented. This section will provide you with a number of items to consider as you assess the fit of each potential program.

As discussed previously, the implementation reports developed by HHS for each program in the database present a range of information that will help you to assess how well each potential program fits with your population, community, and organization.

Before you read the implementation reports for the programs that you identified in Step 3, you should develop criteria based on the “Population Fit” and “Environmental Fit” items presented here. Remember, the criteria you develop should be based on your needs and resource assessment and your logic model. It can be tempting to read through each program to see if any seem particularly relevant, but spending the time to develop your criteria will help you to select the program that is best suited to your particular situation.

Population Fit

At this stage in the program selection process, it is important to verify that your program of interest is actually applicable to the population with which you are working. For example, implementing an EBP that was determined to be effective among low-income African American students in urban environments may not yield the same results for tribal youth in rural settings. How does your population compare to that in the study of that EBP? If there are differences, are they likely to compromise your results? (Remember, you can get more detailed information about the adolescents involved in the evaluations by referring to the implementation reports.) Assessing population fit includes consideration of the following:

  1. Age
  2. Race/ethnicity
  3. Sex
  4. Socioeconomic status
  5. Language
  6. Immigration status
  7. Sexual orientation
  8. Culture
  9. Other considerations (e.g., juvenile justice, parenting teens)

Environment Fit

In addition to assessing the extent to which an EBP fits your target population, you must also consider how it fits within the environment in which it is to be implemented, by answering the following questions:


Does this program fit within the organization’s overall mission? Examine the organization’s broader goals (often captured in a mission statement) and consider whether your potential program(s) will help work toward the attainment of these broader goals. If not (or if they are in some way counter to those goals), it may be best to return to your search until you find one that is a better fit.


Are there local laws, policies, or other norms that would be violated by certain components of this program? For example, are there laws prohibiting condom demonstrations in schools? Is it administratively feasible, given the policies and procedures of the implementing organization? Does the program align well with local norms and customs? Are there community cultural considerations you should take into account?


Is this the appropriate setting for this program? If you are planning to implement your program in a school but the program you are considering is designed to be delivered in an after school setting, you should make sure that it is also acceptable for delivery in a school. Consider whether any differences in setting are likely to compromise the intervention’s effectiveness. The program implementation reports will let you know if program developers have identified other potential settings.


Based on your needs and resource assessment, do you have the capacity to implement this program with fidelity? Four key capacity concerns include (1) training, (2) implementation requirements, (3) time, and (4) cost. For example, will you be able to obtain all of the resources required? Will you be able to provide adequate training for your staff? Do you have enough staff members to implement with fidelity, and if not, do you have the resources to hire more? For instance, if a program requires two facilitators and you know that your organization will not be able to hire more staff, you may need to consider a program with fewer staffing needs.

Be honest and realistic in your assessment of fit. Wishful thinking is just that, wishful. An accurate appraisal up front can save time and energy later attempting to hastily and retroactively make modifications that will likely compromise the integrity of the program. Find a program that will work with the population and setting with which you are actually working to strengthen your odds of success. If the fit is not appropriate, you may have to go back and re-evaluate your program selection.

The following video, developed by the ACT for Youth Center for Excellence, describes some important elements of program implementation and demonstrates how important it is to thoroughly assess both population and environmental fit when selecting a program in order to avoid some of the challenges of implementation. [This video was produced by the ACT for Youth Center for Excellence. HHS is not responsible for the content of this video and does not endorse or recommend any products, processes, services, manufacturers, or companies referenced therein.]


A Note about Fidelity & Adaptations

Implementing EBPs with fidelity increases the likelihood that participants served by programs will experience similar outcomes to those found in the original evaluation study. Implementation with fidelity minimizes the need for adaptations, but does not mean never making adaptations.

Adaptations are changes made to the core components of the program including the program content and program delivery. Adaptations are often proposed because the EBP selected for implementation is a poor fit for the needs of the target population, implementation setting, and/or capacity of the implementing organization. To reduce the need for adaptations, organizations should focus on selecting EBPs that are a good fit - that is, the program matches the needs of the community and population to be served, the implementation setting, the capacity of the implementing organization, and the targeted outcomes.

Some adaptations are minor (i.e. do not significantly change the core components) and may be necessary to make the program culturally relevant, current, and/or more engaging. Examples of minor adaptations (often referred to as green light adaptations) include:

  • Adding icebreakers, team-builders, energizers, or reflection activities
  • Adding a session on general reproductive anatomy
  • Providing updated statistics or information about local statistics
  • Providing information about local resources (e.g. teen friendly health centers)
  • Adding implementation strategies to better engage youth population (e.g. using more music)
  • Revising materials to ensure LGBTQ inclusivity (e.g. creating gender neutral language in role plays)
  • Changing minor wording (e.g. the term “group rules” to “group agreement”)

Other adaptations are major and do significantly change the core components of an EBP. Major adaptations could compromise a program’s fidelity and might affect the intended outcomes. As a result, major adaptations should be avoided, if at all possible. If not possible to avoid, major adaptations should be carefully considered and implemented with great caution. Examples of major adaptations (often referred to as yellow light or red light adaptations) include:

  • Omitting a lesson or activity such as a condom demonstration
  • Decreasing the number or length of sessions
  • Increasing student to teacher ratio
  • Shortening or eliminating program videos

Adaptations to extend the program to a new population or setting are unique and could be classified as either minor or major depending on the circumstances. Since it is impossible to evaluate all potential settings and populations for which a program might work, it is to be expected that some implementers will propose extending the program to a population and/or implementing it in a setting in which the program has not been tested. Implementing an EBP with a different population or in a different setting is considered a minor adaptation, as long as the developer has indicated that the EBP is appropriate for the population or setting.

In the event that adaptations may be necessary to proceed with program implementation, it is important to think about them ahead of time, rather than making unplanned adaptations. Resources available to help organizations in planning for adaptations are available in the Adaptation Section of the OAH TPP Resource Center. Program-specific adaptation kits are also available for select programs from ETR Associates.


Now that you are familiar with the process of selecting an evidence-based teen pregnancy prevention program – including how to use a needs and resource assessment and a logic model to guide your selection process – it is time to put your knowledge into action. Read through the following scenarios and see if you can identify how each organization should proceed, based on the information provided.

Population Fit: Other Considerations

An organization is charged with working with a special population, expectant and parenting teens. How should this organization proceed with its program search?

Population Fit: Sex

Question: An organization is interested in implementing the Aban Aya Youth Project program, which has only demonstrated impact on males, but the organization reaches both males and females. How should the organization proceed with its selection process?

Population Fit: Context

An organization would like to implement Becoming a Responsible Teen (BART) in schools but only has 45-minute class periods instead of the 90-minute periods required for BART. Recognizing that some programs require more time than others (e.g., 45-minute sessions in Making Proud Choices versus 90-minute sessions in Becoming a Responsible Teen), what is the best course of action for this organization?

Population Fit: Capacity

An organization shows interest in the program, Project AIM, but learns that the program requires two facilitators. To ensure environmental fit, the organization must assess whether they have the resources to implement a program that requires two facilitators. If they find they do not have the resources to provide a second facilitator, what is their best course of action?

After Selection

Go to Section: Evaluation and Monitoring > CQI and Sustainability

Evaluation and Monitoring

Program selection on its own will not lead to the outcomes envisioned in the logic model. It is important to view this element in context, understanding that the most successful teen pregnancy prevention programming efforts are those that evaluate the implementation of the program as well as its outcomes; use evaluation data to continuously improve their programs; and plan in advance how they will sustain the program past its initial implementation.

Organizations may find it useful to use a program planning framework, like Getting to Outcomes (GTO), Communities that Care, SAMHSA’s Strategic Planning Framework (SPF), or PROSPER to guide their work. While each framework is slightly different (e.g., PROSPER is intended to provide guidance for University partnerships), they generally include the steps that were covered in this module, as well as activities and planning related to the following:

Process/Implementation Evaluation

Process evaluation is the ongoing assessment of the quality of program implementation. Process evaluations examine the “outputs” segment of the logic model and assess whether (1) program activities were carried out in the manner prescribed (i.e., with fidelity); (2) levels of participant attendance, satisfaction, and retention; and (3) external circumstances that may have interfered with the quality of implementation among other things. Process evaluation should take place throughout the implementation of the program and is useful both for informing future implementation and making links between the program and its outcomes. You can check out these resources related to program implementation on the Office of Adolescent Health’s Teen Pregnancy Prevention Resource Center. You can also refer to OAH's Fidelity Monitoring Guidance for more information on monitoring your activities to ensure that you are implementing your selected program with fidelity.

Outcome Evaluation

Outcome evaluation is the process of assessing the success of the program in achieving its desired goals. As mentioned in the section on developing a logic model, outcomes are written to be SMART (specific, measurable, achievable, relevant, and time-bound). By framing your outcomes this way in advance, you are able to specify the indicators you will measure to determine whether or not they have been achieved. Conducting an outcome evaluation can provide you with results that you can share with community stakeholders and funders alike to increase interest in your program and obtain funding to maintain it. You can check out the evaluation resources on the Office of Adolescent Health website.

CQI and Sustainability

Continuous Quality Improvement

Outcome evaluation is the process of assessing the success of the program in achieving its desired goals. As mentioned in the section on developing a logic model, outcomes are written to be SMART (specific, measurable, achievable, relevant, and time-bound). By framing your outcomes this way in advance, you are able to specify the indicators you will measure to determine whether or not they have been achieved. Conducting an outcome evaluation can provide you with results that you can share with community stakeholders and funders alike to increase interest in your program and obtain funding to maintain it. You can check out the evaluation resources on the Office of Adolescent Health’s Teen Pregnancy Prevention Resource Center.


Sustainability refers to the plan of action to keep the program in place after its initial implementation. Unfortunately, a common reality we face is that even successful programs may not continue if/when their initial funding runs out. Planning for sustainability from the very beginning of this process will help ensure that you can continue to provide high quality programming for youth regardless of issues like financing. For more information, see OAH’s Built to Last: Planning Programmatic Sustainability tip sheet and Building Sustainable Programs: The Resource Guide.


Go to Section: Review > Resources


Reducing rates of teen pregnancy is no easy undertaking. Applaud yourself for taking the first steps to address this issue within your organization. Even the most effective programs will not make a difference by themselves, which is why forethought and planning are so critical. Taking the time and effort to (1) identify your specific needs and existing resources; (2) develop a strong plan for action with steps that are directly connected to your desired outcomes; (3) filter through program options based on your relevant characteristics; and (4) ensuring that your final program selection is applicable to the adolescents you serve, makes sense, and is acceptable within the larger community, puts you on the path for success.

The Program Selection Checklist handout will assist you throughout the process of selecting an EBP. Consult it regularly and frequently for reminders and guidance from start to finish.


Refer to the following resource list for additional information on any of the topics presented in this module:


What Does it Mean to be "Evidence-Based?"

Identify the Problem(s)

Develop a Logic Model

Identify Potential Programs

Assess Fit

After Program Selection


Go to Section: Quiz Intro > Quiz


You have nearly completed the How to Select an Evidence-Based TPP Program E-Learning Module!

To finish the module, you must correctly answer the 8 of the following 10 questions.

Question 1

Which of the following is not a requirement of an EBP?

Question 2

True or false: As long as you implement an EBP, you can be guaranteed to see positive results.

Question 3

Which of these is not a component of a logic model?

Question 4

The component of a logic model that is most important to consider when selecting an EBP is:

Question 5

Which of the following is not one of the four steps to selecting an EBP that were discussed in this module?

Question 6

Which of the following is the best option if a program requires more time than you will have available?

Question 7

Which of the following is not a criterion that you can use to sort programs in the HHS database?

Question 8

Which type of adaptation can you make without affecting the core components of the underlying EBP?

Question 9

Which of the following should not be considered when selecting an evidence-based program?

Question 10

Which of the following is not an important aspect of implementing an evidence-based program?

Content created by Office of Adolescent Health
Content last reviewed on June 30, 2017