Other News

Request for Proposals: Independent Evaluation of Virtual Exchange Programs

Issue Date: June 26, 2024

Application Deadline: August 16, 2024

Submit proposals and questions to: StevensInitiative@aspeninstitute.org, ATTN: Andie Shafer, Assistant Director of Grants and Programs

Introduction

The Aspen Institute requests applications for the independent evaluation of the performance and effect of the J. Christopher Stevens Virtual Exchange Initiative (JCSVEI). The application should take the form of a monitoring and evaluation plan covering evaluation and analysis of the Aspen Institute’s approximately 10 JCSVEI grantees that will conduct virtual exchange programs for approximately 10,000 young people in the United States and across the Middle East and North Africa (MENA) during the period from October 2024 through November 2026. The awarding of the independent evaluation agreement is contingent upon receipt of funding from the U.S. government, which may impact the final period of performance of this agreement, and is anticipated to be finalized in the summer of 2024. 

The J. Christopher Stevens Virtual Exchange Initiative was launched in 2015 to continue the legacy of U.S. Ambassador to Libya J. Christopher Stevens, who devoted his life to building bridges through open and respectful dialogue and person-to-person diplomacy. He served the majority of his Foreign Service career in the Middle East and North Africa, a region he grew to love, and eventually served as U.S. Ambassador to Libya. The Initiative, conceived and developed in close partnership with Ambassador Stevens’ family, seeks to give a generation of young people the kind of meaningful international experience that shaped Ambassador Stevens as a young man. The Initiative is a U.S. Department of State Bureau of Educational and Cultural Affairs program administered by the Aspen Institute.

The Initiative builds on the pioneering work of the virtual exchange community, which uses technology to connect young people across continents and cultures. Virtual exchange provides opportunities to connect young people from diverse places using everyday technology for collaborative learning and interaction through sustained and facilitated engagement. While virtual exchange can cover any topic or subject matter and can vary substantially in the length of programming, technology or platform(s) used, learning outcomes and activity types, all virtual exchange includes a core component of cross-cultural connection and collaboration. Through virtual exchange, youth have access to a substantive international exchange experience by collaborating and learning with their peers abroad without having to leave their communities. Recognizing the impact that virtual exchange promises, and capitalizing on advances in technology, the aim is to make life-changing, cross-cultural experiences available to young people, and for the experience to be mutually beneficial to participants on all sides of the exchange. Learn more about virtual exchange here.

By summer 2025, the Stevens Initiative will reach nearly 80,000 young people around the world. Learn more about the Initiative, its activities, its funders, and the programs it has supported here.

Scope and Basis for This Evaluation

The evaluation sought by this request for proposals is intended to establish the plan for – and carry out the work of – collecting, analyzing, and reporting data required by the JCSVEI’s Performance Monitoring Plan (PMP), which is a component of the cooperative agreement between the Department of State and the Aspen Institute. The PMP is based on the MODE Framework developed and maintained by the Bureau of Educational and Cultural Affairs. The items that the Aspen Institute seeks to have covered by this evaluation are included in the Abridged JCSVEI Performance Monitoring Plan that is an annex to this request for proposals. That document includes a final section containing custom items that the Aspen Institute seeks the evaluator to include in their plans for designing and carrying out data collection, analysis, and reporting. Some items from the PMP agreed between the Department of State and the Aspen Institute are not included in the abridged PMP and are not expected to be collected, analyzed, or reported on by the evaluator because the Aspen Institute is able to collect and report them itself. 

Proposals should be structured to correspond to the following sections, which explain the competencies sought in the evaluator as well as the areas of work and the anticipated deliverables of the project. The applicant should be clear how they would approach working at this scale and with this range of partners serving young people in disparate contexts.

Section 1: Experience and Capacity in Relevant Contexts (25%)

Experience and Capacity on Projects of this Scale and Breadth (5%): This evaluation will cover virtual exchange programs conducted by approximately 10 grantees during the period from October 2024 through November 2026. These programs are anticipated to serve approximately 10,000 young people, each of whom is expected to complete a post-program survey as outlined in the MODE Framework. Some of the programs are conducted as part of primary school, secondary school, or higher education for-credit courses and others are conducted as extracurricular activities. Most programs include a combination of synchronous and asynchronous participant engagement and communication.

Experience and Capacity at the Primary/Secondary and Higher Education Levels (10%): Five of the programs will serve youth at the primary or secondary education level and six of the programs serve young people at the higher education or young professional level (one program is engaging both secondary and post-secondary-aged youth). 

Experience and Capacity in the U.S. and MENA Region (10%): The programs will be conducted across at least ten countries in the Middle East and North Africa and the Palestinian Territories and across geographies in the United States.

The evaluator should describe their experience working on projects of this scale, working in the primary/secondary and higher education contexts, and working in the United States and in the Middle East and North Africa. The applicant should make clear whether they would be hiring or contracting any additional individuals or entities to provide expertise in any of these areas.

Section 2: Data Collection (20%)

Participant Survey Preparation (10%): The evaluator will maintain an administrative account on an online survey management system and will create sub-accounts for the Aspen Institute and for each grantee organization. The evaluator will create a survey template including the items listed in the abridged PMP that are labeled as being collected via a post-program survey. The evaluator will implement survey logic or take other steps to ensure that respondents see the appropriate survey items and instructions. The evaluator should maintain the survey template in English as well as Arabic, retaining an Arabic language expert for this purpose if needed. The evaluator should proactively ensure each grantee is ready to administer the survey to its participants. In rare cases, grantees may request to administer surveys on separate platforms, in which case the evaluator will provide that grantee with a template document and the grantee will submit a draft survey for approval to ensure it complies with Aspen Institute and Department of State requirements. The Aspen Institute may ask the evaluator to add items to the survey template during the period of the contract, either as a result of the item being introduced by the Department of State or as an additional item that is unique to the JCSVEI. Grantees may also request support from the evaluator to add items to their survey that are unique to their program. The evaluator will add these items to the survey template or to the survey for the specific grantee as needed, consult with the grantee on the additions, and share with the Aspen Institute and Department of State for their respective review and approval.

Grantee-Reported Administrative Data (5%): Some program information – identified as having the data source of “administrative records/data” in the abridge PMP annex – is compiled and shared by grantees rather than collected via surveys. The Aspen Institute currently collects (and anticipates continuing to collect) this data from grantees via an evaluation reporting template that must be submitted after grantees complete each round of programming of their virtual exchange program. For example, a grantee that conducts a fall program that concludes in November is required to submit data disaggregated by participating institution, identifying which U.S. institutions are community colleges or Title I schools and which MENA institutions do not use English as the primary language of instruction. The evaluator is expected to analyze and include this data in its semi-annual reports.

Site Visits (5%): The evaluator will plan and conduct virtual site visits of four grantee programs, a representative mix of programs (age level, location, grant type, etc.), the identity and timing of which will be determined later. Site visits should be informed by a framework developed by the evaluator to include observing virtual exchange activities while they are being conducted, focus groups with youth participants, and interviews with facilitators, program staff, and other relevant stakeholders such as institutional leaders to collect qualitative data from virtual exchange programs. The evaluator should account for the steps that may be required to get permission for conducting site visits, particularly when they involve educational institutions, minors, or other settings where advance permission may be required. Within 10 business days following each visit, the evaluator will submit a short report of no more than two or three pages to the Institute (and shared with the grantee whose program was visited), describing what occurred during the visit, any notable data that was collected, and recommendations for improving the program. If the evaluator encounters or observes any urgent or concerning issues during the site visit, the evaluator must notify the Institute immediately.

Section 3: Capacity building (20%)

Grantee Training (10%): The evaluator will conduct an onboarding meeting with new grantees during fall 2024. The evaluator will conduct two instances of the meeting for subsets of the grantees. Throughout the period of the contract, the evaluator will provide technical assistance to grantees via one-on-one and group communication by email and meetings, with an emphasis on addressing common and persistent challenges in data collection and reporting methods, with particular emphasis on helping grantees attain high survey response rates. This support and capacity building is intended to help grantees finalize their surveys, prepare their facilitators/partners to circulate the surveys, get reliable responses and high survey response rates, organize and host site visits, and refine their programs based on the lessons learned over the course of the grant.

Survey Technical Assistance (5%): The evaluator will work proactively with grantees to ensure grantees’ survey drafts are complete, up to date, consistent with the approved template (or any additions, subtractions, or changes are approved by the Aspen Institute), and functioning properly well in advance of the survey implementation date, which should be during the final week or session of a virtual exchange program. The Institute will communicate with grantees and share information with the evaluator about when grantees’ programs will begin and end; the evaluator will maintain and update as needed a database with this and other grantee evaluation-related information and share access to the information with the Aspen Institute. The evaluator will monitor survey responses and follow up with grantees to ensure survey administration is occurring appropriately, notifying the Aspen Institute of any significant issues or shortfalls in response rates. The evaluator will analyze survey data, checking and cleaning it consistent with standards in the field, and include it in the reports described below.

Grantee Community of Practice (5%): The evaluator will plan, convene, and facilitate annual videoconference meetings for all grantees to gather as a community of practice to share reflections on their experience conducting virtual exchange programs and evaluating their programs.

Section 4: Data Analysis, Reporting, and Dissemination (20%)

Data Analysis (5%): The proposal should address how the evaluator would meet the data analysis expectations laid out in the attached PMP and ECA reporting templates as well as any additional calculations and analyses the evaluator proposes to effectively and appropriately provide insights into how programs are being conducted and what effect they are having on participants.

Evaluation Reports (10%): The plan should include, for each of the two years of programming covered, a brief evaluation memo submitted in the fall following the spring academic term and a longer report on grantee evaluation the following spring encompassing all prior calendar year programming. The memo following the spring academic term should include participant demographic and survey data and does not need to include prose beyond a short introduction and explanations. The report following the fall academic term should be no longer than ten pages and should include demographic, survey, and site visit data, if applicable, from the whole prior calendar year (the just-concluded fall term as well as the preceding spring and summer terms). The data should be presented disaggregated by the summer, fall, and winter/spring terms, as well as being presented aggregated for the whole calendar year; for example, the first report would show data separately for the spring 2025, and summer and fall 2025 terms, as well as aggregated for the entire grant portfolio from all of those terms. Reports should be written in a succinct way that can be easily excerpted to be suitable for external publication in communications and outreach strategies or adapted for a general audience that is aware of virtual exchange but not expert on evaluation; they should be clear, concise, and readable and do not need to be produced with publication-ready layout or graphic design.

The survey data analysis that is part of each memo/report should be completed in the form of data tables required for reporting on MODE indicators from the PMP to the Department of State (see templates included in the annex to this request for proposals). The evaluator should plan to add tabs to this template with grantee specific information for each reporting period that can be shared with each unique grantee. Additional JCSVEI indicators that are part of the PMP but not part of the required MODE framework should be included in a separate data table submission or in the memo/report narrative. 

Dissemination and Impact Sharing (5%): The evaluation plan should be built to be suitable for public sharing and dissemination of evaluation findings in mind. While specific MODE reporting is required by the Department of State, the evaluation plan and evaluation reports should account for an audience of donors and program stakeholders, practitioners, leaders and administrators at education and exchange organizations, policymakers, and other relevant sectors. The Aspen Institute hopes to use and characterize data – and work closely and collaboratively with the evaluator in doing so – to publicly raise awareness of the impact of virtual exchange in an effort to generate interest and demand for virtual exchange as it endeavors to exponentially scale the practice of virtual exchange globally. The proposal should include the evaluator’s plans for structuring evaluation reports and collaborating with the Aspen Institute on public impact sharing. 

Section 5: Risk and Safety (5%)

Risk and Safety (5%): The evaluator should demonstrate familiarity with the challenges specific to the contexts where this project will be carried out. This includes laws and norms surrounding data collection and storage (particularly demographic or personally identifiable information); laws and norms about adult/evaluator interaction, especially with minors; sensitivities about certain topics both in the United States and in the MENA region, especially at the primary/secondary education level; and other topics. Issues in these areas can pose risks to the health and safety of youth participants, facilitators, and others associated with the programs and they can harm institutional partnerships, grantee and other organizations, and the JCSVEI as a whole. The evaluator will be expected to work proactively with the Aspen Institute and grantees to solicit and address any feedback or concerns about survey questions, other data collection methods, or any other aspects of the evaluation.

Section 6: Cost-effectiveness (10%)

Cost-effectiveness (10%): The proposal should present a clear plan for carrying out the work and should correspond to an attached budget that includes all expenditures and an attached budget narrative that describes all expenditures in detail. This section of the proposal should include an overview of the staffing plan, with a description of the role each staff member would have in carrying out the project. All costs must be consistent with any relevant federal guidelines. The assessment of this criterion will consider not only whether the total requested budget and the items within it are competitive, but also whether they are realistically sufficient to successfully carry out the work.

Proposal Parameters and Guidance

Timeline: The proposal should have a period of performance from October 1, 2024, through November 30, 2026. The awarding of the independent evaluation agreement is contingent upon receipt of funding from the U.S. government, which may impact the final period of performance of this agreement, and is anticipated to be finalized in the summer of 2024. This contract may have the opportunity for up to two non-competitive renewals, contingent on receipt of future funding. 

  • October 2024: Inception and contractor onboarding; the evaluator collaborates with the Institute and Department of State to develop a draft evaluation framework.
  • November 2024: The evaluator begins engagement with grantees, refines the evaluation framework, and develops a survey template with grantee and ECA input.
  • January 1, 2025, through November 30, 2026: Conduct the evaluation of grantee virtual exchange programs. 
  • September 15, 2025: First evaluator memo is due (reporting on Spring 2025 survey data).
  • March 15, 2026: First evaluator report is due (reporting on the whole preceding calendar year including spring 2025, summer 2025, and fall 2025).
  • September 15, 2026: Second evaluator memo is due (reporting on spring 2026)
  • February, 2026*: Second evaluator report is due (reporting on the whole preceding calendar year including spring 2026, summer 2026, and fall 2026).

*Final report due date may change after the award of U.S. Government funds to the Aspen Institute is finalized. 

Proposal Length and Format: The proposal should be no longer than 20 pages, not counting attachments. It should be structured to correspond to the numbered sections above, which explain the competencies sought in the evaluator as well as the areas of work and the anticipated deliverables of the project.

Attachments: In addition to the proposal described above, please provide these attachments:

  1. A timeline, including major areas of activity and all milestones and deliverables;
  2. Budget, including international and domestic in-person site visits;
  3. Budget narrative, providing details and justification for all items in the budget;
  4. CVs for evaluator staff members, including at least one member with expertise in the Middle East and North Africa.

Submission: Submit questions and full proposal packages as a PDF, no later than 5:00 p.m. U.S. Eastern Time on August 16, 2024 to StevensInitiative@aspeninstitute.org. Subject Line: RFP – JCSVEI Evaluation. Attn: Andie Shafer, Assistant Director of Programs and Grants. 

Selection Criteria:

  1. Experience and Capacity in Relevant Contexts (25%)
    • Experience and Capacity on Projects of this Scale and Breadth (5%)
    • Experience and Capacity at the Primary/Secondary and Higher Education Levels (10%)
    • Experience and Capacity in the U.S. and MENA Region (10%)
  2. Data Collection (20%)
    • Participant Survey Preparation (10%)
    • Grantee-Reported Administrative Data (5%)
    • Site Visits (5%)
  3. Capacity Building (20%)
    • Grantee Training (10%)
    • Survey Technical Assistance (5%)
    • Grantee Community of Practice (5%)
  4. Data Analysis, Reporting, and Dissemination (20%)
    • Data Analysis (5%)
    • Evaluation Reports (10%)
    • Dissemination and Impact Sharing (5%)
  5. Risk and Safety (5%)
  6. Cost-effectiveness (10%)

More News

See All News