Implementing Randomised Control Trials to evaluate the impact of outreach activity – Lessons learned

Article by Sarah Tazzyman and Lindsey Bowes

The NCOP provides an exciting opportunity to extend the evidence base on the impact of outreach in higher education. The national evaluation team (CFE Research and the BIT), together with four consortia have run Randomised Control Trials (RCTs). To understand the impact of online mentoring, summer schools and text-based nudging on attitudes towards and propensity to apply to study HE. The learning identified during the trials could usefully inform the design of future trials to further improve evaluation practice and the quality of evidence produced.

To date, few RCTs have been carried out on outreach activity. RCTs provide the opportunity to improve the quality of the evidence base and identify which outreach activities are most effective. To date, three RCTs have been completed as part of the NCOP national evaluation and analysis is currently being undertaken. The first trial involving the SUN consortium and Brightside considers the impact of online mentoring on students’ self-reported intentions to go to HE. A light-touch text based method with Year 13 learners has been delivered by the NEACO consortium to test whether it can encourage learners to apply to HE. A collaborative summer school RCT with Year 10 learners has been run by the GHWY consortium, LiNCHigher and FORCE consortia to determine whether summer schools can increase students’ knowledge about HE, their motivation to attend and their belief that this is possible.

The national evaluation team has compiled the learning gained as a result of implementing these trials into a series of recommendations designed to help guide other consortia that may be planning to implement RCTs as part of their local evaluations:

  1. Ensure the appropriate skills and capacity are available to design and implement the trial. Evaluation expertise and knowledge/experience of experimental methods is essential, particularly for ensuring a robust trial protocol and data analysis. Seconding or buying-in additional research and/or statistical support may be required. It is also important to ensure there are sufficient ‘boots on the ground’ to secure and maintain engagement in the trail, including data collection.
  2. Test the feasibility of utilising experimental evaluation designs. RCTs are resource intensive. A small-scale pilot of the planned methodology helps to identify budget and/or capacity issues as well as any problems that need to be addressed before a large-scale trial is implemented.
  3. Ensure strategic buy-in from all parties involved in the trial. Securing buy-in from the delivery partners is essential to the successful implementation of an RCT. This is particularly important for activities that engage a range of providers across a wide area. In addition, it is important to be realistic about the likely appetite for the intervention under investigation. A wide range of outreach is being delivered as part of NCOP and wider institutional WP activities. Some schools/FECs may be at saturation point may not have the capacity to engage in additional activities, especially under experimental conditions. This further emphasises the importance of undertaking feasibility testing.
  4. Anticipate and address ethical concerns and tensions that may arise as a result of pressure to achieve operational targets. Be mindful of the potential tension between maximising engagement with NCOP target learners and filling spaces on outreach activity on the one hand, and ensuring the integrity and quality of the trial is maintained on the other. Outreach staff need to be supported and feel confident to prioritise and maintain the design of the trial.
  5. Set realistic timelines. Do not under-estimate the time involved in setting up and implementing a trial. Processes such as obtaining Ethics Committee approval, disseminating information about the intervention, securing agreement from schools/FECs to take part, and agreeing and disseminating data collection instruments, are often time consuming and this needs to be accounted for in the evaluation plan. Access to expertise in data protection issues and allowing sufficient time to derive data security protocols are essential.
  6. Maximise the response rate to trial outcome measures. Incentives, when used appropriately and within the confines of the trial ethics protocol, help to maximise response rates and ensure the data required to establish the outcomes and impacts of an intervention is obtained.
  7. Establish clear communication channels. Ensure there is a single point of contact at the schools/FECs taking part in the trial. This helps to ensure that momentum is maintained for and that accurate information is communicated on a timely basis throughout the trial.
  8. Work with attrition resulting from drop-out. Sustaining engagement in outreach interventions can be challenging for both the schools/FECs and learners involved. Consider over-recruiting participants to ensure enough complete to run the trial. Having a named single point of contact in schools/FECs and making sure there is regular communication between the partners helps to minimise drop-out.
  9. Adhere to trial protocol. It is essential that the trial protocol is communicated to all parties involved and that adherence to the protocol is monitored throughout. Less experienced evaluators may need help to implement the trial in accordance with the protocol (in addition to support with trial design and analysis).

The national evaluation team are currently collating and reviewing findings from reports produced by consortia on their local evaluation activities over the course of the second year of the programme. Please continue to share your outputs with the team as they become available and do let us know if you would be happy for us to upload them to the NCOP resource portal so that learning can be shared with other consortia. The team would also welcome documentation related to trial protocols and/or methodological outlines ahead of planned local evaluation activity. Documents can be sent to your CFE NCOP case-manager or to NCOP@cfe.org.uk