Using Semantic Similarity to Assess Adherence and Replicability of Intervention Delivery

Abstract

Researchers are rarely satisfied to learn only whether an intervention works, they also want to understand why and under what circumstances interventions produce their intended effects. These questions have led to increasing calls for implementation research to be included in evaluations. When an intervention protocol is highly standardized and delivered through verbal interactions with participants, a set of natural language processing techniques termed semantic similarity can be used to provide quantitative measures of how closely intervention sessions adhere to a standardized protocol, as well as how consistently the protocol is replicated across sessions. Given the intense methodological, budgetary, and logistical challenges in conducting implementation research, semantic similarity approaches have the benefit of being low-cost, scalable, and context agnostic for use. In this paper, we demonstrate the application of semantic similarity approaches in an experiment and discuss strengths and limitations, as well as the most appropriate contexts for applying this method.

Publication
Anglin, K. L. & Wong, V. C. (2020) The Validity of Causal Claims with Repeated Measures Data. EdPolicyWorks Working Paper Series No. 73. October 2020. https://curry.virginia.edu/sites/default/files/uploads/epw/73_Semantic_Similarity_to_Assess_Adherence_and_Replicability_revised.pdf
Avatar
Kylie L. Anglin
PhD Candidate

My research develops data science methods for conducting implementation research in field settings, as well as improving the causal validity and replicability of impact estimates.