A participant who appeared to be wearing a wig sparked concerns among a group of Boston researchers that fraudulent actors were attempting to infiltrate their clinical trial.
The Boston ARCH Comorbidity Center was recruiting for two eHealth trials focused on ways of reducing chronic pain and unhealthy drinking among patients with HIV. A research assistant logged on to a screening call and had the feeling they’d seen the prospective participant before.
“A research assistant suspected they were conducting a videoconference informed consent for the Pain trial with the same person with whom they had completed a consent and baseline assessment 5 days prior,” the authors explained in Journal of Medical Internet Research [link here].
“The person appeared to be wearing a wig during this second encounter.”
When the participant was asked to provide a prescription or medical record to confirm their HIV diagnosis, the research assistant noticed the image of the paper medical record presented looked identical to the one they had viewed five days ago for a supposedly different patient.
On the basis of concerns about the participant’s identity, baseline data was not collected for them to begin the study.
But the situation sparked a further review of patient recruitment and the team found other patterns that were out of the ordinary. This included an increase in the number of patients using first names as their full names (for example, ‘John Mark’), email addresses that followed a similar pattern, and participants saying they were available for screening calls 24/7.
The researchers eventually identified 10 participants who were disenrolled from the trial, all of whom connected to screening calls via virtual private networks but who were ultimately with phone carriers based in Nigeria.
“We discovered the attempted infiltration of these studies by what appeared to be an organised group of ineligible persons, an organised group of individuals from Nigeria, seeking study enrollment (and perhaps repeated participation) to receive financial remuneration,” the authors explained.
Fortunately, the situation was discovered early enough to avoid impacting the studies. The experience led the research team to develop a comprehensive list of behaviours which may signal a participant is not who they say they are online.
An under-appreciated threat
“As compensated research opportunities expand in the online setting, so too do opportunities for fraudulent participation, from participants misrepresenting their eligibility or enrolling multiple times, to automated bots attempting to complete online surveys,” study corresponding author Kara Magane said.
The authors modified their pre-screening processes based on their experience with their HIV studies, and from this developed a checklist that could help other researchers.
Their fraud detection strategies were divided into a pre-screening checklist and checks for initial calls and baseline interviews.
| Screening checklist – what to watch | Features |
| Google Voice numbers | Whether the participant used a Google Voice number |
| Mismatched post codes | Cases where area codes didn’t match purported locations |
| Urgent call backs | When participants were overly eager to sign up to the study |
| Similar voices | Cases where a participant had similar voice or accent to others |
| Quick answers | Participants who instantly answered screening questions without pausing |
| Following scripts | Patterns in responses to screening questions that are similar to others or appeared scripted |
The researchers watched for similar patterns in names and email addresses (like the use of similar first names as a full name), cases where postcodes didn’t match purported locations and patterns in availability – for example, where a patient said they were available for a screening call at literally any time.
The use of a Google Voice number, which could conceal their location, or multiple urgent calls from a participant asking to be recruited to the study were also red flags.
During initial screening calls, research assistants also monitored for those who appeared to be following scripts or racing through answers to clinical questions.
At the baseline interview for the HIV studies, researchers also asked participants to hold up a government-issued photo ID during a video call to establish identity.
“While at times inelegant, the use of the checklist we developed to monitor prescreen submissions and inauthentic attempts to join the research study allowed us to identify presumably fraudulent participants who would have been missed earlier in the study.”
A mix of methods
The fight against fake identities required multiple approaches and tools, the researchers said.
Using a program like CAPTCHA could prevent attacks from bots trying to register for studies, but stopping human-driven activity required additional monitoring.
“Tracking suspicious activity, such as short response times on screening forms or surveys, can be effective,” they said.

The use of telehealth to recruit participants opens studies up to fraudulent actors, the authors said.
While there are plenty of automated tools available to prevent fraud, manual processes should also be used depending on the nature of the study, they argued.
“Previous researchers found success through tracking characteristics and patterns of concern by creating a list of indicators that can categorize participation as fraudulent, suspicious, or authentic based on the number or type of indicators in each participant’s responses or data.”
While time-consuming, manually reviewing recruitment information to identify any suspicious looking patterns would allow for researchers to adapt to the changing approaches of scammers, the authors noted.
Using a manual pre-screening checklist as well as phone and video screening, and potentially asking for photo ID, was beneficial for picking up anomalies in the HIV studies.
“By reviewing and protocolising prevention methods early and often, researchers can be better prepared to prevent and, if necessary, handle fraudulent participation when it happens,” the investigators said.