Introduction
The COVID-19 pandemic impacted the ways in which people communicated through social distancing measures, thereby socializing and normalizing many people across the world to communicate through online platforms, such as Zoom. These shifts in communication increased the capacity for online qualitative research, allowing researchers to incorporate alternative forms of recruitment and data collection. This is, of course, not to say that these changes were a result of the pandemic; however, there was a notable expedited shift to digital platforms to minimize physical contact. For example, some observations included the lack of physical recruitment posters, often placed in public spaces such as grocery stores or coffee shops. The ease of transition to online recruitment was aided through mail servers of partnering community organizations or participant referrals, which was also as a time and cost saving measure for researchers, and a convenience for many participants. Despite the positive benefits of online research for both researchers and participants, this paper highlights some of the challenges faced by our research team and suggestions to mitigate these challenges in future research.
Since the announcement of the COVID-19 global pandemic in 2020, health regulations and social distancing policies have interrupted in-person research, thereby accelerating the shifts to online research to protect the health and wellbeing of both researchers and participants and to ensure the continuation of necessary research activities. Both faculty and students had to quickly adapt their projects due to the pandemic, sometimes with little time to fully grasp the uniqueness of online-based research approaches (
Roehl and Harland, 2022). As noted by the third author, a graduate student herself, for some graduate students, most of their experience of conducting qualitative research has been based on online encounters. Considering that the use of online-based approaches in qualitative research is becoming more widespread, further discussions about how qualitative researchers can address potential issues associated with online-based approaches are needed. Therefore, we discuss the challenges that we have experienced in relation to ensuring trustworthiness and research validity because of imposter participants. By “imposter participant,” we refer to “dishonest, fraudulent, fake, or false participants in qualitative research because they completely fake their identities or exaggerate their experiences in order to participate in qualitative studies” (
Roehl and Harland, 2022: 2470). In this paper, we share some reflections about the challenges and barriers in conducting online-based recruitment and data collection in a qualitative research project. We draw on our recent research experiences recruiting women labeled/with intellectual disabilities to learn about their views on feminism. As a qualitative study, it involved conducting in-depth interviews to gain a deeper understanding of their experiences and viewpoints. However, the recruitment and interviewing process was marked by numerous challenges, reflecting the unique nature of our participant group and the distinctive online environment in which we operated.
This paper weaves storytelling, reflection, and team dialogues—all of which are key elements to our research process, validation, and analysis. As most researchers would attest to, the research process often does not go as planned therefore flexibility is key to a successful research design. We draw on feminist methodologies which prioritize our participants’ voices and experiences, building connections through reciprocal exchanges throughout the interview process, and validating individual experiences through documenting the embodied, lived experiences which acknowledge the participant as the “knower.” However, this paper is inspired by a question that the research team was faced with during the research process. How do we ethically navigate imposter participants in online research? We explore this question by discussing our experience in relation to existing literature, our experience of interacting with imposter participants, and we end by providing some practical suggestions that we have integrated in our own research practices moving forward, which can also hopefully act as a set of suggestions for others.
Considerations and challenges with online research
With the increasing usage of online-based approaches in qualitative research, the presence of people pretending to meet eligibility for study participation is on the rise (
Ridge et al., 2023;
Roehl and Harland, 2022;
Woolfall, 2023). Whereas quantitative researchers have long discussed concern about how “research bots” can negatively affect surveys (
Griffin et al., 2022;
Pozzar et al., 2020), little is understood about the phenomenon of imposter participants in a qualitative research context (
Ridge et al., 2023;
Roehl and Harland, 2022). As
Ridge et al. (2023) appropriately note, “fraudulent participation in qualitative research is an apparently newer and potentially more complex endeavor than carrying out online survey response scams” (p. 942). This presents challenges for qualitative researchers in terms of findings ways to identify imposter participants and restrict imposter participants from entering research spaces and having a detrimental effect on the collection of data. Scholars have drawn attention to multiple “red flags” that qualitative researchers need to be aware of when trying to identify and prevent fraudulent participation (e.g. blank subject lines in emails, use of non-personalized messages, unwillingness to turn camera on during interview) (
Ridge et al., 2023;
Roehl and Harland, 2022;
Woolfall, 2023).
Quantitative scholars have written about the challenges that bots and “fake” respondents pose to data quality (e.g.
Griffin et al., 2022;
Pozzar et al., 2020). For instance,
Pozzar et al. (2020) share their experience using social media for recruitment and online methods for data collection. Within some hours of survey activation, they received a total of 271 survey responses, but classified 94.5% of them as “fraudulent” and another 5.5% as “suspicious.” With few recent exceptions (see, e.g.
Ridge et al., 2023;
Roehl and Harland, 2022), the existing literature has focused primarily on online surveys and research clinical trials with less attention paid to the unique challenges and experiences of facing fraudulent participants in the context of qualitative research methods involving interviews.
There is a body of scholarship exploring this seemingly growing phenomenon of persons misrepresenting themselves in web-based research recruitment for monetary rewards (see
Kramer et al., 2014;
Wessling et al., 2017). Recruiting participants through the internet means that eligibility usually relies on participants’ self-reporting. In some projects, eligibility can lead to forms of monetary compensation, which can open possibilities for participants’ misrepresentation. On the one hand, we were concerned that providing an honorarium to participants (CAD $25) could lead to forms of misrepresentation for monetary gain. On the other hand, we also thought it was important to compensate participants for their time, especially considering how people with disabilities in Canada are more likely to be living under the poverty line compared to non-disabled people (
Santinele Martino and Schormans, 2018). Based on the first author's prior experience doing research with people labelled/with intellectual disabilities, he knew that many potential participants have come to expect an honorarium for their research participation, and thus, not providing compensation could mean having fewer research participants. Therefore, participants were provided with compensation for their time. Alongside being careful with the size of the monetary incentive, we opted to offer incentives in the format of digital gift cards that could only be redeemed within Canada.
Focusing on our experience of conducting a research project with people labelled/with intellectual disabilities, in particular, raises some important questions. Our primary concern has been the reproduction of the so-called “fear” of “the disability con—popular perceptions of fraud and fakery” attributed to disabled people (
Dorfman, 2019: 1051). For instance, in a U.S.-based study, researchers found that at least 60% of disabled Americans reported feeling as though other people questioned their disability (
Dorfman, 2019). In everyday life, disabled people are indeed often asked to “prove” their disability to gain access to accommodation, such as disabled parking spots, academic accommodations, and access to financial supports (
Dorfman, 2019;
Price, 2021). A prior study also found increasing attention in mainstream media has been paid to the topics of disability benefits and fraud, accompanied by an increase in the use of pejorative language to describe those with disabilities (
Briant et al., 2013). With all this in mind, we wanted to avoid giving participants the impression that we were questioning their disability or labeling them as potentially fraudulent from the start. Of course, this proved to be a challenge once we started receiving emails from
actual imposter participants.
Interactions with imposter participants: who can you trust these days?
It is here where we shift our attention to our experience recruiting women labeled/with intellectual disabilities for interviews to learn about their views on feminism. To ensure the comfort and the safety of both the researchers and participants during the pandemic, all recruitment and interviews were conducted online. As a strategy for recruitment, an open call for participation was shared with community partners and via social media. The open call was shared with over 30 community organizations yet garnered little response in the first few months. On one random morning, after a community partner shared his willingness to share the call for participants, the first author received emails from potential participants. Then suddenly, 15 separate emails within minutes from each other (see also
Ridge et al., 2023) were received. All the emails had the exact same title, “Participate,” and contained very similar, short text in them (random spacing between words and lack of punctuation were purposedly kept), such as:
“I am willing to participate in this study to share my experience with you.”
“I am available to participate in this study of feminism to share my experience with you.”
“I'm interested in participating in your study.”
“I'm interested in your research study.”
“I'm interested in your study.”
“I'd love to take part in your research study.”
What began as an exciting moment of receiving interest in the research project, turned into doubts about the “real” participants. After sending follow-up emails to six participants, within an hour all respondents replied with signed consent forms. Of the six participants, there were two clusters with the same file names:
(4)6b1901975cf74686aa4b6791a9f50f27cf16f0e5_1667248232097.pdf
(28)130420474fac400b887152817b38856834979831_1667247245805.pdf.
With this in mind, we chose to identify these respondents as ineligible for participation. Nonetheless, we continued to receive emails from them seeking information from the research team about the next steps in the recruitment process. Even these follow-up emails seemed to follow a certain template:
“Hello am yet to hear from you, since the consent form I sent back I have not heard from you.”
“Hello am still expecting to hear from you.”
“Hoping to hear from you.”
After the excitement and frustration about the potential imposter participants, enough time seemed to have passed to move forward without worrying (too much) about the debacle. However, just over 3 weeks later one of the community organizations who had shared the recruitment information contacted the principal investigator about a potential participant. Their contact information was forwarded to the research assistant who noticed this was one of the flagged imposter participants. Our immediate thoughts were: “did we almost lose a legit participant? Are we excluding eligible participants due to our suspicion and creating barriers for participants who are so often questioned and silenced?” Surely, even this would be too much effort for an imposter. We decided to proceed with caution and went ahead to conduct the interview.
Qualitative research and feminist methodologies focus on the complexities of the lived experience (
Brinkmann and Kvale, 2015;
Pole and Lampard, 2002). Every question in the interview guide is carefully created to explore how women labeled/with intellectual disabilities understand and experience feminism in their lives and the social issues that are important to them. Most, if not all, of the questions on the interview guide ask participants to share their personal and lived experiences. Unlike the other interviews, there was no casual conversation or introduction from the participant. Although this seemed odd, it is not unusual for participants to be cautious of researchers; it is through conversation that we are able to break barriers to engage in meaningful dialogue. However, there were two immediate points that stood out: (a) when asked how she learned about the project, she stated that she had been directly contacted by the principal investigator, and (b) when asked where she was located, her answer indicated that she was in a different province than where the community organization was. The benefit of an email chain is you can trace the lineage of communication, and in this case, it could be used to verify information from the participant.
During the interview, the participant was unable to provide concrete examples to illustrate her lived experiences, she provided very broad responses, and most of her answers were significantly different from previous participants. We understand that “thin” descriptions provided by participants should not automatically mean that we are dealing with an imposter participant (
Ridge et al., 2023;
Roehl and Harland, 2022). At times, the interviewer could hear quick keyboard typing and mouse clicking. When asked if she could turn on her camera, the participant said she felt uncomfortable to do so, which other colleagues have also experienced (see, for example,
Ridge et al., 2023). At one point, we could hear a voice in the background for a couple of seconds before the participant quickly muted herself. The participant even seemed to have a hard time answering very basic questions for eligibility checking, including their geographical location:
Researcher:
Where abouts are you located?
[6 seconds of silence]
Researcher:
And in which city?
[5 seconds of silence]
Participant:
Ontario [a province, not a city].
The participant's hesitancy in answering such basic questions represented a significant red flag to us. Due to concerns about its trustworthiness, the research team decided not to include this particular interview in the sample and cancel another previously scheduled interview that also seemed suspicious. The team provided the imposter participant with the honorarium promised for her time.
By this point, we had enough evidence to confirm our suspicions of imposter participants, but the next question was, why would people do this? After lengthy discussions with the research team, we proposed that it was probably about the honorarium. We could not avoid imagining, if we had, for example, 20 potentially imposter interviews, this would amount to $500. In the context of community research, that's potentially $500 removed from a marginalized population and from the already limited research funds that are available to this type of research. Furthermore, participants should be allowed to stop interviews and focus group participation at any point without penalties and still be eligible to receive honorarium. However, without completing the interview, there is the potential for imposter participants to discredit the validity of the data and to limit the opportunity for researchers to verify answers.
It eventually reached a point during the recruitment process when the authors simply did not know who they could trust to be a legitimate participant. This generated both confusion and even some level of distrust, which became significant barriers to recruitment and building rapport with research participants. At one point, we were unsure whether we should respond to some of the emails due to our suspicion and how to respond to those expressions of interest in research participation. We were concerned about ensuring the quality and validity of our data. The struggle in this process was trying to decipher real versus imposter participants, with a particular fear that legitimate participants could be left out or ignored.
The authors spent a lot of time as a research team going back and forth attempting to come up with a solution, sometimes reaching dead ends. The nuances of this challenge can be seen in the following dialogue captured between the authors of this paper:
Second author: There is something odd about these emails. I don’t know. They all have very short sentences, almost using the exact same wording from the call for participants.
First Author: But I know from previous experience that that's how some people labeled/with intellectual disabilities communicate a lot of times… some people copy and paste sentences to communicate.
Third Author: True. verification answers? Questionnaires? Are there like ways so people can “prove” they are a real person and actually eligible as a participant?
First Author: Could we make it mandatory for participants to have their camera on? But would that even actually address the issue? Like, some people have invisible disabilities.
Second Author: Well, and, I mean, students in our classes have the option to leave their cameras off. Is it fair to require participants to turn cameras on?
Third Author: I am also concerned about the notion of making people “prove” they are legit. That can be problematic.
First Author: Disabled people so often have to “prove” their disability and, yeah, we can’t reproduce that.
[Researchers sigh, take a sip of coffee, and stare at each other in silence for a minute]
The first author went as far as to share posts on social media asking other qualitative research scholars for suggestions on how to handle such instances; however, to his surprise, no one could offer suggestions. Instead, a couple of colleagues reached out to him privately, sharing that they had faced similar issues but had no answers to this particular issue. We were told, for example, about interview participants who had dark sky behind them in the video even though it was supposed to be midday in that location. Another colleague shared his concern about a case in which two different participants had basically shared the same personal story in their interviews. A colleague suggested that we followed strategies from quantitative research, for example, by using testing questions to ensure eligibility and legitimacy of potential participants. The first author, a critical disability studies scholar, was hesitant because it is known that people labeled/with intellectual disabilities have issues about being tested.