Study design
This protocol is intended to determine the global acceptability and applicability of a survey instrument designed to assess sexual health-related practices, behaviours, and outcomes. The complete process consists of several steps, as outlined in Fig. 1.
The steps described in Fig. 1 will run simultaneously in the study sites within a single wave of 6–7 countries. A total of three ‘waves’ are envisioned, with 20 study sites in total. After all sites localize the survey instrument, the first wave will complete Steps 2–3. The revision (Step 3) will use data from all the sites in that wave. If necessary, one or multiple sites in that wave will begin Step 4, as the second wave begins Steps 2–3. The instrument will be finalized after three waves of countries have completed at least one round of cognitive interviewing (Steps 2–3). Figure 2 further describes this process.
Steps 1–5 are described briefly below (and in more detail in relevant sections, as indicated)
1. Localize instruments:
Each participating site will translate the English-language core instrument into a local language. [See Data collection method/s section for details]
2. First round cognitive interviewing:
Cognitive interviewing is a qualitative method which enables researchers to investigate participants’ thought processes as they encounter and develop a response to a survey question (see Fig. 3, developed from Tourangeau, 1984 [12]). Through this process, researchers can determine whether questions are being interpreted by participants as intended by survey authors.[13] Similarly, through cognitive interviewing, researchers can identify potential sources of response error when the survey is administered, including: complex design; or inappropriate/suboptimal wording, response options and/or order.[14] When survey instruments are meant to be available in multiple languages, cognitive interviewing also provides an opportunity to test whether problems arise due to translation error, factors influencing interpretation in different contexts, or from question design issues in the source instrument.[15, 16]
Finally, for instruments that are meant to be implemented cross nationally, cognitive interviewing can help to establish the “cultural portability” of constructs across countries, and therefore the subsequent ability to compare data collected between countries.[14] As described by Willis, these so-called ‘cross-cultural cognitive interviews’ (CCCI)
determine whether the different questionnaire versions illustrate the key property of cross-cultural equivalence; that is, whether the range of interpretations associated with the evaluated items varies acceptably between cultural or language groups, given the survey measurement objectives. [17]
Conventionally, cognitive interviews are conducted using either ‘think aloud’ or ‘probing’ techniques to elucidate participants’ thought processes as they complete a given survey instrument. When using the ‘think aloud’ technique, an interviewer asks participants to verbalize their thoughts as they are formulating a response, with the interviewer intervening very little otherwise. In ‘probing’, the interviewer has a more active role, asking targeted questions and follow-up probes to understand a participant’s interpretation of the question and reason for their selected response.[13] The ‘think aloud’ technique has proven challenging to implement across certain cultural contexts and so verbal probing is more suited to CCCIs,[17] though participants spontaneously sharing their thought processes is never discouraged.
The first round of cognitive interviews will involve 24–32 individuals. [See Data collection method/s section for details]
3. Revise instrument:
The data produced by cognitive interviews may identify sources of error and bias, as well as insight into participants’ lives (which may factor into the error or bias in their responses).[18] This information feeds into the revision of the survey instrument.[17] If participants interpret the question as intended, it affirms the question’s validity.[18]
Revisions to the instrument will be based on the findings not only from a single study site but also through a comparison of findings across sites. [See Data analysis section for details]
4. [OPTIONAL] Second round cognitive interviewing:
This second round will be completed with up to 10 individuals using the revised instrument.
Step 4 remains optional, rather than required, as the iterative rounds of survey testing and refinement (described below and in Fig. 2), will ensure that revisions of the instrument continue to be tested in subsequent sites. A second round of cognitive interviews could be triggered in the case of any of the following:
-
Step 2 reveals significant translation error that is, problems with that site’s version of the survey instrument
-
Step 2 reveals source survey instrument design issues, for example Likert-type response options are not understood in that site[19], so an alternate set of responses need to be developed and retested in those as well as future sites
-
Step 2 leaves certain question routes untested/undertested e.g. those around having multiple sexual partners.
5. Finalize instrument
Data will be analysed at country-level and then across countries. [See Data analysis section for details]
Study settings
It is important that cognitive testing for this survey instrument is conducted among the instrument’s intended target audience (the general population) across a variety of cultures. Geographically, up to 20 countries will participate, with 2–4 envisioned from WHO’s Americas Region; 4–8 total from WHO’s African Region and Eastern Mediterranean Region; 1–2 from WHO’s European Region; and 4–6 total from WHO’s Western Pacific Region and South-East Asia Region. Countries will be predominantly, but not exclusively, low- and middle-income countries (LMICs) due to the lack of existing sexual practices related data in LMICs as compared with high-income countries. Importantly, the collective set of countries will reflect a range of political and cultural openness regarding sexual activity and sexuality.
The success of cross-cultural cognitive interviews is facilitated when research processes take place simultaneously so that findings can be compared across sites. This provides insight as to whether a given issue is unique to a certain site or shared by people of similar demographics in different countries. Therefore, respecting the different speeds of research processes in different countries, including protocol development and approval, the 20 countries will be bundled into waves of 6–7, as depicted in Fig. 2.
This core, generic protocol will be adapted in each country for the specific sites. In addition to presenting rural/urban, socio-economic, and educational (e.g. average years of schooling) demographics of the country, site-specific protocols will describe the presence of sexual health-related laws and policies. This would include, for example laws which: criminalize same sex activity; enable/restrict youth ability to engage in sexual activity or access SRHR services; and criminalize intimate partner violence (including within marriage). Additionally, site-specific protocols will present sex-disaggregated statistics (where available) on average age at first sex, average age at first union, as well as contraception prevalence. This will provide additional, SRHR outcome-related insight.
It should be noted that many of the above statistics only exist at the national level. Cognitive interviewing, by comparison, will involve relatively few individuals recruited from a specific part of the country. As such, subnational indicators will be used where available.
Study participants and sampling
This study’s target population will be the general population, defined as those aged 15 years and over. Sexuality is an element of the human experience which exists from birth. However, given that this instrument focuses heavily on present/previous sexual activity, the proposed lower age range of fifteen should capture a majority (though not all) sexually active persons. There is no upper age limit proposed for this protocol, as sexual activity continues throughout the life course.
All site-specific protocols will conduct 34–42 interviews with the general population. Sites will conduct 24–32 interviews in the first round of cognitive testing. Within the target population it is important to ensure heterogeneity in terms of characteristics that may affect the way in which the questions may be understood, such as age or sex. Therefore, should aim to obtain equal numbers of male and female participants across four general population age groups: 15–19, 20–24, 25–59, and 60+.
Table 1
Age/Sex matrix to obtain 24–32 ‘general population’ interviews
|
15–19
|
20–24
|
25–59
|
60+
|
Males
|
3–4
|
3–4
|
3–4
|
3–4
|
Females
|
3–4
|
3–4
|
3–4
|
3–4
|
A minimum of 10 of these 24–32 participants should ideally be recruited from rural communities. Additionally, each site may choose to recruit 4–8 participants that are either 1) more difficult to reach or where additional, special outreach may be required; or 2) too small to reliably reach through general recruitment measures. Examples of these specific population groups may include:
-
Persons living with disabilities
-
Lesbian, gay, bisexual, and/or transgender individuals
-
Persons who have had more than one sexual partner in the last year
Rural and ‘subset population group’ participants can be distributed across the proposed age/sex matrix, shown in Table 1.
Participants must be literate. While, in some settings, a literacy requirement may limit the populations among which this instrument can be implemented, it is seen as necessary due to the mode of administration of some questions. Topics deemed to be particularly sensitive in this instrument may be self-administered, requiring individuals to read questions and write/enter a response. These self-administration sections are a direct recommendation from the global external experts who participated in the development and refinement of the instrument. They also reflect current practice in similar surveys.[20] Self-administered sections are perceived to minimize discomfort and social desirability bias on the part of respondents who may not feel comfortable verbally engaging with an interviewer on sensitive, sex-related questions, for example, ever having engaged in insertive or receptive anal sex.
Sampling in all sites will be purposive and, in line with what has been observed in several CCCI studies [17, 21], will rely on one or more site-specific recruitment channels, for example: newspaper or online advertisements, including on social media platforms; flyers, or in-person outreach at markets, community and/or health centres, etc. Each site will determine the method most appropriate for reaching their target populations.
In all sites, recruitment materials will include contact information for the study team. When a potential participant contacts the study team, whether in-person or by phone, messaging platform, or email, a study team member will screen the participant for eligibility and set a time and place for an interview.
Each site will determine whether interviews will take place in-person or virtually via videoconferencing software. These decisions will be based on the current status of COVID-19 pandemic restrictions and the general population’s access to electronic and mobile devices. ‘Virtual’ interviews will require video and voice (rather than voice alone), so that interviewers are able to show cue cards, and better gauge body language and responses.
Sites will also have the option to add self-administered web-based surveys; essentially, an online survey with no interaction with the research team. In some settings, web probing has been found to generate comparable findings to cognitive interviews.[22] However, the success of web probing is context specific (that is, in sites where such survey research is common, and access to and comfort with web-based technology is widespread). Therefore, web-based surveys will supplement rather than replace virtual/in-person interviews. These interviews are not included in the study participant numbers – they will be included in site-specific protocols, should a site decide to include them. Additional File 2 provides a COVID-19-inspired overview of approaches to in-person, virtual, and web-based cognitive interviewing and the relative strengths and weaknesses of adopting one of these approaches in a study taking place during a pandemic.
Prior to the start of the interview, the researcher will provide the participant with a consent form and explain each part of the form before obtaining written consent. For individuals under the age of majority, where appropriate and in line with local institutional review board requirements and laws, the participant’s written consent will be obtained with parental/guardian consent waived.[23]1 Participants will be provided with a copy of the consent form as well a separate page containing a short description of the study, study team contact information, and links to relevant, local SRHR-related online resources and/or services, to take with them.
Following revision of the instrument (Step 3) based on the findings of all sites in that wave, a study site may choose to conduct up to an additional 10 interviews (Step 4). Eligibility criteria and recruitment procedures will remain as described above.
Sample size calculation
The sample size described above is an estimation of the number of interviews required to capture a wide range of reactions to the instrument. A general practice is for cognitive interviews to be conducted in iterative rounds of 5–15 [13]. Factoring in the complexities of comparing findings across countries, the sample size increases. As such, each site protocol will have a total maximum sample size of between 34–42. With 20 study sites envisioned, the global sample size will be between 680–840, which is in line with similar CCCI studies.[17]
Data collection method/s
Step 1. Localize Instruments
Each site will first translate the English-language draft survey instrument and semi-structured cognitive interview guide into the local language, and then back-translate it into English. The individual(s) translating the core instrument will be different from the individual(s) back-translating to English. The two versions will then be compared for discrepancies and discrepancies will be discussed by the two translators with a third member of the team adjudicating.
Step 2. First round cognitive testing
Trained researchers will conduct cognitive interviews by administering the draft survey instrument to an individual and collecting verbal and nonverbal information about how the individual interprets the question and arrives at a response.[24] Researchers will use a semi-structured interview guide with suggested probes for each question being tested. Probing can take place after each question is answered or after all questions in a given section have been answered.[18] Probes are open-ended, with scope for interviewers to use unscripted elaborative and expansive probes to further explore participants’ understanding of the questions and reactions to them.[18] Probes may be answered verbally, or in writing (with paper provided by the interviewer), based on the participant’s comfort. Additionally, the interviewer can probe on specific questions if they note the participant looking confused, contemplative, uncomfortable or otherwise having a noticeable ‘reaction’ (verbal or physical) to a question.
Probes will explore:
-
Comprehension of key terms
-
Whether participants are able to recall the information requested and whether they constrained their thinking to the time period described
-
Whether answer options are complete and used appropriately
-
Whether participants feel that they (and others) could give an honest answer
-
Whether participants perceived the questions to be phrased in a sensitive manner.
All cognitive interviews will be conducted in private and audio-recorded, with the participant’s consent. Interviews will be conducted in the presence of only the data collector, where possible. In previous cognitive interview studies, participants have been given a gift voucher or other modest ‘token of appreciation’ as a thank you for their time. [25] Each site will offer something similar, not intended to exceed the relative equivalent of USD 20.
[OPTIONAL] Step 4. Second round cognitive testing
In select sites, a revised instrument may undergo cognitive testing in a second round of interviews. The data collection procedure will be the same as described above.
Data Analyses
Within-site data analysis will focus on summarizing the findings from each interview, adopting a pragmatic approach similar to that implemented as part of cognitive testing for Natsal-4.[20] All countries will be provided with an analysis matrix, which captures responses to each test question and corresponding probes for each individual. In this way, data from a given site can be read horizontally as a complete summary of one participant’s interview, or vertically capturing all participants’ responses to the same question/probes.
Each country will send its matrix to WHO/HRP, who will lead the cross-country review of the findings, according to the CNEST (Cross National Error Survey Typology), developed as part of a similar multinational survey instrument design process.[16] CNEST categorizes error according to three classifications: 1) poor source question design; 2) translation problems resulting from either a) translator error or b) source question design (vague quantifiers); and 3) cultural portability. WHO/HRP will review findings across all countries in a ‘wave’ together and make preliminary identifications of questions that need to change in the source survey instrument. These findings will be discussed in a half day joint analysis meeting (JAM) with the PIs and/or study coordinators of that wave. The JAM will cover:
-
Findings which remain unclear/in conflict within or across sites
-
Proposed modifications to the survey instrument
-
The need for one or more sites in the wave to test the revised survey instrument
Data management and data access
All study results will be kept confidential by the team in either password-protected files for electronic data, including audio files, or locked cabinets for interview notes on paper. Only approved team members will have access to study results.
Labelling data: A master list will be maintained that includes ID numbers that are uniquely assigned to each participant. Interview notes, audio files, consent forms, and other interview data will be labelled only with these ID numbers.
Storing paper documents: Master ID lists and informed consent forms (which contain the participant’s identifying information) will be stored in a locked cabinet that is separate from any other study material. Any other hard copy documents that contain study results will be stored in a locked cabinet that is accessible only to key study personnel.
Storing digital data: Digital files (audio, data analysis, interview notes, completed survey instrument) will be stored securely on a password-protected computer and on password-protected cloud storage such as Dropbox. Access to files on cloud storage will only be granted to select research staff who will be participating in the data analysis. The original interview audio recordings will be destroyed after two years, while additional study materials will be destroyed after one additional year.
Each site will be responsible for maintaining the content of each interview: the audio file of interview and any written notes, and a record of the completed survey instrument. Within-site analyses, as described above, will result in a completed matrix file which is shared to WHO/HRP. WHO/HRP will pool these files, as described above, generating one master matrix that contains data from all sites in a given wave. Site PIs will have access to this file for the purposes of the joint analysis meeting.
Ethical considerations
Some specific ethical challenges that this protocol presents are described below, along with details as to how the research partners will address these.
First, the research subject matter (sexual practices and behaviours) is sensitive and may be considered a taboo- research subject that could cause participants discomfort. This could be a concern for ethics review committees. In response, all site-specific protocols will make a clear case as to why this kind of research is important locally, and clearly indicate how participant comfort will be maintained (e.g. in addition to obtaining informed consent, repeating to participants that they may stop the interview at any time, are under no obligation to respond if uncomfortable, etc).
Second, cognitive testing may reveal certain ongoing/past traumatic sex-related experiences on the part of participants. In response, all site-specific protocols will provide participants with information about how to access local counselling and/or support services. Consent forms will specify that interviewers can suggest referrals to participants when they feel like they or someone around them may be at risk. In the event that reliable services are not available, interviews will not be conducted in that area.
Third, mandatory reporting laws may place researchers in a compromising position where legal reporting obligations conflict with their ethical obligations to put the welfare of the participant first. This could include reporting of activity criminalized in the country, including commercial sex work, or same-sex activity. It could also, however, include age-specific legislation which may require an adolescent participant desiring SRHR services to have the consent of a parent or guardian (a breach of the participant’s confidentiality).
In response, as part of protocol development, each site will identify any mandatory reporting laws. Each site will determine if exemptions from reporting for the purposes of research already exist or can be obtained.[26] If this is possible, site-specific protocols will specify this. In the event that this is NOT the case, research sites will consult with local ethics review committees and civil society and/or advocacy groups for advice on how to balance these obligations, while keeping the welfare and confidentiality of the participant a primary objective.[23, 27] Site-specific protocols will indicate the results of these discussions. In the event that a satisfactory, participant-centred solution is not possible, relevant questions will be dropped from testing in that site.
Finally, in some cases, even where mandatory reporting requirements do not exist/are waived, cognitive testing of some questions may put certain participants (particularly members of already-marginalized LGBTI groups) at risk in settings where specific sexual behaviours are criminalised. In response, as described above, each site will determine, in consultation with local groups and based on review of local laws and policies and social norms, whether asking any questions could have serious adverse consequences for participants. The presence of any relevant laws/policies will be clearly noted in the ‘study setting’ section of the site-specific protocol, and in the event that concerns (described above) cannot be mitigated, any relevant question will not be tested in these sites.
1This is in line with adolescent sexual and reproductive health research considerations, which indicate that “…For example, for reasons of sensitivity, such as discussions about sexual activities, substance abuse, sexual abuse, physical abuse or neglect – it may be desirable and ethically justifiable for minors (especially minors aged 16 years and older) to choose independently (without parental assistance) whether to participate in research. In this regard, minors may be unwilling to participate in the proposed research if they are required to tell their parents or guardians about the nature of the research.”[22]