The ESOMAR 28 Questions, answered.
The primary aim of these 28 Questions is to increase transparency and raise awareness of the key issues for researchers to consider when deciding whether an online sampling approach is fit for their purpose. Put another way, the aim is to help researchers to ensure that what they receive meets their expectations. The questions are also designed to introduce consistent terminology for providers to state how they maintain quality, which enables buyers to compare the services of different sample suppliers. Notes on the context of the questions explain why the questions should be asked and which issues researchers should expect to be covered in the answer.
These new questions replace ESOMAR’s 26 Questions to help Research Buyers of Online Samples.
ESOMAR has updated the text to recognize the ongoing development of techniques. While some of the questions remain constant, new questions have been added to incorporate new techniques and new technology in this area. In particular, this revision recognizes the broad trend within the industry to build online samples from multiple sources rather than relying on a single panel. It should be noted that these 28 Questions focus on the questions that need to be asked by those buying online samples. If the sample provider is also hosting the data collection you will need to ask additional questions to ensure that your project is carried out in a way that satisﬁes your quality requirements.
The 28 Questions complement ESOMAR’s Guideline to Online Research which was revised in 2011 to add updated legal and ethical guidance and new sections on privacy notices, cookies, downloadable technology and interactive mobile.
1. What experience does your company have in providing online samples for market research?
Context: This answer might help you to form an opinion about the relevant experience of the sample provider. How long has the sample provider been providing this service and do they have for example a market research, direct marketing or more technological background? Are the samples solely provided for third party research, or does the company also conduct proprietary work using their panels?
Founders of InnovateMR, Matt Dusig and Gregg Lavin, enjoy a successful track record in the market research industry spanning over ﬁfteen years. During this time, Matt & Gregg have developed hundreds of online and mobile research panels as well as launched numerous advances in automated sampling technology, revolutionizing the industry.
- First Sample Publisher Network — the goZing Network in 2001
- First Sample Routing Solution for Publisher Traffic — 2003
- First Self-Serve Platform for Proprietary Panel Purchasing — 2010
- First APIs for Programmatic Sampling — 2011
- First Global Mobile Platform with Geo-Fencing — 2012
Leveraging their vast experience, Dusig & Lavin launched InnovateMR in 2014 and once again, they are taking the industry by storm; disrupting antiquated sampling practices in favor of a new model that truly engages survey participants and generates long-term retention. The InnovateMR panel offers clients a unique and highly engaged community representing a balanced cross-section of online and offline recruitment sources. Studies are custom-ﬁtted based on project speciﬁcations and client preferences.
Clients may rely solely on InnovateMR’s proprietary panel of over 1 million respondents and/or extend sample reach using the ﬁrm’s carefully vetted network of partners.
SAMPLE SOURCES AND RECRUITMENT
2. Please describe and explain the type(s) of online sample sources from which you get respondents. Are these databases? Actively managed research panels? Direct marketing lists? Social networks? Web intercept (also known as river) samples?
Context: The description of the types of sources a provider uses for delivering an online sample will provide insight the quality of the sample.
The InnovateMR panel was built using a blend of diverse sources representing various online and offline publishers. Many of these publishers have formed an exclusive relationship with InnovateMR which affords clients a truly unique sourcing strategy for online and mobile studies. Examples include large-scale advertising networks which access millions of daily banner impressions, hard-to-reach specialty websites which cater to key demographic groups such as young males, (un)acculturated Hispanics, mothers, technology decision-makers, and business professionals. The InnovateMR mobile community was built leveraging in-app banner advertising, social networks such as Facebook and numerous web and SMS databases.
3. If you provide samples from more than one source: How are the different sample sources blended together to ensure validity? How can this be replicated over time to provide reliability? How do you deal with the possibility of duplication of respondents across sources?
Context: The variation in data coming from different sources has been well documented. Overlap between different panel providers can be signiﬁcant in some cases and de-duplication removes this source of error, and frustration for respondents.
InnovateMR’s blending strategy was born from the company’s extensive experience in online sampling and panel management. Upon joining, panel members complete a comprehensive demographic proﬁle and user participation metrics are carefully tracked to support various sample frame needs; all of which are predicated on client preferences. InnovateMR sample frames are balanced across the requested demographic distributions, source-type, panel activity levels and tenure. Additionally, participation and category exclusions are supported as requested. Duplication across panels as well as within panels is a very real concern in the Market Research industry. With this in mind, InnovateMR utilizes industry leading digital ﬁngerprinting technology, 3rd party validation as well as in-community algorithms. Our algorithmic approach is designed to identify uniqueness and measure respondent honesty on a longitudinal basis.
4. Are your sample source(s) used solely for market research? If not, what other purposes are they used for?
Context: Combining respondents from sources set up primarily for different purposes (like direct marketing for example) may cause undesirable survey effects.
The InnovateMR survey panel was designed to engage members offering a wide variety of survey focused opportunities. We ﬁrmly believe that current industry standards point to a broken model, often leaving the survey respondent frustrated and conditioned to behave dishonestly. High industry disqualiﬁcation rates with no reward generate negative behaviors which in turn compromises the survey data our industry reports. As such, our panel has been carefully managed to ensure respondents enjoy a balanced, engaging and rewarding experience.
At InnovateMR, we are revolutionizing the user model to ensure survey participants are adequately rewarded and retained. This goal is achieved by carefully managing low-incidence survey opportunities among panelists, extensive proﬁling for targeting relevant opportunities as well as rewarding respondents adequately for the time they invest in survey inventory.
5. How do you source groups that may be hard to reach on the internet?
Context: Ensuring the inclusion of hard-to-reach groups on the internet (like ethnic minority groups, young people, seniors etc.) may increase population coverage and improve the quality of the sample provided.
Drawing on our unique panel recruitment relationships, InnovateMR offers unparalleled access to hard-to-reach segments in our panel. This is demonstrated by the countless clients who rely on our team, technology and panel asset to accomplish survey work that other companies simply cannot satisfy.
Our technology allows us to remain ﬂexible. One such example centers around the vast network of recruitment partners that supply our panel. Our robust systems allow us to easily accommodate the various ﬁnancial arrangements that many recruitment campaigns require.
This ﬂexibility allows us to recruit from a larger cross-section of online, offline and mobile channels, yielding a diverse and more representative audience.
6. If, on a particular project, you need to supplement your sample(s) with sample(s) from other providers, how do you select those partners? Is it your policy to notify client in advance when using a third party provider?
Context: Many providers work with third parties. This means that the quality of the sample is also dependent on the quality of sample providers that the buyer did not select. Transparency is essential in this situation. Overlap between different providers can be signiﬁcant in some cases and de-duplication removes this source of error, and frustration for respondents. Providers who observe process standards like the ISO standards are required to give you this information.
Quality sample and transparency are of critical importance in the Market Research industry. As veterans, we understand the vital role that consistent methodologies play in the overall success of a research project. As such, we carefully vet each and every partner that participates in our support network.
This vetting process begins with an extensive review of a partner’s recruitment methodology, sampling protocols, deduplication technology and incentive management.
Additionally, we continually evaluate the quality of each partner by testing each source across a number of demographic, attitudinal and behavioral benchmarks. This process helps our team achieve a balanced blended sample outgo and avoid common pitfalls that can be present when partner sample is introduced. Above all else, transparency is a non-negotiable tenant to our business practice. If partner sample is required, this information is provided to clients during the bid phase of a project so that a fully informed decision can be made regarding sample composition.
SAMPLING AND PROJECT MANAGEMENT
7. What steps do you take to achieve a representative sample of the target population?
Context: The sampling processes (i.e. how individuals are selected or allocated from the sample sources) used are the main factor in sample provision. A systematic approach based on market research fundamentals may increase sample quality.
Leveraging our long history of sampling best practices, the InnovateMR project team utilizes predeﬁned census templates which allocate sample outgo based on the required distributions needed for a study, as speciﬁed by the client. Additionally, our sample frames are blended so that a representative cross-section of the panel is balanced by such variables such as activity level, tenure and source-type.
Maintaining database hygiene is an additional component in this process. We carefully monitor panelist activity and harness advanced algorithms to track members longitudinally throughout their lifetime in the panel. This proactive methodology allows our team to take quick action on members who may be demonstrating behaviors that do not align with our expectations.
8. Do you employ a survey router?
Context: A survey router is a software system that allocates willing respondents to surveys for which they are likely to qualify. Respondents will have been directed to the router for different reasons, perhaps after not qualifying for another survey in which they had been directly invited to participate, or maybe as a result of a general invitation from the router itself.There is no consensus at present about whether and how the use of a router affects the responses that individuals give to survey questions.
When used appropriately, routing systems can generate several beneﬁts for our clients, respondents and our company. Our clients are quickly able to reach low-incidence populations in a cost-effective way, while our members don’t fall victim to routine disqualiﬁcation. The intention of routers is to match qualiﬁed respondents to the most relevant survey opportunity through a battery of pre-screening questions.
The key to an effective routing technology is to avoid practices that could introduce bias to the sample; such examples include conditioning respondents to falsely qualify and/or non-response bias. Non-response bias can occur when particular studies are prioritized over others in a queue thereby causing subsequent studies to be under-represented for key populations that may have been pulled into earlier, higher priority studies. InnovateMR’s routing technology was developed to avoid these pitfalls by pre-screening respondents across a battery of questions, and then randomly allocating these members among a group of studies rather than a ‘priority project.’
Ensuring there is adequate sample capacity to meet the survey inventory demand is another critical component to responsible routing. When key groups are under-represented in a panel, the risk of overutilization is present; this outcome can be negative for members and for the projects which require a representative sample from these groups. Lastly, we understand that some clients prefer a random sample via email with no pre-targeting introduced. This sampling methodology is also supported at InnovateMR, however we believe our routing methodology mitigates against the concerns noted earlier.
9. If you use a router: Please describe the allocation process within your router. How do you decide which surveys might be considered for a respondent? On what priority basis are respondents allocated to surveys?
Context: Biases of varying severity may arise from the prioritization in choices of surveys to present to respondents the method of allocation.
As noted earlier, we pre-screen respondents through a battery of questions and randomly allocate prequaliﬁed respondents among a large group of available studies. Additionally, we closely monitor our panel frame and inventory demand to ensure key groups are not over-utilized and subsequently underrepresented.
10. If you use a router: What measures do you take to guard against, or mitigate, any bias arising from employing a router? How do you measure and report any bias?
Context: If Person A is allocated to Survey X on the basis of some characteristic then they may not be allowed to also do Survey Y. The sample for Survey Y is potentially biased by the absence of people like Person A.
Random allocation is critical to avoid bias and as such is central to our methodology. We do not place priority over surveys to ensure equal and fair distribution of sample.
11. If you use a router: Who in your company sets the parameters of the router?it a dedicated team or individual project managers?
Context: It may be necessary to try to replicate your project in the future with as many of the parameters as possible set to the same values. How difficult or easy will this be?
Our routing engine is managed by our experienced Operations leadership team. These specialized permissions ensure consistent methods and treatment are employed at all times. Furthermore, InnovateMR maintains a robust project management intelligence system. This system stores all sample study plans so that ongoing longitudinal research will be managed consistently each time a study is ﬁelded. Our robust technology allows us to initiate projects using a replicable sample frame using the ﬁlters noted earlier, i.e., a sample frame balanced by speciﬁc demographic distributions (nested quota design), and outgo balancing by variables such as activity and tenure.
12.What proﬁling data is held on respondents? How is it done? How does this differ across sample sources? How is it kept up-to-date? If no relevant proﬁling data is held, how are low incidence projects dealt with?
Context: The usefulness to your project of pre-proﬁled information will depend on the precise question asked and may depend on when it was asked. If real time proﬁling is used, what control do you have over what question is actually asked?
We collect proﬁling data at various stages of the panel lifecycle. At registration, several demographics are collected including name, address, email, age, gender, language, state/province, country, zip/postal code, household income, profession/industry, ethnicity, education, marital status, mobile phone number, mobile device and mobile operating system. As panelists participate, we dynamically proﬁle additional data-points. This approach allows our team to serve up relevant survey opportunities to our members; maximizing retention rates and shortening ﬁeld times for quick-turn studies.
When requested, we can easily accommodate custom question design for pre-screening needs identiﬁed by our clients as well as load third party data sources for segmentation purposes. Recognizing that all sample data has a speciﬁc shelf life, we dynamically proﬁle our panel on an ongoing basis to help improve requaliﬁcation rates and maintain accurate detail, especially among more temporal respondent data. All of our data is housed in a central database for efficient sampling and consistent database hygiene standards.
13. Please describe your survey invitation process. What is the proposition that people are offered to take part in individual surveys? What information about the project itself is given in the process? Apart from direct invitations to speciﬁc surveys (or to a router), what other means of invitation to surveys are respondents exposed to? You should note that not all invitations to participate take the form of emails.
Context: The type of proposition (and associated rewards) could inﬂuence the type of people who agree to take part in speciﬁc projects and can therefore inﬂuence sample quality. The level of detail given about the project may also inﬂuence response.
At InnovateMR, we maintain a library of pre-deﬁned HTML invitation templates as well as standard operating procedures related to invitation verbiage (in accordance with CAN-SPAM compliance). Our members are engaged via email invitation as well as one reminder within 24-48 hours after the initial contact. We typically send 3-5 invitations each week in order to keep members fully engaged, however participants may elect to receive fewer invitations, if desired.
We utilize subject lines that are nonleading in nature to avoid conditioning respondents for false qualiﬁcation. We maintain an extensive subject line library to ensure our projects only present approved language. All invitations are delivered in the preferred language indicated by our members during the registration process. Within the invitation, members are advised of the survey topic, length of interview, and the incentive offered.
Member relations are an integral element in the panel management process, as such our members are presented with information related to their current membership status, our customer service helpdesk as well as detail related to unsubscribe and contact updates.
Invitations are deployed based on the pre-deﬁned demographic segments needed for a representative sample. Certain groups which may respond at a lower rate are over-sampled to ensure nested quotas are fulﬁlled properly. Additionally, our members may also participate via our member dashboard which is visible upon log-in of our panel website. Clients may elect to contact our members via email invitation, dashboard or a blend both approaches. These contact preferences are deﬁned during the bid phase of the project and any deviation from sample plan must be approved by the client before changes are made.
As it relates to our mobile sample, we can engage respondents using a variety of different methodologies. Such examples include, SMS text message, push notiﬁcation via our mobile app, or mobile web (a shortened URL presented to members of both our online and mobile panel). Utilizing these various approaches affords InnovateMR deep reach among a mobile audience.
14. Please describe the incentives that respondents are offered for taking part in your surveys. How does this differ by sample source, by interview length, by respondent characteristics?
Context: The reward or incentive system may impact on the reasons why people participate in a speciﬁc project and these effects can cause bias to the sample.
Our staff are comprised of seasoned industry veterans with experience in panel management, including incentive fulﬁlment. We maintain a standard incentive matrix, this allows our team to maintain a consistent treatment and offering to our members throughout their lifetime in our panel community.
Incentive amounts are based on the length of interview for a particular opportunity, along with the complexity of the task. We offer consistent incentives within a survey to avoid bias as well as across our entire survey inventory. This methodology is essential in panel management and helps to avoid inadvertently generating negative panel learning effects. Our panel earns a virtual currency for both qualifying and non-qualifying activities which may be redeemed at various levels. Rewards levels offer a variety of online and mobile gift cards including virtual Visa and Amazon. PayPal and charitable donations are also available for members as a redemption option. Presenting respondents with a wide variety of rewards generates a catered, positive panel experience and combats attrition.
15. What information about a project do you need in order to give an accurate estimate of feasibility using your own resources?
Context: The “size” of any panel or source may not necessarily be an accurate indicator that your speciﬁc project can be completed or completed within your desired time frame. In order to appropriately price a project, our team must be advised on the following study details:
- Total number of completed interviews
- Incidence of the study population
- Length of interview
- Complexity of task such as downloads required, offline or mobile activities.
- Geographic target
- Time in ﬁeld
- Category or past participation exclusions
- Speciﬁc quota requirements such as census (i.e., speciﬁc demographic distributions or click balancing)
16. Do you measure respondent satisfaction? Is this information made available to clients?
Context: Respondent satisfaction may be an indicator of willingness to take future surveys. Respondent reactions to your survey from self-reported feedback or from an analysis of suspend points might be very valuable to help understand survey results.
Respondent satisfaction is the central focus of our business. As a result, we continually measure respondent satisfaction through various feedback channels (ongoing satisfaction surveys and our worldclass helpdesk operations team).
17. What information do you provide to debrief your client after the project
Context: One should expect a full sample provider debrief report, including gross sample, start rate, participation rate, drop-out rate, the invitation/contact text, a description of the ﬁeld work process, and so on. Sample providers should able to list the standard reports and metrics that they make available.
Project performance KPIs are shared with our clients throughout the life of a project. Prior to launch, our team shares expected performance goals as well as communication details (invitation verbiage etc.). At the initial slow start of a project, our project management team provides visibility into a number of metrics such as open rate, click-through rate, drop-out rate, the length of interview, incidence and other vital statistics required for proper visibility. Communication is provided on a frequent basis to ensure our clients are advised of project performance. At the close of ﬁeld, our team provides a detailed summary report of the project’s execution along with any custom reporting as requested.
DATA QUALITY AND VALIDATION
18. Who is responsible for data quality checks? If it is you, do you have in place procedures to reduce or eliminate undesired within survey behaviors, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non-response (e.g. “Don’t Know”) or (d) speeding (too rapid survey completion)? Please describe these procedures.
Context: The use of such procedures may increase the reliability and validity of the survey data. InnovateMR has assembled the best and brightest sampling professionals to oversee our quality standards. Our team harnesses the power of automated technology to closely track respondent behaviors and proactively remove memberswho demonstrate actions that do not align with our standards. Advanced algorithms longitudinally track member performance and automatically deactivate users who exhibit poor quality.
Red herring questions are presented to participants to test engagement and panel honesty. Straight-lining and speedy completions are also monitored and logged inside our system. Additionally, we rely on geo-IP ﬂagging, third-party PII validation as well as digital ﬁngerprinting to prevent duplication.
19. How often can the same individual be contacted to take part in a survey within a speciﬁed period whether they respond to the contact or not? How does this vary across your sample sources?
Context: Over solicitation may have an impact on respondent engagement or on self-selection and non-response bias. Over-utilization can threaten the quality of any panel, as such, we carefully balance our outreach through availability and contact rules programed into our system. This approach ensures we are respecting a respondent’s time and deters negative or over-zealous behaviors from surfacing. Our members typically receive anywhere from 3-5 survey invitations each week, with a maximum of one reminder for each project.
Additionally, user preferences are recorded on our member proﬁle page so that we may always maintain an optimal experience for our panel community.
20. How often can the same individual take part in a survey within a speciﬁed period? How does this vary across your sample sources? How do you manage this within categories and/or time periods?
Context: Frequency of survey participation may increase the risk of undesirable conditioning effects or other potential biases.
As noted above, we do not exceed member contact outside of pre-deﬁned contact rules. Additionally, we carefully track category participation and maintain accurate performance metrics. As requested, we can ﬁlter outbound sample based on a speciﬁed category and/or historic participation data. Lastly, we can balance a project sample frame across a number of variables such as activity level, tenure and source. All of these details are outlined during the project kick-off to ensure the proper execution of a study.
21. Do you maintain individual level data such as recent participation history, date of entry, source, etc., on your survey respondents? Are you able to supply your client with a project analysis of such individual level data?
Context: This type of data per respondent including how the total population is deﬁned and how the sample was selected and drawn, may increase the possibilities for analysis of data quality. We maintain hundreds of self-reported data as well as performance metrics on our user base. Upon request, we are more than happy to append these variables to a data set.
22. Do you have a conﬁrmation of respondent identity procedure? Do you have procedures to detect fraudulent respondents? Please describe these procedures as they are implemented at sample source registration and/or
at the point of entry to a survey or router. If you offer B2B samples what are the procedures there, if any?
Context: Conﬁrmation of identity can increase quality by decreasing multiple entries, fraudulent panelists etc.
Leveraging a technological approach for quality respondent management is central to InnovateMR’s DNA. For over a decade, our team has been developing cutting-edge solutions that thwart even the most advanced online and mobile fraud. We have implemented over two dozen quality check-points in our registration process alone. This approach employs a scoring methodology that allocates points for various behaviors. In isolation, a point may not be indicative of a suspicious user, however when these points begin to accumulate, it becomes quickly apparent that a more nefarious user is attempting to join our panel. As such, quick action is taken against the prospective panelist by silently deactivating his/her account. As it relates to our B2B sample, we recognize this segment is an especially high demand target for malicious behavior. In response, we conduct an extensive proﬁling exercise which focuses on business-speciﬁc proﬁle questions; layered with red-herring and quality check-point questions. This approach has proven to be very successful in trapping fraudulent respondents who attempt to bypass our system-checks and quality procedures.
POLICIES AND COMPLIANCE
23. Please describe the ‘opt-in for market research’ processes for all your online sample sources.
Context: The opt-in process indicates the respondents’ relationship with the sample source provider. The market generally makes a distinction between single and double opt-in. Double opt-in refers to the process by which a check made to conﬁrm that the person joining a panel or database wishes to be a member and understands what to expect advance of participating in an actual survey for a paying client).
InnovateMR’s registration process strictly adheres to CAN-SPAM and COPPA protocols. As such, all registrants are sent conﬁrmation email upon joining our panel. Members may only participate once they have conﬁrmed our double-opt-in email as well as answer additional screening questions designed to test respondent quality.
25. Please describe the measures you take to ensure data protection and data security.
Context: The sample provider usually stores sensitive and conﬁdential information on panelists and clients in databases. These data need to be properly secured and backed-up, as does any conﬁdential information provided by the client. The sample provider should be able to provide you with the latest date at which their security has been evaluated by a credible third-party Data security protocols are tightly maintained by our IT administration across all areas of our business.
Our internal and external systems are maintained on the Amazon Cloud with multi-region redundancy for maximum security and scalability. As it relates to our internal project management and panel database, user permissions are tightly managed using encrypted passwords. All employees and panel members must log-in using an encrypted password within our secure network. All PII and client details are stored within our encrypted database with limited permissions access. Our staff must follow standard operating procedures for handling and transferring sensitive information and authentication must be veriﬁed before access is granted.
As noted earlier in this document, our panelists are monitored on a longitudinal basis, beginning at registration where advanced algorithms and scoring methods are used to evaluate respondent quality.
Digital ﬁngerprinting technology is used to detect the use of proxy servers, and other variables indicative of fraudulent behavior (i.e., inconsistent browser and operating language use, mismatch of geography and IP address etc.). Our quality monitoring system performs dynamic scans to identify suspicious patterns within and across member accounts. Quick action is taken against any panel member that demonstrates behaviors that do not align with our high quality standards.
26. What practices do you follow to decide whether online research should be used to present commercially sensitive client data or materials to survey respondents?
Context: There are no foolproof methods for protecting audio, video, still images or concept descriptions in online surveys. In today’s social media world, clients should be aware that the combination of technology solutions and respondent conﬁdentiality agreements are “speed bumps” that mitigate but cannot guarantee that a client’s stimuli will not be shared or described in social media.
A successful panel registration is predicated on members’ reviewing our privacy terms and actively opting into our engagement agreements. When appropriate, we include additional conﬁdentiality language and capture informed consent in the survey so that panelists are advised of the sensitivity of the information presented. Additionally, participants are strictly advised that potential legal action may be taken if survey conﬁdentiality is breached by the member.
27. Are you certiﬁed to any speciﬁc quality system? If so, which one(s)?
Context: Being certiﬁed may require the supplier to perform tasks in a pre-determined manner and document procedures that should be followed.
InnovateMR founders Matt Dusig and Gregg Lavin have been long-standing members of several industry organizations including ESOMAR, CASRO, MRA, MMRA and both gentlemen have served on several advisory boards helping to shape our industry’s direction.
As it relates to speciﬁc sample validation, InnovateMR can leverage 3rd-party validations such as True Sample, Veratad, Verity and IDology, as requested by our clients.
28. Do you conduct online surveys with children and young people? If so, do you adhere to the standards that ESOMAR provides? What other rules or standards, for example COPPA in the United States, do you comply with?
Context: The ICC/ESOMAR International Code requires special permissions for interviewing children. These are described in ESOMAR Online Research Guideline. In the USA researchers must adhere to the requirements of the Children’s Online Privacy Act (COPPA). Further information on legislation and codes of practice can be found in Section 6 of ESOMAR’s Guideline for Online Research.
We rigorously adhere to all COPPA regulations, as well as comply with ESOMAR and CASRO research standards. In the US, we do not interview children under the age of 13, and follow country-speciﬁc regulations to ensure that we are compliant with regional legislation and privacy laws.
In the event that a client requires survey participation from minors, we recruit these participants through parents; ensuring parental permission is captured. Within our standard operating procedures, we avoid sensitive subject matters for participants under the age of 18.