Hostname: page-component-7c8c6479df-xxrs7 Total loading time: 0 Render date: 2024-03-28T13:58:36.470Z Has data issue: false hasContentIssue false

Encountering Your IRB 2.0: What Political Scientists Need to Know

Published online by Cambridge University Press:  20 April 2016

Dvora Yanow
Affiliation:
Wageningen University
Peregrine Schwartz-Shea
Affiliation:
University of Utah
Rights & Permissions [Opens in a new window]

Abstract

This essay corrects and updates one that was originally published in Qualitative & Multi-Method Research and, in a condensed version, in three other APSA Organized Section newsletters. Our research into IRB policy has shown that many political scientists are not familiar with some of its key provisions. The intent of the essay is to increase awareness of the existing policy’s impact on political scientific research and, in particular, on graduate students and junior faculty. We remain concerned that at present, faculty are leaving discussions of research ethics to IRBs (and their counterparts worldwide), whereas these Boards largely focus on complying with the regulatory details of governmental policy. Even though this essay seeks to clarify the latter, we remain convinced that research ethics ought to be vigorously taken up within disciplinary and departmental conversations.

Type
The Profession
Copyright
Copyright © American Political Science Association 2016 

Authors’ Note for the PS Edition

This essay is a corrected and updated version of one that was originally published in Qualitative & Multi-Method Research (Yanow and Schwartz-Shea Reference Yanow and Schwartz-Shea2014). Condensed versions of that essay also appeared in the newsletters of three other APSA Organized Sections (Law and Courts, The Political Methodologist, and Migration and Citizenship). We appreciate QMMR Newsletter editor Robert Adcock’s publishing the original essay. Noting that both in the past and recently, PS has engaged with Institutional Review Board (IRB) policies and practices (Bhattacharya Reference Bhattacharya2014; Pool Reference Pool1979), we thank the editors of PS for picking up the initial essay and adding it to this history of engagement.

Our research into IRB policy emerged out of prior research on qualitative-interpretive methodologies and methods. With its origins in the experimental designs of (bio-)medical and psychological research, IRB policy—since its inception—has been out of touch with the methodological and ethical demands of field research (other than experiments). Our focus of late has shifted to questions concerning the ethical dimensions of political and other social science research conducted in a democracy, especially in today’s increasingly neoliberal-managerialist higher education regimes (Schwartz-Shea and Yanow Reference Schwartz-Shea and Yanow2016).

That research has led us to think that a wholesale ethical “reset” is needed for the social sciences, rather than continued tinkering at the margins of systems of research ethics. Such systems do not and, we think, cannot address the fundamental ethical issues that are raised when one considers the role of social scientists in a democracy. Nevertheless, we wrote the original essays to explain key provisions of current IRB policy because our research showed that many political scientists were not familiar with them. We are concerned that colleagues be aware of the existing policy’s impact and of its lesser-known provisions that affect political scientific research.

Since the essays were published, the situation has shifted further, and IRB policy is, at this moment in time, in a state of flux. On September 8, 2015, the Office for Human Research Protections (OHRP), which oversees IRBs, published a Notice of Proposed Rulemaking (NPRM) suggesting changes to the existing policy for the first time since 1981 and inviting comments on those proposed revisions. Several of the requirements discussed in this essay might be revised, including allowing researchers themselves to determine whether their research is “exempt” (see discussion) and eliminating the exemption for research on public officials. The NPRM also introduces a new category, “excluded,” with respect to oral history, journalism, biography, and specific “historical scholarship activities,” which would no longer be subject to review.

We cannot predict what will result from OHRP’s review process. When the agency issued the Advanced Notice of Proposed Rulemaking in 2011, more than 1,100 individual and organizational responses were filed by the deadline for comments. OHRP took four years to evaluate those responses and produce the actual proposed revisions in the 2015 NPRM. This time, they received over 2,000 comments. How long it will take them to review those and produce the final revision for implementation is an open question. Until then, the concepts and issues we discuss in this essay will continue to hold; but it is clear that the landscape of IRB review of political and other social science research is changing.

Prescript [to the QMMR Newsletter essay]

After we had submitted the initial version of this essay to the APSA Section newsletters in which it appeared, a field experiment concerning voting for judges in California, Montana, and New Hampshire made it even more relevant. Because they came to trial and were widely publicized, events in Montana are clearest: three political scientists—one at Dartmouth, two at Stanford—mailed 102,780 flyers marked with the state’s seal to potential voters, containing information about the judges’ ideologies. (Flyers were also mailed to voters in California—143,000—and New Hampshire—66,000.) On May 11, 2015, a Montana court found that although the Dartmouth researcher had received his IRB’s approval for an “exempt” project (defined later in this essay) in New Hampshire, that project is not what was carried out in Montana and the IRB process “was improperly engaged by the Dartmouth researcher and ignored completely by the Stanford researchers” (McCulloch v. Stanford and Dartmouth 2015, 5). Jeremy Johnson, the political scientist asked by the court to advise on the “vetting of the study,” found that the California study was also not submitted for IRB review at either university (idem, 1; for additional background on the experiment, see Michelson Reference Michelson2014 and Willis Reference Willis2014). Still, the issue of IRB review does not engage what appear to be lapses in ethical judgment in designing the research. These include using the three states’ seals (Asch Reference Asch2014) without permission and thereby creating the appearance of an official document. Media coverage also noted the possible conflict of interest between one of the Stanford researchers and the consulting firm he co-founded (e.g., Cowgirl 2014; Murphy Reference Murphy2014; Willis Reference Willis2014) and that another had previously conducted possibly similarly unethical research (Bartlett Reference Bartlett2014).

We find this a stellar example of a point we raise in the essay: the discipline’s lack of attention to research ethics, possibly due to the expectation that IRBs will take over that discussion. In our view, this reliance is misplaced, given that IRBs largely focus on complying with the regulatory details of the federal policy, fostering a thin, compliance, or checklist ethics rather than a more substantive engagement with issues arising in the actual conduct of political scientific, sociological, and other field research.

Continuing analysis of US Institutional Review Board (IRB) policies and practices concerning the protection of human “subjects” involved in research (Schwartz-Shea and Yanow Reference Schwartz-Shea and Yanow2014; Yanow and Schwartz-Shea Reference Yanow and Schwartz-Shea2008) shows that many political scientists lack crucial information about these matters. The policy clearly affects dissertating doctoral students—who may be denied their degrees if they conduct research involving human participants without obtaining IRB approval—and junior scholars, whose research and publication progress may be delayed by approval processes, thereby affecting their tenure prospects. Senior scholars who may be tempted to “fly under the radar”—avoiding IRB review for their own research (although they risk being caught later at the publication stage, as journals are increasingly requiring proof of IRB approval)—should nonetheless become familiar with IRB policies because of their impact on advisees and junior colleagues. To facilitate more effective interactions with IRB staff and Boards, we would like to share some insights gained from our research.

GENERAL BACKGROUND

University (and other) IRBs implement the federal policy that is based on the Belmont Report (US Department of Health, Education, and Welfare [HEW] 1979), adopted in 1981 by HEW and accepted in 1991 by multiple federal agencies and therefore known as the Common Rule. Monitored by the Department of Health and Human Services’ (HHS) Office for Human Research Protections (OHRP), Boards themselves usually comprise faculty colleagues (some of whom may be social scientists) plus a community member. Footnote 1 IRB office staff do not necessarily have a background in science (of any sort), and their training is oriented toward the language and evaluative criteria of the federal code. Indeed, administering an IRB has become a professional occupation, with its own training, certification, newsletters, conferences, and so forth. Footnote 2 IRBs review proposals to conduct “research” involving “human subjects” (regulatory terms explained below; “participants” is increasingly the preferred term in interactive field research). The review boards are charged with assessing a project’s potential risks to subjects in relation to its expected benefits, including the importance of the knowledge that might be gained, and with examining whether those risks have been minimized. They also assess the adequacy of researchers’ plans to secure informed consent, protect participants’ privacy, and maintain the confidentiality of collected data.

IRBs are charged with assessing a project’s potential risks to subjects in relation to its expected benefits, including the importance of the knowledge that might be gained, and with examining whether those risks have been minimized.

The federal policy was created to rest on local IRB decision making and implementation, leading to significant variations across campuses in its interpretation. Differences in practices often hinge on whether a university has a single IRB evaluating all forms of research or several IRBs (e.g., different ones for medical, social science, and educational research). Therefore, researchers need to know the IRB at their own institutions. In addition, they need to be familiar with key IRB policy provisions and terminologies. We offer a condensed explication of some of this “IRB-speak,” followed by observations on more general procedural matters. Among the latter we draw particular attention to items relevant to political science field researchers—those conducting interviews, participant-observation/ethnography, and/or surveys, whether domestically or overseas. We do not discuss political science laboratory or field experiments, which, unlike field research, generally fit the experimental research design on which IRB policy is modeled. Footnote 3

IRB-SPEAK: A PRIMER

IRBs review research that involves human participants and which is intended to contribute to knowledge. Part of what makes the review process potentially daunting is its specialized language. Discipline-based understandings of various “ordinary language” terms, such as “research” and “human subject,” often do not match regulatory definitions. While we cannot cover all the terminology, we highlight some aspects that are particularly germane to political science research involving human participants.

“Research”

Most researchers likely consider “research” to mean any systematic investigation of a topic, an understanding that includes everything from laboratory investigations to historical research. IRB regulations, however, tie its meaning to the philosophically contested idea of “generalizable knowledge” (US Code of Federal Regulations [CFR] 2009, 45 §46.102 (d)). Some disciplines have used this definition to argue—without consistent success—that their studies should not be subject to IRB review (e.g., historians conducting oral history research; see Schrag Reference Schrag2010a, 154–9). Political scientists considering advancing a similar argument concerning their empirical research should consider its ramifications in light of the primacy that many universities place on conducting research. That is, this strategy may imply second-class status for that research and for the discipline—and could even create difficulties in obtaining grants. Additionally, this definition explains why some campuses have chosen to excuse from IRB review course-related research or class exercises used to teach research methods involving human participants, on the grounds that neither contributes to generalizable knowledge.

“Human Subject”

A “human subject” is a “living individual” with whom the researcher interacts or intervenes to obtain data. Although “interaction” is defined as “communication or interpersonal contact between investigator and subject” (45 CFR §46.102 (f)), living individuals are also considered “human subjects” if the researcher obtains “identifiable private information” without interaction, such as through the use of existing records. This definition renders some modes of Internet research subject to IRB review, an area in which the OHRP’s own interpretations are particularly unsettled (Schrag Reference Schrag2013), and it has brought increased IRB oversight to research using existing datasets (discussed below).

“Minimal Risk”

Based on their perceptions of the risks entailed, IRB staff members determine a proposed project’s “level of review” (described below). “Minimal risk” research means that “the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life or during the performance of routine physical or psychological examinations or tests” (45 CFR §46.102 (i)). As risks encountered in daily life vary across subgroups in American society, not to mention worldwide, IRB judgments have been criticized for their reviewers’ lack of expertise in risk assessment and/or experience in other sociocultural milieus, leading them to misconstrue the risks associated with social scientific research (Schrag Reference Schrag2010a, 162–4; Stark Reference Stark2012). Researchers need to be explicit, therefore, in explaining potential risks posed by their projects.

“Vulnerable Populations”

Six categories of research participants are identified as “vulnerable to coercion or undue influence” and are therefore subject to safeguards that surpass the basics: “children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons” (45 CFR §46.111(b)). For example, when proposed research involves prisoners, a prisoner or prisoner representative must serve on the Board reviewing that project. Researchers studying individuals from any of these groups should carefully consider protections when designing their projects. (See specific code sections relevant to such research: 45 CFR §46.111(b), Subpart C for prisoners, Subpart D for children; and Subpart B for medical research with pregnant women, human fetuses, and neonates. There are no specific subparts for the remaining categories. See Orsini Reference Orsini, Yanow and Schwartz-Shea2014, e.g., on the problematic designation of persons with autism as vulnerable.) Federal policy also enables universities to designate additional populations as “vulnerable” (e.g., staff, students, Native Americans) in keeping with local community values.

Levels of Review: “Exempt,” “Expedited,” and “Convened” (Full Board) Review

The decision concerning which level of review a research proposal requires most commonly lies with IRB staff. Based on the researcher’s description of the proposed study’s objectives, design, and procedures, including anticipated risks, staff members decide whether it is “exempt,” is subject to “expedited” review, or requires a “convened” or full Board review.

The ordinary language understanding of “exempt” has led to considerable confusion, as researchers often think it means they need not submit their research proposals for review. That is not the case. In IRB policy, the word has come to mean exemption from full Board review, with the result that a study’s “exempt” status can be determined only by an IRB assessment. Footnote 4

The ordinary language understanding of “exempt” has led to considerable confusion, as researchers often think it means they need not submit their research proposals for review. That is not the case.

Proposed research is eligible for either “exempt” or “expedited” review designation only when it entails no greater than “minimal risk” to research participants. Such projects can undergo an expedited review process, assessed by either the IRB chairperson or “one or more experienced reviewers designated by the chairperson from among members of the IRB.” They may not disapprove the proposal, although they may require changes to its design (45 CFR §46.110 (b)). Projects that entail greater than minimal risk require full (“convened”) Board review.

The next subsections take up three of the six categories of “exempt” research which are particularly relevant to political scientists: exempted methods; public officials; and existing data, documents, and records. (The other three categories are educational research, demonstration projects, and taste and food quality studies.) Because of limited space, the nine categories of “expedited” review–eligible research—a few of which are relevant to political science (see www.hhs.gov/ohrp/policy/expedited98.html)—are omitted from this discussion.

Exempt Category: Methods

Survey and interview research and observation of public behavior are exempt from full review if the data so obtained do not identify individuals and would not place them at risk of “criminal or civil liability or be damaging to the subjects’ financial standing, employability, or reputation” if their responses were to be revealed “outside of the research” (45 CFR §46.101(b)(2)(ii)).

One form of research that is central to many types of political science is observing public behavior as political events take place. Such research would normally qualify as “minimal risk” and therefore be eligible for “exempt” status. But normal IRB review is required to establish such status, and this may delay the start of a research project. This delay could inhibit a researcher’s ability to study rapidly unfolding events of political consequence (e.g., the “Occupy” movement). Some IRBs have an “Agreement for Public Ethnographic Studies” that would allow observation to begin almost immediately, perhaps subject to certain stipulations. For example, the researcher would have to self-identify as a university-affiliated researcher in all interactions, and/or no personally identifying information could be collected.

Exempt Category: Public Officials

IRB policy explicitly exempts surveys, interviews, and public observation involving “elected or appointed public officials or candidates for public office” (45 CFR §46.101(b)(3)) without, however, clarifying who, precisely, constitutes an appointed public official. This exemption is not something most IRB members and staff deal with regularly, and researchers may need to bring it to their attention.

The way in which the exemption is written means that researchers who use any of these three methods may conduct research—in complete compliance with the federal code—that might put public officials at risk for “criminal or civil liability” or that could damage their “financial standing, employability, or reputation” (45 CFR §46.101(b)(2)). Footnote 5 The regulatory language is consistent with general legal understandings that public figures bear different burdens than private citizens. Whether political scientists would wish to hold officials’ “feet to the fire” is another matter; but should they so desire, that research is not prohibited by federal IRB policy. (Some IRBs shy away from other research topics perceived as sensitive, such as sex [Irvine Reference Irvine2012] or criminal behavior [Schrag Reference Schrag2010a, 165–6], although there is nothing in the federal policy to prohibit these.)

Exempt Category: Existing Data, Documents, and Records

Federal policy exempts from full review “[r]esearch involving the collection or study of existing data, documents, [or] records…if these sources are publicly available or if the information is recorded by the investigator in such a manner that subjects cannot be identified, directly or through identifiers linked to the subjects” (45 CFR §46.101(b)(4)). However, there appears to be considerable variability across university IRBs in how they treat existing quantitative datasets, such as the Inter-University Consortium for Political and Social Research (ICPSR) collection (see www.icpsr.umich.edu/icpsrweb/ICPSR/irb). Perhaps due to the increasing concern over the confidentiality of medical records (subsequent to the 1996 Health Insurance Portability and Accountability Act legislation), some universities with large teaching hospitals now require all researchers to obtain IRB approval to use any dataset not on a preapproved list (even if that dataset includes a responsible use statement, as ICPSR does). This means that researchers using data from an existing, publicly available, “de-identified” dataset, which ordinarily would be exempt according to the federal code, can be in violation of their campus’s IRB policy if they proceed without IRB review when that dataset is not on its preapproved list. Again, it is IRB policy design, built on the principle of tying policy implementation to community values, which makes possible the local override of federal regulations—that is, mandating locally what federal policy exempts (Yanow and Schwartz-Shea Reference Yanow and Schwartz-Shea2008, 485; see also Stark Reference Stark2012). Footnote 6

“Unchecking the Box”

The box in this phrase appears next to a statement in the Federal-Wide Assurance form (FWA) that universities must file with OHRP. Checking that box registers a university’s intention to apply IRB regulations to all human subjects research conducted by its employees and students, regardless of funding source. Leaving the box unchecked indicates that the university need not include in IRB review any research funded by sources other than the HHS, thereby limiting OHRP jurisdiction over such studies (see, e.g., Schrag Reference Schrag2010b, Reference Schrag2010c). In other words, if a university has “unchecked the box” on the FWA, then human subjects research funded from a source other than HHS or without any funding support is—technically speaking—not subject to IRB review. However, even in this case, IRB administrators can and often do require proposals for non-HHS–funded (as well as unfunded) research to be submitted for review. (The “Flexibility Coalition,” which promotes more flexible review policies for unfunded research, advises that the first thing a university wishing to create a “Flexibility Policy Framework” needs to do is to uncheck the box [see https://oprs.usc.edu/initiatives/flex]).

PROCEDURAL MATTERS FOR NONEXPERIMENTAL FIELD RESEARCH

IRB policy emerged out of concerns about abusive experimentation (e.g., the Holmesburg prison and the Willowbrook State School medical experiments Footnote 7 ). But nonexperimental field researchers face particular challenges, due in no small part to the experimental research design model that informed IRB policy at its creation and which continues to be the design most familiar to policy makers and IRB members and staff. In fact, as Schrag (Reference Schrag2010a) shows in his detailed policy history, IRB extension to social science research was largely an afterthought, something that Massachusetts Institute of Technology political scientist Ithiel de Sola Pool protested vociferously (e.g., Pool Reference Pool1979). Although psychologists have played a prominent role in policy development and in local review (see van den Hoonaard Reference van den Hoonaard2011 on local review in the Canadian system), social scientists (i.e., in disciplines other than psychology) have had quite a limited role in policy development in the United States. This section addresses those political scientists engaged in nonexperimental field research.

IRBs and Research Design Orientations

The language of the forms and online application sites that have been developed for university (and other) IRB uses reflect the experimental research design of the policy’s history. Some of the standard questions, therefore, are not relevant for nonexperimental field research designs. This can be frustrating for researchers trying to fit such designs into those templates (e.g., asking for the number of participants to be “enrolled” in the study or for “inclusion” and “exclusion” criteria, features of laboratory and field experiments or medical randomized controlled clinical trials [RCTs]). Conforming to language that does not fit the methodology of a proposed project may seem expeditious, but it can lead field researchers to distort the character of their research, and it fails to educate Board members about other methods of inquiry, however much such an effort might feel like an additional burden.

Conforming to language that does not fit the methodology of a proposed project may seem expeditious, but it can lead field researchers to distort the character of their research, and it fails to educate Board members about other methods of inquiry, however much such an effort might feel like an additional burden.

Informed Consent

IRB policy generally requires researchers to inform potential participants—to “consent” them (in IRB-speak, this noun has transmogrified into a verb)—about the scope of both the research and its potential harms. The latter include not only physical or mental injury from the research activity itself, but also, as noted previously, possible harms to their “financial standing, employability, or reputation” were their identity and/or responses (which should be kept confidential) to be publicly revealed (45 CFR §46.101(b)(2)). Potential subjects may also need to be consented about possible identity revelations that could render them subject to criminal or civil prosecution (e.g., the unintentional public revelation of an undocumented worker’s identity). Central to the consent process is the concern that potential participants not be coerced into participating in the research and that they understand that they may stop their involvement at any time.

The federal code lists eight “basic” elements of informed consent, and Board reviewers or staff may insist that a consent form include all eight, resulting in forms that are unnecessarily complex, with elements clearly irrelevant to social science research (e.g., disclosure of “alternative procedures or courses of treatment” [45 CFR §46.116 (a)(4)]). For minimal risk research, the code allows IRBs the flexibility to approve forms without all eight elements “provided the alteration will not adversely affect the rights and welfare of the subjects” (45 CFR §46.116 (d))—something a campus IRB may not know.

In addition, it is not always feasible to consent everyone with whom the researcher interacts in a field site (nor is it always ethically necessary). Although the federal code initially states that “informed consent will be sought from each prospective subject” (45 CFR §46.111 (4)), a subsequent section allows a waiver of the consent process when the “research could not practicably be carried out without the waiver or alteration” (45 CFR §46.116 (c)(2))—something else not always known or acknowledged (despite its presence in one of the CITI Footnote 8 training modules). Furthermore, a signed consent form may pose threats to its signer (e.g., if files are lost, subpoenaed, or commandeered). The federal code explicitly allows IRBs to waive the requirement that researchers obtain signed consent forms (45 CFR §46.117 (c)). Boards may allow verbal consent for minimal risk studies, forgoing a signed form, although they may still require that a written summary or “information sheet” be given or read to potential participants.

When signed forms are not waived, IRBs may make standardized consent forms available, thinking these will assist researchers by simplifying the process. Although using such forms might seem a good way to signal Board members that the proposed research meets their requirements, what is beneficial for obtaining IRB approval may not be well suited to the research needs of a particular field study. Once approved, the researcher will be restricted to that consent language, unless a request is filed with the IRB to revise the study design. Rather than modifying templates that were likely designed for experiments or RCTs, a better strategy might be to think through the language and formulations most appropriate to one’s intended research participants. This is in keeping with our general caution about adopting IRB terminology unreflectively, as language deriving from experimental and clinical research practices is not always a good fit with nonexperimental field research criteria.

Permissions to Conduct Research Footnote 9

IRBs are also increasingly requiring authorities in a proposed research setting to provide formal permissions to conduct research—“approvals” or “letters of cooperation.” For example, the University of Northern Iowa IRB Manual (ND) states: “A letter of cooperation serves as documentation from the research site that the investigator has permission to conduct the research at that location. The letter typically must be from someone in authority at the organization, not a group counselor or teacher.” In some cases, the permission must be included with the researcher’s initial application; in other cases, the application may be assessed by the IRB without the permission, but it must be submitted later to obtain final approval.

This is one example of current practice reaching beyond the scope of the initial policy document. The Belmont Report focuses on the consent of individual research participants; it does not require that researchers gain gatekeepers’ approval to access research sites where the potential research participants are located (even if this is common practice in participant observer and ethnographic research, covert research excepted). Such requests for documented access to a community, organization, or other field site at the outset of a field research project are part of what critics call “mission creep” among IRBs (Gunsalus et al. Reference Gunsalus, Bruner, Burbules, Dash, Finkin, Goldberg, Greenough, Miller and Pratt2006). Requiring these requests further complicates already fraught processes of negotiating access to research settings, adding a level of formality that could, in some cases, forestall or prevent actual access.

GENERAL PROCEDURAL MATTERS

The IRB review process can pose significant time delays to the start of a research project. We know of IRB reviews’ adding a year or more to a project’s timetable, usually at the front end (i.e., before the research itself can commence). Adding to the potential delay is the requirement at many universities that researchers (and, for graduate students, their supervisors as well) complete some form of training before they may submit their study for review. Such delay has implications not only for field researchers negotiating the start of a project (e.g., arranging “access”), but also for all empirical researchers with grant funds, since these will usually not be disbursed until IRB approval is secured. Researchers should find out the turnaround time for their campus’s IRB, including whether key staff might be on vacation when they intend to submit their proposals.

Three other circumstances can further complicate this situation. First, consider a single researcher whose research setting is an organization with its own IRB, such as a state agency or local hospital. Sometimes, the researcher’s university and the research setting will each require a review.

Second, when a project involves multiple researchers at different universities and/or research settings, the coordination and timing problems multiply. Some IRBs are content to have the lead researcher proceed through her campus IRB, drawing on federal code that explicitly allows a university to “rely upon the review of another qualified IRB…[to avoid] duplication of effort” (45 CFR §46.114). Other Boards insist that all participating investigators clear their own campuses’ IRBs—regardless of the provision for reliance agreements. Considering that Boards on different campuses meet at different frequencies—usually depending on the volume of research proposals requiring review—it can be difficult to achieve coordinated review for a jointly written proposal. Adding to that the different Boards’ interpretations of what the code requires results in a classic instance of organizational coordination gone awry. Footnote 10

Third, overseas research, whether solo or with foreign collaborators, is increasing. Federal policy recognizes and makes allowances for international variability in ethics regulation (45 CFR §46.101(h)); however, in practice, some US IRBs assume that all universities, worldwide, share the same institutional concerns and requirements. That is hardly the case. Whereas Canada, the United Kingdom, Australia, and New Zealand have national research oversight policies, most European states and their universities typically do not (yet), despite the European Union’s 2006 call to establish them; matters in the rest of the world vary. Footnote 11 These differences pose problems, potentially, for the US researcher whose IRB requires review by a foreign government, a local board, and/or the nonexistent IRB of a European colleague’s university. These requests for review have become more complicated with the addition of requirements for permissions letters from research settings, a move that does not take into account authoritarian regimes in which providing such documents might endanger the officials who issue them. Footnote 12 The University of Chicago’s IRB provides one, fairly moderate example addressing the circumstances in which a researcher might be unable to obtain such permissions (and not only in international settings):

Where there is no equivalent board or group in the locality where you wish to do research, investigators should seek input from local experts or community leaders. If you do not obtain approval by an IRB or the local equivalent of an IRB in the country where you wish to do research, you will need to explain in your protocol submission why you did not do so (University of Chicago, Social and Behavioral Sciences Institutional Review Board, ND).

But not all IRBs are as sensitive to such social and political realities. With the upswing in international research, states and universities around the world are developing their own policies, often motivated by a desire to enable their researchers to meet US university demands to collaborate with international colleagues. Other difficulties lie just beyond the horizon: the incompatibility of European privacy protection laws—which require that all data be destroyed, usually within five years after the end of a research project—with the current US push to archive data, something that touches on IRB concerns.

Considering that Boards on different campuses meet at different frequencies—usually depending on the volume of research proposals requiring review—it can be difficult to achieve coordinated review for a jointly written proposal.

Additionally, for research overseas, some IRBs are now requiring a letter from an independent expert attesting to the cultural appropriateness of the proposed research and/or the cultural expertise of the researcher. Such “independent experts” may be required to be entirely unaffiliated with the research project, which would eliminate a doctoral student’s dissertation supervisor, for instance, from providing “expert knowledge” concerning either criterion.

One matter we have not taken up here concerns the applicability of IRB policy to research-active retired faculty. This issue has been discussed by IRB administrators, with no clear consensus as to whether emeriti still need to obtain IRB clearance.

CONCLUDING THOUGHTS

Knowing the federal regulations can put researchers on more solid footing in pointing to permitted research practices that may not be familiar to their local Boards. And knowing IRB-speak can enable clearer communications between researchers and Board members and staff. Consulting with campus colleagues using similar methods—whether in political science or other disciplines—who have successfully navigated the IRB process may also be helpful, especially as such discussions may alert a researcher to ethical concerns not yet considered, quite aside from issues of IRB compliance (see, e.g., Thomson Reference Thomson, Thomson, Ansoms and Murison2013). Discussing questions about one’s design with IRB staff is also a possibility on some campuses. However, some caution in the latter may be called for, as administrators’ expertise more commonly lies in regulatory compliance than in the subtleties of research design (or research ethics). Administrative staff may encourage changes to a research design which, from their perspective, ease the assessment of compliance with IRB policy and/or which fit the expectations of the local campus Board, yet which do not stand the research in good stead. This is not to say that staff do not mean well; rather, their organizational interests may not align with the design and ethical practices accepted within discipline-specific fields of study.

Whether concerning research design, consent practices, or research evaluative criteria, on many campuses political scientists doing fieldwork are faced with educating their IRB members and staff about the ways in which their methods differ from the experimental studies conducted in hospitals and laboratories. Although challenging, educating staff as well as IRB members potentially benefits all such researchers, particularly graduate students, some of whom have given up on field research due to IRB delays, which often are greater for research that does not fit the experimental model (van den Hoonaard Reference van den Hoonaard2011). Like Thomas (2013), we encourage colleagues, where possible, to exert the effort to educate their IRBs about the exigencies of political science field research, for the benefit of all, something that Eissenberg et al. (Reference Eissenberg, Panicker, Berenbaum, Epley, Fendrich, Kelso, Penner and Simmerling2004) and Lederman (Reference Lederman2007) have also argued for in psychology and anthropology.

IRB review is no guarantee, however, that the ethical issues relevant to a particular research project will be engaged. Consider, for instance, Majic’s experience, in her research on the sex worker industry, that “form-driven consent procedures…invoke[d] a moment of fear for participants” (Majic Reference Majic2014, 14; emphasis in original)—an ethical concern not raised in IRB review (see Varma Reference Varma2014 for additional examples). Furthermore, IRB policy, by design, does not touch on ethical issues that might emerge in dissemination, such as when various audiences, such as the Central Intelligence Agency, use published work, especially without the author’s knowledge, let alone permission (Fujii Reference Fujii2012; see Salemink Reference Salemink2003, 3–5).

Moreover, significant ethical matters of particular concern to political science research are simply beyond the bounds of IRB policy, given its origins in the doctor–patient relationship (see Stark Reference Stark2012 on the National Institutes of Health history) and its preoccupation with the ethics of researcher–researched interactions, rather than with broader matters of societal ethics and the role of social scientists in a democracy. One example of this is the lack of recognition of the importance to political science of “studying up” (i.e., studying societal elites and other power holders; Nader Reference Nader and Hymes1972) and of the ways in which current IRB policy makes this difficult (cf. Childress, Meslin, and Shapiro Reference Childress, Meslin and Shapiro2005).

Covert research practices comprise another area of concern. In IRB policy, the discussion of deception is largely shaped by its use in laboratory psychology experiments, and IRBs generally allow it provided that it is subsequently revealed to subjects in debriefing. Many nonexperimental social science field researchers assume that their IRBs would not be likely to approve researcher deception (e.g., as necessary to field studies of such politically relevant topics as police practices; see Leo Reference Leo1995; cf. Erikson Reference Erikson1995). However, the most recent policy-related discussions recognize its use in the social sciences and remain open to its consideration. The Flexibility Coalition initiative referenced previously may make IRB review and approval of such studies more likely. Still, one of our primary concerns is that IRB administrative processes are diverting conversations about these and other research ethics topics that might otherwise (and, in our view, should) be part of departmental curricula, research colloquia, and discussions with supervisors and colleagues.

IRB review need not be an adversarial process (as Eissenberg et al. Reference Eissenberg, Panicker, Berenbaum, Epley, Fendrich, Kelso, Penner and Simmerling2004 remark to their colleagues in psychology). Several researchers, in fact, have shared with us their pleased surprise that filling out IRB forms focused their attention on ethical aspects of their research which they had not yet considered. And variations across IRBs include differences in approach, with some being more explicitly inclined to support researchers. However, we would be remiss not to mention that many researchers experience various fears concerning IRB processes—that delay will kill their research projects or the funding for them, or even that Boards might retaliate in some fashion if they resist requests to change their research designs. Given the power of IRBs and the policy’s lack of a formal appeals process, these fears and experiences should not be dismissed lightly. They can lead scholars to self-censor or to advise their students to do so, whether through topic and method selection or in response to Board requests for proposal changes. Unfortunately, such self- or advisee-censorship does not facilitate organizational learning, either within or across IRBs, leaving many IRB members and staff with the impression that the policy works well for social scientists. Censorship of this sort poses a threat not only to field research; it is also potentially detrimental to the future of nonexperimental methods.

Much associational and disciplinary attention has been focused, rightly, on congressional efforts to curtail National Science Foundation funding of political science research. However, because IRB policy affects all research engaging human participants, it deserves equal attention. We urge colleagues to pay attention to IRB policy on their own campuses and, as important, to involve their colleagues and graduate students in engaging the ethical issues raised by political science research.

Footnotes

1. The Office for Human Research Protections website (available at www.hhs.gov/ohrp) includes detailed information on IRB requirements and policies—ed.

2. See, for instance, PRIM&R’s (Public Responsibility in Medicine and Research) Certified IRB Professional program (available at www.primr.org/certification/cip)—ed.

3. At times, the concerns of experimental and nonexperimental field research overlap. We do not have space here to explore this topic, but see Desposato (Reference Desposato2013).

4. This understanding derives from OHRP’s predecessor, the Office for Protection from Research Risks, in its Report 95-02 (1995) statement that “investigators should not have the authority to make an independent determination that research involving human subjects is exempt,” adding, “Institutions may elect to review all research under the auspices of the institution even if the research qualifies for exemption under .46.101(b).”

5. This point is difficult to understand because this exemption rests on what precedes it: “The following categories are exempt from this policy: … (3) Research involving [these methods] that is not exempt under paragraph (b)(2) of this section, if: (i) the human subjects are elected or appointed public officials or candidates for office” (45 CFR §46.101(b)(2) and (3)).

6. Along with others, we remain skeptical that IRB decisions are always taken to reflect local community values, rather than to protect universities’ federal funding (see, e.g., Heimer and Petty Reference Heimer and Petty2010, 621; Klitzman Reference Klitzman2011).

7. On Holmesburg, see Hornblum (Reference Hornblum1998); on Willowbrook and some of the other abuses, see Peckman (Reference Peckman2001).

8. “CITI [Collaborative IRB Training Initiative], a web-based training program in human-subjects protection, is currently used at more than 1,130 institutions and facilities worldwide.” Available at https://oprs.usc.edu/education/citi. Accessed June 16, 2014.

9. This section has been added to this version of the essay to reflect developing trends in the field about which we learned after writing the QMMR and condensed versions.

10. This is one of the policy dimensions that has received extensive criticism, and OHPR seems to address it in the NPRM, calling for multisited research projects to have only a single IRB review the proposal. The provision also shifts responsibility, in the case that something goes wrong, from the various institutions to the reviewing IRB. It is not clear how this proposed change would affect a researcher with a single research site (such as a state agency or local hospital) that has its own IRB—the first example in this paragraph—if both the campus’s and the site’s IRBs were each to insist on its own review.

11. Although oriented toward biomedical experimentation, OHRP’s compilation of international policies and practices may provide useful links—in the “general” information row—to policies concerning nonexperimental social science research (Office for Human Research Protections 2015).

12. The rest of this paragraph and the subsequent one concerning culturally appropriate research have been added to this version of the essay to reflect recent developments in IRB practices.

References

REFERENCES

Asch, Joseph. 2014. “Moving to Montana?” Dartblog: Dartmouth’s Daily Blog, October 29. Available at www.dartblog.com/data/2014/10/011757.php. Accessed November 10, 2015.Google Scholar
Bartlett, Tom. 2014. “Dartmouth and Stanford Apologize after a Political-Science Experiment Gone Wrong.” Chronicle of Higher Education, October 29. Available at http://chronicle.com/article/DartmouthStanford/149687. Accessed November 6, 2015.Google Scholar
Bhattacharya, Srobana. 2014. “Institutional Review Board and International Field Research in Conflict Zones.” PS: Political Science & Politics 47 (4): 840–4.Google Scholar
Childress, James F., Meslin, Eric M., and Shapiro, Harold T. (eds.). 2005. Belmont Revisited. Washington, DC: Georgetown University Press.Google Scholar
Cowgirl . 2014. “Questions Emerge about Potential Conflict of Interest between Mailergate and Silicon Valley Start-Up.” Montana Cowgirl Blog, October 26. Available at http://mtcowgirl.com/2014/10/26/the-worm-turns-in-mailergate-heres-the-latest. Accessed November 6, 2015.Google Scholar
Desposato, Scott. 2013. Conference on “Ethics in Comparative Politics Experiments.” San Diego: University of California, May 1–2. Available at http://polisci2.ucsd.edu/polisciethics. Accessed September 20, 2014.Google Scholar
Eissenberg, Thomas, Panicker, Sangeeta, Berenbaum, Sheri, Epley, Norma, Fendrich, Michael, Kelso, Rosemary, Penner, Louis, and Simmerling, Mary. 2004. “IRBs and Psychological Science: Ensuring a Collaborative Relationship.” Available at www.apa.org/research/responsible/irbs-psych-science.aspx. Accessed September 18, 2014.Google Scholar
Erikson, Kai. 1995. “Commentary.” The American Sociologist 26 (2): 411.CrossRefGoogle Scholar
Fujii, Lee Ann. 2012. “Research Ethics 101.” PS: Political Science & Politics 45 (4): 717–23.Google Scholar
Gunsalus, C. K., Bruner, Edward M., Burbules, Nicholas C., Dash, Leon, Finkin, Matthew, Goldberg, Joseph P., Greenough, William T., Miller, Gregory A., and Pratt, Michael G.. 2006. “Mission Creep in the IRB World: Editorial.” Science 312 (5779): 1441.Google Scholar
Heimer, Carol A., and Petty, JuLeigh. 2010. “Bureaucratic Ethics: IRBs and the Legal Regulation of Human Subjects Research.” Annual Review of Law and Social Science 6: 601–26.CrossRefGoogle Scholar
Hornblum, Allen M. 1998. Acres of Skin: Human Experiments at Holmesburg Prison, A True Story of Abuse and Exploitation in the Name of Medical Science. New York: Routledge.Google Scholar
Irvine, Janice M. 2012. “Can’t Ask, Can’t Tell: How Institutional Review Boards Keep Sex in the Closet.” Contexts 11 (2): 2833.Google Scholar
Klitzman, Robert. 2011. “The Myth of Community Differences as the Cause of Variations among IRBs.” American Journal of Bioethics Primary Research 2 (2): 2433.Google Scholar
Lederman, Rena. 2007. “Educate Your IRB: An Experiment in Cross-Disciplinary Communication.” Anthropology News 48 (6): 33–4.Google Scholar
Leo, Richard A. 1995. “Trial and Tribulations: Courts, Ethnography, and the Need for an Evidentiary Privilege for Academic Researchers.” The American Sociologist 26 (1): 113–34.Google Scholar
Majic, Samantha. 2014. “Policy Meets Practice: Qualitative Political Science Fieldwork and the IRB Process.” Presented at the Betty Glad Memorial Symposium, Field Research and US Institutional Review Board Policy, University of Utah, March 20–21.Google Scholar
McCulloch v. Stanford and Dartmouth . 2015. Before the Commissioner of Political Practices of the State of Montana. “Decision Finding Sufficient Facts to Demonstrate a Violation of Montana’s Campaign Practice Laws, No. COPP 2014 CFP-046.” May 11. Available at http://politicalpractices.mt.gov/content/2recentdecisions/McCullochvStanfordandDartmouthFinalDecision. Accessed November 6, 2015.Google Scholar
Michelson, Melissa R. 2014. “Messing with Montana.” The New West, official blog of the Western Political Science Association, October 25. Available at http://thewpsa.wordpress.com/2014/10/25/messing-with-montana-get-out-the-vote-experiment-raises-ethics-questions. Accessed November 3, 2014.Google Scholar
Murphy, Katy. 2014. “Stanford, Dartmouth Sidestep Legal Action against Researchers’ Election Mailers.” San Jose Mercury News, October 29. Available at www.mercurynews.com/education/ci_26824200/stanford-dartmouth-sidestep-legal-action-against-researchers-election. Accessed November 10, 2015.Google Scholar
Nader, Laura. 1972. “Up the Anthropologist: Perspectives Gained from Studying Up.” In Reinventing Anthropology, ed. Hymes, Dell, 284311. New York: Pantheon.Google Scholar
Office for Human Research Protections, US Department of Health and Human Services. 2015. International Compilation of Human Research Standards, 2015 edition. Available at www.hhs.gov/ohrp/international. Accessed January 2, 2015.Google Scholar
Office for Protection from Research Risks, US Department of Health and Human Services. 1995. “Exempt Research and Research that May Undergo Expedited Review,” Report Number 95-02, May 5. Available at www.hhs.gov/ohrp/policy/hsdc95-02.html. Accessed August 18, 2014.Google Scholar
Orsini, Michael. 2014. "‘May I See Your Color-Coded Badge?’ Reflections on Research with ‘Vulnerable’ Communities.” In Interpretation and Method: Empirical Research Methods and the Interpretive Turn, eds. Yanow, Dvora and Schwartz-Shea, Peregrine, 406–20. Armonk, NY: ME Sharpe.Google Scholar
Peckman, Steven. 2001. “Local Institutional Review Boards.” Ethical and Policy Issues in Research Involving Human Participants, Vol. II: Commissioned Papers and Staff Analysis. Bethesda, MD: National Bioethics Advisory Commission (August), Appendix K. Available at http://onlineethics.org/cms/17212.aspx#t31. Accessed April 17, 2015.Google Scholar
Pool, Ithiel de Sola. 1979. “Protecting Human Subjects of Research: An Analysis of Proposed Amendments to HEW Policy.” PS: Political Science & Politics 12 (4): 452–5.Google Scholar
Salemink, Oscar. 2003. “Ethnography, Anthropology and Colonial Discourse: Introduction.” In The Ethnography of Vietnam’s Central Highlanders, 139. Honolulu: University of Hawai’i Press.Google Scholar
Schrag, Zachary M. 2010a. Ethical Imperialism: Institutional Review Boards and the Social Sciences, 1965–2009. Baltimore, MD: Johns Hopkins University Press.Google Scholar
Schrag, Zachary M. 2010b. “More Universities Uncheck Their Boxes.” Institutional Review Blog, August 6. Available at www.institutionalreviewblog.com/2010/08/more-universities-uncheck-their-boxes.html. Accessed June 6, 2014.Google Scholar
Schrag, Zachary M. 2010c. “Twenty-Six Percent of Boxes Go Unchecked.” Institutional Review Blog, March 19. Available at www.institutionalreviewblog.com/2010/03/twenty-six-percent-of-boxes-go.html. Accessed June 6, 2014.Google Scholar
Schrag, Zachary M. 2013. “SACHRP: Exempt Research May ‘Be Subject to IRB Review’.” Institutional Review Blog, June 11. Available at www.institutionalreviewblog.com/2013/06/sachrp-exempt-research-may-be-subject.html. Accessed June 10, 2014.Google Scholar
Schwartz-Shea, Peregrine, and Yanow, Dvora. 2014. Betty Glad Memorial Symposium on “Field Research and US Institutional Review Board Policy.” University of Utah, March 20–21. Available at http://poli-sci.utah.edu/2014-research-symposium.php.Google Scholar
Schwartz-Shea, Peregrine, and Yanow, Dvora (eds.). 2016. Policing Social Science: Regulating Researcher Ethics in a Democracy. Work in progress.Google Scholar
Stark, Laura. 2012. Behind Closed Doors: IRBs and the Making of Ethical Research. Chicago: University of Chicago Press.Google Scholar
Thomson, Susan. 2013. “Academic Integrity and Ethical Responsibilities in Post-Genocide Rwanda: Working with Research Ethics Boards to Prepare for Fieldwork with ‘Human Subjects’.” In Emotional and Ethical Challenges for Field Research in Africa, eds. Thomson, Susan, Ansoms, An, and Murison, Jude, 139–54. London: Palgrave Macmillan.Google Scholar
University of Chicago, Social and Behavioral Sciences Institutional Review Board. ND. Available at https://sbsirb.uchicago.edu/page/international-research. Accessed February 4, 2015.Google Scholar
University of Northern Iowa IRB Manual. ND. Available at www.uni.edu/rsp/irb-manual-irb-review-process-and-considerations. Accessed February 3, 2015.Google Scholar
US Code of Federal Regulations. 2009. Title 45, Public Welfare, Department of Health and Human Services, Part 46, Protection of Human Subjects. Available at www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html.Google Scholar
US Department of Health, Education, and Welfare. 1979. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Washington, DC: Office for Protection from Research Risks Reports.Google Scholar
van den Hoonaard, Will C. 2011. The Seduction of Ethics. Toronto: University of Toronto Press.Google Scholar
Varma, R. 2014. “Questioning Professional Autonomy in Qualitative Inquiry.” IEEE Technology and Society Magazine 33 (4): 5764.Google Scholar
Willis, Derek. 2014. “Professors’ Research Project Stirs Political Outrage in Montana.” New York Times, October 28. Available at www.nytimes.com/2014/10/29/upshot/professors-research-project-stirs-political-outrage-in-montana.html?ref=us&;_r=1&abt=0002&abg=1. Accessed November 3, 2014.Google Scholar
Yanow, Dvora, and Schwartz-Shea, Peregrine. 2008. “Reforming Institutional Review Board Policy.” PS: Political Science & Politics 41 (3): 484–94.Google Scholar
Yanow, Dvora, and Schwartz-Shea, Peregrine. 2014. “Encountering your IRB: What political scientists need to know.” Qualitative and Multi-Method Research (Newsletter of the APSA Organized Section) 12 (2): 3440.Google Scholar