Hostname: page-component-7c8c6479df-ph5wq Total loading time: 0 Render date: 2024-03-26T14:34:37.229Z Has data issue: false hasContentIssue false

Will Open Access Get Me Cited? An Analysis of the Efficacy of Open Access Publishing in Political Science

Published online by Cambridge University Press:  31 December 2014

Amy Atchison
Affiliation:
Valparaiso University
Jonathan Bull
Affiliation:
Valparaiso University
Rights & Permissions [Opens in a new window]

Abstract

The digital revolution has made it easier for political scientists to share and access high-quality research online. However, many articles are stored in proprietary databases that some institutions cannot afford. High-quality, peer-reviewed, top-tier journal articles that have been made open access (OA) (i.e., freely available online) theoretically should be accessed and cited more easily than articles of similar quality that are available only to paying customers. Research into the efficacy of OA publishing thus far has focused mainly on the natural sciences, and the results have been mixed. Because OA has not been as widely adopted in the social sciences, disciplines such as political science have received little attention in the OA research. In this article, we seek to determine the efficacy of OA in political science. Our primary hypothesis is that OA articles will be cited at higher rates than articles that are toll access (TA), which means available only to paying customers. We test this hypothesis by analyzing the mean citation rates of OA and TA articles from eight top-ranked political science journals. We find that OA publication results in a clear citation advantage in political science publishing.

Type
The Profession
Copyright
Copyright © American Political Science Association 2015 

As academic publishing transitions from print to electronic periodicals, the process by which political scientists share research has become increasingly easier. Considerable high-quality political science research is now available at the click of a button; however, the majority of published articles in the discipline are locked behind paywalls. Subscription rates to journals and scholarly databases are increasingly expensive, putting them out of reach for budget-conscious researchers and institutions. As a result, traditional publishers have developed a number of methods to make their journal articles freely available online; the industry refers to this as open access (OA) publishing. Many proponents of OA argue that this greater accessibility gives OA articles a citation advantage over toll access (TA) articles that are not freely available (Antelman Reference Antelman2004; Davis Reference Davis2011; Gargouri et al. Reference Gargouri, Hajjem, Larivière, Gingras, Carr, Brody and Harnad2010).

It is important to note that the term “open access” signifies different things. For many, “freely available” scholarship is associated with working papers, non-peer-reviewed work, and online journals with lax peer-review standards. In this article, however, we examine articles that were published in top-tier journals and were converted into OA scholarship because they were self-archived on authors’ personal websites and/or in institutional repositories.Footnote 1 This is what the industry refers to as “green” OA (Suber Reference Suber2012).Footnote 2 In our study, these articles were made “green” (i.e., available to the general public free of charge) after rigorous peer review.Footnote 3 The eight journals included are American Political Science Review, American Journal of Political Science, Public Opinion Quarterly, Journal of Conflict Resolution, Political Analysis, Political Geography, Annual Review of Political Science, and Comparative Political Science.

Our research question was straightforward: Will making an article OA increase the number of citations it receives? The social sciences have been slow adopters of OA (Calise et al. Reference Calise, Rosa and Marin2010). In contrast, mathematics, engineering, and the natural sciences were early adopters; therefore, the question has been examined in those disciplines. Results are mixed but generally positive regarding advantageous OA citation effects (OACE) in those disciplines (Doty Reference Doty2013). However, to the best of our knowledge, political science publications never have been the sole focus of research into the OA citation advantage. This study contributes to the ongoing debate regarding OA efficacy and also serves as a starting point for serious research on OA in political science.

Our primary hypothesis is that OA articles in the discipline will be cited at higher rates than articles that are not freely available online. We test this hypothesis by examining the citation rates of all articles published in the eight journals during a two-year period. We first determined which articles had been made OA and which were still TA. We then examine mean citation rates of the OA and TA articles to determine whether there is a citation advantage for the OA articles. The fact that some journals have permissive self-archiving policies whereas others are more restrictive adds variation to the data, as does the fact that authors may choose not to take advantage of OA policies and therefore will not self-archive their work. Given these parameters, we further hypothesize that the more permissive self-archiving policies of top publishers will result in a greater citation advantage. This article proceeds as follows. First, we articulate our argument in the context of the extant OA literature. This is followed by a discussion of our data and methodological considerations. Then, we present our results and discuss their implications for OA publishing in the discipline. We conclude by noting possibilities for continued research on the efficacy of OA publishing in political science.

OPEN ACCESS AND CITATION ADVANTAGE

Since at least the 1980s, subscription prices of US academic publications have been rising faster than the rate of inflation (Dingley Reference Dingley2005). As a result, academic institutions—particularly college and university libraries—have had to alter purchasing strategies for acquiring and/or renewing databases, journals, monographs (i.e., books), and other academic resources (Greco et al. Reference Greco, Wharton, Estelami and Jones2006). This situation, commonly referred to as the “serials crisis,” has had an impact across the spectrum of academic publishing (Greco et al. Reference Greco, Jones, Wharton and Estelami2007). In response to the crisis, some called for scholars to reconsider publishing with inflationary and overpriced journals (Parks Reference Parks2002) and for academic libraries to make access to them as economically efficient as possible (Pascarelli Reference Pascarelli1990). However, no alternative has shown as much traction as OA publishing.

For the purposes of this study, we use the classic definition of OA, in which an article “is available online to be read for free by anyone, anytime, anywhere—as long as they have Internet access” (Crawford Reference Crawford2011, 1). Despite the broad definition, we have narrowed the OA field by focusing solely on articles published in top political science journals. None of these journals is freely available in toto (which is referred to as “gold” OA). By definition, this restricts our study to green OA, which means articles that have been self-archived by the author and/or a sponsoring institution.

Despite the broad definition, we have narrowed the OA field by focusing solely on articles published in top political science journals. None of these journals is freely available in toto (which is referred to as “gold” OA).

OA advocates argue that disciplines will benefit from free access to information (Calise and De Rosa Reference Calise and Rosa2008; Papin–Ramcharan and Dawe Reference Papin–Ramcharan and Dawe2006) and that the increased use of OA publishing will help to stem the serials crisis (Calise and De Rosa Reference Calise and Rosa2008; Papin–Ramcharan and Dawe Reference Papin–Ramcharan and Dawe2006; Harnad and Brody Reference Harnad and Brody2004; May 2005)In addition, advocates argue that OA will increase research efficacy as measured by citation counts and/or the citation impact factor. Despite the budgetary imperative to reduce costs and the potential citation advantage for authors, buy-in for OA publishing has largely been limited to disciplines in the physical and natural sciences as well as engineering and mathematics. Because these disciplines have been heavily invested in OA for many years, they have been the focus of the preponderance of research into the OACE.

Although researchers generally have found a positive correlation between OA and research impact, note that results are varied and the issue is not settled (Doty Reference Doty2013). For example, Lawrence (Reference Lawrence2001) found a citation advantage for OA articles in computer sciences, whereas Anderson et al. (Reference Anderson, Sack, Krauss and O'Keefe2001) found no citation advantage for OA articles in medicine. In subsequent research, Harnad and Brody (Reference Harnad and Brody2004) reported a citation advantage for OA articles in mathematics and physics, whereas Kurtz et al. (Reference Kurtz, Eichhorn, Accomazzi, Grant, Demleitner, Henneken and Murray2005) found no advantage in astrophysics. More recent research, however, has provided more support for the conventional wisdom that OA benefits citation counts. For example, a four-discipline study (i.e., economics, applied mathematics, ecology, and sociology) found that the OACE is positive across all four disciplines but that the degree varies among them (Norris et al. Reference Norris, Oppenheim and Rowland2008). Also, in recent unpublished research, McCabe and Snyder (Reference McCabe and Snyder2014) found a citation advantage for science journals with OA publishing models, although this increase was possibly a “superstar effect” because it was found primarily in the higher-tier journals.Footnote 4

Despite our previous reference to a study that included economics and sociology (Norris et al. Reference Norris, Oppenheim and Rowland2008), only a few OACE studies have included the social sciences. This is commonly attributed to the low adoption of OA in these disciplines. However, recent scholarship indicates that the prevalence of OA publishing is increasing rapidly in the social sciences generally and political science specifically (Nentwich Reference Nentwich2008). Indeed, a recent study (Gargouri et al. Reference Gargouri, Larivière, Gingras, Carr and Harnad2012, 5) estimates that between 1998 and 2006, approximately 28% of social science articles were OA; that average increased to 36% when the authors studied the data from 2005 to 2010. In addition, the number of OA political science journals listed in the Directory of Open Access Journals increased by more than 50% between 2010 and 2013 (Bjørnshauge et al. Reference Bjørnshauge, Brage, Brage and Jørgensen2013; Calise et al. Reference Calise, Rosa and Marin2010).Footnote 5 Online self-archiving and archiving in institutional repositories complicate the efforts to quantify the volume of political science articles that are OA; the estimates vary from 5% (Hajjem et al. Reference Hajjem, Harnad and Gingras2005) to 30% (Antelman Reference Antelman2006). Despite this ambiguity, it is generally accepted that between the increasing number of OA journals and the more liberal self-archiving policies that many journals are adopting, the percentage of political science articles that are OA is increasing (Nentwich Reference Nentwich2008).

However, as in the natural sciences, the literature presents varied results on the efficacy of OA in different social science disciplines. For example, Antelman (2004, 375–76) found that whereas political science accounted for only 29% of the OA articles, those articles received the highest citation advantage in her study. Similarly, in a 10-discipline study that included political science, Hajjem, Harnad, and Gingras (Reference Hajjem, Harnad and Gingras2005) found that OA articles consistently have more citations than non-OA articles from the same journal and year. However, Evans and Reimer (Reference Evans and Reimer2009) found that the social sciences receive negligible benefit from OA. Most recently, Xia and Nakanishi (Reference Xia and Nakanishi2012) found that anthropologists receive a significant citation advantage from OA publishing independent of journal ranking (i.e., no superstar effect). Although mixed, the greater volume of positive findings in the social sciences leads us to our first hypothesis regarding OA in political science, as follows:

  • H1: OA articles will be cited at higher rates than articles that are TA only.

PUBLISHER SELF-ARCHIVING POLICIES AND OACE

As noted previously, the two primary approaches that academic publishers take to OA publishing are “gold OA” and “green OA.” Gold OA is provided by the journals, often at a cost to researchers or sponsoring institutions. Conversely, green OA is provided by either institutional and/or subject repositories or by individuals posting to their personal or academic websites (Suber Reference Suber2012, 53). Recently, gold OA was the subject of a symposium at the International Studies Association, the results of which were published in International Studies Perspectives (Gleditsch Reference Gleditsch2012). In the symposium, Mehlum (Reference Mehlum2012) makes a strong case for gold OA, citing concerns regarding equality of access for researchers in developing countries.Footnote 6 He also discusses economic and pricing concerns, noting that research could be treated as a public good.

However, a strong counterargument is articulated by both Thompson (Reference Thompson2012) and Gleditsch (Reference Gleditsch2012). Although neither author is opposed to OA publishing per se, both note a considerable economic downside for professional organizations and editorial offices. A professional association may receive as much as one third of its operating budget from publishers’ payments for exclusive publication rights to its journal (Thompson Reference Thompson2012). In addition, editorial offices are kept afloat by publisher subsidies and royalties; if association-sponsored journals were to transition to gold OA, Gleditsch (Reference Gleditsch2012) argued, those editorial offices and the quality-control service they provide would be endangered. Thus, gold OA poses serious concerns for associations and their flagship journals.

Within the green OA category, publishers have adopted a wide range of copyright and self-archiving policies.

However, only a fraction of journals—approximately 5% in 2004—would fit the definition of gold OA (Harnad and Brody Reference Harnad and Brody2004). Indeed, only an estimated 2% of all OA articles published in 2011 were in gold OA journals; in the social sciences, the number decreased to less than 1% (Gargouri et al. Reference Gargouri, Larivière, Gingras, Carr and Harnad2012, 6). In contrast, an estimated 90% of journals were “green OA,” giving permission to authors in those journals to self-archive some version of their article (Harnad and Brody Reference Harnad and Brody2004). Within the green OA category, publishers have adopted a wide range of copyright and self-archiving policies. Among the larger publishers of political science journals (e.g., Wiley, Oxford, Cambridge, Elsevier, and Sage), the most common policies fall into either the permissive or the restrictive category. For our purposes, “permissive” publisher policies allow authors to self-archive any version of their paper, including the publisher’s PDF. “Restrictive” publisher policies allow authors to self-archive only the preprint (i.e., prior to peer review) version. When publishers explicitly allow self-archiving of the published version of an article, it is reasonable to expect that more authors and institutions will do so. This leads to our secondary hypothesis, as follows:

  • H2: Permissive copyright and self-archiving policies will lead to higher mean citation rates.

DATA AND METHODS

Data and Methodological Considerations

Antelman’s Reference Antelman2004 study of OA efficacy is considered an origin article in OA citation-analysis research, and it provides the earliest evidence of the efficacy of OA publishing. Antelman’s work remains one of only a few articles that specifically address OA and political science. In her 2004 study, political science was studied in contrast with philosophy, engineering, and mathematics. As discussed previously, Antelman found that political science registered the highest effect of OA on mean citation rates. However, her N was relatively small (i.e., 299 political science articles) and her study did not control for journal influence. Along with a lack of fixed time periods and self-selection bias, this is one of the three most-oft-cited methodological problems in the study of OA impact (Craig et al. Reference Craig, Plume, McVeigh, Pringle and Amin2007). We follow Antelman (Reference Antelman2004) in testing our hypotheses by comparing mean citation rates; however, we improve on the outstanding methodological challenges.

First, when a researcher does not account for journal influence, articles published in more influential journals are compared to those published in less influential journals. We control for this issue by including only those that were consistently ranked among the top 20 political science journals in the Journal Citation Report (JCR) between 2007 and 2011. This ensures that only journals with comparable levels of influence are included in the study. It is important to note that by including these journals, we make no assumptions about quality; we simply use the JCR ranking as a measure of research impact.

Second, in the absence of a fixed period, a study will include older articles compared to recently published articles; this can skew the citation counts and bias the results. We control for this by limiting the articles to those published only in 2007 and 2008, which ensures that those included in the study have had time to become widely circulated and cited. Another advantage of this approach is that it allows us to compare mean citation rates not only for OA versus TA articles from similarly ranked journals but also for OA and TA articles within the same journal for the same period (Harnad and Brody Reference Harnad and Brody2004).

The third major methodological criticism of the extant literature is that when an OA advantage is found, it may be the result of self-selection bias. Critics argue that the “biggest names” are more likely to self-archive and that all authors are more likely to archive their best papers (Hajjem et al. Reference Hajjem, Harnad and Gingras2005). However, as several other scholars have pointed out, the self-selection–bias argument is flawed (Antelman Reference Antelman2004; Gargouri et al. Reference Gargouri, Hajjem, Larivière, Gingras, Carr, Brody and Harnad2010; Hajjem et al. Reference Hajjem, Harnad and Gingras2005). First, Antelman dismisses the idea of the biggest names self-selection bias, noting that authors tend either to self-archive all or none of their work. Second, if self-selection of “best papers” were the primary causal factor, it would be logical to expect that self-archived OA articles have higher citation rates than those for which OA is mandatory.Footnote 7 However, Gargouri et al. (Reference Gargouri, Hajjem, Larivière, Gingras, Carr, Brody and Harnad2010) tested the citation advantage for mandatory-OA articles versus voluntarily OA articles and found no self-selection bias. The third flaw in the self-selection–bias argument is that researchers must have access to articles to cite them; however, given the serials crisis in academic publishing, no research center can afford to purchase subscriptions to all journals (Hajjem et al. Reference Hajjem, Harnad and Gingras2005). Thus, it is logical to conclude that OA rather than self-selection is a causal factor in citation-advantage findings.

Data Collection

We began by examining the JCR data to determine which journals were consistently ranked (by impact factor) in the top 20 political science journals between 2007 and 2011. Only eight journals are ranked in the top 20 in all five years (table 1). We use the JCR rankings to select our data sample because impact factor is an indicator of researchers’ use of a specific journal (Antelman Reference Antelman2004). In addition, we used only journals that were highly ranked during a five-year period because the JCR impact-factor score can be unduly influenced by a few highly cited articles (Seglen Reference Seglen1997). As a result, each year’s top 20 rankings included journals that benefited from a high number of citations for a small number of articles rather than a high number of citations for the journal as a whole. Given these issues in the impact-factor calculation, a high impact factor in one JCR report is not a guarantee that a journal is regularly consulted in the discipline. However, high impact factors during a five-year period are a reasonable indicator that political scientists consistently cited articles from the eight journals at high rates.

Table 1 Annual Journal Citation Ranks, by Journal (2007–2011)

After comparing the 2007–2011 JCR reports to determine which journals were top-ranked in all five years, we gathered all of the articles published in them in 2007 and 2008. Then we used the Publish or Perish (PoP) software program (Harzing Reference Harzing2007) to query Google Scholar and retrieve the citation counts for each article in the dataset. We excluded book reviews, letters to the editor, conference programs and proceedings, presidential addresses, and membership meeting notes. We included replies to other authors because they are often cited in literature reviews. In the event that the PoP query returned duplicate citation-count records, the record with the lower count was excluded from the sample. We then used PoP’s built-in Google Scholar queries to determine whether each article in the sample is openly available, in any format, online. These formats, as defined in the Sherpa-RoMEO Publisher Copyright Policies & Self-Archiving Database, include the following:

  • Preprint: the version of the article before peer review

  • Postprint: the version of the article after peer review, with revisions made

  • Publisher’s postprint: the publisher’s PDF version (postpublication)

We then used Sherpa-RoMEO to determine whether the journal is subject to permissive or restrictive self-archiving policies.Footnote 8 The end result was a database of 727 observations, each representing a single article and its citation count.

Descriptive Statistics

Our results must be understood in the context of the data sample; therefore, it is important to include descriptive statistics. As noted previously, our sample size was 727 articles. Of those, 404 articles (55.5%) were OA in some form. This is surprising given that much of the extant literature indicates that the social sciences have not adopted OA publishing at high rates. Although it is true that no mainstream political science journals have converted to gold OA (Bjørnshauge et al. Reference Bjørnshauge, Brage, Brage and Jørgensen2013), the data presented here indicate that individual political scientists are making publications OA at fairly high rates. Overall OA frequency is reported in table 2a.

Table 2a Open Access Frequency

Note: N = 727. Percentages may not total 100% due to rounding.

It is remarkable that whereas the majority (more than 75%) of OA articles in the sample were publisher PDFs, more than 45% of them were from restrictive publishers. This may indicate that authors are either ignorant of or indifferent to the provisions of their publisher copyright agreements.

It is remarkable that whereas the majority (more than 75%) of OA articles in the sample were publisher PDFs, more than 45% of them were from restrictive publishers. This may indicate that authors are either ignorant of or indifferent to the provisions of their publisher copyright agreements. However, it also may indicate that because citations of unpublished work are uncommon in political science, authors self-archive the publisher PDF in the hope of more citations. Furthermore, of the 410 articles published by permissive publishers, 48% were not self-archived. This indicates that political scientists are not taking full advantage of author-friendly copyright agreements. Table 2b shows OA frequency by publisher-policy classification. Table 2c presents the variation in OA frequency and publisher-policy classification by journal.

Table 2b Open Access Frequency Rates and Publisher Policies

Notes: N = 727. Percentages may not total 100% due to rounding.

**P = Permissive; R = Restrictive.

Table 2c Open Access Frequency Rates and Publisher Policies by Journal

Note: *Percentages rounded to the nearest whole number. ** P = Permissive; R = Restrictive.

The descriptive statistics provide prima facie evidence that OA articles are cited more frequently than non-OA articles (figure 1 and table 3). As shown in table 3, the mean citation rate in the sample size is 51. For OA articles, the mean citation rate is 70 and for TA articles, it is 28; thus, the number of citations is approximately two and a half times higher for OA articles in this sample. As shown in table 3, the mean citation rates for OA articles are higher not only across the full sample but also within samples for each journal. The table also shows wide variation in the mean citation rates between journals, from an average difference of only 18 citations between OA and TA articles in Public Opinion Quarterly to an average difference of more than 100 citations in Annual Review of Political Science.

Figure 1 Citation Rates by Open Access Status

Table 3 Comparison of Mean Citation Rates between Open Access and Toll Access Articles

RESULTS AND DISCUSSION

Because our sample cannot be assumed, a priori, to be normally distributed, we used the nonparametric Wilcoxon-Mann-Whitney (WMW) test (also known as the Mann-Whitney U) to test the difference of means (Großer and Schram Reference Großer and Schram2006; Munck and Snyder Reference Munck and Snyder2007). We first tested our primary hypothesis: OA will lead to more citations than TA. The WMW results indicate that OA articles have significantly higher mean citation rates than TA articles. This holds across the data sample as well as within each of the included journal; therefore, OA publication results in a clear and significant citation advantage. The results of these tests are presented in table 4. To confirm these findings, we logged the citation rate and conducted an independent samples T-test; those results also indicate that OA articles are cited at a significantly lower rate than TA articles–which again demonstrates an OA citation advantage (T = 11.5, p < 0.0000). Given the citation advantage, these results indicate that if political scientists want to be cited at higher rates, they should publish in journals that freely allow authors to self-archive or advocate for more restrictive journals to offer self-archiving.Footnote 9 Furthermore, our data indicate that almost half of the authors whose copyright agreements allow them to self-archive the final version (i.e., publisher PDF) do not do so. The citation advantage provided by OA publishing indicates that authors should take advantage of the permissiveness of their copyright agreements if they want to be cited at higher rates.

Table 4 Wilcoxon-Mann-Whitney Difference of Means Results, by Data Source

Note: *** p < 0.01, ** p < 0.05.

Next, we discuss our secondary hypothesis: More permissive publisher self-archiving policies will lead to higher citation counts. The WMW analysis, conducted on the full sample and presented in table 5, suggests that permissive publisher policies are associated with higher citation rates. However, further data analysis contradicts this initial result. For example, the journal with the highest OA frequency (Political Analysis, 74%) is subject to restrictive policies, whereas the journal with the lowest OA frequency (Political Geography, 23%) is subject to permissive policies. This suggests that, at least superficially, publisher policies are not a causal factor in citation advantage.

This descriptive finding is confirmed by additional difference-of-means testing. We selected the OA population of 404 records and tested the mean citation rates (by self-archiving permissiveness) within that subset of the sample. The results, presented in table 5, indicate that although positive, there is no statistically significant citation advantage to publishing with a more permissive journal. This is counterintuitive, and may be an artifact of the data. As noted previously, a large percentage (45%) of the OA articles in our sample have been posted in violation of the publisher’s copyright and self-archiving policies, and a similarly large percentage (48%) of the articles subject to permissive policies were not made OA. It is equally likely, however, that lack of education regarding self-archiving leads authors to simply sign the copyright agreement.

The results indicate that although positive, there is no statistically significant citation advantage to publishing with a more permissive journal. This is counterintuitive; this finding may be an artifact of the data.

Table 5 Wilcoxon-Mann-Whitney Difference of Means Results, by Publisher Policy

Note: *** p < 0.01, ** p < 0.05

CONCLUSION

By using standard OACE research methods to evaluate citation rates for the top political science journals, this article marks an important step for the discipline. Our results have implications not only for authors but also for institutions and publishers. As the serials crisis persists, alternative publication models such as gold and green OA will continue to be developed, explored, and evaluated. For authors, wider acceptance or rejection of OA publishing models will hinge on the perceived and measured impact of those publications in their respective academic fields. This study provides evidence that OA is beneficial to political scientists: when researchers find the full-text version of a high-quality article without being prompted for payment, they are more likely to use it in their own research. However, the data indicate that although political scientists seem to be self-archiving at relatively high rates, many are not—even when permissive self-archiving policies allow it. Although our research indicates that publisher permissiveness does not result in a citation advantage, it does indicate an advantage for freely available articles. Thus, political scientists should be self-archiving whenever their copyright agreements permit.

This finding brings up two issues, however. First, researchers often are ill-informed about the details of their copyright agreements. Second, many researchers lack the technical skills or resources that would enable them to self-archive their work. These problems can be solved at the institutional level. Many institutions hire specialists to help authors with copyright agreements and have established repositories to make it easier for authors to self-archive. The benefit to an institution is that self-archiving can raise the research profiles of both the researcher and the institution.

Finally, the results presented in this article also have implications for publishers. As noted previously, we found that articles from more restrictive journals often are self-archived at higher rates than those subject to more permissive policies. This indicates that some political scientists may be self-archiving in violation of their copyright agreements, whether or not knowingly. Although publishers historically have ignored these violations, recent news indicates that some may be less likely to overlook them in the future. In late 2013, science-publishing giant Elsevier served takedown notices to dozens of institutions and websites, demanding that these entities remove articles that had been posted without Elsevier’s permission (Peterson Reference Peterson2013). If more publishers follow Elsevier’s lead, it may become more important that political scientists publish with journals that have permissive policies or push more restrictive publishers for self-archiving exceptions in their copyright agreements. Indeed, if publishers start enforcing copyright agreements more stringently, authors increasingly may favor OA to find a wider audience for their work.

The clear OA citation advantage found in this study indicates that political scientists can increase access to and use of their research by self-archiving. However, more research is needed. Although our dataset includes the largest sample of political science OA and TA articles tested to date, it remains a narrow sample. First, we restricted the time to two years of data to ensure that the observations were comparable in terms of opportunity to accrue citations. Second, there is a slightly wider spectrum of publisher policies than is represented in this study. The dataset includes only those journals with highly permissive publishing policies (i.e., authors can self-archive any version) and those with moderately restrictive policies (i.e., authors can self-archive preprints only). Thus, the study excludes publisher policies that prohibit all self-archiving. While in theory these should be cited at a much lower rate, our study indicates that this is not the likely case; the inclusion of observations from fully restrictive publications would be an interesting expansion of the study. Also, given the superstar effect found in recent research, it would be useful to investigate relative journal ranking as a factor in OA citation advantage. Finally, as the major associations in the discipline enforce their requirement that authors upload conference papers, it will be important to study the effects of this type of preprint on citation rates.

ACKNOWLEDGMENTS

The authors thank Jennifer M. Piscopo, Gregg B. Johnson, Patricia Boling, James Paul Old, the Purdue University PIRCAT workshop participants, and the anonymous reviewers for their helpful comments and suggestions.

Footnotes

1. “Institutional repository” is defined as a set of “digital collections capturing and preserving the intellectual output of a single or multi-university community” (Johnson Reference Johnson2002, NP).

2. In contrast, journals that have been made fully OA by the publishers are known as “gold” OA and are beyond the scope of this article.

3. This typically is done by the author and/or the supporting institution, although the final publisher’s PDF occasionally is posted to a course website to which the author is unconnected.

4. McCabe and Snyder (Reference McCabe and Snyder2014) describe the “superstar effect,” noting that “open access benefits higher-quality journals more than lower-quality.” In a recent conference paper on OACE in civil engineering, Koler-Povh, Turk, and Južnič (Reference Koler-Povh, Turk and Južnič2013) found a similar effect.

5. Although none of these journals are considered “mainstream,” it does indicate that OA is growing in the discipline.

6. He notes that there is a strong Developing Nations Initiative that ensures articles are free or low-cost for researchers in developing states.

7. Some universities require that all faculty and researcher publications be made OA.

8. Our classifications of “permissive” and “restrictive” are based on the Sherpa-RoMEO green (i.e., authors can upload any version) and yellow (i.e., authors can archive preprint/pre-refereed versions) classifications, respectively (Publisher Copyright Policies & Self-Archiving Database 2013).

9. The authors do not want to encourage violation of publishers’ copyright agreements. Authors who want to self-archive publications from more restrictive publishers can find more information on copyright agreements and OA addenda at http://www.sparc.arl.org/audience/authors.

References

REFERENCES

Anderson, Kent, Sack, John, Krauss, Lisa, and O'Keefe, Lori. 2001. “Publishing Online-Only Peer-Reviewed Biomedical Literature: Three Years of Citation, Author Perception, and Usage Experience.” Journal of Electronic Publishing 6 (3).Google Scholar
Antelman, Kristin. 2004. “Do Open-Access Articles Have a Greater Research Impact?College & Research Libraries 65 (5): 372382.CrossRefGoogle Scholar
Antelman, Kristin. 2006. “Self-Archiving Practice and the Influence of Publisher Policies in the Social Sciences.” Learned Publishing 19 (2): 8595.CrossRefGoogle Scholar
Bjørnshauge, Lars, Brage, Rikard, Brage, Sonja, and Jørgensen, Lotte. 2013. “Directory of Open Access Journals.” ed. www.doaj.org. Lund, Sweden.Google Scholar
Calise, Mauro, and Rosa, Rosanna De. 2008. “E-Research: An Introduction to On-line Political Science Sources for Beginners (and Skeptics).” International Political Science Review 29 (5): 595618.CrossRefGoogle Scholar
Calise, Mauro, Rosa, Rosanna de, and Marin, Xavier Fernandez. 2010. “Electronic Publishing, Knowledge Sharing and Open Access: A New Environment for Political Science.” European Political Science 9 (S1): S50S60.Google Scholar
Craig, Iain D., Plume, Andrew M., McVeigh, Marie E., Pringle, James, and Amin, Mayur. 2007. “Do Open Access Articles Have Greater Citation Impact?: A Critical Review of the Literature.” Journal of Informetrics 1 (3): 239248.Google Scholar
Crawford, Walt. 2011. Open Access: What You Need to Know Now. Chicago: American Library Association.Google Scholar
Davis, Phillip M. 2011. “Open Access, Readership, Citations: A Randomized Controlled Trial of Scientific Journal Publishing.” The FASEB Journal 25 (7): 21292134.CrossRefGoogle ScholarPubMed
Dingley, Brenda. 2005. “U.S. Periodical Prices - 2005.” American Library Association.Google Scholar
Doty, R Christopher. 2013. “Tenure-Track Science Faculty and the'Open Access Citation Effect'.” Journal of Librarianship and Scholarly Communication 1 (3): 6.Google Scholar
Evans, James A., and Reimer, Jacob. 2009. “Open Access and Global Participation in Science.” Science 323 (5917): 1025.CrossRefGoogle ScholarPubMed
Gargouri, Yassine, Hajjem, Chawki, Larivière, Vincent, Gingras, Yves, Carr, Les, Brody, Tim, and Harnad, Stevan. 2010. “Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research.” PLoS ONE 5 (10): 112.CrossRefGoogle ScholarPubMed
Gargouri, Yassine, Larivière, Vincent, Gingras, Yves, Carr, Les, and Harnad, Stevan. 2012. “Green and Gold Open Access Percentages and Growth, by Discipline.” arXiv preprint arXiv:1206.3664.Google Scholar
Gleditsch, Nils Petter. 2012. “Open Access in International Relations: A Symposium.” International Studies Perspectives 13 (3): 211215.Google Scholar
Greco, Albert N, Jones, Robert F, Wharton, Robert M, and Estelami, Hooman. 2007. “The Changing College and University Library Market for University Press Books and Journals: 1997–2004.” Journal of Scholarly Publishing 39 (1): 132.Google Scholar
Greco, Albert N, Wharton, Robert M, Estelami, Hooman, and Jones, Robert F. 2006. “The State of Scholarly Journal Publishing: 1981-2000.” Journal of Scholarly Publishing 37 (3): 155214.Google Scholar
Großer, Jens, and Schram, Arthur. 2006. “Neighborhood Information Exchange and Voter Participation: An Experimental Study.” The American Political Science Review 100 (2): 235248.Google Scholar
Hajjem, Chawki, Harnad, Stevan, and Gingras, Yves. 2005. “Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases Research Citation Impact.” Bulletin of the IEEE Computer Society Technical Committee on Data Engineering.Google Scholar
Harnad, Stevan, and Brody, Tim. 2004. “Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals.” D-Lib Magazine 10 (6).Google Scholar
Harzing, Anne-Wil. Publish or Perish 2007. http://www.harzing.com/pop.htm.Google Scholar
Johnson, Richard K. 2002. “Partnering with Faculty to Enhance Scholarly Communication.” D-Lib Magazine 8 (11).Google Scholar
Koler-Povh, Teja, Turk, Goran, and Južnič, Primož. 2013. Does the Open Access Business Model Have a Significant Impact on the Citation of Publications? Case Study in the Field of Civil Engineering. Paper read at 5th Belgrade International Open Access Conference 2012.Google Scholar
Kurtz, Michael J., Eichhorn, Guenther, Accomazzi, Alberto, Grant, Carolyn, Demleitner, Markus, Henneken, Edwin, and Murray, Stephen S.. 2005. “The Effect of Use and Access on Citations.” Information Processing & Management 41 (6): 13951402.Google Scholar
Lawrence, Steve. 2001. “Free Online Availability Substantially Increases a Paper's Impact.” Nature 411 (6837): 521521.CrossRefGoogle ScholarPubMed
May, Christopher. 2005. “The Academy's New Electronic Order? Open Source Journals and Publishing Political Science.” European Political Science 4 (1): 1424.CrossRefGoogle Scholar
McCabe, Mark, and Snyder, Christopher M. 2014. “Identifying the Effect of Open Access on Citations Using a Panel of Science Journals.” Economic Inquiry.Google Scholar
Mehlum, Halvor. 2012. “The Case for Open Access Publishing.” International Studies Perspectives 13 (3): 216223.Google Scholar
Munck, Gerardo L., and Snyder, Richard. 2007. “Debating the Direction of Comparative Politics: An Analysis of Leading Journals.” Comparative Political Studies 40 (1): 531.CrossRefGoogle Scholar
Nentwich, Michael. 2008. “Political Science on the Web: Prospects and Challenges.” European Political Science 7 (2): 220229.Google Scholar
Norris, Michael, Oppenheim, Charles, and Rowland, Fytton. 2008. “The Citation Advantage of Open-Access Articles.” Journal of the American Society for Information Science and Technology 59 (12): 19631972.Google Scholar
Papin–Ramcharan, Jennifer, and Dawe, Richard A.. 2006. “Open Access Publishing: A Developing Country View.” First Monday 11 (6).Google Scholar
Parks, Robert P. 2002. “The Faustian Grip of Academic Publishing.” Journal of Economic Methodology 9 (3): 317335.Google Scholar
Pascarelli, Anne M. 1990. “Coping Strategies for Libraries Facing the Serials Pricing Crisis.” Serials Review 16 (1): 7580.CrossRefGoogle Scholar
Peterson, Andrea. 2013. “How One Publisher is Stopping Academics from Sharing Their Research.” The Washington Post, December 19, 2013.Google Scholar
Publisher Copyright Policies & Self-Archiving Database. University of Nottingham 2013. http://www.sherpa.ac.uk/romeo/index.php?fIDnum=|&mode=simple&la=enGoogle Scholar
Seglen, Per O. 1997. “Why the Impact Factor of Journals Should Not Be Used for Evaluating Research.” British Medical Journal 314 (7079).Google Scholar
Suber, Peter. 2012. Open Access. Cambridge, MA: MIT Press.CrossRefGoogle ScholarPubMed
Thompson, William R. 2012. “Why Journal Editors Have Other and More Pressing Concerns.” International Studies Perspectives 13 (3): 224227.Google Scholar
Xia, Jingfeng, and Nakanishi, Katie. 2012. “Self-Selection and the Citation Advantage of Open Access Articles.” Online Information Review 36 (1): 4051.CrossRefGoogle Scholar
Figure 0

Table 1 Annual Journal Citation Ranks, by Journal (2007–2011)

Figure 1

Table 2a Open Access Frequency

Figure 2

Table 2b Open Access Frequency Rates and Publisher Policies

Figure 3

Table 2c Open Access Frequency Rates and Publisher Policies by Journal

Figure 4

Figure 1 Citation Rates by Open Access Status

Figure 5

Table 3 Comparison of Mean Citation Rates between Open Access and Toll Access Articles

Figure 6

Table 4 Wilcoxon-Mann-Whitney Difference of Means Results, by Data Source

Figure 7

Table 5 Wilcoxon-Mann-Whitney Difference of Means Results, by Publisher Policy