Research Integrity

views updated

RESEARCH INTEGRITY

Integrity (from the Latin integritas, meaning whole or complete) refers in ethics to adherence to a code or a usually high standard of conduct. Research integrity thus indicates doing research in accord with standards that properly inform and guide that activity—without deviance under any inappropriate influences. Integrity in this sense has close correlates with authenticity and accountability. Research integrity is also often considered the flip side of research misconduct. Whereas the topic of research misconduct concentrates on the definition, identification, adjudication, and consequences of malfeasance committed by scientists in the course of their research; research integrity concentrates on, as the Institute of Medicine's 2002 report, Integrity in Scientific Research, was subtitled: "creating an environment that promotes responsible conduct" of research (Institute of Medicine, p. x). Having received considerable public attention since the 1980s, however, research integrity is a contested issue both within the scientific community and between the community and its patrons.


Public and Professional Tensions

Part of the conflict over research integrity occurs over identifying the appropriate code or standard. Sociologist Robert K. Merton (1973) described four norms of science—communalism (or communism), universalism, disinterestedness, and organized skepticism—that are often cited as antecedent to codes to which scientists are supposed to adhere. But other scholars argue that such norms are not well recognized among all scientists (Mitroff 1974), or that they are merely self-serving vocabularies of justification for scientific autonomy (Mulkay 1975), or that they might have served as guideposts historically but that they are being supplanted by counternorms that are more bureaucratic and commercially oriented (Ziman 1990).

Many professional societies have written or revised codes of ethics or guidelines for research integrity that encompass normative issues ranging from formal, regulatory definitions of research misconduct (for example fabrication, falsification, and plagiarism) to more subtle professional behavior such as authorship practices and mentorship. In the early-twenty-first century, professional bodies such as the Accrediting Board for Engineering and Technology (ABET) require training in ethics and research integrity for accredited undergraduate engineering programs. Scientific journals have also assumed an active role in defining integrity for their authors around topics such as credit for authorship, conflict of interest, and responsibility for corrections and retractions.

Research integrity is often connected not only with the attempt of the scientific community to encourage ethical behavior within its own ranks, but also with its attempt to maintain professional autonomy from public interference. As such, it is an aspect of the social contract for science in which the scientific community implicitly promised to maintain the integrity of its research in exchange for an unusual lack of oversight—despite public patronage. This tacit agreement was substantially reconfigured during the 1980s and 1990s, as both parties recognized that the promotion and assurance of research integrity must be a collaborative, rather than an autonomous, enterprise (Guston 2000).

The public patrons of research in liberal democracies have a special interest in research integrity not only because of the instrumental use of science and technology for public purposes (for example, only good science can lead to the promises of health, economic advancement, environmental quality, and military security, among others), but also because of the ideological support that good science offers the state by demonstrating its effectiveness and by reifying the concepts of representation and causality upon which representative government is based (Ezrahi 1990). In the United States, research integrity has become a pressing issue to the funding agencies and professional societies that mediate between public patrons and practicing scientists. A driving force for attention to research integrity was the promulgation of rules in 1990 by the National Institutes of Health (NIH) to require institutions participating in training grants to provide training in the responsible conduct of research. Such training often includes discussions not only of misconduct, but also of whistle-blowing, the protection of human and animal research subjects, the mentoring relationship, and the consequences of recently emergent economic relations in research including conflicts of interest and intellectual property rights. In 2000 the Office of Research Integrity (ORI) of the U.S. Public Health Service proposed more specific and broadly applicable rules for training in the responsible conduct of research, but as of 2004 these rules had not been implemented.

Because of the increasing recognition that the effects of research—for good or for ill—go beyond the scientific community, there is increasing attention as well to what some (particularly in engineering ethics) call macroethics, or the responsibility that scientists and engineers have to behave with integrity not just toward each other and toward their direct patrons but to society more broadly conceived (Herkert 2001). This agenda includes helping to craft private and public policies that make appropriate use of science and its products, assuring that the knowledge-based innovations to which they contribute are not only technically virtuous but socially benign, and even accepting greater involvement of non-scientists in some aspects of technical decision making. This agenda has historical roots, for example, in the characterization of activism by atomic physicists in nuclear weapons policy or molecular biologists in recombinant DNA policy as scientific responsibility.


Unresolved Questions

Despite increasing recognition of the importance of research integrity to both the scientific community and the broader society, and the consequent need for collaboration to assure it, several questions remain. One is whether the primary responsibility for assuring the integrity of research lies with individual researchers; research institutions such as universities, professional societies and the community of science; or public patrons of research. The Institute of Medicine (2002) concludes that research institutions should have the primary role, but that public patrons of research have an important oversight role and that individual integrity is still the backbone of the system.

A second question is, given the importance of some institutional role in research integrity, why so few exist. As one such institution, ORI—initially created to investigate allegations of research misconduct—has, in the early-twenty-first century, been changing its agenda toward encouraging training in research integrity and even sponsoring research on research integrity. The National Science Foundation (NSF) has also sponsored projects on research integrity, including the On-Line Ethics Center.

A third question is whether greater collaboration between science and society may legitimate an increasingly malign political interference, rather than a benign influence, on public science. The Waxman report, which issued from the U.S. House of Representatives, and a similar report from the Union of Concerned Scientists in 2004, for example, claim to document dozens of threats to research integrity from the intrusion of political agendas into scientific and technical decision making in the bureaucracy.

A fourth question, which makes the others all the more difficult to manage, is—as the Institute of Medicine (2002) concluded—how to create reliable ways to assess the overall integrity of the research environment, as well as the efficacy of any particular interventions (including educational ones). The lack of empirical evidence means that the scientific community can legitimately call for additional research on research integrity, but it also means that political demands for action may be met with less than satisfactory responses.


DAVID H. GUSTON

SEE ALSO Accountability in Research;Ecological Integrity; Misconduct in Science: Overview;National Institutes of Health;Office of Research Integrity;Professional Engineering Organizations;Social Contract for Science.

BIBLIOGRAPHY

Ezrahi, Yaron. (1990). The Descent of Icarus: Science and the Transformation of Contemporary Democracy. Cambridge, MA: Harvard University Press. Scholarly account of the co-dependence of scientific and democratic ideologies.

Guston, David H. (2000). Between Politics and Science: Assuring the Integrity and the Productivity of Research. New York: Cambridge University Press. Includes detailed political and institutional account of research integrity in U.S. in the 1980s and 1990s.

Herkert, Joseph R. (2001). "Future Directions in Engineering Ethics Research: Microethics, Macroethics and the Role of Professional Societies." Science and Engineering Ethics 7: 403–414. Responsible research also includes broader societal responsibilities.

Institute of Medicine. Committee on Assessing Integrity in Research Environments. (2002). Integrity in Scientific Research: Creating an Environment That Promotes Responsible Conduct. Washington, DC: National Academy Press. The scientific establishment focuses on the research environment to assure integrity.

Merton, Robert K. (1973). "The Normative Structure of Science." In The Sociology of Science: Theoretical and Empirical Investigations, ed. Norman Storer. Chicago: University of Chicago Press. Originally published in 1942. The locus classicus for the norms of the scientific community.

Mitroff, Irving I. (1974). The Subjective Side of Science: A Philosophical Inquiry in the Psychology of the Apollo Moon Scientists. Amsterdam: Elsevier. A non-Mertonian perspective on scientific norms.

Mulkay, Michael J. (1975). "Norms and Ideology in Science." Social Science Information 15: 637–656. Challenges Merton's norms.

Union of Concerned Scientists. (2004). Scientific Integrity in Policymaking: An Investigation into the Bush Administration's Misuse of Science. Cambridge, MA: Author. Independent group of scientists alleges that U.S. science has been politicized.

Ziman, John. (1990). "Research As a Career." In The Research System in Transition, eds. Susan Cozzens, Peter Healey, Arie Rip, and John Ziman. Boston: Kluwer Academic Publishers. Traditional Mertonian norms have been altered by new practices, including commercial relations, in science.


INTERNET RESOURCE

Waxman, Henry A. (2003). Politics and Science in the Bush Administration. Washington, DC: U.S. House of Representatives, Committee on Government Reform, Minority Staff. Available from www.house.gov/reform/min/politicsandscience/pdfs/pdf_politics_and_science_rep.pdf. Politicians allege that U.S. science has been politicized.

More From encyclopedia.com

About this article

Research Integrity

Updated About encyclopedia.com content Print Article

You Might Also Like