You are on page 1of 18

What is Ethics in Research & Why is it Important?

by David B. Resnik, J.D., Ph.D.


When most people think of ethics (or morals), they think of rules for distinguishing between right and wrong, such as the Golden Rule ("Do unto others as you would have them do unto you"), a code of professional conduct like the Hippocratic Oath ("First of all, do no harm"), a religious creed like the Ten Commandments ("Thou Shalt not kill..."), or a wise aphorisms like the sayings of Confucius. This is the most common way of defining "ethics": norms for conduct that distinguish between acceptable and unacceptable behavior. Most people learn ethical norms at home, at school, in church, or in other social settings. Although most people acquire their sense of right and wrong during childhood, moral development occurs throughout life and human beings pass through different stages of growth as they mature. Ethical norms are so ubiquitous that one might be tempted to regard them as simple commonsense. On the other hand, if morality were nothing more than commonsense, then why are there so many ethical disputes and issues in our society? One plausible explanation of these disagreements is that all people recognize some common ethical norms but different individuals interpret, apply, and balance these norms in different ways in light of their own values and life experiences. Most societies also have legal rules that govern behavior, but ethical norms tend to be broader and more informal than laws. Although most societies use laws to enforce widely accepted moral standards and ethical and legal rules use similar concepts, it is important to remember that ethics and law are not the same. An action may be legal but unethical or illegal but ethical. We can also use ethical concepts and principles to criticize, evaluate, propose, or interpret laws. Indeed, in the last century, many social reformers urged citizens to disobey laws in order to protest what they regarded as immoral or unjust laws. Peaceful civil disobedience is an ethical way of expressing political viewpoints. Another way of defining 'ethics' focuses on the disciplines that study standards of conduct, such as philosophy, theology, law, psychology, or sociology. For example, a "medical ethicist" is someone who studies ethical standards in medicine. One may also define ethics as a method, procedure, or perspective for deciding how to act and for analyzing complex problems and issues. For instance, in considering a complex issue like global warming, one may take an economic, ecological, political, or ethical perspective on the problem. While an economist might examine the cost and benefits of various policies related to global warming, an environmental ethicist could examine the ethical values and principles at stake. Many different disciplines, institutions, and professions have norms for behavior that suit their particular aims and goals. These norms also help members of the discipline to coordinate their actions or activities and to establish the public's trust of the discipline. For instance, ethical norms govern conduct in medicine, law, engineering, and business. Ethical norms also serve the aims or goals of research and apply to people who conduct scientific research or other scholarly or creative activities. There is even a specialized discipline, research ethics, which studies these norms.

There are several reasons why it is important to adhere to ethical norms in research. First, norms promote the aims of research, such as knowledge, truth, and avoidance of error. For example, prohibitions against fabricating, falsifying, or misrepresenting research data promote the truth and avoid error. Second, since research often involves a great deal of cooperation and coordination among many different people in different disciplines and institutions, ethical standards promote the values that are essential to collaborative work, such as trust, accountability, mutual respect, and fairness. For example, many ethical norms in research, such as guidelines for authorship, copyright and patenting policies, data sharing policies, and confidentiality rules in peer review, are designed to protect intellectual property interests while encouraging collaboration. Most researchers want to receive credit for their contributions and do not want to have their ideas stolen or disclosed prematurely. Third, many of the ethical norms help to ensure that researchers can be held accountable to the public. For instance, federal policies on research misconduct, conflicts of interest, the human subjects protections, and animal care and use are necessary in order to make sure that researchers who are funded by public money can be held accountable to the public. Fourth, ethical norms in research also help to build public support for research. People more likely to fund research project if they can trust the quality and integrity of research. Finally, many of the norms of research promote a variety of other important moral and social values, such as social responsibility, human rights, animal welfare, compliance with the law, and health and safety. Ethical lapses in research can significantly harm human and animal subjects, students, and the public. For example, a researcher who fabricates data in a clinical trial may harm or even kill patients, and a researcher who fails to abide by regulations and guidelines relating to radiation or biological safety may jeopardize his health and safety or the health and safety of staff and students.

Codes and Policies for Research Ethics


Given the importance of ethics for the conduct of research, it should come as no surprise that many different professional associations, government agencies, and universities have adopted specific codes, rules, and policies relating to research ethics. Many government agencies, such as the National Institutes of Health (NIH), the National Science Foundation (NSF), the Food and Drug Administration (FDA), the Environmental Protection Agency (EPA), and the US Department of Agriculture (USDA) have ethics rules for funded researchers. Other influential research ethics policies include the Uniform Requirements for Manuscripts Submitted to Biomedical Journals (International Committee of Medical Journal Editors), the Chemist's Code of Conduct (American Chemical Society), Code of Ethics (American Society for Clinical Laboratory Science) Ethical Principles of Psychologists (American Psychological Association), Statements on Ethics and Professional Responsibility (American Anthropological Association), Statement on Professional Ethics (American Association of University Professors), the Nuremberg Code and the Declaration of Helsinki (World Medical Association). The following is a rough and general summary of some ethical principals that various codes address*: Honesty Strive for honesty in all scientific communications. Honestly report data, results, methods and procedures, and publication status. Do not fabricate, falsify, or misrepresent data. Do not deceive colleagues, granting agencies, or the public.

Objectivity Strive to avoid bias in experimental design, data analysis, data interpretation, peer review, personnel decisions, grant writing, expert testimony, and other aspects of research where objectivity is expected or required. Avoid or minimize bias or self-deception. Disclose personal or financial interests that may affect research. Integrity Keep your promises and agreements; act with sincerity; strive for consistency of thought and action. Carefulness Avoid careless errors and negligence; carefully and critically examine your own work and the work of your peers. Keep good records of research activities, such as data collection, research design, and correspondence with agencies or journals. Openness Share data, results, ideas, tools, resources. Be open to criticism and new ideas. Respect for Intellectual Property Honor patents, copyrights, and other forms of intellectual property. Do not use unpublished data, methods, or results without permission. Give credit where credit is due. Give proper acknowledgement or credit for all contributions to research. Never plagiarize. Confidentiality Protect confidential communications, such as papers or grants submitted for publication, personnel records, trade or military secrets, and patient records. Responsible Publication Publish in order to advance research and scholarship, not to advance just your own career. Avoid wasteful and duplicative publication. Responsible Mentoring Help to educate, mentor, and advise students. Promote their welfare and allow them to make their own decisions. Respect for colleagues Respect your colleagues and treat them fairly. Social Responsibility Strive to promote social good and prevent or mitigate social harms through research, public education, and advocacy. Non-Discrimination Avoid discrimination against colleagues or students on the basis of sex, race, ethnicity, or other factors that are not related to their scientific competence and integrity. Competence Maintain and improve your own professional competence and expertise through lifelong

education and learning; take steps to promote competence in science as a whole. Legality Know and obey relevant laws and institutional and governmental policies. Animal Care Show proper respect and care for animals when using them in research. Do not conduct unnecessary or poorly designed animal experiments. Human Subjects Protection When conducting research on human subjects, minimize harms and risks and maximize benefits; respect human dignity, privacy, and autonomy; take special precautions with vulnerable populations; and strive to distribute the benefits and burdens of research fairly. * Adapted from Shamoo A and Resnik D. 2009. Responsible Conduct of Research, 2nd ed. (New York: Oxford University Press).

Ethical Decision Making in Research


Although codes, policies, and principals are very important and useful, like any set of rules, they do not cover every situation, they often conflict, and they require considerable interpretation. It is therefore important for researchers to learn how to interpret, assess, and apply various research rules and how to make decisions and to act in various situations. The vast majority of decisions involve the straightforward application of ethical rules. For example, consider the following case, Case 1: The research protocol for a study of a drug on hypertension requires the administration of the drug at different doses to 50 laboratory mice, with chemical and behavioral tests to determine toxic effects. Tom has almost finished the experiment for Dr. Q. He has only 5 mice left to test. However, he really wants to finish his work in time to go to Florida on spring break with his friends, who are leaving tonight. He has injected the drug in all 50 mice but has not completed all of the tests. He therefore decides to extrapolate from the 45 completed results to produce the 5 additional results. Many different research ethics policies would hold that Tom has acted unethically by fabricating data. If this study were sponsored by a federal agency, such as the NIH, his actions would constitute a form of research misconduct, which the government defines as "fabrication, falsification, or plagiarism" (or FFP). Actions that nearly all researchers classify as unethical are viewed as misconduct. It is important to remember, however, that misconduct occurs only when researchers intend to deceive: honest errors related to sloppiness, poor record keeping, miscalculations, bias, self-deception, and even negligence do not constitute misconduct. Also, reasonable disagreements about research methods, procedures, and interpretations do not constitute research misconduct. Consider the following case: Case 2: Dr. T has just discovered a mathematical error in a paper that has been accepted for publication in a journal. The error does not affect the overall results of his research, but it is potentially misleading. The journal has just gone to press, so it is too late to catch the error before it appears in print. In order to avoid embarrassment, Dr. T decides to ignore the error.

Dr. T's error is not misconduct nor is his decision to take no action to correct the error. Most researchers, as well as many different policies and codes, including ECU's policies, would say that Dr. T should tell the journal about the error and consider publishing a correction or errata. Failing to publish a correction would be unethical because it would violate norms relating to honesty and objectivity in research. There are many other activities that the government does not define as "misconduct" but which are still regarded by most researchers as unethical. These are called "other deviations" from acceptable research practices and include: * * * * * * * * * * * * * * * * * * * Publishing the same paper in two different journals without telling the editors Submitting the same paper to different journals without telling the editors Not informing a collaborator of your intent to file a patent in order to make sure that you are the sole inventor Including a colleague as an author on a paper in return for a favor even though the colleague did not make a serious contribution to the paper Discussing with your colleagues confidential data from a paper that you are reviewing for a journal Trimming outliers from a data set without discussing your reasons in paper Using an inappropriate statistical technique in order to enhance the significance of your research Bypassing the peer review process and announcing your results through a press conference without giving peers adequate information to review your work Conducting a review of the literature that fails to acknowledge the contributions of other people in the field or relevant prior work Stretching the truth on a grant application in order to convince reviewers that your project will make a significant contribution to the field Stretching the truth on a job application or curriculum vita Giving the same research project to two graduate students in order to see who can do it the fastest Overworking, neglecting, or exploiting graduate or post-doctoral students Failing to keep good research records Failing to maintain research data for a reasonable period of time Making derogatory comments and personal attacks in your review of author's submission Promising a student a better grade for sexual favors Using a racist epithet in the laboratory Making significant deviations from the research protocol approved by your institution's Animal Care and Use Committee or Institutional Review Board for Human Subjects Research without telling the committee or the board Not reporting an adverse event in a human research experiment

* * * * * * * * *

Wasting animals in research Exposing students and staff to biological risks in violation of your institution's biosafety rules Rejecting a manuscript for publication without even reading it Sabotaging someone's work Stealing supplies, books, or data Rigging an experiment so you know how it will turn out Making unauthorized copies of data, papers, or computer programs Owning over $10,000 in stock in a company that sponsors your research and not disclosing this financial interest Deliberately overestimating the clinical significance of a new drug in order to obtain economic benefits

These actions would be regarded as unethical by most scientists and some might even be illegal. Most of these would also violate different professional ethics codes or institutional policies. However, they do not fall into the narrow category of actions that the government classifies as research misconduct. Indeed, there has been considerable debate about the definition of "research misconduct" and many researchers and policy makers are not satisfied with the government's narrow definition that focuses on FFP. However, given the huge list of potential offenses that might fall into the category "other serious deviations," and the practical problems with defining and policing these other deviations, it is understandable why government officials have chosen to limit their focus. Finally, situations frequently arise in research in which different people disagree about the proper course of action and there is no broad consensus about what should be done. In these situations, there may be good arguments on both sides of the issue and different ethical principles may conflict. These situations create difficult decisions for research known as ethical dilemmas. Consider the following case: Case 3: Dr. Wexford is the principal investigator of a large, epidemiological study on the health of 5,000 agricultural workers. She has an impressive dataset that includes information on demographics, environmental exposures, diet, genetics, and various disease outcomes such as cancer, Parkinsons disease (PD), and ALS. She has just published a paper on the relationship between pesticide exposure and PD in a prestigious journal. She is planning to publish many other papers from her dataset. She receives a request from another research team that wants access to her complete dataset. They are interested in examining the relationship between pesticide exposures and skin cancer. Dr. Wexford was planning to conduct a study on this topic. Dr. Wexford faces a difficult choice. On the one hand, the ethical norm of openness obliges her to share data with the other research team. Her funding agency may also have rules that obligate her to share data. On the other hand, if she shares data with the other team, they may publish results that she was planning to publish, thus depriving her (and her team) of recognition and priority. It seems that there are good arguments on both sides of this issue and Dr. Wexford needs to take some time to think about what she should do. One possible option is to share data, provided that the investigators sign a data use agreement. The agreement could define allowable

uses of the data, publication plans, authorship, etc. The following are some step that researchers, such as Dr. Wexford, can take to deal with ethical dilemmas in research: What is the problem or issue? It is always important to get a clear statement of the problem. In this case, the issue is whether to share information with the other research team. What is the relevant information? Many bad decisions are made as a result of poor information. To know what to do, Dr. Wexford needs to have more information concerning such matters as university or funding agency policies that may apply to this situation, the team's intellectual property interests, the possibility of negotiating some kind of agreement with the other team, whether the other team also has some information it is willing to share, etc. Will the public/science be better served by the additional research? What are the different options? People may fail to see different options due to a limited imagination, bias, ignorance, or fear. In this case, there may be another choice besides 'share' or 'don't share,' such as 'negotiate an agreement.' How do ethical codes or policies as well as legal rules apply to these different options? The university or funding agency may have policies on data management that apply to this case. Broader ethical rules, such as openness and respect for credit and intellectual property, may also apply to this case. Laws relating to intellectual property may be relevant. Are there any people who can offer ethical advice? It may be useful to seek advice from a colleague, a senior researcher, your department chair, or anyone else you can trust (?). In the case, Dr. Wexford might want to talk to her supervisor and research team before making a decision. After considering these questions, a person facing an ethical dilemma may decide to ask more questions, gather more information, explore different options, or consider other ethical rules. However, at some point he or she will have to make a decision and then take action. Ideally, a person who makes a decision in an ethical dilemma should be able to justify his or her decision to himself or herself, as well as colleagues, administrators, and other people who might be affected by the decision. He or she should be able to articulate reasons for his or her conduct and should consider the following questions in order to explain how he or she arrived at his or her decision: . * * * * * Which choice could stand up to further publicity and scrutiny? Which choice could you not live with? Think of the wisest person you know. What would he or she do in this situation? Which choice would be the most just, fair, or responsible? Which choice will probably have the best overall consequences?

After considering all of these questions, one still might find it difficult to decide what to do. If

this is the case, then it may be appropriate to consider others ways of making the decision, such as going with one's gut feeling, seeking guidance through prayer or meditation, or even flipping a coin. Endorsing these methods in this context need not imply that ethical decisions are irrational or that these other methods should be used only as a last resort. The main point is that human reasoning plays a pivotal role in ethical decision-making but there are limits to its ability to solve all ethical dilemmas in a finite amount of time.

Promoting Ethical Conduct in Science


Many of you may be wondering why you are required to have training in research ethics. You may believe that you are highly ethical and know the difference between right and wrong. You would never fabricate or falsify data or plagiarize. Indeed, you also may believe that most of your colleagues are highly ethical and that there is no ethics problem in research. If you feel this way, relax. No one is accusing you of acting unethically. Indeed, the best evidence we have shows that misconduct is a very rare occurrence in research, although there is considerable variation among various estimates. The rate of misconduct has been estimated to be as low as 0.01% of researchers per year (based on confirmed cases of misconduct in federally funded research) to as high as 1% of researchers per year (based on self-reports of misconduct on anonymous surveys). See Shamoo and Resnik (2009), cited above. Clearly, it would be useful to have more data on this topic, but so far there is no evidence that science has become ethically corrupt. However, even if misconduct is rare, it can have a tremendous impact on research. Consider an analogy with crime: it does not take many murders or rapes in a town to erode the community's sense of trust and increase the community's fear and paranoia. The same is true with the most serious crimes in science, i.e. fabrication, falsification, and plagiarism. However, most of the crimes committed in science probably are not tantamount to murder or rape, but ethically significant misdeeds that are classified by the government as 'deviations.' Moreover, there are many situations in research that pose genuine ethical dilemmas. Will training and education in research ethics help reduce the rate of misconduct in science? It is too early to tell. The answer to this question depends, in part, on how one understands the causes of misconduct. There are two main theories about why researchers commit misconduct. According to the "bad apple" theory, most scientists are highly ethical. Only researchers who are morally corrupt, economically desperate, or psychologically disturbed commit misconduct. Moreover, only a fool would commit misconduct because science's peer review system and selfcorrecting mechanisms will eventually catch those who try to cheat the system. In any case, a course in research ethics will have little impact on "bad apples," one might argue. According to the "stressful" or "imperfect" environment theory, misconduct occurs because various institutional pressures, incentives, and constraints encourage people to commit misconduct, such as pressures to publish or obtain grants or contracts, career ambitions, the pursuit of profit or fame, poor supervision of students and trainees, and poor oversight of researchers. Moreover, defenders of the stressful environment theory point out that science's peer review system is far from perfect and that it is relatively easy to cheat the system. Erroneous or fraudulent research often enters the public record without being detected for years. To the extent that research environment is an important factor in misconduct, a course in research ethics is likely to help people get a better understanding of these stresses, sensitize people to ethical concerns, and improve ethical judgment and decision making. Misconduct probably results from environmental and individual causes, i.e. when people who are

morally weak, ignorant, or insensitive are placed in stressful or imperfect environments. In any case, a course in research ethics is useful in helping to prevent deviations from norms even if it does not prevent misconduct. Many of the deviations that occur in research may occur because researchers simple do not know or have never thought seriously about some of the ethical norms of research. For example, some unethical authorship practices probably reflect years of tradition in the research community that have not been questioned seriously until recently. If the director of a lab is named as an author on every paper that comes from his lab, even if he does not make a significant contribution, what could be wrong with that? That's just the way it's done, one might argue. If a drug company uses ghostwriters to write papers "authored" by its physicianemployees, what's wrong about this practice? Ghost writers help write all sorts of books these days, so what's wrong with using ghostwriters in research? Another example where there may be some ignorance or mistaken traditions is conflicts of interest in research. A researcher may think that a "normal" or "traditional" financial relationship, such as accepting stock or a consulting fee from a drug company that sponsors her research, raises no serious ethical issues. Or perhaps a university administrator sees no ethical problem in taking a large gift with strings attached from a pharmaceutical company. Maybe a physician thinks that it is perfectly appropriate to receive a $300 finders fee for referring patients into a clinical trial. If "deviations" from ethical conduct occur in research as a result of ignorance or a failure to reflect critically on problematic traditions, then a course in research ethics may help reduce the rate of serious deviations by improving the researcher's understanding of ethics and by sensitizing him or her to the issues. Finally, training in research ethics should be able to help researchers grapple with ethical dilemmas by introducing researchers to important concepts, tools, principles, and methods that can be useful in resolving these dilemmas. In fact, the issues have become so important that the NIH and NSF have mandated training in research ethics for graduate students. David B. Resnik, JD, Ph.D. (http://www.niehs.nih.govhttp://www.niehs.nih.gov/research/resources/bioethics/bioethicist. cfm) Bioethicist and NIEHS IRB Chair Tel (919) 541-5658 Fax (919) 541-9854 resnikd@niehs.nih.gov

Five principles for research ethics


Cover your bases with these ethical strategies. By DEBORAH SMITH Monitor Staff January 2003, Vol 34, No. 1 Print version: page 56 Not that long ago, academicians were often cautious about airing the ethical dilemmas they faced in their research and academic work, but that environment is changing today. Psychologists in

academe are more likely to seek out the advice of their colleagues on issues ranging from supervising graduate students to how to handle sensitive research data, says George Mason University psychologist June Tangney, PhD. "There has been a real change in the last 10 years in people talking more frequently and more openly about ethical dilemmas of all sorts," she explains. Indeed, researchers face an array of ethical requirements: They must meet professional, institutional and federal standards for conducting research with human participants, often supervise students they also teach and have to sort out authorship issues, just to name a few. Here are five recommendations APA's Science Directorate gives to help researchers steer clear of ethical quandaries: 1. Discuss intellectual property frankly Academe's competitive "publish-or-perish" mindset can be a recipe for trouble when it comes to who gets credit for authorship. The best way to avoid disagreements about who should get credit and in what order is to talk about these issues at the beginning of a working relationship, even though many people often feel uncomfortable about such topics. "It's almost like talking about money," explains Tangney. "People don't want to appear to be greedy or presumptuous." APA's 2002 Ethics Code offers some guidance: It specifies that "faculty advisors discuss publication credit with students as early as feasible and throughout the research and publication process as appropriate." When researchers and students put such understandings in writing, they have a helpful tool to continually discuss and evaluate contributions as the research progresses. However, even the best plans can result in disputes, which often occur because people look at the same situation differently. "While authorship should reflect the contribution," says APA Ethics Office Director Stephen Behnke, JD, PhD, "we know from social science research that people often overvalue their contributions to a project. We frequently see that in authorship-type situations. In many instances, both parties genuinely believe they're right." APA's Ethics Code stipulates that psychologists take credit only for work they have actually performed or to which they have substantially contributed and that publication credit should accurately reflect the relative contributions: "Mere possession of an institutional position, such as department chair, does not justify authorship credit," says the 2002 code. "Minor contributions to the research or to the writing for publications are acknowledged appropriately, such as in footnotes or in an introductory statement." The same rules apply to students. If they contribute substantively to the conceptualization, design, execution, analysis or interpretation of the research reported, they should be listed as authors. Contributions that are primarily technical don't warrant authorship. In the same vein, advisers should not expect ex-officio authorship on their students' work. Matthew McGue, PhD, of the University of Minnesota, says his psychology department has instituted a procedure to avoid murky authorship issues. "We actually have a formal process here where students make proposals for anything they do on the project," he explains. The process allows students and faculty to more easily talk about research responsibility, distribution and authorship.

Psychologists should also be cognizant of situations where they have access to confidential ideas or research, such as reviewing journal manuscripts or research grants, or hearing new ideas during a presentation or informal conversation. While it's unlikely reviewers can purge all of the information in an interesting manuscript from their thinking, it's still unethical to take those ideas without giving credit to the originator. "If you are a grant reviewer or a journal manuscript reviewer [who] sees someone's research [that] hasn't been published yet, you owe that person a duty of confidentiality and anonymity," says Gerald P. Koocher, PhD, editor of the journal Ethics and Behavior and co-author of "Ethics in Psychology: Professional Standards and Cases" (Oxford University Press, 1998). Researchers also need to meet their ethical obligations once their research is published: If authors learn of errors that change the interpretation of research findings, they are ethically obligated to promptly correct the errors in a correction, retraction, erratum or by other means. To be able to answer questions about study authenticity and allow others to reanalyze the results, authors should archive primary data and accompanying records for at least five years, advises University of Minnesota psychologist and researcher Matthew McGue, PhD. "Store all your data. Don't destroy it," he says. "Because if someone charges that you did something wrong, you can go back." "It seems simple, but this can be a tricky area," says Susan Knapp, APA's deputy publisher. "The APA Publication Manual Section 8.05 has some general advice on what to retain and suggestions about things to consider in sharing data." The 2002 APA Ethics Code requires psychologists to release their data to others who want to verify their conclusions, provided that participants' confidentiality can be protected and as long as legal rights concerning proprietary data don't preclude their release. However, the code also notes that psychologists who request data in these circumstances can only use the shared data for reanalysis; for any other use, they must obtain a prior written agreement. 2. Be conscious of multiple roles APA's Ethics Code says psychologists should avoid relationships that could reasonably impair their professional performance or could exploit or harm others. But it also notes that many kinds of multiple relationships aren't unethical--as long as they're not reasonably expected to have adverse effects. That notwithstanding, psychologists should think carefully before entering into multiple relationships with any person or group, such as recruiting students or clients as participants in research studies or investigating the effectiveness of a product of a company whose stock they own. For example, when recruiting students from your Psychology 101 course to participate in an experiment, be sure to make clear that participation is voluntary. If participation is a course requirement, be sure to note that in the class syllabus, and ensure that participation has educative value by, for instance, providing a thorough debriefing to enhance students' understanding of the study. The 2002 Ethics Code also mandates in Standard 8.04b that students be given equitable alternatives to participating in research. Perhaps one of the most common multiple roles for researchers is being both a mentor and lab supervisor to students they also teach in class. Psychologists need to be especially cautious that they don't abuse the power differential between themselves and students, say experts. They

shouldn't, for example, use their clout as professors to coerce students into taking on additional research duties. By outlining the nature and structure of the supervisory relationship before supervision or mentoring begins, both parties can avoid misunderstandings, says George Mason University's Tangney. It's helpful to create a written agreement that includes both parties' responsibilities as well as authorship considerations, intensity of the supervision and other key aspects of the job. "While that's the ideal situation, in practice we do a lot less of that than we ought to," she notes. "Part of it is not having foresight up front of how a project or research study is going to unfold." That's why experts also recommend that supervisors set up timely and specific methods to give students feedback and keep a record of the supervision, including meeting times, issues discussed and duties assigned. If psychologists do find that they are in potentially harmful multiple relationships, they are ethically mandated to take steps to resolve them in the best interest of the person or group while complying with the Ethics Code. 3. Follow informed-consent rules When done properly, the consent process ensures that individuals are voluntarily participating in the research with full knowledge of relevant risks and benefits. "The federal standard is that the person must have all of the information that might reasonably influence their willingness to participate in a form that they can understand and comprehend," says Koocher, dean of Simmons College's School for Health Studies. APA's Ethics Code mandates that psychologists who conduct research should inform participants about: * * * * * * * The purpose of the research, expected duration and procedures. Participants' rights to decline to participate and to withdraw from the research once it has started, as well as the anticipated consequences of doing so. Reasonably foreseeable factors that may influence their willingness to participate, such as potential risks, discomfort or adverse effects. Any prospective research benefits. Limits of confidentiality, such as data coding, disposal, sharing and archiving, and when confidentiality must be broken. Incentives for participation. Who participants can contact with questions.

Experts also suggest covering the likelihood, magnitude and duration of harm or benefit of participation, emphasizing that their involvement is voluntary and discussing treatment alternatives, if relevant to the research. Keep in mind that the 2002 Ethics Code, which goes into effect on June 1 this year, includes specific mandates for researchers who conduct experimental treatment research. Specifically, they must inform individuals about the experimental nature of the treatment, services that will or will not be available to the control groups, how participants will be assigned to treatments and

control groups, available treatment alternatives and compensation or monetary costs of participation (see What you need to know about the new code). If research participants or clients are not competent to evaluate the risks and benefits of participation themselves--for example, minors or people with cognitive disabilities--then the person who's giving permission must have access to that same information, says Koocher. Remember that a signed consent form doesn't mean the informing process can be glossed over, say ethics experts. In fact, the 2002 APA Ethics Code says psychologists can skip informed consent in two instances only: When permitted by law or federal or institutional regulations, or when the research would not reasonably be expected to distress or harm participants and involves one of the following: * * The study of normal educational practices, curricula or classroom management methods conducted in educational settings. Anonymous questionnaires, naturalistic observations or archival research for which disclosure of responses would not place participants at risk of criminal or civil liability or damage their financial standing, employability or reputation, and for which confidentiality is protected. The study of factors related to job or organization effectiveness conducted in organizational settings for which there is no risk to participants' employability, and confidentiality is protected.

If psychologists are precluded from obtaining full consent at the beginning--for example, if the protocol includes deception, recording spontaneous behavior or the use of a confederate--they should be sure to offer a full debriefing after data collection and provide people with an opportunity to reiterate their consent, advise experts. The code also says psychologists should make reasonable efforts to avoid offering "excessive or inappropriate financial or other inducements for research participation when such inducements are likely to coerce participation." 4. Respect confidentiality and privacy Upholding individuals' rights to confidentiality and privacy is a central tenet of every psychologist's work. However, many privacy issues are idiosyncratic to the research population, writes Susan Folkman, PhD, in "Ethics in Research with Human Participants" (APA, 2000). For instance, researchers need to devise ways to ask whether participants are willing to talk about sensitive topics without putting them in awkward situations, say experts. That could mean they provide a set of increasingly detailed interview questions so that participants can stop if they feel uncomfortable. And because research participants have the freedom to choose how much information about themselves they will reveal and under what circumstances, psychologists should be careful when recruiting participants for a study, says Sangeeta Panicker, PhD, director of the APA Science Directorate's Research Ethics Office. For example, it's inappropriate to obtain contact information of members of a support group to solicit their participation in research. However, you could give your colleague who facilitates the group a letter to distribute that explains your research study and provides a way for individuals to contact you, if they're interested. Other steps researchers should take include:

Discuss the limits of confidentiality. Give participants information about how their data will be used, what will be done with case materials, photos and audio and video recordings, and secure their consent. Know federal and state law. Know the ins and outs of state and federal law that might apply to your research. For instance, the Goals 2000: Education Act of 1994 prohibits asking children about religion, sex or family life without parental permission.

Another example is that, while most states only require licensed psychologists to comply with mandatory reporting laws, some laws also require researchers to report abuse and neglect. That's why it's important for researchers to plan for situations in which they may learn of such reportable offenses. Generally, research psychologists can consult with a clinician or their institution's legal department to decide the best course of action. * Take practical security measures. Be sure confidential records are stored in a secure area with limited access, and consider stripping them of identifying information, if feasible. Also, be aware of situations where confidentiality could inadvertently be breached, such as having confidential conversations in a room that's not soundproof or putting participants' names on bills paid by accounting departments. Think about data sharing before research begins. If researchers plan to share their data with others, they should note that in the consent process, specifying how they will be shared and whether data will be anonymous. For example, researchers could have difficulty sharing sensitive data they've collected in a study of adults with serious mental illnesses because they failed to ask participants for permission to share the data. Or developmental data collected on videotape may be a valuable resource for sharing, but unless a researcher asked permission back then to share videotapes, it would be unethical to do so. When sharing, psychologists should use established techniques when possible to protect confiden-tiality, such as coding data to hide identities. "But be aware that it may be almost impossible to entirely cloak identity, especially if your data include video or audio recordings or can be linked to larger databases," says Merry Bullock, PhD, associate executive director in APA's Science Directorate. Understand the limits of the Internet. Since Web technology is constantly evolving, psychologists need to be technologically savvy to conduct research online and cautious when exchanging confidential information electronically. If you're not a Internet whiz, get the help of someone who is. Otherwise, it may be possible for others to tap into data that you thought was properly protected.

5. Tap into ethics resources One of the best ways researchers can avoid and resolve ethical dilemmas is to know both what their ethical obligations are and what resources are available to them. "Researchers can help themselves make ethical issues salient by reminding themselves of the basic underpinnings of research and professional ethics," says Bullock. Those basics include: * The Belmont Report. Released by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research in 1979, the report provided the ethical framework for ensuing human participant research regulations and still serves as the basis for human participant protection legislation (see Further Reading).

APA's Ethics Code, which offers general principles and specific guidance for research activities, available at www.apa.org/ethics.

Moreover, despite the sometimes tense relationship researchers can have with their institutional review boards (IRBs), these groups can often help researchers think about how to address potential dilemmas before projects begin, says Panicker. But psychologists must first give their IRBs the information they need to properly understand a research proposal. "Be sure to provide the IRB with detailed and comprehensive information about the study, such as the consent process, how participants will be recruited and how confidential information will be protected," says Bullock. "The more information you give your IRB, the better educated its members will become about behavioral research, and the easier it will be for them to facilitate your research." As clich as it may be, says Panicker, thinking positively about your interactions with an IRB can help smooth the process for both researchers and the IRBs reviewing their work.

FURTHER READING
* * American Psychological Association. (2002). Ethical principles of psychologists and code of conduct. American Psychologist, 57(12). Sales, B.D., & Folkman, S. (Eds.). (2000). Ethics in research with human participants. Washington, DC: American Psychological Association. APA's Research Ethics Office in the Science Directorate; e-mail; Web site: APA Science. The National Institutes of Health (NIH) offers an online tutorial, "Human Participants Protections Education for Research Teams," at http://cme.nci.nih.gov. NIH Bioethics Resources Web site: www.nih.gov/sigs/bioethics/index.html. The Department of Health and Human Services' (DHHS) Office of Research Integrity Web site: www.ori.hhs.gov. DHHS Office of Human Research Protections Web site: http://ohrp.osophs.dhhs.gov. The 1979 Belmont Report on protecting human subjects is at http://ohrp.osophs.dhhs.gov/humansubjects/guidance/belmont.htm. Association for the Accreditation of Human Research Protection Programs Web site: www.aahrpp.org.

ON THE WEB * * * * * * *

RELATED ARTICLES Value of ethics in research


B.S. WARRIER The basic tenet which we should honour is that our behaviour should never violate principles of integrity.

There are innumerable instances where dishonest means have been adopted by intellectuals of distinction. Many of them lost the honour and prestige they built through hard work for several years. A research scholar should never even dream of resorting to any unethical act. Achieving personal glory through deceit would never give us peace of mind. The great German philosopher Immanuel Kant (1724 1804) said, In law a man is guilty when he violates the rights of others. In ethics he is guilty if he only thinks of doing so. This may be read along with the pithy statement of Dr Einstein - Relativity applies to physics, not ethics. He also said, The ideals which have always shone before me and filled me with the joy of living are goodness, beauty, and truth. To make a goal of comfort or happiness has never appealed to me; a system of ethics built on this basis would be sufficient only for a herd of cattle. Listening to these great men is quite fine. But we should know fairly well the norms of ethics in research. What all would amount to unethical acts, which we should shun with all our might? The basic tenet which we should honour is that our professional behaviour with other researchers, past or present, should never violate principles of integrity. Our research findings should be the product of our original work of a genuine nature. The studies should be transparent. The test results should be real, and not cooked up or fabricated. Communication to other researchers in the same field should be honest. We should keep our word. We should never attempt to mislead them. We should have an open mind while discussing our work with others in the field. Healthy criticism should not deter us from continued effort. Never disclose the secrets of your organisation or unpublished data belonging to others. Keep them confidential. In research in the discipline of social sciences, we may have to gather data on diverse aspects of people. They may even share purely private and personal information with us. We have to protect their privacy. We should tell them beforehand the purpose of our study, as also the way in which data pertaining to them would be used by us. We should not publish any personal data without their permission. A medical researcher should never disclose the health information relating to a patient without his consent. Human dignity should be respected at all times, especially while dealing with vulnerable population. You might have heard of the notorious Tuskegee study of untreated syphilis in black males in the 1930s held in Alabama in the U.S. The scandal became public only in 1972. That was an instance of an unethical and outrageous experiment on human beings. Research on animals should be limited to properly designed experiments honouring principles of ethical handling. Whatever we do in research should be with extreme care and attention. There is no room for negligence or indifference. Persistence will help us to try again and again enthusiastically despite temporary setbacks. Further, we should be honest to ourselves when we analyse data. We should be totally free from bias or wishful thinking. The objective of our research should not be amassing illegal wealth of any kind. We should not violate any copyright, patent, or any form of intellectual property right. Never use someone else's unpublished data without his permission. Give due credit whenever you use material from others. Never do anything that may damage your credibility.

Researchers in different fields may look at the same problem from different angles. For example evolving more effective methods of using insecticides for enhancing agricultural output may be viewed from a purely technical standpoint by an agricultural scientist. It may be unethical if he is totally blind to the harmful effects of insecticides on farmers' health. Better crops should not be accompanied by sick farmers. Perhaps the second aspect may be highlighted by an environmentalist or a researcher in ecology and public health. We should not publish the same material in the same form in two journals, hoping that the publishers may not discover the trick. A supervisor should give genuine guidance safeguarding the best interest of the research fellow, and desist from any attempt to exploit him. Tricksters There have been tricksters in search of name and fame as scientists, who have come forward with deceptive findings based on forgery or fabricated data. They represent an ominous weakening of the norm of scientific truthfulness. Such efforts that do not fall in the category of genuine research should be shunned. You must have heard about the shame and tragedy that gripped the South Korean biomedical scientist Hwang Woo Suk of stem cells fame, who sprung up as a national hero and then rose to international stardom. Hwang's lab was the only one in the world that claimed remarkable breakthroughs in cloning human cells, making new human embryos from single adult cells. Cells cultivated from such embryos, called stem cells, could be crucial for studying diseases and medical treatment. His research papers appeared in the prestigious Science' journal. Later on his claims were found to be fraudulent. Hwang had to resign from his Seoul National University in March 2006 and apologise for his actions. There is another incident involving two British scientists. In 1912, Charles Dawson and Arthur Smith Woodward produced fragments of the skull of the so-called Piltdown Man, claimed to be discovered by workmen in gravel pits in Sussex. They suggested that Piltdown man represented an evolutionary missing link between ape and man. It was after forty years that Piltdown Man was shown to be a composite forgery. It was made out of a medieval human skull, the 500-year-old lower jaw of an orangutan, and chimpanzee fossil teeth. Another story relates to painting the mice. In 1974, Dr. William Summerlin, a top-ranking transplantation immunologist at Sloan-Kettering Cancer Center in New York, used a marker pen to make black patches of fur on white mice in an attempt to prove his new skin graft technique. He claimed that there would be no rejection in his method. But he appeared on the front pages of newspapers, not as a discoverer but as a perpetrator of fraud. The expression painting the mice has come to mean fraud in research. A perpetual motion machine claimed to have been invented by Charles Redheffer drew large crowds in a paid exhibition in New York in 1813. The machine was a fraud. Paul Kammerer (1880 - 1926), an Austrian scientist brought forth his Lamarckian inheritance'. He claimed to have proved that organisms may acquire characteristics and pass them to their offspring. His experiments with toads were a fabrication. Black ink had been secretly injected into the hind legs of the toads. He was exposed. He committed suicide perhaps because of the humiliation.

Shinichi Fujimura, born in1950, a Japanese archaeologist earned fame by claiming to have dug up stones backing up to 500,000 years or more. Later on it was found that he had buried the artifacts and later on dug them up and presented as old treasures. In a public appearance, he bowed his head in shame and said, I had been tempted by the devil. Jan Henrik Schn, a young researcher at Bell Laboratories, published a dozen papers on his discoveries in advanced electronics in journals including Nature at the turn of the century. The findings were later on found to be a hoax. Moral These real stories have been related to establish the priceless value of ethics in research. Exaggeration of a result, disregarding counter-evidence, is as unhealthy as falsification of data or experiments. Deliberate misrepresentation of an inference and its perpetration will certainly land the fraudulent person in trouble. Cutting corners', a euphemism for taking wrong shortcuts or deliberate distortion of evidence, will have the same fate. Honesty is the best policy in research as well.

You might also like