Workshop on Higher Education and Professional Responsibility in CBRN Applied Sciences and Technology across the Sub-Mediterranean Region
3-4 April 2012. Palazzo Zorzi, Venice
Session 1. Status - Culture of Safety and Security and Responsible Science
Boost Fertility New Invention Ups Success Rates.pdf
Â
Science and Social Responsibility [John Crowley, UNESCO SHS, France]
1. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
Science and Social Responsibility
John Crowley
UNESCO Division of Ethics, Science and Society1
No one, I suspect, would deny that scientists have responsibilities. Their primary
responsibility is to science itself â to search impartially for truth, with all the epistemological
and institutional implications that such a search entails. They also have civic responsibilities,
as citizens and as members of society, that are inevitably coloured by their specific status as
scientists â for instance because of the technical knowledge they may alone have access to.
It is less obvious that scientists might have additional responsibilities as scientists. Indeed,
the claim is often denied. Some see the idea of responsible science, or of science ethics, as
a restriction or an imposition on science â one that denies the fundamental character of
science as free inquiry, which implies the entitlement of scientists to follow a line of research
wherever it may lead.
The issue, therefore, is to give reasons for the very existence of social responsibility.
There is in fact a simple, straightforward and consensual basis in article 27 of the Universal
Declaration of Human Rights, which declares the human right âto share in scientific
advancement and its benefitsâ. It follows logically from this human right, and from the need to
take it seriously, that scientists, and all those engaged in shaping the institutions and
practices of science, have a correlative duty to help create the conditions in which the right
can be realized. However, this particular human right has not been extensively developed,
either conceptually or legally, and its implications are perhaps not entirely clear. Furthermore,
it does not require simply assertion â it also, if it is to be taken seriously as a basis for the
definition of responsibilities, needs justification.
A full justification would be beyond the scope of this presentation. However, there are two
basic ideas that enable the key point to be made fairly quickly.
The first is the universal nature of science, which has two implications, one âthinâ and
incontrovertible, the other âthickâ and debatable, but certainly plausible.
What can hardly be denied is that science is fundamentally impersonal. While many
discoveries, theorems, experiments, equations etc. bear the names of their discoverers, this
is purely a matter of courtesy. They are in fact inherently anonymous, radically detachable
from the conditions of their invention. Any piece of scientific knowledge â valid or invalid â
has the same content for anyone who can understand it. It follows that potential universal
access to the results of science is written into the logic of science itself. Failure to achieve
actual universality of access must therefore be ascribable to identifiable barriers â
inadequate training, wilful refusal, diffusion lags, and so on.
What is more debatable is whether science actually operates in this way â or whether it
should. The elementary âbitsâ of scientific knowledge may be radically universalizable, but
science is not just a collection of âbitsâ. It is structured not just by institutions and practices,
but also by theories and by worldviews that give it meaning as well as content. The
connection between the understanding produced by science and human self-understanding,
1
The views expressed in this paper are those of the author and, except where specifically stated otherwise,
should not be regarded as official statements of a UNESCO position on the topics addressed.
Contact email: j.crowley@unesco.org.
1
2. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
as expressed in philosophy, literature, art and everyday life, is clearly much more complex
than the implications of operational method or even epistemology. Arguments are indeed
widespread in contemporary philosophy that science is not in fact universal in the âthickâ
sense, and at a more practical level it is clear that the institutional organization of science is
not universalizable in the same sense as the elementary bits of knowledge are.
This point, it should be emphasized, does not qualify the human right âto share in scientific
advancement and its benefitsâ. What it does call into question, however, is the extent to
which the right follows directly from the internal structure of science. If the right were a purely
external supplement, then it clearly would require an autonomous justification. Without going
into detail for present purposes, the language of the Universal Declaration of Human Rights
deserves emphasis in this regard. It does not refer only to the benefits of science, important
though they are, but also to sharing in âscientific advancementâ â in other words in science
as a process. From the fact that this process is not inherently universal (unlike the
elementary âbitsâ of scientific knowledge), what follows is a responsibility to create the
conditions in which scientific advancement can actually be shared. Science, in other words,
is bundled up with worldviews, institutions and social practices that are not strictly speaking
âscientificâ and that are acceptable only in so far as they are, in the relevant sense, âsharedâ
â which means, minimally, elaborated through mechanisms that are recognized by all
stakeholders to be equitable. This is a huge challenge, of course, but it expresses rather than
negates the essentially universal character of science.
Alongside universality, the second basic idea that can help to establish the validity of the
human rights framework for science is integrity.
It is sometimes assumed in public debate that integrity has an optional character, in the
sense that science can deliver results regardless of integrity â and perhaps deliver more
effectively when it ignores ethical considerations. There are clearly cases in which this can
be argued â for instance in certain areas of medical research, where strong bioethics
frameworks have been established precisely because violating principles such as informed
consent could, plausibly, enable results to be achieved faster. Generally speaking, however,
indifference to integrity tends to be positively correlated with indifference to methodology.
The most serious violations of scientific ethics in the 20th century â for instance, the Mengele
experiments in Nazi Germany â typically involved not just gross mistreatment of research
subjects but also routine fabrication and falsification of data, precisely because they were
responding to politically biased and scientifically arbitrary agendas. And of course, the results
were not peer-reviewed. There are reasonable grounds to believe, therefore, that unethical
science tends, for institutional and social reasons, to be bad science. Conversely, integrity,
which underwrites the availability to all of the results of science, albeit not effective sharing,
provides an internal justification for the established human rights based responsibility
framework.
Assuming this framework to be adequately justified, it is important to consider what it actually
means. While the right to share in scientific advancement and its benefits is inadequately
developed in legal and conceptual terms, it is nonetheless possible to draw some reasonably
firm conclusions from a series of authoritative instruments that have been adopted by
UNESCO to address science ethics and the institutional structures of science. Reference is
made in this regard, inter alia, to the 1974 Recommendation on the Status of Scientific
Researchers, the 1999 Declaration on Science and the Use of Scientific Knowledge and the
2005 Universal Declaration on Bioethics and Human Rights. Drawing on these documents,
three important dimensions of the social responsibilities of scientists can be drawn out, which
build on but also expand the basic principles of universality and integrity: the duty to refrain
from harm; the duty to aim at social benefits; and the duty to ensure access not just to results
but also to equitable opportunities to participate in scientific processes.
2
3. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
These duties are very generic, of course. Each of them would need to be filled in before they
can apply to specific cases. For instance, the kind of harm that can be inflicted by the
research process or by technological applications varies considerably between disciplines. In
biomedical research, the principle of informed consent serves to protect human subjects
against inappropriate protocols. A different approach is required â even if it can be related by
analogy to informed consent â to deal with, for instance, weapons research or environmental
impact. Similarly, the direct social benefits are typically much easier to identify if one is
working on, say, vaccine development or nanofiltration than if oneâs research deals with
prime numbers or ancient history.2 Furthermore, it is a tricky business to give weight to
potential social benefits in decisions about priority-setting and funding, since such benefits
depend not just on scientific success at the research level, but also on a range of other
factors such as effective development of applications and social take-up. Finally, the kind of
âaccessâ that might be relevant in human rights terms is likely to vary considerably from case
to case. In some research areas, publication in reputable journals may suffice to ensure that
those who could benefit from the results have access to them. In other areas, it may be
necessary to overcome barriers to access relating to factors as diverse as the cost of
subscriptions, language and lack of capacity to develop applications. To take just one
example, equitable access to climate science for developing countries cannot be ensured
simply by giving them free access to raw satellite observation data, if the observations do not
have the right geographical focus and if the beneficiaries lack the human and numerical
capacity to input the data into models and other computational tools.
Nonetheless, while generic, the three duties do offer a reasonably clear basis for what
scientists and science institutions need to worry about and respond to. They also give a fairly
persuasive negative picture of what irresponsible science might look like. Very simply put,
indifference to harm, social benefit and equitable access is unethical. Concern does not of
course exhaust ethics. But without concern, ethics has nothing to build on.
This framework for the social responsibilities of scientists â which, while established, has
never been comprehensively applied â is not immediately applicable. On the contrary, it
faces a series of challenges that point to the need to sharpen, to operationalize and perhaps
to update it. Some are very familiar; others are recent or even emerging.
Among the familiar challenges is the indeterminate character of the social impact of any
given area of scientific research. Technological development may of course be more
determinate â although even then significant uncertainties are likely to remain with regard to
social uses and social effects. Scientists cannot, therefore, be expected to assess their
actions by reference to detailed predictions of things that would happen. Ethical concerns
typically operate in the realm of what might happen. They involve considerations of
uncertainty and risk and are framed by the very old philosophical notion of prudence â and
the much newer one of precaution.
Recent and emerging issues relate both to changing patterns of scientific organization and to
scientific developments that modify the character of the relation between science and
society.
The point is not that science has suddenly acquired a capacity to change the word that it
previous lacked. On the contrary, science has been world-changing ever since the notion of
science was first adumbrated â not just through the application of specific technologies,
important as they are, but also because science is a worldview that bears on human
2
Indeed, the general question how ethics and social responsibility apply to the social and human sciences is a
complex issue that cannot be discussed in detail in this paper.
3
4. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
consciousness and self-understanding. However, the power of science and technology to
shape the very fabric of life sharply expanded in the 20th century, and was dramatized by the
destructive deployment of nuclear fusion. In the 21st century, we are increasingly sensitive to
the implications of biotechnologies, broadly understood, including their interfaces and
possible convergence with nanotechnologies and computer sciences, which may over time
reshape what we mean by life and what we understand to be âhumanâ.
What this means, in practice, is an added level of complexity embedded in the question what
scientists could or should be responsible for. The things that might happen are increasingly
diffuse, and the causal connections sometimes elusive.
Let me take just two examples.
The first is so-called âdual useâ, which in this context involves the malicious use by an
unrelated third party of results derived from bona fide research that is intended to have
beneficial applications or is not pursued with a view to application at all. The question is to
what extent scientists have responsibilities with respect to âdual use research of concernâ3 â
in other words possible malicious use by hypothetical third parties of their research results.
The second is the concerns about converging technologies that have emerged in some
circles, notably in Europe, as a consequence of the reception of the 2002 NSF report by
Roco & Bainbridge on âenhancing human potentialâ. These concerns led the European
Commission, in the code of conduct for responsible nanoscience that it published in 2008, to
propose banning public funding for research in the nanoscience that could lead to
applications involving the ânon-therapeutic enhancement of human potentialâ.
In both cases, the difficult question is how such concerns might be operationalized. Clearly, it
is unrealistic and even unfair simply to state that individual scientists will be held responsible
for misuse of their research. At the very least, their ability to deal with such responsibilities
will depend on the institutional support provided to them, on the broader regulatory context,
on the actions of many others in the science and technology system, and of course on
adequate education and training.
It bears emphasizing that these areas of responsibility are, in the strong sense of the word,
social. First, because they fully correspond to the three core dimensions of social
responsibility, as traditionally constituted: do no harm, aim at social benefit, ensure access.
Secondly, because they relate to the social organization of science, and not simply to
science as an abstraction (âunderstanding the worldâ) or as an individualized set of
professional practices.
However, there is also a broader category of social responsibilities that concern the social
organization of science as such, regardless of the specific content of scientific research and
technological development. It is a very familiar point that, in this area, considerable changes
have occurred and are still ongoing. Nonetheless, it remains helpful to highlight some of the
dynamics that are putting under pressure a comprehensive and practically applicable
approach to scientific responsibility.
In this respect, intellectual property obviously deserves special mention. The point is not that
intellectual property as such necessarily conflicts with the orientation of science towards
3
The phrase âdual use research of concernâ (DURC) was developed by the US NASBB in order to define the
priority area for new science policy frameworks with respect to biosecurity. The objective is to operationalize the
concept in such a way that DURC (i.e. research giving rise to clearly identifiable potential concerns on the basis of
credible security information) can be singled out at the research funding or publication level and treated
appropriately by evaluators, editors and others with, in the broad sense, a âgatekeepingâ mandate.
4
5. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
public benefit â though of course science and technological innovation long predate
intellectual property as elaborated in the 18th and 19th centuries. But there are certainly new
uses of intellectual property that raise questions with respect both to the orientation of
science and to access to its results. The relevant contrasts are very familiar. Jonas Salk
famously did not seek to patent the polio vaccine, though it would undoubtedly have been
admissible, because he felt it would have been inappropriate to do so. Few scientists would
do the same today. Indeed, in many countries, few scientists could, given how research is
funded and the institutional pressures under which scientists operate. Similarly, as far as I
know, no attempt was made to patent any of the transuranide elements, although it would
arguably have been possible. Certainly, technologically revealed fragments of the natural
world such as genes have been successfully patented in recent years.
Intellectual property does not exhaust the significance of commercial dynamics as they bear
on the social organization of science. There is also a partly separate set of issues connected
to technology choices and how they relate to social needs. There is no lack of evidence that
technology choices, notably for development, tend to be driven by supplier push rather than
by impartial needs assessment. The difficult question, which would require further
consideration, is what social responsibilities scientists might have, individually and through
their institutions, to foster an improved and more participatory approach to technology
assessment, especially in developing countries.
Commercial pressures are just one aspect of the context in which science and technology
currently operate. Of particular importance for the topic of this workshop is military research,
which influences research agendas, restricts access to research results, and of course
creates the risk that science and technology will be deployed deliberately for harmful
purposes. It is well know that military funding currently dominates a number of areas of
emerging research, notably in converging technologies. The implications of this situation
require careful consideration. At the very least, military research programmes require more
open and transparent discussion than they typically receive.
To conclude, I should like to emphasize that the various ethical or responsibility issues I have
referred, and the others that could be added in the same vein, are not exclusively, and in
some respects not primarily, moral dilemmas for individual scientists. Of course, individual
scientists can make a difference, and need to be adequately equipped, through education,
training and institutional support, to cope with their responsibilities. But social responsibility is
first and foremost a social, and therefore institutional, issue. If scientists are to act
responsibly, policy-makers and institutional leaders also have duties to create the right kind
of framework â one that encourages, supports and ultimately normalizes ethical science and
technology.
Alongside education, awareness-raising and training, which are obviously essential,
UNESCO has a major role to play in shaping international and national frameworks. There
are three main levels in this regard.
First, providing a forum for discussion about the ethical challenges that are emerging and the
principles that can be elaborated to deal with them. As an intellectual agency, UNESCO is
naturally convinced that such discussions are valuable in themselves, regardless of the legal
or institutional follow-up given to them. We have established several expert advisory bodies
with the mandate precisely to foster independent and open-ended discussion in these areas.
Of particular relevance to this workshop are the International Bioethics Committee and the
World Commission for the Ethics of Scientific Knowledge and Technology. They can set their
own agenda with respect to emerging issues, conduct hearings, mobilize expertise, publish
reports, and thereby contribute to shaping international agendas, including the institutional
agenda of UNESCO itself.
5
6. DRAFT FOR PRESENTATION AT THE WORKSHOP ON
âHIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN
APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGIONâ,
VENICE, 3-4 APRIL 2012
Secondly, UNESCO is an intergovernmental setting within which, if our member states see
fit, legal commitments and political statements can be adopted. There are a range of possible
instruments, with different legal status and practical implications:
- declarations, which are statements by the General Conference, addressed to
humanity as a whole, expounding in political terms shared commitments â for
instance the 2005 Universal Declaration on Bioethics and Human Rights;
- recommendations, which are statements by the General Conference, addressed to
the member states of UNESCO, calling on them to do certain things and to report
periodically to the General Conference on implementation â for example the 1974
Recommendation on the Status of Scientific Researchers; and
- conventions, which are adopted by the General Conference but require subsequent
ratification by member states and, once they have entered into force, become legally
binding commitments between the states parties. There is no directly relevant
example in UNESCO, but the 2005 Convention on the Elimination of Doping in Sport
does relate to some aspects of scientific responsibility, in so far as doping is
increasingly about deliberate and sophisticated misapplications of science.
Many valuable things are done within the international community without recourse to
international law, but the UN system is, essentially, a law-driven community. Consideration of
the possible merits of new legal instruments in specific areas is therefore an ongoing
process.
Thirdly, on the basis of agreed principles, and within the setting either of existing legal
commitments or of agreed programmes, UNESCO can provide technical support for the
development in each member state, and where appropriate at institutional level, of the
legislative, regulatory, advisory or aspirational frameworks that can give substance to
scientific responsibility. This level is of course essential. If ethical reflection stays exclusively
at the level of identification of issues and elaboration of principles, it risks being toothless.
The crucial challenge is to establish the mechanisms by which ethics â in this case the social
responsibilities of scientists â can be embedded in the routine practices of science and
technology at all levels.
In this respect, a culture of responsibility parallels the familiar and in many ways better
understood idea of a culture of safety. In the same way as safety cannot simply be added to
a pre-existing institution, but requires extensive re-engineering of processes, attitudes and
behaviour, a responsible institution does not simply happen to exhibit responsibility. It needs
to be configured â and usually, in practice, reconfigured â to become responsible.
6