Is interdisciplinary scientific research easily evaluable?

18 Luglio 2012

André JC, Frochot C, Toméi F

André JC 1-2, Frochot C 2, Toméi F 3

1 INSIS-CNRS, 3 rue Michel Ange F75016 Paris - France
2 LRGP UPR 3349 CNRS - University of Lorraine – 1, rue Grandville F54000 Nancy - France
3 Department of Occupational Medicine, “Sapienza” University of Rome, Italy

Citation: André JC, Frochot C, Toméi F. Is interdisciplinary scientific research easily evaluable? Prevent Res 2012; 2 (3): 196-220. Available from: http://www.preventionandresearch.com/ . doi: 10.7362/2240-2594.033.2012


doi: 10.7362/2240-2594.033.2012


Key words: research, disciplines, society

Abstract
 
Introduction: Research is increasingly divided into different scientific disciplines and this tends to lead to difficulties in associating knowledge from different specialized fields in projects of public usefulness such as for medical or OSH research.
 
Objectives: This essay’s objective is to explore the problem of scientific evaluation in this important area both for readers of P&R and those with a more general interest. The topic chosen represents an important problem for the development of such activities facing decision-makers and, to a lesser extent, peers, involved in a form of conservatism of paradigms, associated with mono-disciplines. The paper will discuss the robustness of traditionally used indicators in comparison with the perceived reputation of researchers involved in interdisciplinary projects.
 
Methods: The study covers the topic of interdisciplinary actions using research results from different disciplinary fields. Most of the methodologies for assessing research performance nowadays are broadly based on bibliometric indicators. Advice on interdisciplinary activities is proposed and emphasis is placed on proposing better methods to assess researchers’ reputations in complex fields which associate several scientific disciplines.
 
Results: The paper does not provide direct results but the authors consider that this kind of proposal may represent one of the better bottom-up approaches to promoting change in the culture of some research assessors who need to take into account a new notion of excellence associated with an applied vision which must also of course benefit Society.
 
Conclusions: No final conclusion is proposed in this essay which nonetheless argues in favour of a better level of evaluation for (good) interdisciplinary research.

 
“What an odd word “disciplines”... An old word, or rather old words, as old as the European vernacular languages, and which drag behind them less Roman Latin than education during the medieval period. In both English and French, there are simultaneously the verb and the noun. The name which gave us interdisciplinarity consists of fields of study defined by their contents and by the Institution. But the verb implies castigation and punishment” (1).
 
“Interdisciplinarity, as such, can be approached via two questions: What are its finalities and what are its tools? Interdisciplinarity, at the first level, is inherent to all scientific work” (2).
 
“Technologies and the market draw us towards our future but our representations and our non-understanding keep us in our past. How can Man be so late with regard to that which he created? How could we have mastered our tools so well but not those who use them?” (3).
 
« Collaborations between disciplines and openness between fields of expertise do not necessarily mean breaking down boundaries” (4).

Background
“Since the knowledge of Man and Society instituted into sciences, like the Natural Sciences, our report aimed at telling the truth about the social world seems irremediably subject to the domination of specialization […]. Inside even the world of experts favors autism: the economists tend not to read any more, (and rather cite) just economists (those, preferably, of the same chapel), the sociologists only read other sociologists, legal researchers just read work of other legal researchers, and so on right up to this form of paroxysm of the fall represented by the auto-citation as privileged reference mode” (5). For decades, bibliometric indicators have been extensively used in research performance evaluation (6-10) with a great deal of bias (11, 12). In the case of interdisciplinary research work as with classic basic science, Dumas et al (13) have shown that further research needs to be done to find better indicators regarding the perceived reputation of individual researchers. The aim of this paper is to discuss this kind of expertise.
Let us take as hypothetical starting point, the case of a grant submission for a project linking overlapping scientific disciplines which has to be evaluated by independent experts from several disciplines who nonetheless have practically no experience of interdisciplinary projects. These experts are just required to respect the procedures and criteria linked to the attribution of the grant. In this respect, how should they choose between a proposal which describes a future collaboration by researchers from different disciplines consisting nonetheless of a juxtaposition of mono-disciplinary projects and possessing clear objectives and deliverables, and a truly interdisciplinary proposal which is less well thought less likely to yield precise results? Should they prefer the first project which entails less risk or the second which is less clear and involves a a new risk derived from the need for a common language, “good” governance, etc? “How likely are those outstanding interdisciplinary proposals to emerge in such conditions? And aren’t most of […] the colleagues on the committee quite content with the status quo which allows disciplinary business to go on as usual at the relatively minor cost of some interdisciplinary rhetoric?” (14).
Research in Prevention is quite a complex subject, associating as it does both validated results from hard science experiments and sometimes heuristic knowledge and expertise from the fields of the humanities and social sciences, legal sciences, etc. In this respect, this important field for society is a good example of interdisciplinary research work and yet projects in this area are difficult to develop. This is because they may present little attractiveness for researchers; have inherently difficult forecasting levels linked to a complex combination of social needs and demands; be weak in the evaluation of the notion of scientific excellence in a field where science and society need to be reconciled and because of the force of inertia of established disciplines, etc. This paper focuses on the difficult problem currently arising in modern society regarding the correct evaluation of interdisciplinary research work and is therefore of course relevant to the case of scientific research in the area of Occupational Safety and Health (OSH).
Of course, Heintz and Origgi (15) explained, there are major problems inherent to interdisciplinary research projects namely:
·Vocabulary: Each discipline has developed its own vocabulary and interdisciplinary work requires time for translation in order to develop a common language for communication with a view to attaining a mutual vision for a fruitful cooperation;
·Methodologies: Each discipline uses its own methods of investigation. Time pressure is also a constraint if misunderstandings between members of the interdisciplinary research team are to be avoided;
·Cognitive constraints: It is difficult to find people able to cover the whole field of expertise in an interdisciplinary project; “Yet, deep knowledge of different disciplines is needed to carry out genuine interdisciplinary research” (15);
·Project duration: when an artifact needs to be developed, following success, the interdisciplinary project will have attained its objectives and will be stopped which is different with mono-disciplines where the work may need to be pursued further; from an epistemic point of view, the creation of knowledge does not follow a linear time-resolved process. This makes it difficult to synchronize knowledge production from different disciplines;
·Institutional constraints: Research institutions are often organized according to disciplines and evaluation procedures often depend on the power of the disciplines involved.
“Evaluation means thinking within and on the behalf of a given professional community about what should be done in a given social situation, particularly regarding all the duties linked to the exercise of the profession in question” (16). One century ago, scientists, true social elite, published their research results in national scientific journals without real prior evaluation and the quality of their Universities provided a guarantee of the quality of their work. The transition to “Mass Science” (there are between 1 and 3 million researchers in Europe!) has inevitably changed theses practices and “today, scientific quality is assessed using figures” (17, 18) and there are hundreds of thousands scientific journals around the world in which researchers can express their ideas. But Elster (19) teaches us that “the analysis of the choice as a maximization of an objective such as usefulness or profit seems to promise the creation of the scientific ideal of a sole prediction”. It is a basic mathematical fact that a “conforming” function defined on a group which is also “conforming” will reach its maximum value for a unique value of a variable. The force of the theory of rational choice is inherent to its supposed capacity to produce accurate quantitative predictions… Ségalat (20) writes that a “bureaucracy does not care if the tools which it uses are appropriate only whether those tools suit its own purposes”!
However would the developed Countries, with their different cultures, all be subject to the same obsession? Science aims to be universal - would this be same for States’ bureaucratic methods or for the interests of intermediaries such as publishers of scientific journals (as an example, the budget of the American Chemical Society is apparently around 400 million dollars per year), or finally for scientists who have hemmed themselves in with their elitist principles which do not comply with research management charters? Moreover, Hannah Arendt (21) tells us that “all judgments have hidden prejudice within. Only the individual will be judged but not the criterion or even less the relevance as compared with what is evaluated”.
Having said all this, according to targets (decision makers, research institutes, citizens, industrialists), the classic leitmotiv is to take the least possible immediate risks when funding research. However if techno-scientific or interdisciplinary evaluation is too complex, how can one ever be sure of:
·Making the “right choice”?
·Making the right long-term investments?
·Weak line loss caused by too many assessments, with low coherency levels, proposals needing to be prepared and assessed, all of which is time- and energy-consuming and leads to reduced teamwork?
What should be thought of obligatory competence on a subject and the need for independence in assessing a colleague or one of his/her proposals?
 
These rapidly sketched out questions suffice for us to easily imagine the difficulty in harmonizing the views of the authorities and their researchers, of decision-makers and society, even if everybody agrees on the necessity to judge (correctly) the quality of scientific production for progress… The situation becomes still more complicated if we take into account Debord’s logic (22) which defines the modern (postmodern?) society as a society of the spectacle made up of representations, pictures and  the appearance of small powers which characterize the social relationships between individuals. Therefore the “quality” of this kind of permanent spectacle, while depending on new artifacts, which would need to be evaluated and not the general purpose of a society, which today seems to be defined only by the technological development of the “spectacle”, exploiting acquired knowledge of innovations provided by science.
 
The authors of scientific articles use scientific journals as a major vector for their communication with their disciplinary community and, probably, to a lesser degree, with a broader public, but possess the right language to make their messages understood. Peer reviews and the impact factor of the scientific journal can give an idea of the importance of published work (even if critical information is available on the usefulness or otherwise of this kind of quantitative impact factor) (23). This “material” can undoubtedly be used as a basis for a discussion of how to judge the originality or rationality of scientific methods to achieve consensus but what about the quality of targets and goals? What then does a measure of impact signify if we leave aside possible financial returns? Do we then risk knowing everything about “how” without having indeed discussed “why”? Melman (24) writes: “Moreover it is rather logical that tyrannical minorities which over-promote such traits of their specific natures only see in the law the barrier it puts up to the expression of their uncriticisable and complete narcissism and to the manifestation of their desires”… So, between a hurried reduction and a conclusion regarding the complete inappropriateness of the system of scientific production for credible assessment, the procedure seems riddled with numerous traps which could lead to polemics (25).
 
In these conditions, do researchers not also need to also consider the idea that the requirement to justify their work, to be reviewed by peers and therefore to justify their own choices, influences them into conformity through a strategy of risk minimization for the recognition of the work’s inherent quality? Bettman, Luce and Payne (26) suggest that this would lead to too much stress being laid on the ease of justifying oneself which would allow us to avoid arbitration and possible “negative” emotions. The use of only the appropriate quantitative criteria, which some claim to be relevant, increases the likelihood that consequence determines the cause like with tautology!
Moreover, to illustrate some “sliding effects” on the foundations of evaluation, it is useful to rapidly discuss the context of interdisciplinary research (with applied objectives) evaluation linked to the EU’s different frame-work programs (FWP). During the Cold War these were based on innovation, security and scientific excellence, while today, these framework programs are used for the development of economical competitiveness and for the creation of European research networks in a context which is increasingly concerned with social aspects. Originally the military and security requirements of the countries of the Union (and Western countries as a whole) tended to provide direction for research. Things changed with the decrease in problems between East and West and evaluation became more associated with socio-economical considerations. Excellence is still essential but now research must produce added financial and economic value for European Society. Recently (6th and 7th FWPs), the evaluation procedures have been broadened to take better account of society’s willingness to participate in innovation (producer, user, major trends for the EU society, and repair expenses for insurance companies). These different elements characterize the correlation between science, technology and society and of course need to be taken into consideration for the scientific evaluation of interdisciplinary activities which also have to help open science up to broader Society. A. de Tocqueville (27) suggests that “since works issued from intelligence had become the sources of force and wealth, every development of Science, all new knowledge and every new idea had to be viewed as a seed of power put within the reach of the people” which would mean there are many links between the creation of scientific knowledge and Society.
 
The notion of elitism probably differs when one considers in an integrated manner the role of social player which the development of innovations and their relations with the socio-economy enable. These comments therefore rightfully pose the essential question of the definition of the concept of excellence (and its accompanying modes of evaluation which do not necessarily correspond to a “A“ or a “A+” in the existing evaluation from AERES - The French National evaluation system for science) (28), which must be based on conventions and on a system of values compared subjectively with a stereotypical form of “picture – type”. We then move more towards forms of judgment derived from being recovered from qualitative or semi-quantitative choices. This art of action (29) is defined by the expression of the “appropriate” relation between different heterogeneous criteria of evaluation according to a situation either as it takes place or as it is envisaged. The decision-maker therefore has a real responsibility and cannot easily hide shelter behind stereotypic or blocking quantitative foundations (30, 31). To put it another way, we may also opt to follow Von Keyserling’s provocative statement (32) that “the scholar is the superficial Man by excellence; he is even principally obtuse!” Is he perhaps formatted to the quantitative “h factor”? (33). Or that of Salem (34), who develops his thinking along the “ready be thought” lines: “The French intelligentsia is ready for collaboration”? Or finally that of Fleury (35) who writes the following regarding the concept of the trap: “Performance, a caricature of personal accomplishment, is very certainly, in this perspective, one of the most entropic pathologies of current democracy”. It is therefore quite natural that there should be debate and controversy around the subject of how to judge disciplinary or object-orientated research credibly and indeed what should be judged.
Within these contexts, interdisciplinary research is currently in an awkward position between research intentionally focused on an “object” and basic research which explores a subject further which has led I. Stengers (36) to write: “All sciences which are not a product of a paradigm only claim to be ideological”! Similarly, Lecourt (37) writes: “Fundamental research has no other finality than its own progress”. What is the case then of interdisciplinary research work whose aim is to bring science and Society closer? Researchers doing interdisciplinary work have tended to be subject to anxiety and a certain uncertainty regarding their own legitimacy. This inclination is favorable to a shrill lucidity which, sometimes, can be translated by slightly pathological behavior and even an auto-destructive way of thinking (38). And nonetheless, it will still be necessary to go beyond sterile debates about “pure” science, which is completely free of any social necessity, and by open science (or serfdom for Bourdieu) enslaved to demand (39).

The quantitative aspect
In a recent paper Larroussière (40) expresses the point of view of numerous scientists concerning quantitative procedures which have lead to drastic reductions, considered by some to be indicators of research activity: “Numerous researchers refuse to calculate their own productivity as the CNRS asks them to [this remarks also applies to other institutions]. The absence of robust tools is a moot point here”... He is supported by Kermarrec et al (41), who suggest that research may react emotionally on principle to the existence of indicators because the influence of quantitative descriptors of activity will continually increase. More specifically how could this “h” factor (10) provide a deeper indication of the quality of research work? (42-48). This situation is characterized by deficiencies, certain frauds or work being done twice (41-47). These analyses lead, according to Zitt and Filliatreau (48), to the following advice regarding biases and limits of the indicators which:
·Only convey information on part of the broader spectrum of activity;
·Have to be calibrated to “compare comparable things”;
·Are not particularly well- adapted to the observation of emerging domains of research;
·Require a diversity of angles of attack and levels of observation;
·Have to be supplemented, as part of an evaluation, by other elements, particularly the opinions of peers.
So, indicators should only be considered to be elements which can help partially provide a deeper analysis based on a responsible qualitative evaluation (which also goes further into the subject). Indeed, the existence of numerous biases does not enable access to the whole wealth of scientific research work because of the reduction linked to the sole use of quantitative criteria. But with the time available to make a calm judgment, the trend risks using irrefutably factual databases which should nonetheless be used with precaution. Some examples are given in the INRIA report (41) which illustrates this idea. Moreover, according to Diouf (49) and Giget (50), there are around 164,000 scientific journals worldwide plus 15,000 scientific articles written and 2,500 patents applied for every day by around 10 million researchers... The financial cost of the scientific journals is supposed to have multiplied by 8 in 40 years with spectacular situations like that of the “Journal of the American Chemical Society” which has risen from 50 $ US in 1970 to 2053 $ US in 1998! (51). Questions can also be asked regarding the role of scientific “reference” journals which exploit a “maintained lack” in the individual performance of researchers and in the fame of the scientific journals...
 
Should it therefore be considered that there is an economic will on behalf of scientific journals to choose the best existing articles presenting research results written by the best disciplinary specialists who could lead to the strengthening of “dogma”, an increase in publishers’ power and weakened scientific openness? Wellcome (25) considers that commercial publishers of course try to publish the best articles for the broadest public possible to make a profit and are far from having any altruistic motives… Moreover, De Pracontal (52) also points out that “even if this were not its only motive, it is indisputable that the evolution of the major scientific journals means that they increasingly look for media impact which allows them to “hook” a broader public than specialists alone”. Sensationalism “sells” in terms of advertising, brand image and of course citations and therefore leads to an improvement of a journal’s quantitative impact factor! But in the current high-pressure environment, “how can the power of the performance dimension be avoided?” (53, 54). According to Salem (34) this rhetorical argument also serves to avoid moral dilemmas and give proof of forms of resignation. Girard (55) suggests that it is based on initial cultural foundations (even if they are no longer fully adapted to researchers): “The majority of intellectuals of course claim they are not competing with others; at best, they aim just to excel in their respective domains. The spirit of competition only concerns others […]. As the intellectual world is devoid of hierarchy and therefore deprived of objective criteria, all researchers are inevitably subject to the indirect judgment of their peers”.
 
In any case, for Roux-Dufort (56), the quest for performance and credibility, whatever these may be, “leads to an ultimate simplification of modes of functioning, by concentrating attention on short term reproductions of proven methods. It thus ruins the discernment of individuals and anaesthetizes any alertness not directly linked up with criteria of performance”. The exploration of principles of socially responsible research (57) has made aspects of reactivity coexist with immediate scientific performance on one hand, and commitments in the medium term or when possible in the long term, on the other hand. And, there is traditionally blind conformity/opportunism from those being evaluated. But, in these conditions wouldn’t the standardization role of peer evaluated scientific publications (58) be strengthened? According to Gringras (59), “the existence and persistence of indicators and awards seem to respect a social law which says that “any number beats no number””. Scientific journals are representative of disciplines inside a “paradigm” (60) and therefore in a certain traditional form. These latter criteria define the nature or conditions of the power of concepts vehicled by scientific disciplines. This elitist situation reinforces “the contempt of scientific “mandarins” towards the general public and popularization” (61) and also innovation derived from interdisciplinary research work cannot legitimately compare itself with the tradition of expertise leading either to rejection because of incomprehension or in a more positive way to the emergence of controversies. The library council of the University Libre of Louvain (Belgium) (2003) writes: “The role of “referees” for scientific articles is therefore definitely to organize and promote controversy in order to establish the power of new scientific concepts. The “good” researcher is thus the one who wins such disputes and conquers the skepticism of the most competent researchers”. Is that really true?
 
We have entered a period of evaluation associated with a “qualitative need”, but the evaluation system develops measurements of science’s results with the aid of balanced numerical data (42, 62). Sometimes these criteria are not unequivocal and this means that evaluation cannot be more than a sometimes debatable attempt at a measure of subjectivity (59). As an example, the Thomas-More Institute (63) published a very interesting document about the classification of Universities according to discreet criteria which gave particularly disparate results. For example in 2008, the university of Cambridge was ranked 4th according to Shanghai, 35th for the “Professional Ranking of World Universities”, 3rd according to the “Times Higher Education Journal”, 10th according to “Total MBA Ranking “, etc. Similarly the “Polytechnique School” in France was ranked respectively 300th, 15th, 34th and nowhere in the same ranking? As Lepori, Barré and Filliatreau pointed out (64), performance depends on the “right” combination of indicators… Also in today’s world based on immediacy, it seems important to define well-founded quantitative criteria of classifications which are as representative as possible of reality but then again of what kind of reality? (65-68). It is possible to compare these comments with the fact that a qualitative expertise of requirements lost its meaning at the same time as requirements lost their footing in Nature but is this really possible otherwise? Berthoz (69) recalls that “to decide means choosing the appropriate information from the world in comparison with the purpose of action”. It is a principle of parsimony. It is therefore a question of reducing information to the essential by knowledge of the past without taking any evolutions over time into considerations. Numerous models have moreover been developed (70, 71) to assess innovations or interdisciplinary activities quantitatively; the attained refinement allows, in principle, to take into account internal innovation with the effect of local external interdependences or not, even if several authors doubt the pertinence of current modes of evaluation of the cultural and social impacts of scientific research (65, 72, 73). But to our knowledge this still does not mean complicated correlations leading to the emergence of original and fruitful ideas. “In every field, the expertise of the measure of the subjective needs to henceforth occupy the front of stage” (74). Godin (18) introduces the origins of these modes of definition of quantitative criteria claiming that the scientific publisher J. Mc Keen-Douglas, compiled the first statistics and that the system was reused by governments before the Second World War. At first, two important criteria were considered: the cult of world famous scientists and economic aspects. Another reason which was less easy to admit to according to Cattell (75) was to find ways to attract top European scientists to the USA and then for others to define an elitist classification of universities. Today, the OECD supervises the methodological norms framing government surveys of research. In short, the “absolute” accountant surveys all we produce and of course consume (76).
 
This means that the “best” researcher may be the one who publishes most in the best scientific journals with the highest “impact factors” and indeed this is even partly comprehensible. It is a little easier to be among the best when, tautologically speaking, researchers are also reviewers for top scientific journals, members of editorial committee or best of all chief Editor of journals. Nevertheless, the promotion of new ideas, particularly those of interdisciplinary origin in a suitable context is still not always easy. Researchers often have to publish in less prestigious magazines which cover broader scientific fields... Bachimont (77) tells us that “theoretical research cannot be interdisciplinary […]. So, the objects of study of a theoretical discipline are objects constructed from an ideal and abstract point of view […]. As a result, a discipline is formed only by [weakening] the others”. This is perhaps how for several centuries the methods of disciplinary “exact” sciences acquired a real legitimacy but this is not the same for interdisciplinary work and also for the Humanities and Social Sciences. Some people still dispute the scientific character of these activities (58).
 
And, in numerous cases, the sciences of innovation, or rather those whose development on is based on interdisciplinary foundations, derive from the social field be it just for their impact on Society and on the environment. Are we all therefore cursed to be difficult to assess “well”? In effect, is it not possible that traditional modes of evaluation have imposed their own norms on interdisciplinarity which obviously introduces bias? If as Guillebaud claims (78), “science no longer obeys the ancient rules of being inventive and free and of academic validation”, then why try to define a universal framework for evaluation which is in fact rather badly adjusted to the context of interdisciplinary open-mindedness?
 
According to Solé (79), the search for efficient and sometimes heuristic models may lead to provisional proposals having defects of universality. This could lead to worse evaluation for interdisciplinary projects than for purely mono-disciplinary approaches with more robust procedures. This point of view can be found in the work of Lubart (80) which links innovation and creativity with the latter defined by novelty and adaptation to the context in which it appears (81). For these authors, judgments strongly depend on the culture of the reviewers who analyze creation in another way than its character, its adaptability and its amplitude, etc. However at the same time, the complexity of systems explored by interdisciplinary actions can serve as an argument to justify a certain ignorance of fundamental knowledge... In these conditions, the debate continues... Solé (79) writes: “The complexity of research problems leads to multidisciplinary teams being formed. But, in a subtle inversion, complexity is now demanded by researchers in order to isolate themselves in cramped communities and create many sub-disciplines to acquire their legitimacy. The current crisis in the French research system seems to be linked to this context for us”. Which of course complicates the situation even further...?
 
Hubert and Bonnemarie (82) discuss the difficulty of evaluating interdisciplinary research in depth and they consider that numerous reviewers assimilate such research with “ordinary” mono-disciplinary research which is just enriched by supplementary input from other fields. “To assess both the rigor and relevance of such research means analyzing and judging the products of research by confrontation in the contexts of disciplines and testing of contents and frontiers and also from the point of view of categories and from rules of action mobilized by the partners involved. It is also means making the logic of building and processing of research objects explicit”.
 
To further develop this point, if we examine the possibility of emergence of a new idea or an original artifact, should we not consider the possibility of gaps existing between reviewers chosen for their disciplinary competence (therefore normally credible and legitimate for the scientific community at whole) and the research object which has emerged. It is possible therefore that controversies, misunderstandings, disagreements and certain forms of corporatism may appear which experience shows may translate into a low level of support. This was the case of the stereo-photo-lithography created a long time ago by J.C. André and his team (83) and awarded a a National Prize of Creativity by the majority of decision-makers who some years before had refused to take risks because it is really easier to support winners. Recognition is too weak to leave continuity behind and deepen research (except some historical “unruly” researchers, who, perhaps, would no longer be able to find a permanent position?), because institutions, guarantors of average quality, at the highest possible level however, remain on the sidelines or seems disorientated when faced with new developments in research. The theses of the “New Public Management” demand tangible and foreseeable results based on “just-in-time management” from Science which is generally an unpredictable “long–term” activity. These theses adopt the hypothesis whereby science follows a normal law, that is to say that the variables studied can be represented by average and dispersion parameters which then means that the measured variables can be the object of an appropriate mathematical treatment…
 
Could we not also consider, like Bruner (84), that the sciences of statistics were only “invented” to conceal our weakness in predicting the future? The divergent and “revolutionary” changeable aspects are therefore not envisaged any more than aspects of self-government and long-term choices. “Contingency is the key word for current science with regard to its comprehension of temporal dynamics thus exceeding the former antinomy of hazard and necessity […]. A look back over the past few decades with hindsight shows this unpredictability” (85). The scientific system functions more or less by leaps forward and precisely it is in these divergences that interdisciplinary power appears as a form of determinist chaos and factor of creativity…
It is true that an analysis of scientific works developed with minimal risk, linking several reviewers coming from different disciplinary origins does not directly lead to support for interdisciplinarity because such reviewers apparently cannot judge the correctness and effectiveness of what is proposed. As Niels Bohr said, when forecasting, the easiest factor to master is is the past! Indeed in this David and Goliath type relationship, generally orthodoxy is the strongest standpoint which of course restricts the researcher’s freedom of action. This conclusion reminds us of a remark by Godard et al (86), published in another context, namely that “even when aimed at precaution, caution does not abolish all risks because many people are worthy of being known, others are unavoidable, or because some are unknown at the time when the decision is made”. It assumes a deep change probably in modes of evaluation in general, especially in the space of actual competition between research groups worldwide…
Deep down what numerous researchers seem to feel is an image of freedom but in a space where above all the first principle of thermodynamics is applied to make the most of financial support and personnel with a view to achieving a certain effectiveness based on the capacity to act to achieve a a given objective. Conjuring up “Maxwell devils“ moving randomly moving molecules in a specific direction to reduce all possible risks (or entropy) is to treat the second principle of thermodynamics with disdain and ignore the views of Michel Serres (87) regarding the ore mines where the useless upper crust must be withdrawn at high cost to gain access to the required ore: “How can we get rid of this surface crust which prevents us from getting to that which is real to the point that we even doubt that it exists and may  give convincing reasons to show that it does not exist?” These principles are to some extent the basis for his statement that “The clean is acquired and preserved by that which is dirty”!

To go a little farther
Thus the transition between “assured” sciences and interdisciplinary innovations does not allow us to respect uniquely quantitative foundations except perhaps by those possessing a modicum of well-adapted analysis like for instance the number of patents in a given domain and for a given country (68). At the research team scale, Lesourne, Bravo and Randet (88) have attempted to define excellence on other foundations such as their capacity to change socio-economical know-how in depth or to transform “academic” scientific knowledge into applications (67). It is therefore possible to make judgments and not just refer to accounting balance sheets. In effect, if we apply the ideas of Albarello (89) to certain interdisciplinary aspects, we may conclude that:
 
·Their objects are not strictly objects, but instead artifacts which are close to wider society and capable of influencing life in that society;
·Action is by necessity subjective because it is linked to its relations with society;
·The researcher is himself a subject, and in this sense part of the reality which s/he studies (57).
 
If this perception may seem new for some, this is because we are going through a period of transition from the 1945 to 1975 era (or as the French call this period, the "Thirty Glorious Years") to the present “postmodernist” society, where everything expresses itself by complexity even if many actions are (still) recovering from causality… In any case, at the least, the current uncertainty leads to a visible multiplicity of points of view, among which some are receive a great deal of media coverage. These points are linked with different origins but require:
 
·To get closer to actors, or even to work with them on the resolution of concrete scientific problems;
·To acquire methods that are as simple and as rigorous as possible so that they are applied by those concerned (the problem of “true” transfer, whether it concerns technical, organizational, social or pedagogic innovations).
 
As was the case with sustainable development and the Corporate Social Responsibility of Firms (CSR), these findings lead us to consider variables and integrative measurements taking into account correlations between the three “pillars” namely scientific production (the economic in CSR), the social aspect and the environment (90). This author offers contribution indicators (contribution by one of variables to the two others, even if all are not known, nor creditable) and synthetic materials (reciprocal contribution of every pillar) (cf. also 91). The system still has faults and companies still ask for quantitative dashboards which are easy to follow to such an extent that the qualitative objective sometimes blends in with action… Stéphany has listed some of the more important elements when considering the difficulties met:
 
·Those involved may have different interests (researchers, firms, society, States, etc.);
·It may be necessary to start work without everything being clearly defined (92);
·The immediate satisfaction of those concerned is considered to be an essential performance factor (lack of long term outlook);
·It is necessary to have a common production of knowledge of what is efficient (ill conditioned system leading to several approaches) for the resolution of a question (93);
·The joint construction of scenarios depending on members of the group of scientific players; the difficulty of taking into account potential users or people exposed to the harmful effects of research results (93);
·The complexity of socio-technical systems linked to different temporalities leaves the way clear for interpretation;
·The criteria to be measured are cannot all be determined, particularly from the quantitative standpoint;
·The balances to be applied to between different items are not easily determinable.
 
The EREFIN report comes from a working group on the future of Engineering Sciences under the presidency of Robert Chabbal (67). This applies structured thinking to this subject allowing one to go further than then current system of evaluation by envisaging the exploration of following topics:
 
·The activities of a research unit within the academic world: publications, cooperation, networking, development of new instruments;
·The activities with the socioeconomic world: partnerships, operational knowledge, methods, innovation, know-how, expertise, advice;
·Activity aimed at the public sector: knowledge for public decision, expertise, and advice; …
·Education and training by research;
·Relations with broader Society: analysis of social needs, technological monitoring, dissemination of culture, interaction between science and society;
·Internal activities in the research unit: building a common scientific goal for the unit, governance of the laboratory; …
·Etc.
 
This proposal enlarges current action modes of evaluation (AERES in France) which are interested in:
 
·Production: quality, quantity, impact;
·Attractiveness: National, international;
·Strategy: management, emergence of young teams;
·Project: quality, opportunity.
 
To “nonetheless” continue with this difficult subject and attempt to achieve coherency with comprehension of authors on interdisciplinarity, it seems important to recall that the researchers have to also do the following:
 
Creativity: The brain invents hypotheses permanently, suggests possible solutions in an ephemeral manner, but not always robust resolutions all of which corresponds to useful but risky actions. However, the knowledge of the real and the possible, the modes of evaluation and difficult to acquire financial support can constitute barriers for researchers trying to leave the “beaten tracks” of existing research behind. Luis de Miranda (94) considers that “the space is too saturated with information for creative imagination to be able to work successfully […]. The more precise sampling […] is, the less our “soul” participates to fill up ellipses, to interfere creatively, erotically, in defaults”.
Forecasting: The prospective fields need to result from a societal request aiming at removing dysfunctions and/or anticipating this request as it is envisaged and imagined. Ad-hoc groups, circles of research, industrialists, policy makers, associative networks, etc. need to lead thinking about this and their work needs to be validated by “society” or its representatives and should correspond to what needs to be done to satisfy tomorrow’s demands. Anticipation and linked imagination therefore have to be dominant for this activity.
Strategy: Research units (probably) do not possess all the means necessary to respond to this whole question. This effective constraint can however turned to advantage by selecting the clear favored domains which they choose to work in. It is an important element for the creation of a responsible brand image. It is, in effect, a means to give sense to actions.
Tactics: Strategic thinking must of course cover the means to satisfy requirements (skills, critical mass, staff members, temporalities, financing, openness to others, etc). In this sense, there is the obligation to achieve results as defined by ad-hoc committees. This element has medium term advantages if researchers make the right choices. However, it moves away from any context of “false democracy” where nobody is criticized but nothing much happens except incidentally. For the authors, the scientist must work in favor of progress for society which finances his or her research and must take the risk of acting (!). It is probably necessary to leave behind this intellectual heritage which prevents, or at least, slows down the apprehension of the complexity of choices by creation of a filter of acceptability and immobility (95).
Interdisciplinarity: To work on/for artifacts means interacting with different working activities and scientific disciplines. One of the difficulties of association with a view to reaching a target partly depends on “the way in which attention, recognition and authority are [principally] canalized by the disciplinary institutions” (14). At the same time, the majority would present interdisciplinarity as a good thing (96). Would the overly extreme specialization of researchers be a bad thing then?
These considerations integrated into research work defined as interdisciplinary projects are not easy to quantify; the number of perceived parameters is high with numerous interrelationships. This consideration therefore leads us to re-examine modes of action in research in an attempt to respect Baron’s idea (97) namely that “the intellectual worker [the researcher] has no chance of being permanently competitive if s/he does not know how s/he contributes to the produced value”. It is definitely a question of working on respective modes of learning and specific productions of partners for a project, by respecting certain criteria of effectiveness and maintaining respect for other partners. The work of researchers must then be analyzed by taking into account tensions which exist between their disciplinary norms of action (scientific strictness for instance) and the social relevance of their results. According to Schön (98) in the field of Science this amounts to a debate about human values.

How to build the research object?
It seems reasonable to think that the emergence of a research project is not so easy when we ourselves are directly involved. This lead to a desire to continue with scientific work that has already been a success, or the pleasure of deepening one’s knowledge, and perhaps finally of avoiding thinking and questioning oneself. On this subject, Alain wrote that “to think is to disclaim what one believes”. Do we always wish to do this? As is definitely known, how should the emergence of subjects accomplished in close association with societal action be structured and possibly brought forward in time (a nice potential alibi)? In principle, and this is at the crux of this paper, research has to produce knowledge and to develop understanding of society, such as it how it exists in daily life and how it will live in the future. In this respect, there are indeed two potentially opposing proposals:
 
- The first consists of anticipating social demand, corresponding to the optimization of scientific knowledge allowing progress and a higher levels of well-being linked with possible adverse effects (57). This anticipation must be translated, not only into research work, but also in an adapted communication strategy (the problem of transfer from scientific knowledge to society). In this kind of top-down operation, several disciplines need to act together. As in any human adventure, it is always difficult to play the game of interdisciplinarity, but the de facto “clients” demand it. The plan is not to content oneself with anomalous or biased answers and even less with partial answers because here we are dealing with serious problems where Man is involved. Research activities must also be credible for those who request them in the field of socio-economy. However, as has been pointed out, the activities of researchers with specialist knowledge are always in a position of uncertainty in relation to the element of the complex request which they translated or understood… In these conditions, to restrict this uncertainty factor, a diagnosis, using research of an interdisciplinary action, has to be only partly deductive. It uses collected knowledge in the development of its projects (practical knowledge or expertise exploiting inductive approaches), relational knowledge, of know-how in communication strategies… Open-mindness is vital for this relational mode with Society.
 
However how can one respond “well” to a social request, if its anticipation was badly pre-defined? Apart from with permanent problems, it is difficult to define an objective, quick and credible way of definition of the fields to be studied. Besides, when actually carrying out the work, should researchers aim to satisfy the request resulting from what is desired or act to cover all requirements or even provoke requirements when these cannot be seen initially by the potential beneficiary? It is therefore easy to understand how such modes of action which aim to be orientated towards technological innovations authorized by Science may act in a climate of uncertainty. In any case, it is necessary to define targets of action as well as “pathways” which mean using the right communication strategies and constructing the right relationships with users, to attain those aims…
- The second avoids uncertainty to a greater degree along with the “risk management related to actions” and takes into account comparatively stable, present disciplinary structures in the academic world, which act in a bottom-up logic within their research. The hypothesis is that there will be at least one customer for every published work. Another visible “advantage” to this consideration, the complicated nature of research work, can then, for some, avoid this obstacle by simplifying complex renowned situations or by reducing them to the analysis of one of their basic components. Other researchers would need to take over these results to work on their possible future integration... This possibility could lead to a pedagogical method based on the “art” of the logical demonstration, established step by step, from simplified elements. This art of normalized and Cartesian discourse is, more or less, the perfect example of mono-disciplinary university education: a science of fair, validated reasoning and establishment of proof. In this context, modes of communication are orientated to the scientific community linked to a traditional judgment by peers but do these fulfill the humanist mission of Science?
 
This second way of action avoids a form of social evaluation more but in certain situations, can be an element of the brand image of an organization which must “produce” scientific messages validated internationally (competition requires this). However are they then sure of the trust of the socio-economic players for these kinds of scientific work? This proposal enables projects to be managed with greater serenity in principle thanks to a strategy centered on disciplinary elitism, separated to a large extent from the necessity of finding links with an activity regulated by the definition of purposes – project – and by the mobilization and management of personnel and financing to attain improved effectiveness. It is also important to underline the point that disciplinary scientific excellence can also lead to innovation (cf. innovation via the serendipity (99)); in this way “Top-down” and “Bottom-up” are not in opposition and can both work towards the same aims. The dilemma then is whether to be an actor for economic development in a useful social context, using means which will always be restricted and within one’s scientific mission, or to be the “respectable shop-window” with other forms of usefulness for a research organization covering all fields of Science! Reality, as always, is of course less Manichean. The “shades of grey” are numerous like for example the centrifugal trends which result from complex correlations between science, technology, progress, risks, society and policy. These shades of grey were translated by de Ketele and Roegiers (100) into seven types of research:
 
1.Scientific, based on  an inductive-hypothetical-deductive methodology;
2.Technological tools and solutions for the socio-economic players who require them or to aid with decision-making;
3.Evaluative with a projected and long-term perspective;
4.Work examining a situation from the point of view of participants;
5.An exploratory, heuristic phase aimed at generating hypotheses;
6.Descriptive, when description and classification are a precondition;
7.Speculative, centered on long-term function.
 
These different types converge to varying degrees in interdisciplinary research work and it seems useful to define the common rules which it is necessary to respect, knowing that in all research situations, social reality, social phenomena, and “social facts” in the sense of Durkheim (101) should always be taken into account. It would however be erroneous to think that a “scientific adventure” turned into an artifact will always be well assessed by socio-economical circles. Foray (102) discussed perpetuated habits as in other contexts and wrote the following: “Older technology imposes its own norms of economic evaluation on new technology […] imposing a kind of bias at the time of the exercise linked to economic added value. The function of these norms of assessment is to define the nature of performances on which competing technologies are compared”. This situation of non-neutrality observed in scientific evaluation can limit action especially when a new technology appears which does not always possess the optimization of all its essential elements (particularly the most fragile elements derived from scientific research).
 
From the field of study through hypotheses
The question here is how to transform a perceived or anticipated “social questioning process”, as posed by social partners into an object of scientific research. A six-step process can be proposed which progresses from general to more individual considerations. In experience, this does not always correspond to an “ordinary” progress of thought:
 
·Identification of the field of research;
·The choice of a priority disciplinary angle allowing an economy of means but which enables researchers to reach their desired progress targets;
·The selection of social objectives within this field;
·The precision of the theoretical frame adapted to this objective; abstraction and conceptualization;
·The definition or recapitulation of a certain number of hypotheses;
·The correct evaluation of the project’s time scale, personnel means and financial support.
 
This description does not include either modelizing or normative will however. These elements must be included even if returns are necessary in the course of the time. In any case, confusion and non-satisfaction of these conditions is harmful and can lead to the failure of a project, particularly in interdisciplinary action research.
 
Field of study
The fragile and ephemeral structure of interdisciplinary projects is based on the low number of researchers who could be involved. In effect, given the distribution of scientific skills and an important potential demand, it is difficult to say with any credible accuracy that one field is more of a priority than another as long as it is has some social usefulness (it is the old debate between greater interest for some people versus average interest for a large part of the population).
 
Disciplinary approaches
A scientific evolution of technological origin which interests Society, whatever it is, can be studied from various disciplinary points of view. All researchers must intervene in their own area of disciplinary competence. However, the association of points of view, supplementary approaches and/or in synergy must contribute to involving elements on time which is necessary for progress objectives (management in project mode). Given the request or the expression of this request, the researcher cannot act alone, inside his mono-discipline. This very general frame moves away from university “traditional” sciences in the sense that scientific activity is in practice always multiple-disciplinary, implicating “Man” necessarily and his environment. To enter this field of action, the disciplinary researcher needs to possess a “minimum” of general knowledge which can be shared with his or her partners. In effect, as has been pointed out, research is in general complex because it is based on techniques, organizational aspects, perception and so forth. This correlation between Science, Society and the System of production, material or not, constitutes the privileged place of actions of the Engineering Sciences (103).
 
Choice of a research objective
A precise objective in the field of action needs to be determined. “Having already studied a field or other more or less comparable fields […], [the researcher] will naturally concentrate on certain problems or certain situations which s/he can in a way assume from experience to constitute critical zones or conflicts the privileged analysis of which will allow him/her to advance faster in the understanding of the features of its field” (104). This vision which is comprehensible for the academic researcher involved in “unintentional” research may lead the researcher to think that s/he is capable of infinitely deepening knowledge of subjects but with a level of social usefulness which may diminish in correlation with the time invested in the project. Researchers can look for other subjects – it is of course necessary to learn to take risks! – or even accept training courses allowing them to re-situate themselves in the present economic and social reality.
 
Remark: How to specify objectives? With whom?
 
Before proposing new research work, it seems a good idea:
·To identify the dimensions of proposal, its components, the various ways of stopping it, to “assess” it;
·To perceive probably various ways of understanding with which other actors may perceive this topic (a long time before writing of response to a call for tender); do they see an interest for the interdisciplinary actions in Science?
·To formulate “appropriate” working hypotheses;
·To show the different factors of influence within the proposal which may be used early on to explain possible later effects?
·To identify the possible consequences of various modalities of the topic;
·To begin predicting the complexity linked to the proposal;
·To perform a serious bibliographic analysis to avoid copies and redundant work;
·To examine if they are the best in the scientific community to work on the project. Should one involve people with other skill sets from outside the initial group, envisage mobility and so forth?
 
Theoretical context
Interdisciplinary activities all have a fundamental interrelationship and a correlation between empirical aspects on one hand and scientific contributions on the other hand. These relations between theoretical foundations and empirical observation lead to the production of results, or even more comparative concepts, linked to the crossroads between Science and Society (105). This means that questions need to be asked regarding the production of new theoretical means which will allow scientists to continue the construction of the object of research. This is the function which fills the theoretical frame. According to Albarello (89), “the positioning of the method in a theoretical context does not occur in a mechanical or artificial manner. The theoretical context is neither an administrative burden nor an academic activity. Its function is to not to complicate but to make the life of the researcher easier; it should allow researchers to open new directions and so […] introduce original concepts and suchlike into their work to enrich the whole method”. The theoretical frame is necessary because it allows researchers to enrich their study of the following elements: the way to define the problems to be solved, concepts to be used, tools of empirical approach and best practices. Indeed, choosing a theoretical frame means aiming to conduct research in a particular direction because it lays emphasis on the specific aspects of a given field (problem of positioning).
 
Hypothesis
A hypothesis is not easily found or unexpectedly discovered in the course of research. It is necessary to construct and support it! It tends to formulate a relation between facts and constitutes the thread of research normally. In effect, it enables researchers to propose concepts and techniques as well as methodologies to be implemented. It comes in the form of a proposal to be tested putting in contact explicative variables and social facts which researchers try to understand better. What is the use of strictly technological and quality research work if it is rejected by society? Hypotheses must be conceived in verifiable terms using ad-hoc technology and then explained. It must avoid searching for correlations without establishing links with possible causalities…
 
An inventory of means
This is obvious and yet … Do researchers have the time, money and particularly the “human resources” to carry out this step? This inventory is necessary at the beginning of the process because it conditions the choice of techniques/methodologies to be implemented. It has a major influence on the choice of specific objectives and on the identification of hypotheses. If researchers have time, do answers need to be found rapidly? If they do not have sufficient human resources, can results be obtained too late? What should be done? Should researchers reduce the objectives of the project? How should the project be explained? Should one try to answer in another way, by a collective expert study for instance? It is also a good idea in this context to examine how to link other partners both internally and as part of outside collaboration work. Time cannot be simply defined by a distance between the beginning and the possible end. A schedule of due intermediate deliverables is necessary and of great importance in multidisciplinary operations which interest socio-economic players and the society at large.
 
Approaches to research
Lecomte of Nouy (106) writes: “The strength and prestige of scientific arguments are founded on their rigor, clarity and on the trust in events which are of use for the scientist as a foundation - scientifically observed facts or if you prefer, facts which have been honestly tested and, as far as possible, are repeatable. But the number of people who know how to cover all these facts and are able, having criticized them, to draw legitimate conclusions by means of a logical reasoning process based on all previous personal experiments or not, is extremely limited”. In this idea, there are two supplementary elements linked to the quality of practices and people. Whatever the project, action, quality and know-how are inevitable elements of trust. In practice, in scientific or technological publications it is hardly ever possible for readers to never criticize results on a fundamental basis (unless they repeat the study). It is possible to question hypotheses and conclusions, or even to highlight criticisms on certain choices. As has previously been stated, it is necessary to manage heterogeneous networks – both within and outside the project team– which cross borders between science, society and sometimes policy (of companies, of scientific organizations, etc.) and which can be potentially controversial (this situation, led by the nearness of socio-economy differs a great deal from the situation noted for sciences considered by some as more “noble”…) (107).
 
The quality of research work (the excellence?) is certified by peer review when researchers publish in national or international scientific journals with scientific committees. This is an element of the elitist promotion which has also already been mentioned. It is also a basic element of research activity. It is neither correct nor fair to repeat basic disciplinary research work which must, in principle, be carried out – in a more efficient manner – by other more specialized researchers. Scientists working on interdisciplinary projects need to translate knowledge from other research disciplines to use them in a concrete fashion in applicable actions (except, of course, if none exist). In order to do that, researchers need to master the languages and concepts of the people they are collaborating with which involves cooperation work and publications in scientific journals (cf. coeducational research units, networks, etc) to avoid the risk of reduced relevance. What balance can be struck between “basic disciplinary” and “orientated” researches? Where is the “neutral” balance between the creation of scientific knowledge and scientific work using the knowledge of others? Where must they place the cursor? Which time scale should be used? For the second step in communication, research works correspond to several types of action:
·The observation and measurement of phenomena. Instrumentation mean there is an easy closeness between applications and scientific research obtained by defining validated techniques of measurement;
·Understanding society’s possible “reactions”: identification of advantages / disadvantages in relation to the existing situation; added value;
·Technological research which aims to provide products/artifacts/methods which exploit technical, socio-technical or social “methodologies” and thus allow progress. In this context of multi/ interdisciplinary actions, the aim is to use knowledge (or the absence of knowledge) to find original ways of progress;
·Works typically concern the incorporation of disciplinary knowledge into “object sciences”: energy, sustainable development, communications, cognition, techniques, well-being, health, OSH, security, etc.
 
For innovation (of interdisciplinary origin), knowledge created by research must not just be collected in data banks and add to accumulated scientific knowledge. Nor is its only virtue the development of a personal ethic of knowledge but innovation should instead promote civic and human ethics. Links with the socioeconomic world need to avoid partners being dispossessed of their right to a point of view to the advantage of the scientific expert, the “sole” possessor of decision rights according to his or her competence! Moreover, Morin (108) writes:”Specialization includes progress, really, because progress is in the organization of work which allows the development of knowledge but it also produces a decline in knowledge when un-communicating and fragmented knowledge always leads to mutilated knowledge; and a mutilated knowledge always leads to a mutilating practice”. It is possible to show that the actors of technological progress can be there to reposition the researchers, if needs be, within their scientific mission.

A short return to the subject of quality control
Many traps exist for the integration of scientific and technological competences into the world of material production. It is however essential to be strict in the quality of what is measured and of data processing. These factors condition trust between partners with different scientific backgrounds. On this basis, should we not have traceability avoiding loss of “objectified traces” which can, a posteriori, constitute a true wealth allowing data to be re-worked (but not only)? The “legal framework” applied to everything (cf. European Union on the recommendation for a code of best practices in the field of nanosciences and nanotechnologies (109) as an example revealing taking of a future which breaks with “old” traditions of “irresponsibility”/of complete independence of the researcher (Galileo syndrome)) increasingly requires the existence of irrefutable proof.
 
Signal processing
It is certainly normal to read gathered information well. We should not however consider that information which corresponds to our hypothesis is part of the signal and that the rest is only noise. The processing method must be known, robust and validated. This obvious situation for mono-disciplines becomes more difficult to verify and validate in interdisciplinary approaches because there is an obligation to trust information from other disciplines.
 
At the end of the research project
This stage gives true sense to scientific work carried out to produce innovations. It must give sense to the analysis of data, avoiding the simple descriptive framework. To reach this target, it is necessary to confront the initial hypotheses with their theoretical framework. This implies a critical study in return on methods and techniques used (and everything needs to be done again?). Also, and this is essential, how should results be “transferred/returned” to Society? How should we provide information on the evolution of practices and manners? How can research be linked better with reality and social usefulness? How can these necessary transfers be carried out? These different elements are sometimes neglected and in any case, comparatively they are not present to any great degree in the majority of current evaluation procedures. For the authors, research is an “instant of distancing” in comparison with a given social reality but it is also part of that same social reality! Unless research work remains confidential (!!), it will enable the scientific actors to gain greater knowledge of the field in which they worked. This connection with real or estimated “users” can bring out new research proposals of a mono-disciplinary nature to correct scientific blockages.
 
However, in this framework opened up to Society, the researcher is probably not alone in deciding how his or her scientific production will be used. S/he works alongside other players (with whom he will normally already have interacted be it only in the early stages of study) such as decision-makers, stakeholders and numerous interlocutors. “Research is part of the field itself and takes its place in a system of action. The question of its usefulness or its use will therefore be, according to the contents to which it leads, object of negotiation, of various strategies and will be part of the game” (89). Renard (110), on this subject writes: “The question of the usefulness of research is not so much that of the use of its products by actors-decision-makers, it is that of the use of information first which these posses regarding their own actions, that of their capacity to become an object of observation in this positive attitude of availability and openness, in critical analysis and in change”. This means there is a partial transfer of responsibility, the field has been “enlightened” and others must be able to act in full knowledge of the facts.
 
And tomorrow?
The points discussed above illustrate the fact that there is a “basis” for relations with Society. However, two comments lead to a necessary evolution: these elements firstly correspond to the definition of a strategy which is linked (or not) to a real or potential demand from the public based on technical progress and secondly to the deepening, in the logic of demand, of scientific fields with a defined goal. In general it is possible to offer applications to society linked to technical progress but an approach to possibly harmful consequences effects tend to be overlooked (this leads naturally to the question of whether research could be the object of significant financing from decision makers? In the West, less than 1 % of financing for nanotechnologies is linked to studies of risks for society and the environment). To return to a better coupling of the advantages and disadvantages for society, the concept of socially responsible research (SRR) was developed (57). “Diversity, complexity, imperfection, vulnerability, here is the force of Ulysses, here is the force of the Man. Ulysses does not try to steal their godhead from the Gods, he compares them with his own humanity which is the key of its freedom. […] The world which is built, to be more human, but especially to survive, must draw inspiration from the education of Homer” (111). It is definitely the goal of the SRR.
 
Adams (112) says however that “each citizen is a genuine expert on risks in the original sense of term; we have all been trained by practice and experience in the management of risk. The processes of trial and error by which we learn to crawl, then walk and to speak involve decisions faced with uncertainty. In our development towards motor functions, we progressively refine our techniques of controlling risks”. The user is concerned in principle by the risk which he or she is supposed to know and master, but this is not so simple for a lot of (good) reasons covered in this essay. It is therefore an important field which must absolutely be explored.
 
So, whether this knowledge is intuitive or revealed by each person, it must be “translated” by appropriate information which is the less distorted as possible (problem of popularization of scientific knowledge), a necessary condition for transparency and a condition for a useful democratic foundation (but probably difficult to achieve). However, mistrust of risk is the subject of too few research studies which means there have sometimes been very contradictory results on one given topic. This tends to confuse Society which requires a clear answer without shades of grey.
 
This inability of the scientist to cover all fields supported by the multiplication of new information, with which some people are alarmist, causes further collective anxiety. And, in a lot of complex domains, science fumbles and any useful information (even biased) in mass media can be exploited for sensationalist purpose. What should be done? First of all, in the elements of forecasting which can be acquired, it is necessary to negotiate the desired/required strategy with stakeholders. Not everything is possible in a world which functions under the pressure of time, according to human criteria, strategic choices in terms of research subjects, space, etc., which demands rapid answers to often barely formulated questions. It is highly necessary for researchers to explain what they want to do and how they intend to do it to all citizens who feel concerned.
 
In the interdisciplinary domain, it is probably necessary to also explain what will not be done because of a lack of time, means and staff. This strategy of reduction needs to be shown, and explained to lead to the organization into a hierarchy of actions and explicit and responsible choices. Dialogue, negotiation, alliance, trust and confidence, transparency, subsidiary choices are thus the key words of this essential communication which is probably difficult to carry out, except through ordinary blind conformity given that the field to be explored is vast and ever-changing… However, a cooperative and interdisciplinary approach is only possible if truly common work can be carried out aimed at operational effectiveness (i.e. not a simple "addendum" of disciplinary knowledge). In effect, it will be necessary to manage cultural differences, different dynamics of actors from different scientific fields, difficulties induced by distance, even if the internet is likely to make communication between partners easier. Will they attain their planned objectives? In this example, the organization in a network can be seen as an essential element for transparency even if effectiveness is not easy to optimize. In fact, the redeployment of activities implies local pertinence and the association of well supported scientific fields. It is the only way to exploit useful knowledge from different fields through synergies (critical mass for the duration of a project).
 
So, the change of scale, the movement from the national state to the European Union must be translated by the local strengthening of certain specializations (the notion of critical mass and competitiveness in relation to stakes), an operation which cannot be achieved just by waving a “magic wand”. It probably enables us to avoid the simple quantitative progress of published scientific works which could be a deviance of the system, or even an increase in its entropy... Other elements of reflection in this negotiation of an (explicit or moral) “contract” explained to society, defined by both those working on a project and broader Society, the place of the “client or perhaps the user”. The client/user cannot be considered to be a partner situated at the end of the chain who can sanction proposals and existing research activities in a binary manner. The Sciences of action do not have to try to define this contract as a guarantee, or even an alibi, but do need to question the “customer’s partners” in order to assess scientific productions and diminish the distance in its relations with the world (problem of successive evaluations or of the “weak link” of a system) as far as possible.
 
This specific consideration corresponds to opening research up to bring it closer to social demand or requirements. Research work must be useful for society, today or tomorrow (otherwise what would be its real justification?) and therefore must have visible, quantifiable, validated aspects which are also sometimes conveyed by the media. Leaving aside the “stereotypical portrait” of the “consumer’s customers” (with feedback) for scientific created knowledge, a “go-between/translator” is required to provide information in both directions (downstream and upstream) on how the response evolves, often in an interdisciplinary manner, to a question which interests the Society. Beyond the disciplinary aspects (multi), this mode of relations requires the resolution of human, organizational aspects and of management or governance… In their work, researchers are (and will be increasingly) led to control all factors of influence which enable them to attain their objectives. The necessary incorporation of knowledge and work (including information) enables the provision of messages which are more useful because they are better directed towards an efficient answer. These developments are produced with an overall and strategic vision of cause and effect and will however no longer be able to always be published in a disciplinary manner because they are less visible for the majority of scientific journals. Scientific and/or technological communication will be done in other forms closer to researchers “in the field”.
 
However, “weighty” projects, such as those described above, require the cooperation of numerous operators from different disciplines… This implies a certain reduction through strengthening and increased lisibility, of the number of research subjects to be worked on as well as the necessity to create “team-work” centered on a “technological or scientific issue”. This evolution, linked to collaborative work at the national, European or even international levels, must be explained, or even negotiated within the research team and/or with certain decision-makers. The fact that only partial coverage of the scientific field in an interdisciplinary project is possible is accepted by all partners and is probably an element which strengthens trust and confidence. This indeed corresponds to a “contract” between the sciences of innovation and the Society in a frame of opening and transparence enlarging the concept of SRR (57).
 
And after tomorrow?
To move towards the concrete and real world of exploration of phenomena linked to innovations cannot only a good intention and must correspond to a real and collective commitment. However, even if trust and confidence can exist locally with the learning induced by the proactive “encounter” of the researcher with the “reality” of its environment, allowing correction, the long-term strategy to be constructed may be linked to a terrible uncertainty. If, in principle, it feeds on the current knowledge of citizens and their numerous and complex requests or wishes which are often badly expressed and on certain social acceptance for its previous proposals, it may find its frontiers and an improved responsibility in the definition of future linked to research leading to innovations and to major challenges for the Society. First of all, the existence of breaks linked to programmed or accidental social events, to political changes, can lead to transformations of the strategy of action of domain. This is based on the research personnel which may restrict the dynamics of evolution in relation to that of the society (a problem of reactivity and researchers being too specialized in a given area but is it easy to be really competent and expert in all areas?). Even if this forecasting is imperfect, defined by real knowledge, by inductive contributions, of elements linked haphazardly and on haphazard meetings, negotiations with the society can always be envisaged. However, the definition of an agreement can only be based on principles and rigid procedures allowing current interests to make heard themselves heard in order to be taken in consideration. Like Kant (113) we may be optimistic and follow the hypothesis that “in morality, human reason, in the most common intelligence, can easily be carried to a high degree of certainty and perfection”. In this context, Pascal’s wager is always won and proposed evolutions are always aimed at achieving long-term progress.
 
Indeed, the evolution of the society is a very complex system which is particularly sensitive to initial conditions (cf. the butterfly’s wing paradigm!) and, while acting in an ethical frame, validated periodically by others, the improvement of knowledge linked to innovation can, by accumulative (memory) effect have a very important impact – moving away broadly from what was envisaged and accepted at first – on the place of the Man in this world in rapid evolution. Jonas (114) in “Principle Responsibility” tells us that “future is not represented by groups; it is not a force which they can throw into the debate. What does not exist does not have a lobby and those who were not born are without power”. How can certain disordered evolutions which are sometimes difficult to perceive of Society be followed in order to redefine research work which can satisfy its requirements? Will intuition on the evolution of values suffice to link the future with the present? In this sense, they can define an ethics of future? The OECD (115) listed certain obstacles which can harm the development of interdisciplinarity and the sharing of ideas:
 
·Existence of university departments centered on disciplines;
·Mono-cultural scientific journals;
·Lack of communication between disciplines;
·High risk of failure of interdisciplinary operations because of a sub-critical mass of actors;
·Evaluation by peers and difficulty of the taking into account of the added value of cooperation;
·Recognition of merit.
 
Camacho-Hübner (116) defined interdisciplinary research as follows: “The fields which by mutual exchange, are on a not explored pathway, or even new, could be considered as being interdisciplinary. However, it is important to point out that the sole contact between two fields of investigation, or more, does not confer a true value of interdisciplinarity on research. Also any change of direction within the same discipline does not become identified out of necessity with an interdisciplinary opening, even if the taking over of a research object belonging traditionally to another discipline would have be led to this change directly or indirectly” ( 117, 118, 119, 120). It is therefore necessary to get involved in the research of bifurcations, synergies between disciplinary domains which conditions the object of action and its relation between Science, Technology and Society (121), “not completely foreseeable by the knowledge of the previous knowledge”, as is demanded in the patentability of innovative works. It would aim moreover, for Gramaccia (122) “at the social incorporation of knowledge for more social and cultural mediations”. Nevertheless, it is already necessary to choose the existing interdisciplinary actors, if these do not try to “acknowledge” their own deviant “children”, it will never be possible to define the concept of interdisciplinarity more precisely.
 
Anyway, if, has been underlined, communities which share the same disciplinary passion are innovative but tend to close up on themselves, the chance of the Sciences of action is, from a scientific point of view, to develop a multidisciplinary space open to  the Society, allowing a more collective functioning of creation and exploitation of new concepts. So, by association with “ordinary” scientific disciplines on truly open subjects which concern the technological future of society, these will bring out discontinuities leading to reconstructions and to adaptations. These elements need to constitute the specific nature of interdisciplinarity, aiming through new approaches at creating links between Man and artifacts, between users and the system of production and consumption. This is why it is necessary to explain how research on objects is studied by scientists and technicians and how research results are communicated to the rest of society. This means that techno-scientific disciplines have to be the object of an analysis which goes further than just their possible relation with the solely scientific “truth” and need to get involved in the approach of the horizon of social practices, of artifacts and the relations to which they may be linked. This group of central questions link research with the economy and social and political contexts, and creates a direct link between interdisciplinary research and Society.
 
“We are in the world of “Sunday drivers”, as writes it George Friedmann, “of Men who have never leaned over their engine and for whom things have simply function and are a mystery” (123). And, in the context described in the present essay, the authors would like to go against the direction of this proposal which corresponds, partly, to the vision of research which is often/sometimes too clumsy and too conceptual because it corresponds to aspirations and adapted to a “classical evaluation” following traditional models finally by another age or rather by another world… In this situation, it is necessary to share better and to refocus its actions towards the “basic” mission, and therefore to be more efficient and in this way it will be possible to be acknowledged as living actors pushing for the development of autonomous and useful scientific “undisciplined inter-disciplines” for Society. But as Henry (117) reminds us, “the evaluation of scientific production always has something to do with the “territory” of a discipline. Each advances its own definition, by definition of frontiers, one is “in – inside” and one is “in – outside”, transfers being sometimes condemned, accepted sometimes, but in a logic of transfiguration to be imposed as legitimate”. This opinion is broadly confirmed by that of Larrère (124) who considers that the industrial paradigm undertakes a division between knowledge, between sciences of Nature and Human and social sciences, constituting an “anthropo-centered epistemic paradigm”. This context, when Sciences – Techniques - Society are brought closer together, must therefore be taken in consideration to attain more optimum forms of judgment of activities.
 
Indeed, auto-evaluation is necessary; the same is true for an evaluation by external experts. In both situations, it is probably useful to define quantitative criteria, which it is necessary to take into account for what they are worth (but probably no more). Particularly, the desired intrusion of “hard” sciences in the social field is a constraint for the usefulness of such criteria which are solely quantitative. This means it must be possible to obtain qualitative judgments on interdisciplinary operations by different ways:
 
·In general: evaluation of the general interest of the research action;
·Analytical: the most definite possible identification of the perceptible advantages/ disadvantages of the research work;
·Comparative: analysis of alternative solutions;
·Analogical: transfer from judgments carried previously on scientific considered activities as similar or jurisprudence (125).
 
 
 
 
“Clever people are often loath to look at certain opposite problems and to envisage cool-headed certain resolutions which contradict their firm beliefs or feelings. The majority try to be right, rather than trying to see well” (106).
 
“In the reign of thought, imprudence is a method” (126).
 
“It is within the transitional that the Man is fulfilled, or ever” (127).
 
 
 
 
 
 
References
1. Hacking I. (2008). Disciplinaire et satisfait. Available from: http://www.interdisciplines.org
2. Pestre D. (2003). The evolution of knowledge domains: Interdisciplinarity and core knowledge http://www.interdisciplines.com
3. Fauconnier D. J'exerce un métier, donc je suis SERP Ed. Paris, France, 1998.
4. Hacking I. (2004). The complacent disciplinarian. Available from: http://www.interdisciplines.org
5. Sperber D. (2004). Pourquoi repenser l'interdisciplinarité ?“ Available from:  http://www.interdisciplinarity/papers/1/language/fr.
6. Teubner G. Droit et réflexivité : l’autoréférence en droit et dans l’organisation LGDJ Ed. – Paris – France, 1994.
7. Bhowmick S. (2010). Computer science conference rankings. Available from:  http://www3. ntu.edu.sg/home/assourav/crank.htm.
8. Australian Research Council (2010). Era - excellence in research for Australia initiative. Available from: http://www.arc.gov.au/era/era_2010.htm.
9. Egghe L. An improvement of the h-index: the g-index ISSI Newsletter 2006; 2: 8-9.
10. Hirsch JE. An index to quantify an individual's scientific research output Proc. Ntl. Acad. Sci., 2005 ; 102: 16569-16572.
11. Jensen J, Rouquier JB, Croissant Y. Testing bibliometric indicators by their prediction of scientists promotions Scientometrics 2009; 78: 467-479.
12. Shi X, Leskovec J, McFarland DA. Citing for High Impact Arxiv preprint arXiv, 2010 ; 1004-3351.
13. Dumas M, Kungas P,  Parra Trepowski C, Casati F, Garcia L,  Birukou A. (2010). On the correlation between bibliometric indicators and rankings of conferences and researchers Available from: http://www.interdisciplines.org/conferences/Workshop-on-Trust-and-Reputation
14. Sperber D. (2003). Why rethink interdisciplinarity. Available from: http://www.interdisciplines.org.
15. Heintz C, Origgi G. (2004). Rethinking interdisciplinarity; emergent issues. Available from: http://www.interdisciplines.org
16. Canto-Sperber M, Ogien R. La philosophie morale PUF Ed.  Paris, France, 2004.
17. Godin B. La science sous observation : cent ans de mesures sur les scientifiques 1906 - 2006 Presses de l'Université Laval Ed. Québec, Canada, 2005.
18. Godin B. (2006). Mesurer la Science. Available on line from: http://www.csiic.ca/PDF/PourLaScience.pdf .
19. Elster J. Le désintéressement : traité critique de l’Homme économique Seuil Ed. Paris, France, 2009.
20. Ségalat L. La science à bout de souffle? Seuil Ed. Paris, France, 2009.
21. Arendt H. Qu’est ce que la politique? Seuil Ed. Paris, France, 1995.
22. Debord G. La société du spectacle Gallimard-Folio Ed. Paris, France, 2008.
23. Seglen PO. Why the impact factor of journals should not be used for evaluating research Brit. J. Medicine 1977; 314: 498-502.
24. Melman C. L’Homme sans gravité in E. Enriquez L’idéal type de l’Homme moderne : l’individu pervers?  ERES Ed. Paris, France, 2004.
25. Wellcome (2003). Analyse économique de l’Edition scientifique – Rapport commandé par Wellcome trust. Available on line from: http://www.wellcome.ac.uk/scipublishing
26. Bettman JR, Luce ML, Payne JW. Constructive consumer choice processes J. Consumer Res 1998; 25: 187-217.
27. De Tocqueville A. De la démocratie en Amérique Flammarion Ed. Paris, France, 1981.
28. Laurain B. (2009) Humoristic private communication – CNRS, Paris, France, 2009.
29. De Certeau M. L'invention au quotidien ; arts de faire Collection 10/18 Ed. Paris, France, 1980.
30. Karpik L. L'économie des singularités NRF – Gallimard Ed. Paris, France, 2007.
31. Goffman E. Les cadres de l'expérience Les Editions de Minuit, Paris, France, 1991.
32. Von Keyserling H. L'analyse spectrale de l'Europe C. de Bartillat Ed. Paris, France, 1990.
33. Chamayou G. (2010). Petits conseils aux enseignants-chercheurs qui voudront réussir leur évaluation. Available from: http://www.contretemps.eu/interventions/petits-conseils-enseignants-chercheurs-qui-voudront-reussir-leur-evaluation.
34. Salem J. Rideau de fer sur le Boul’mich : formatage et désinformation dans le monde libre Editions Dega, Paris,  France, 2009.
35. Fleury C. Les pathologies de la démocratie Fayard – Livre de poche Ed. Paris, France, 2005.
36. Stengers I. L’invention des sciences modernes Champs Sciences Ed. Paris, France, 1995.
37. Lecourt D. Contre la peur PUF Ed. Paris, France, 1999.
38. Berthelot JM, Martin O, Collinet C. Savoirs et savants ; les études sur la science en France PUF Ed. Paris, France, 2005.
39. Bourdieu P. Les usages sociaux de la science ; pour une sociologie clinique du champ scientifique INRA Ed. Paris,  France, 1997.
40. Larousserie D. La démesure de la science Sciences&Avenir, June 2008 ; 84-85.
41. Kermarrec AM, Faou E, Merlet JP, Robert P., Segoutin L. Que mesurent les indicateurs bibliométriques Document d'analyse de la commission d'évaluation de l'INRIA 34 pp. INRIA, Rocquencourt, France, 2007.
42. Van der Graaf M. (2004). A Report on the Functionality of Abstracts & Indexing Database Platforms: recent Developments, Library Policies and a new Evaluation Technique - Scopus white Paper Series 3. Available from : http://www.info.scopus.com/researchtrends/doc/RT5.pdf
43. Aeres. (2008). Charte de l'évaluation. Available from: http://www.aeres-evaluation.fr/charte-de-l-evaluation
44. Porter TM. Trust in number; the pursuit of objectivity in Science and Public Life Princeton University Press Ed. USA, 1995.
45. Schibany A, Streicher G. (2008). The European Innovation Scoreboard: drowning by numbers?  Science and public Policy 2008 ; 35 : 717-732.
46. Green T. (2009). We Need Publishing Standards for Datasets and Data Tables - OECD Publishing White Paper, OECD Publishing. – Paris – France. Available from: http://dx.doi.org/10.1787/603233448430OECD
47. OCDE (2007) Séminaire de l'OCDE sur la fraude scientifique. Available from: http://www.bulletins-electroniques.com/actualites/041/41989_vi.htm
48. Zitt M, Filliatreau G. Bibliométrie et indicateurs : rôle de l'OST Rencontres 2005 des professionnels de l'IST – Nancy – France, 2005.
49. Diouf JP. (2006) Combler le fossé Nord-Sud dans le domaine de la communication sur l'Afrique : menaces et opportunités de l'ère du numérique. Available from: http://www.codesria.org/Links/conferences/electronic_publishing06/papers/Jean_Pierre_Diouf.pdf
50. Giget M. Du push technologique à la synthèse créative : vers un retour des valeurs de progrès in J. Attali, C. de Boissieu Ed. Un monde en mouvement ; enjeux et défis ESKA Ed. – Paris – France, 177-180, 2007.
51. Conseil des bibliothèques (2003) La publication scientifique ; problèmes et perspectives. Available from:  http://www.ucl.ac.be/cbib/pub_sc_nv91.pdf.
52. De Pracontal M. L’imposture scientifique en dix leçons Seuil Ed. – Paris – France, 2005.
53. Heilbrunn B. La performance ; une nouvelle idéologie ?“ La découverte Ed. – Paris – France, 2004.
54. Ehrenberg A. Le culte de la performance Calmann-Lévy Ed. – Paris – France, 1991.
55. Girard R. La voie méconnue du réel : une théorie des mythes archaïques et modernes Grasset – Livre de poche Ed. – Paris – France, 2002.
56. Roux-Dufort C. La performance, antichambre de la crise in B. Heilbrunn Ed. La performance ; une nouvelle idéologie ?“ La découverte Ed. - Paris – France, 144-162, 2004.
57. André JC. Vers le développement d'une recherche durable... ou vers une (ré)humanisation des sciences des artefacts Environnement, Risques et Santé 2008; 7: 47-54.
58. Stengers I, Schlanger J. Les concepts scientifiques Gallimard Ed. - Paris – France, 1991.
59. Gringras Y. Note de Recherche : la fièvre de l'évaluation de la recherche ; du mauvais usage de faux indicateurs 15pp. CIRST Ed. - Montréal – Canada, 2008.
60. Kuhn T. La structure des révolutions scientifiques Champs-Sciences Ed. - Paris – France, 2008.
61. Dubessy J, Lecointre G. Ed. Intrusions spiritualistes et impostures intellectuelles en Sciences“ Actes du colloque organisé sous l'égide de la libre pensée (2000) Syllepse Ed. – Paris – France, 2003.
62. Scopus (2008) Scopus http://www.info.scopus.com/
63. Thomas More Institute. (2009). Note de benchmarking 4 : Quel classement européen des universités ? Available from: http://www.institut-thomas-more.org
64. Lepori B, Barré R, Filliatreau G. New perspectives and challenges for the design and production of S&T indicators Research Evaluation 2008; 17: 33-44.
65. Kanninen S, Lemola T. Methods for evaluating the impact of basic research funding 99 pp. Academy of Finland Ed. – Helsinki – Finland, 2006.
66. ERiC project (2009) Evaluating research in context. Available from: http://www.eric-project.nl/nwohome.nsf/pages/NWOA_7RXCLC_Eng ; http://ec.europa.eu/research/science-society
67. EREFIN (2009) Un jeu de descripteurs quantitatifs pour une approche intégrative de la production et des résultats d’une unité de recherche. Available from: http://www.obs-ost.fr/en/la-cooperative/erefin.html
68.  André JC. Proposition d’indicateurs d’efficience pour INSIS - CNRS – Internal report INSIS – CNRS Paris - France.
69. Berthoz A. La simplexité O. Jacob Sciences Ed. – Paris – France, 2009.
70. Madiès T, Prager J.-C. (2008). Innovation et compétitivité des Régions. Available from: http://www.cae.gouv.fr:80/
71. Autant-Bernard C, Massard N. Disparités locales dans la production d'innovation : l'incidence du choix des indicateurs in 4èmes journées de la proximité Proximité, réseaux et communication, MIMEO - Standford – USA, 2004.
72. De Turckheim E, Hubert B, Cerf M. L’évaluation des recherches partenariales ; quelle procédure, quels critères in P. Béguin et M. Cerf Ed. Dynamique des savoirs, dynamique des changements Octarès Ed. – Toulouse – France, 265-281, 2009.
73. Spaapen J, Dijstelbloem H, Wamelink F. Evaluating research in context, cf. ERiC project, 2007.
74. Laufer R, Paradeise C. Le prince bureaucrate Flammarion Ed. - Paris – France, 1982.
75. Cattell RB. The scientific use of factor analysis in behavioural and life sciences Plenum Press Ed. – New-York – USA, 1978.
76. LOLF (2001) La LOLF, qu'est que c'est? Available from: http://www.education.gouv.fr/cid31/La-lolf-qu-est-ce-que-c-est.html .
77. Bachimont B. La complexité audio-visuelle : enjeux pour une recherche interdisciplinaire in Les dossiers de l'audio-visuel 85 - La recherche en information et communication en France – Paris – France, 1999.
78. Guillebaud JC. Le commencement d'un monde Seuil Ed. – Paris – France, 2008.
79. Solé A. (2004) Critique de la complexité. Available from: http://forum.aceboard.net/63455-2145-3752-0-.htm.
80. Lubart T. Psychologie de la créativité A. Colin Ed. - Paris – France, 2005.
81. Ochse R. Before the gate of excellence: the determinants of creative genius Cambridge Univ. Press Ed. - New-York – USA, 1990.
82. Hubert B, Bonnemaire J. La construction des objets dans la recherche interdisciplinaire finalisée : de nouvelles exigences pour l'évaluation EDP Sciences Ed. – Paris – France, 1993.
83. André JC, Le Mehauté A, De Witte O. Dispositif pour réaliser un modèle de pièce industrielle. French Patent n° 84 11 241 - 16.07.1984.
84. Bruner J. Pourquoi nous racontons nous des histoires ? Retz Ed. – Paris – France, 2002.
85. Lévy-Leblond JM. La vitesse de l’ombre ; aux limites de la science Seuil Ed. - Paris – France, 2006.
86. Godard O, Henry C, Lagadec P, Michel-Kerjan E. Traité des nouveaux risques Gallimard-Folio Ed. – Paris – France, 2002.
87. Serres M. Le mal propre : polluer pour s’approprier Le Pommier Ed. - Paris – France, 2008.
88. Lesourne J, Bravo A, Randet D. Avenirs de la recherche et de l'innovation en France La Documentation Française Ed. - Paris – France, 2004.
89. Albarello L. Apprendre à chercher. De Boeck Ed. - Brussel – Belgium, 2004.
90. Stéphany D. Développement durable et performance de l’entreprise Ed. Liaisons – Rueil-Malmaison – France, 2003.
91. Chesbrough H. Open innovation: the new imperative for creating and profiting from technology Harvard Business School Press – Boston – USA, 2003.
92. Moles A. Théorie de l'information et perception esthétique  Flammarion Ed. – Paris – France, 1990.
93. Chiffoleau Y. La sociologie des réseaux au service d’une recherché engagée : retour sur un travail d’équipe en viticulture languedocienne in P. Beguin et M. Cerf Ed. Dynamique des savoirs, dynamique des changements Octarès Ed. –Toulouse – France, 111-127, 2009.
94. De Miranda L. L’art d’être libre au temps des automates Max Milo Ed. – Paris – France, 2010.
95. De Geuser F, Fiol M. Faire face aux situations complexes: la blessure narcissique des managers in Moingeon B. Ed. Peut-on former les dirigeants ? L'apport de la recherche L'Harmattan Ed. – Paris – France, 99-125, 2004.
96. Weingart P. Interdisciplinarity: The paradoxical discourse in P. Weingart et N. Stehr Ed. Practising Interdisciplinarity Toronto Univ. Press Ed. - Toronto – Canada, 2000.
97. Baron X. Les conditions de la performance du travail intellectuel Entreprise et Personnel Ed. – Paris- France, 2002.
98. Schön DA. Knowing in action: the new scholarship requires a new epistemology Changes 1995 ; 27 : 27-34.
99. Van Andel P, Bourcier D. De la sérendipité dans la science, la technique, l’art et le droit ; leçons de l’inattendu L’ACTMEM “Libres Sciences“ Ed. - Paris – France, 2009.
100. De Ketele JM, X. Roegiers X. Méthodologie du recueil d'information De Boeck Ed. - Brussel – Belgium, 1996.
101. Durkheim E. Les règles de la méthode sociologique PUF Ed. – Paris – France, 1937.
102. Foray D. Choix des techniques, rendements croissants et La Techno-science : les fractures des discours  processus historiques : la nouvelle économie du changement technique in J. Prades Ed.  L'Harmattan Ed. – Paris – France, 57-93, 1992.
103. Wolton D. Vieux problème – idées neuves Cahiers STS 1 – CNRS Ed. - Paris – France, 1984.
104. Crozier M, Friedberg E. L'acteur et le système Seuil Ed. – Paris – France, 1977.
105. Elias N. Engagement et distanciation Collection Agora-Fayard Ed. – Paris – France, 1983.
106. Lecomte de Nouy P. L'avenir de l'esprit Gallimard Ed. - Paris – France, 1941.
107. Devaquet A. L'amibe et l'étudiant O. Jacob Ed. – Paris – France, 1988.
108. Morin E. Science avec conscience Fayard Ed. - Paris – France, 1990.
109. Union Européenne Recommandation de la commission du 07 février 2008 concernant un code de bonne conduite pour une recherche responsable en nanosciences et nanotechnologies Journal Officiel de l'Union Européenne L 116/46-52 ; Ref. 2008/345/CE - Brussel – Belgium, 2008.
110. Renard R. Recherche scientifique et aide à la décision Recherches sociologiques, 1985; 16 : 143-169.
111. Léonetti J. Quand la science transformera l’humain Plon Ed. – Paris – France, 2010.
112. Adams J. Risk. UCL Press Ed. - London – UK, 1995.
113. Kant E. Fondements de la métaphysique des mœurs Delagrave Ed. – Paris – France, 1934.
114. Jonas H. Le principe Responsabilité Flammarion Ed. – Paris – France, 1990.
115. OCDE (1998) L'interdisciplinarité en sciences et technologies. Available from:  http://www.crsng.ca/professors_f.asp?nov=profnav&lbi=intre.
116. Camacho-Hübner E. (2007). De l'interdisciplinarité comme paradigme de recherche in Espaces temps.net Available from: http://espacetemps.net/document3842.html
117. Henry C. Le « je » intellectuel et le « jeu » interdisciplinaire Le Genre humain/Seuil Ed. 1998; 33: 155-170.
118. Alvarez-Pereyre F. L'exigence interdisciplinaire. MSH Ed. – Paris – France, 2003.
119. Noiriel G. Pour une approche subjective du social Annales 1989; 6: 1435-1459.
120. Vinck D. Pratiques de l'interdisciplinarité Presses Universitaires de Grenoble Ed. - Grenoble – France, 2000.
121. Ollivier B. Enjeux de l'interdisciplinarité L'année sociologique 2001/2; 51: 337-354.
122. Gramaccia G. Démocratie participative et communication territoriale. Vers la micro – représentativité L’Harmattan Ed. – Paris - France, 2008.
123. Baudrillard J. Le système des objets Gallimard Ed. – Paris – France, 2001.
124. Larrère L. Ordre biologique, ordre technologique; le cas de l'écologie in F. Tinland Ed. Ordre biologique, ordre technologique Champ-Vallon Ed. - Seyssel – France, 233-252, 1994.
125. Barthélémy P. Critères d'évaluation du processus de conception de produit in J. Perrin Ed. Pilotage et évaluation des processus de conception L'Harmattan Ed. – Paris – France, 41-67, 1999.
126. Bachelard G. Le rationalisme appliqué PUF Ed. – Paris – France, 1949.
127. De Beauvoir S. Pour une morale de l’ambigüité Gallimard Ed. – Paris – France, 1947.
 

Corresponding Author: Jean-Claude Andrè
INSIS-CNRS, 3 rue Michel Ange F75016 Paris - France
LRGP UPR 3349 CNRS - University of Lorraine – 1, rue Grandville F54000 Nancy - France
e-mail: info@preventionandresearch.com
Download full text: