In this study, we discover Russian “centers of excellence” and explore patterns of their collaboration with each other and with foreign partners. Highly cited papers serve as a proxy for “excellence” and coauthored papers as a measure of collaborative efforts. We find that currently research institutes (of the Russian Academy of Sciences as well as others) remain the key players despite recent government initiatives to stimulate university science. The contribution of the commercial sector to high-impact research is negligible. More than 90% of Russian highly cited papers involve international collaboration, and Russian institutions often do not play a dominant role. Partnership with U.S., German, U.K., and French scientists increases markedly the probability of a Russian paper becoming highly cited. Patterns of national (“intranational”) collaboration in world-class research differ significantly across different types of organizations; the strongest ties are between three nuclear/particle physics centers. Finally, we draw a coauthorship map to visualize collaboration between Russian centers of excellence.
The study of interhuman communication requires a more complex framework than Claude E. Shannon's (1948) mathematical theory of communication because “information” is defined in the latter case as meaningless uncertainty. Assuming that meaning cannot be communicated, we extend Shannon's theory by defining mutual redundancy as a positional counterpart of the relational communication of information. Mutual redundancy indicates the surplus of meanings that can be provided to the exchanges in reflexive communications. The information is redundant because it is based on “pure sets” (i.e., without subtraction of mutual information in the overlaps). We show that in the three-dimensional case (e.g., of a triple helix of university–industry–government relations), mutual redundancy is equal to mutual information (Rxyz = Txyz); but when the dimensionality is even, the sign is different. We generalize to the measurement in N dimensions and proceed to the interpretation. Using Niklas Luhmann's (1984–1995) social systems theory and/or Anthony Giddens's (1979, 1984) structuration theory, mutual redundancy can be provided with an interpretation in the sociological case: Different meaning-processing structures code and decode with other algorithms. A surplus of (“absent”) options can then be generated that add to the redundancy. Luhmann's “functional (sub)systems” of expectations or Giddens's “rule-resource sets” are positioned mutually, but coupled operationally in events or “instantiated” in actions. Shannon-type information is generated by the mediation, but the “structures” are (re-)positioned toward one another as sets of (potentially counterfactual) expectations. The structural differences among the coding and decoding algorithms provide a source of additional options in reflexive and anticipatory communications.
Problem solving often requires crossing boundaries, such as those between disciplines. When policy‐makers call for “interdisciplinarity,” however, they often mean “synergy.” Synergy is generated when the whole offers more possibilities than the sum of its parts. An increase in the number of options above the sum of the options in subsets can be measured as redundancy; that is, the number of not‐yet‐realized options. The number of options available to an innovation system for realization can be as decisive for the system's survival as the historically already‐realized innovations. Unlike “interdisciplinarity,” “synergy” can also be generated in sectorial or geographical collaborations. The measurement of “synergy,” however, requires a methodology different from the measurement of “interdisciplinarity.” In this study, we discuss recent advances in the operationalization and measurement of “interdisciplinarity,” and propose a methodology for measuring “synergy” based on information theory. The sharing of meanings attributed to information from different perspectives can increase redundancy. Increasing redundancy reduces the relative uncertainty, for example, in niches. The operationalization of the two concepts—“interdisciplinarity” and “synergy”—as different and partly overlapping indicators allows for distinguishing between the effects and the effectiveness of science‐policy interventions in research priorities.
This article considers the relationships among meaning generation, selection, and the dynamics of discourse from a variety of perspectives ranging from information theory and biology to sociology. Following Husserl's idea of a horizon of meanings in intersubjective communication, we propose a way in which, using Shannon's equations, the generation and selection of meanings from a horizon of possibilities can be considered probabilistically. The information-theoretical dynamics we articulate considers a process of meaning generation within cultural evolution: information is imbued with meaning, and through this process, the number of options for the selection of meaning in discourse proliferates. The redundancy of possible meanings contributes to a codification of expectations within the discourse. Unlike hardwired DNA, the codes of nonbiological systems can coevolve with the variations. Spanning horizons of meaning, the codes structure the communications as selection environments that shape discourses. Discursive knowledge can be considered as meta-coded communication that enables us to translate among differently coded communications. The dynamics of discursive knowledge production can thus infuse the historical dynamics with a cultural evolution by adding options, that is, by increasing redundancy. A calculus of redundancy is presented as an indicator whereby these dynamics of discourse and meaning may be explored empirically.