POND III: Unity / Disunity of Science
|Lisbon, 20-22 September 2018
Centre for Philosophy of Sciences of the University of Lisbon (CFCUL)
Faculty of Sciences and Faculty of Law, Lisbon University
|Thursday 20.09||Friday 21.09||Saturday 22.09|
|9.00 – 10.00
Chair – Jose Diez
Antonio Zilhão, Lisbon University Does the hypothesis of the disunity of science really work? Commentary: Stavros Ioannidis, University of Athens
|9.00 – 10.00
Chair – İlke Ercan
İlke Ercan & Berna Kılınç, Bogazici University, Istanbul, Entropy, Information, and Their Relation to Energy: Implications in Science and Engineering. Commentary: Jose Diez, University of Barcelona
|10.00 – 11.00
Chair – Stephanie Ruphy
Ori Belkind, Tel Aviv University
|10.00 – 11.00
Chair – Stavros Ioannidis
Elina Pechlivanidi, Athens University ,Acting powers, interveners, and the problem of the temporal gap
|11.00 – 11.30
|11.00 – 11.30
Chair – Berna Kılınç
Fernando Rua, CFCUL Thought experiments as abstract artifactual entities: reinterpreting Heisenberg’s microscope experiment
Sónia Duarte, CFCUL Maxwell’s valve and Thomson’s demon
Chair – Joshua Norton
Nathalie Gontier, CFCUL Pattern Similarity and Explanatory pluralism in biological, sociocultural and linguistic sciences
|Thursday 20.09||Friday 21.09||Saturday 22.09|
|15.00 – 15.30
Orly Shenker (Pond President), Rui Moreira (CFCUL Coordinator) and Olga Pombo (Local Organizer)
15.30 – 16-00
Cristina Bares, Univ. of Sevilla, & Matthieu Fontaine, CFCUL
|15.00 – 16.00
Chair – Correia Jesuíno
|15.00 – 16-00
Chair – José Croca
|16.00 – 16.30
|16.00 – 16.30
|16.00 – 16.30
Chair – Orly Shenker
Joao Cordovil, CFCUL Must OSR endorse Metaphysical Fundamentalism?
Gil C. Santos & Joao Pinheiro, CFCUL
Chair – Elena Castellani
José Croca, University of Lisbon The Principle of Eurhythmy. Origins
Chair – Lilia Gurova
Fernanda Palma, University of Lisbon
Jorge Correia Jesuino, ISCTE-University Institut of Lisbon
17:00 POND SESSION
18:00 POND SESSION
|Fado Dinner||Farewell Dinner
– Book of Abstracts –
Antonio Zilhão (University of Lisbon / CFCUL, Portugal)
Does the Hypothesis of the Disunity of Science Really Work?
Jerry Fodor passed away a few months ago. Having been one of the most creative philosophical minds of the last quarter of the twentieth century, he published many famous writings in the course of his philosophical career. Among them is a paper the topic of which matches this year’s POND conference topic: Special Sciences (or the Disunity of Science as a working Hypothesis) (In: Synthese 28, 97-115, 1974). In this paper, Fodor makes a plea for the disunity of science; this plea is, in turn, based upon a diagnosis of the facts of the matter of the development of the scientific enterprise into a diverse gamut of scientific disciplines. According to Fodor’s diagnosis, such a development entails, by its very nature, the inevitable disunity of science. Besides its intrinsic theoretical features, Fodor’s paper also acted in the 1980’s and 1990’s as a trigger for an avalanche of books and papers in the Philosophy of Mind and Metaphysics advocating the multiple realizability view of so-called ‘non-reductive materialism’. In the meantime, matters have quieted down. The time seems thus to be ripe for revisiting both the main ideas Fodor put forth in his paper as well as the contentions and ambitions characterizing the ensuing flood of non-reductive materialist metaphysical and mind-philosophical theorizations. In doing this, I will focus, in particular, on whether Fodor’s diagnosis of, and plea for, the disunity of science was actually properly substantiated by his arguments.
Elina Pechlivanidi (Athens University, Greece)
Acting Powers, Interveners, and the Problem of the Temporal Gap
This paper offers a critique of Nancy Cartwright’s “patchwork pluralism”. The point of departure is common; i.e. the thesis that not only is there an intimate connection between capacities and causation, but that they are central in causal relations and causal explanations. On this view, an effect is an acting power (or what in the contemporary discussion is referred to as the “manifestation” of a power). Moreover, on both accounts real events are singular and are based on complex interactions that require time, i.e the manifestation of a complex power is a process. However, as any process, this allows intervention; but then, it is possible that an activated power does not manifest itself. This is the main objection against causal necessity based on powers, which gives rise to the following problem: if powers are to ground laws (or in any case provide the regularity in nature) but do not involve necessity, how can one hold both the possibility of intervention and nomological necessity? Cartwright’s pluralism gives an answer by accepting laws only as regularities that the nomological machines we create bring about. Despite maintaining singular causation, regularities are possible given the account of nomological machines that involve a shielded environment such that of a laboratory or an experiment. The negative thesis of this paper is that even if accepting manifestation as a process there is no genuine intervention that breaks down regularities and makes the laws of nature to ‘lie’. I show that in order for intervention to apply, one is led to accept a temporal gap between cause and effect, a claim already posed and dismissed by Russell. Finally, by dismissing genuine intervention, one need not accept pluralism.
İlke Ercan and Berna Kılınç (Bogazici University Istanbul, Turkey)
Entropy, Information, and their Relation to Energy: Implications in Science and Engineering
Philosophers of science have usually paid attention to how general theoretical frameworks can bring unity to different scientific outlooks. A less well-known phenomenon is how applications of a theory can contribute to unity. Recent developments in the physics of information provide an illustration: The physical nature of information has been of much debate since Szilard’s approach to Maxwell’s demon problem. Questions around the physicality of information remained abstract until the emergence of nanotechnology. Measurements in quantum scale perfection enabled researchers to perform experimental tests on Landauer’s principle which led to a dramatic turn in the debate. The results reveal the implications of certain misconceptions in interpreting entropy, information and their relation to energy. The literature reveals that scientists and engineers employ different approaches to assessing information theoretic energy dissipation in nano systems. However, in order to accurately assess fundamental energy requirements of a complex computing paradigm, we need a unifying framework that will enable calculation of energy bounds based on entropy and information. In this paper, we provide an overview of common misconceptions in science and engineering in relating entropy and information to energy and illustrate how experimental practices clarify these misunderstandings and help develop a unifying framework. We emphasize the importance of such a framework and argue that it can significantly inform the strategic development as well as performance assessment of emerging technologies, benefiting both researchers and funding agencies who are striving to take effective strides forward in the realizations of novel computing paradigms.
Joshua Norton (American University of Beirut, Lebanon)
Unification and Quantum Gravity
In this talk, I will raise four considerations relating to unification and the search for a quantum theory of gravity (QG). The first two considerations express the sense in which our current search for a quantum theory of gravity has relied upon unification as a guiding principle. This reliance is largely a result of there being little to no data which directly entails a need for a quantum theory of gravity. While there is plenty of data which can be used to fine-tune and guide our development of a quantum theory of gravity, there is next to no data which would count as anomalous from the perspective of general relativity (GR). In other words, there is no experimental evidence which suggests that GR is false. Given that we don’t have recalcitrant data pointing to the falsity of GR, research into quantum gravity has largely been motivated by the following two considerations: (a) a philosophical presupposition that the fundamental laws of the universe are unified, and (b) that, without quantization, gravitational phenomena would be conceptually and perhaps physically incompatible with the quantum matter of our other theories. In addition, I will argue against the standard requirement that any theory of quantum gravity ought to subsume general relativity. In particular, I will argue that we should not require of QG that we, the practitioners of the theory, be able to formally derive Einstein’s field equations, or certain metric structures from it. This “derivation” requirement has been treated as the North Star for quantum gravity programs to the extent that quantum theories of gravity are judged as more or less promising based on how well Einsteinian physics can be extracted from them. I will argue that we ought not require this. While a derivation of GR from QG would make the new theory practically useful, the physical world might not comply. We require our new theories to be empirically successful and to make novel predictions, but so much the worse if we are unable to derive a past theory from it. One might think of the requirement that new theories ought to formally include old theories as a type of conceptual unification across successive scientific paradigms. However, science can progress even without this form of unification. And finally, I will address the issue of retrodiction and unification in the context of certain radical quantum theories of gravity which do not include space or time as part of their ontology. Such a context provides a powerful tool for probing what we require of novel theories in terms of retrodiction when these novel theories do not include essential structures which are presupposed by the data of our old theories. For instance, most of our observable data is expressed in terms of local beables (a la John Bell): these are shapes and properties located (or locate-able) in well-defined local regions of space. In the simplest case, we test theories by observing needles pointed at different numerical readouts. Yet, what are we to require of a novel theory which putatively does not make contact with spatio-temporal structures, which does not include local beables as part of its fundamental ontology? What is required for retrodiction once local beables are excluded? With respect to unification, it is often claimed that once general relativity is quantized we will be in a better position to unify the theory with other fundamental forces. However, I will argue that there are good reasons to see certain quantization programs as eliminating GR by providing a completely different framework for thinking about physical structures. According to these radical quantization schemes, unification might be a result of elimination rather than subsumption.
 A third reason can be added to these, though I will not address it in this talk: (c) noting that we gained new mastery over the world upon quantizing and unifying our other physical theories, induction would suggest that there are fruits to be gained in quantizing and unifying GR as well.
Ori Belkind (Tel Aviv University, Israel)
Disunity in Inductive Inferences
John Norton’s material theory induction argues that there are no general or formal inductive schemas, and that inductive generalizations are grounded in facts established empirically. In essence Norton’s material theory of induction “localizes’’ inductive inferences, given that each area of scientific research bases its inductive generalizations on empirical facts established prior to the inductive inference. In this presentation I provide further support for Norton’s negative claim that inductive generalizations lack any unified formal schemas. I argue that deductive rules of inference provide a false analog for inductive schemas, given that deductive rules of inference can abstract from the material content of our theories while inductive inferences cannot. To substantiate this claim, I try to show that Early Modern thinkers (for example, Francis Bacon, Robert Boyle, and Isaac Newton), rely on distinct versions of the corpuscular hypothesis to underwrite their theories of induction. However, contrary to Norton’s positive account, I argue that the content that underwrites induction in these cases is not merely a set of empirical facts. The “material” basis for induction consists of general matter-based strategies for reducing observable phenomena to some corpuscular elements, their primary qualities, and the configuration of the corpuscular parts. Thus, we can see how general theories of matter provide the necessary “local’’ structures that underwrite inductive inferences. Perhaps these insights into the matter-based inductive strategies can be used to investigate the local nature of different areas of science.
Raphael Kunstler (Toulouse II University Jean Jaurès, France)
The two Relativisms Dilemma
Are the norms of the scientific inquiry fixed? If not, does it involve relativism? Following Kuhn’s work (1962), these two questions have attracted lots of discussions. Larry Laudan (1984) has tried to offer a model which accommodates normative mobility and rationality. This was the occasion of a heated debate with John Worrall. Worrall (1988) claimed that Laudan didn’t manage to avoid relativism, because, according to his «reticulated model», the same theory can be legitimately accepted in a given social context and legitimately rejected in another context. Laudan (1990) replied that the real relativist was Worrall, since his normative intuitionism implied to renounce to the justification of the norms of scientific inquiry. Therefore, we are left with a choice between Landan’s relativism and Laudan’s relativism. In my paper, I try to offer a solution to this dilemma.
Kuhn, T. (1962). The Structure of Scientific Revolutions, Chigaco, Chicago University Press. Laudan, L. (1984). Science and Values: The Aims of Science and Their Role in Scientific Debate, University of California Press.
Laudand, L. (1989). « If If ain’t Broke, don’t Fix It », The British Journal for the Philosophy of Science, Volume 40, Issue 3, 1 September 1989, pp. 369–375
Laudan, L. (1990). « Aim-less epistemology? » Studies in History and Philosophy of Science Part A, 21 (2), 315–322.
Worrall, J. (1988). « The value of a fixed methodology ». The British Journal for the Philosophy of Science, 39 (2), 263–275.
Worrall, J. (1989). Fix it and be damned: a reply to Laudan. The British Journal for the Philosophy of Science, 40 (3), 376–388.
Fernanda Palma (University of Lisbon, Portugal)
Psychiatrization of Law and Juridicization of Psychiatry: reductionism or collaborative autonomy between scientific methodologies
1. The first question: meaning and necessity of Law as Science and problems of responsibility and capacity; 2. Previous questions: a) the alleged specificity of the social sciences from Dilthey’s dichotomy between human and natural sciences and the idea of a common normative parameter; b) qualification of Law as normative science in the analysis of the neo-Kantians; c) law science and the relation between facticity and validity in contemporary legal thought: several answers; d) the approximation of Science of Law to the social sciences and the understanding versus explanatory paradigms; 3. The meeting and mismatch between thinking about the Law and Psychiatry on the issue of responsibility and capacity: a) psychiatrization of Law as a historical experience and as a scientific question; b) the juridicization of Psychiatry as an experience of legal systems and as a scientific interpellation; 4. An interdisciplinary methodological cooperation of Law and Psychiatry on the criteria of criminal responsibility: soft and hard questions. 5. The map and the territory metaphor in relationship between Law and Psychiatry. Conclusions.
Fernando Rua (CFCUL, Portugal)
Thought experiments as abstract artifactual entities: reinterpreting Heisenberg’s microscope experiment
When Quantum Mechanics was doing the first steps to build its conceptual framework, Heisenberg proposed a thought experiment that the literature knows as “Heisenberg’s gamma ray microscope”. In an endnote of the article “Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik” (Zeitschrift für Physik, March 1927), Heisenberg proposed an ideal experiment stating the impossibility to decide the position of a particle, paving the way to the formulation of the uncertainty relations, one of the fundamental dogmas of the orthodox interpretation of quantum mechanics. It is our endeavor to look this particular though experiment in its historical context, but also, in its epistemological richness. We will intend to reinterpret the gamma ray thought experiment according to Amie Thomason´s theory of fiction and compare this interpretation with platonic and argumentation views in order to better understand the epistemic and the creative tasks of thought experiments.
Gil C. Santos & João Pinheiro (CFCUL, Portugal)
The Relational-Integrative Model of Explanation
Integrative pluralists [e.g. Bechtel 1986, Levin 1992, Kincaid 1997, Mitchell 2003, Bechtel & Hamilton 2007, and Brigandt 2013] depart from the observation that theories and models are idealizations, abstracting away causes from an actually concrete World, to argue that, partial descriptions notwithstanding, models need to be integrated so as not to make incompatible claims about the operation of each model’s ignored causes (contra radical interpretations of nomological pluralism [e.g. Dupré 1993 & 1996, and Cartwright 1996]). Simultaneously, while attending to the nature of more complex target systems, integrative pluralists recognize the role that emergent properties play in barring the way of strictly “low-level” models of explanation. Their claim is that unity by lower-level reduction, hereby defined as an explanation of the emergent properties of an object solely in terms of the intrinsic properties of the parts that constitute it plus their aggregativity [e.g. Kim 1999], fails in the face of relational properties [e.g. Hütteman 1998, Lewontin 2000, Keller 2010, Hütteman & Love 2011, Santos 2015 & 2016, and Humphreys 2016], «for surely the relational properties of a thing cannot supervene on the microphysical properties of that thing alone» [Dupré 1999:109], given that «relational properties are properties that each entity has and acquires due to its extrinsic relations with other entities in its environment» [Santos 2015:4]. Alternatively, we propose a relational-integrative model of explanation that accounts for strong (ontological) emergence with a combination of lower-level analysis of the relational properties, the local interactions of the system’s parts and their global organization at the system-level, and, eventually, even a higher-level analysis for when a systemic property is itself extrinsic (e.g. a systemic property of a molecule within, and relative to a cell).
Jorge Correia Jesuíno (ISCTE-IUL / CFCUL, Portugal)
Unity and Discontinuity in the Scientific Field of Psychology
The issue of unity /disunity of science is logically related with the epistemological criteria of demarcation, aiming at distinguishing science from non-science or from pseudo-science. Even admitting that there exists a demarcation criteria scientific disciplines are not equal in terms of epistemic dignity, some of them seem to be more equal than others, in accordance with a hierarchy leading to a pecking order. This same hierarchical differentiation can also be observed within the various disciplinary fields. The present paper attempts to illustrate such a controversial situation in the case of Psychology, a discipline that Ebbinghaus described as having a long past and a short history, at least in terms of its scientific status, divided as it is, in accordance with Dilthey, between the sciences of nature and the sciences of the spirit, the former allegedly grounded on explanation and the latter on comprehension. For Jean Piaget Psychology is situated at the interface linking formal disciplines as logic and mathematics with the life sciences which introduces another level of epistemic complexity in that mental processes would not be made intelligible neither in terms of causal explanation nor in terms of empathical comprehension, but rather in terms of logical implication. Psychology could thus be illustrative of how paradigms and sub paradigms even within the same interdisciplinary field become incommensurable giving rise to controversies that are displaced to a metaphysical level beyond the Popperian criterion of falsifiability. It could be added that the very debate about the unity versus disunity of science is itself liable of the same kind of undecidability, in part due to the fuzziness of the no less metaphysic notion of unity. Probably science is like God whose circle is everywhere and the circumference nowhere.
João Cordovil (CFCUL, Portugal)
Must OSR endorse Metaphysical Fundamentalism?
As Steven French puts it, Ontic Structural Realism (OSR) is motivated by “two sets of problems that ‘standard’ realism is seen to face. The first has to do with apparent ontological shifts associated with theory change that can be observed throughout the history of science. The second is associated with the implications – again ontological – of modern physics” (French, 2010). Precisely OSR’s literature stresses the fact that modern physics implies the downfall, or is at least incompatible with, the traditional metaphysics of objects – namely, with the individuality and the ontological independence conditions. In opposition to the traditional metaphysics of objects, OSR is often presented as the ontological view, according to which at the fundamental level of reality there are, either structures of relations (= R-OSR), or structures of relations and objects, in the OSR’s moderate version (M-OSR). So, in OSR’s literature all but one characteristic of traditional metaphysics of objects has been rejected, revised or at least put in question. All but the assumption that there is a fundamental level of physical reality. But does OSR really need to be committed to metaphysical fundamentalism? More specifically, is there a naturalistic ground to the fundamentalist claim of R-OSR? Does Physics entail Metaphysical Fundamentalism? Does R-OSR’s commitment to Metaphysical Fundamentalism depend on whether or not our best scientific theories are fundamentalists (and therefore is it a contingent assumption)? Is there a possibility of a non-fundamentalist Primitive Ontology that tackles issues in Modern Physics? And finally, can a non-fundamentalist OSR can be developed? From the recent concerns with fundamentalism raised notably by Schaffer (2003, 2010), Markosian (2005), Cameron (2008), McKenzie (2012, 2013, 2014, 2015, 2016), Tahko (2018), I will in first place show how nowadays forms of OSR are committed with Fundamentalism. Then, I will try to argue that from both mereological and supervenience relations OSR has severe difficulties to sustain the fundamentalist thesis. Then, I will try to show that there are different non-fundamentalist accounts that can work within R-OSR, and that M-OSR in combination with Primitive ontology has not to be committed to fundamentalism either. And finally, I will argue that non-fundamentalism is not only compatible with the nowadays forms of OSR but is also coherent with OSR’s initial motivations.
Cameron, R. (2008), “Turtles All the Way down: Regress, Priority and Fundamentality”, Philosophical Quarterly, 58 (230): 1-14.
Esfeld, M. and Lam, V. (2010), “Ontic Structural Realism as a Metaphysics of Objects” in Alisa and Peter Bokulich (eds.): Scientific structuralism. Dordrecht: Springer 2010, pp 143-159.
McKenzie, K. (2012), Physics without fundamentality (unpublished Ph.D. thesis);
McKenzie, K. (2013), “Priority and Particle Physics: Ontic Structural Realism as a Fundamentality Thesis”, Br J Philos Sci (2014) 65 (2): 353-380.
McKenzie, K. (2014), “On the Fundamentality of Symmetries”, Philosophy of Science 81, no. 5 (December 2014).
McKenzie, K. (2015), “Relativities of Fundamentality”, Studies in History and Philosophy of Modern Physics, 2015.
McKenzie, K. (2016), “Looking forward, not back: Supporting structuralism in the present”, Studies in History and Philosophy of Modern Physics, 2016.
Ladyman, J. and Ross, D. (2007), Everything must go: Metaphysics naturalized, Oxford: Oxford University Press.
Schaffer, J. (2003), “Is There a Fundamental Level?”, in Noûs 37.3 (2003), pp. 498-517.
Tahko, T. (2018), “Fundamentality and Ontological Minimality” in Ricki Bliss & Graham Priest (eds.): Reality and its Structure. Oxford University Press: 237-253 (2018).
José Croca (University of Lisbon / CFCUL, Portugal)
The Principle of Eurhythmy. Origins
It is amazing to recognize that most of the basic concepts of modern physics have had its early formulations much before. For instance, the organizational principle of eurhythmy, formulated at the beginning of this century, was in a certain way already proposed by many thinkers of the past. The principle of eurhythmy tells us that the complex physical entities in their becoming interact reciprocally with the omnipresent medium in such a way that, in average, they tend to improve the degree of persistence. A first version of this principle may be found for instance in Aristotle. To explain the fall of the bodies he advanced the principle of the natural place. According to this principle the physical bodies tend to move towards their natural place which, in this case, is the Earth. Also, Heron of Alexandria, Pierre de Fermat and many other scholars of the past, advanced early versions of the principle of eurhythmy to explain physical observable phenomena.
J.R. Croca, The principle of eurhythmy a key to the unity of physics, Unity of Science, Nontraditional Approaches, Lisbon, October, 25-28, 2006; J.R. Croca, The principle of eurhythmy a key to the unity of physics, in Special Sciences and the Unity of Sciences, Eds. Pombo, O.; Torres, J.M.; Symons, J.; Rahman, S. (Eds.), Springer, 2012.
J.R. Croca, Eurhythmic Physics, or Hyperphysics, The Unification of Physics, Lambert, Berlin, 2015.
Aristotle, Physics, Translated by Robin Waterfield, Ed. David Bostock, Oxford World’s Classics, Oxford, 1999.
Marco Pina (CFCUL, Portugal)
Reductionism, Determinism and the Sciences of Human Behavior – an Ontological and Epistemological Analysis of how Developments in Regulation of Gene Expression and Synaptic Plasticity are Contributing to the Debate
Reductionism and determinism are key issues to the unity/disunity of science problematics in philosophy of biology, and more specifically to those who face the philosophical quandaries of the study of human behavior, at a time when it is being tackled from several scientific perspectives, from biological to psychological sciences (e.g., genetic vs environmental causes of mental illness). This multiplicity of theoretical and methodological approaches constitutes the scientific objectivation of the old ‘nature vs nurture’ ontological debate on the causal emergence of human behavior, currently put in the terms ‘genes vs environment’. This presentation aims to analyze this objectivation, specifically the contributions from molecular genetics, systems biology and synaptic plasticity that are advancing candidate biological mechanistic explanations for how environmental (psychosocial) factors – and not only genetic factors – determine behavior. Our main goal is to help determining the extent to which these findings can be said to lay deep foundations for an epigenetic (non-fully genetically predetermined) explanatory model of the enormously complex gene vs psychosocial interactions at the roots of human behavior. The reductive nature of these and other scientific approaches to human behavior raises however another set of ontological and epistemological problems, summarized in the question of how can complex behaviors arise from the organization of molecules in bodies and brains? Although this is a very complex problem, an attempt at contributing to the discussion will be made, hand in hand with the work of some scientists and philosophers of biology and mind.
Matthieu Fontaine (CFCUL, Portugal) & Cristina Bares (University of Sevilla, Spain)
Disunity of Inferences in a Unified Framework
Logical pluralism is the thesis according to which different logics can be used. While dealing with argumentative interaction, it can be useful to have a unified framework for logical pluralism. Dialogical logic, in which the proof process is conceived in terms of a game between the proponent of a thesis and the opponent who challenges that thesis, constitutes such a pluralist framework. Up to now, dialogical logic has mainly been designed for deductive reasoning. Our aim is to consider dialogical logic as a tool to unify not only deductive logics in one framework, but also different inferences, deduction as well and abduction. One fundamental feature of abduction is that new information is introduced in the course of the dialogue. How this information can be introduced? How can it be challenged? How can it be defended? By means of rules definitions and examples expositions, we will thus begin by explaining how dialogical logic can be used as a unified framework. Then, we will explain the specificity of abduction, in relation to the Gabbay & Woods’s model , which emphasizes its pragmatic dimension. Finally, we will follow Bares & Fontaine  and we will define abduction in a dialogical framework by defining it in terms of concession-problem.
BARES GOMEZ, C. & FONTAINE, M. 2017. “Argumentation and Abduction in Dialogical Logic”. In Magnani & Bertolotti (eds) Springer Handbook of Model-Based Science. Springer. Dordrecht: 295-314.
GABBAY, D. & WOODS, J. 2005. The Reach of Abduction – Insight and Trial. Elsevier. Amsterdam.
Nathalie Gontier (CFCUL, Portugal)
Pattern Similarity and Explanatory Pluralism in Biological, Sociocultural and Linguistic Sciences
Biological, linguistic and sociocultural sciences have a long tradition of borrowing theoretical frameworks and modelling techniques from one another. In the 19th century, for example, there was the Darwin-Haeckel-Schleicher connection where parallels were drawn between language and species diversification by drawing phylogenetic trees for both and by understanding language evolution as a Darwinian process. In the 20th century, fields such as sociobiology, memetics and neural Darwinism took flight by searching for analogues of genes which they found in memes or replicators that subsequently formed to basis of computational modelling. Currently, and in association with big data mining, macroevolutionary phylogenetics is demonstrating that patterns of drift, reticulation and punctuated equilibria characterize trees and networks of sociocultural, linguistic and biological evolution. Such parallels are often taken to indicate the possibility of unification, but in practice, the different patterns are often explained differentially, indicating explanatory pluralism.
Gontier, N 2016 Time: The biggest pattern in natural history research. Evolutionary Biology 43: 604-637. (Special issue on converging patterns in the biological and sociocultural sciences.)
Gontier, N. 2018. Pattern similarity in biological, linguistic, and sociocultural evolution. In Cuskley, C., Flaherty, M., Little, H., McCrohon, L., Ravignani, A. & Verhoef, T. (Eds.): The Evolution of Language: Proceedings of the 12th International Conference (EVOLANGXII). (https://doi.org/10.12775/3991-1.035).
Sónia Duarte (CFCUL, Portugal)
Maxwell’s Valve and Thomson’s Demon
Scientific tradition united, as if they were one and the same, two different thought experiments. In 1867, James Maxwell (1831-1879), while studying gas molecules, posited the violation of the second law of thermodynamics through intercession of «the intelligence of a very observant and neat-fingered being». William Thomson/Lord Kelvin (1824- 1907) elaborated over it and made several additions not present in the original. Afterwards, Maxwell himself adverted against confusions between his project and the other’s: «Concerning demons – 1. Who gave them this name? Thomson. 2. What were they by nature? Very small but [Maxwell’s emphasis] lively beings incapable of doing work but able to open and shut valves (…) Call him no more a demon but a valve». Later commentators inconsistently brought into unity what was disunity from the start: Maxwell’s valve and Thomson’s demonic version. For the sake of adequacy, they should be distinguished in accordance to their conceptors, whose premises and assumptions explicitly differ, even though they have worked in the common field of Natural Philosophy (the academic chair both occupied). More than one century later, Deleuze and Guattari described Maxwell’s experiment as a science fight against scientific opinion, namely, probabilistic evaluation. They add that what is in question here is the role of the intercessors, whether scientific or philosophical. This article will advance a comparison of the aforementioned Maxwell’s and Thomson’s conceptualizations put in relation with Deleuze and Guattari’s characterization of two distinct agents who intercede in divergent plans: the philosophical intercessor (Maxwell) and the scientific observer (Thomson).
Catarina Nabais & Sara Fuentes & Graça Correa & Adalberto Fernandes & David Santos & João Cao (CFCUL/SAP Lab, Portugal)
Science and Art Connections: Case Studies