6th FPMW (2014)

Sixth French Philosophy of Mathematics Workshop (FPMW6)

Les sixièmes rencontres françaises de philosophie des mathématiques (FPMW6) auront lieu du jeudi 9 au samedi 11 octobre 2014 à l’Institut de Mathématiques de Toulouse. Ces rencontres organisées par un groupe international de chercheurs incluent à la fois des conférences invitées et des conférences issues d’un appel à contribution.

The sixth French Philosophy of Mathematics Workshop (FPMW6) will be held from Thursday 9th to Saturday 11th October 2014 at the Toulouse Mathematics Institute. This workshop organized by an international team of scholars will feature both invited and contributed talks.

Lieu (Place) : Institut de Mathématiques de Toulouse, Amphi Schwartz

Attention, la session du jeudi après-midi aura lieu à l’Auditorium de l’ IRIT / Beware, the afternoon session of Thursday will be held at the IRIT Auditorium

Date : 9-11 Octobre 2014, Jeudi 9 Octobre, 9h30 – Samedi 11 Octobre, 12h45

Contact : FPMW6[at]math.univ-toulouse.fr

 

Programme / Schedule 

Jeudi 9 octobre

9h30-11h Bertrand Toën (CNRS, I3M), Solving polynomial equations up to homotopy: an introduction to derived algebraic geometry 11h15-12h45 Neil Barton (Univ. of London, Birkbeck College), What is a forcing extension of V? Déjeuner 
Attention, la session du jeudi après-midi aura lieu à l’Auditorium de l’ IRIT / Beware, the afternoon session of Thursday will be held at the IRIT Auditorium 
14h15-15h45 Mark Wilson (Univ. of Pittsburgh), The Greediness of Scales 16h-17h30 Davide Crippa (Univ. Paris Diderot, SPHERE), Reflexive knowledge in mathematics: the case of impossibility results Dîner

 

Vendredi 10 octobre

09h-11h Marco Panza (CNRS, IHPST), What is (are) the challenge(s) of Benacerraf’s Dilemma? A Reinterpretation Oswaldo Chateaubriand (PUC-Rio/CNPq), Is there really a dilemma? 11h15-12h45 Pascal Bertin (Univ. Paris Diderot, SPHERE), Hausdorff et le Raumproblem Déjeuner 14h15-15h45 Marcus Rossberg (Univ. of Connecticut), Inferentialism and conservativeness: on the extension of second-order consequence 16h-17h30 Dewi Trebaul (Aix-Marseille Univ., CEPERC), What theories of meaning for what formal languages? A comparison between the fregean and the model-theoretical framework

 

Samedi 11 octobre

9h30-11h Michael Hallett (Mc Gill Univ.), Frege and Hilbert: two opposing views of the Nature of Mathematics 11h15-12h45 Graham Leach-Krouse (Kansas State Univ.), Generalizing the Boolos-Heck theorem

 

Comité d’organisation

Brice Halimi (Paris 10) 
Sébastien Maronne (Toulouse)
Damian Rössler (Toulouse)

Comité scientifique

Andrew Arana (Illinois) 
Mark van Atten (CNRS, Paris 4) 
Denis Bonnay (Paris 10) 
Paola Cantu (CNRS, Aix-Marseille) 
Gabriella Crocco (Aix-Marseille) 
Michael Detlefsen (Notre Dame) 
Jacques Dubucs (CNRS, Paris 4) 
Henri Galinon (Clermont-Ferrand) 
Sebastien Gandon (Clermont-Ferrand) 
Brice Halimi (Paris 10) 
Gerhard Heinzmann (Nancy) 
Paolo Mancosu (Berkeley) 
Sébastien Maronne (Toulouse) 
Philippe Nabonnand (Nancy) 
Jean-Philippe Narboux (Bordeaux) 
Marco Panza (CNRS, Paris 1) 
Fabrice Pataut (CNRS, Paris 4) 
Dominique Pradelle (Paris 4) 
David Rabouin (CNRS, Paris 7) 
Shahid Rahman (Lille) 
Andrei Rodin (Moscou) 
Ivahn Smadja (Paris 7) 
Stewart Shapiro (Ohio State) 
Jean-Jacques Szczeciniarz (Paris 7) 
Claudine Tiercelin (Collège de France) 
Sean Walsh (Irvine)

 

Résumés / Abstracts

Invited Talks

Oswaldo Chateaubriand (PUC-Rio/CNPq), Is there really a dilemma ?

I shall argue that Benacerraf’s arguments in “Mathematical Truth” have a rather tentative character and although he raises some relevant questions, his main hypotheses concerning truth, knowledge and reference are basically flawed. I shall argue, moreover, that the attempts to use Benacerraf’s arguments as a weapon to question the existence of abstract entities are based on simple-minded philosophical prejudices.

 

Michael Hallett (McGill University), Frege and Hilbert: two opposing views of the nature of mathematics

This paper sets out the central differences between the views of Frege and Hilbert on the nature of mathematics. At the heart of the opposition between the two are their differing views on creation and existence. In Frege’s view, the mathematical world is not created by us, but pre-exists, and the purpose of mathematics is to describe this world. For Hilbert, mathematics properly speaking is a collection of axiomatised theories which do not describe a fixed subject matter, and which are largely our creation. The paper points out that Gödel’s views about mathematics really takes elements from both Frege’s and Hilbert’s views.

 

Marco Panza (CNRS, IHPST), What is (are) the challenge(s) of Benacerraf’s Dilemma? A Reinterpretation

Despite its enormous influence, Benacerraf’s dilemma admits no standard, unanimously accepted, version. My purpose is to come back to the discussion on its interpretation and reformulation, with a particular attention to Field’s reformulation of the problem, so as to identify two converging and quite basic challenges, respectively addressed by Benacerraf’s dilemma to a platonist and to a combinatorialist (in Benacerraf’s own sense) philosophy of mathematics. What I mean by dubbing these challenges ‘converging’ is both that they share a common kernel, which encompasses a crucial conundrum for any plausible philosophy of mathematics, and that they suggest (at least to me) a way-out along similar lines. Roughing these lines out is the purpose of the last part of the talk.

 

Bertrand Toën (CNRS, I3M), Solving polynomial equations up to homotopy: an introduction to derived algebraic geometry

The purpose of this lecture is to introduce the ideas and concepts of derived algebraic geometry. For this, I will start to consider the general problematic of intersection theory, namely “bad intersections” (or “non-generic intersections”), and explain through some basic examples how they appear naturally in central questions of algebraic geometry. I will review some approaches that have been introduced in order to deal with these bad intersections (intersection numbers, cohomological intersections, homotopical intersections…). In the second half of the talk, I will explain how the notion of “solution up to homotopy of polynomial equations” leads to the concept of “derived intersections” and more generally to derived schemes and derived stacks, the central objects of derived algebraic geometry.

 

Mark Wilson (University of Pittsburgh), The Greediness of Scales

The methodological problem to be discussed is an old one that has influenced metaphysical thinking in the past considerably (especially Leibniz). In modern form, it runs like this. Scientists know a lot about the internal structures of a complex materials upon different length scales and have framed very effective models of the key behaviors witnessed. But these treatments all employ differential equations which inherently operate upon an infinitesimal size scale, despite the fact that their targeted behaviors only emerge upon much longer characteristic lengths. Such modeling policies engender descriptive inconsistencies between the different treatments that prevents them from working together in a mutually beneficial way (upon a computer, say). Unfortunately, these clashes can’t be easily rectified without spoiling the utility of the models altogether. Recent advances in “multi-scalar methods” have uncovered policies that evade these syntactic inconsistencies by persuading the different models to “talk to one another” in strikingly novel ways. These innovations raise important philosophical questions about “truth value” in physical theory and how directly such accounts relate to the world they describe. As such, philosophy of science is returned to the basic concerns that Leibniz weighed in his writings on the “labyrinth of the continuum.” In consequence, modern metaphysicians should recognize that “determining the ontology of a theory” may not follow the simple contours suggested by Quine in “On What There Is” and may require a deeper engagement with the actual details of effective applied mathematics.

 

Contributed talks

Neil Barton (University of London, Birckbeck College), What is a forcing extension of V?

Recent debates in the philosophy of set theory often focus on how many universes of sets there are. Absolutist hold that there is just one maximal, definite universe of sets, while Multiversists hold that there are many universes of sets. Often, the practice of forcing over V is regarded as evidence against the Absolutist position. In this paper I clarify this debate by examining Absolutist interpretations of forcing. My strategy is as follows:

Section 1 outlines the main problem for the Absolutist. Section 2 then assesses and provides further criticism of the claim that the Absolutist can simply interpret forcing through the use of countable transitive models or Boolean valued models. It is argued that these well-known interpretations of forcing are not satisfactory on the Absolutist’s position. Section 3 presents a recent interpretation of forcing (the Boolean ultrapower) and argues that it is satisfactory on the Absolutist’s view. It is argued, however, that a conflict of intuitions between the Absolutist and the Multiversist is highlighted, namely the attitudes of each to the rôle of simulation in mathematics.

It is concluded that while the Absolutist appears to have an interpretation of forcing over V, nonetheless philosophical research should focus on the ontological implications of simulating one kind of mathematical object with another.

 

Pascal Bertin (Université Paris Diderot, SPHERE), Hausdorff et le Raumproblem

In this talk, we propose to study the development of Hausdorff’s Raumproblem , from the first “nietzschéo-kantian” works of the philosopher-mathematician to the relative completion of the Grundzüge der Mengenlehre in 1914.

We will try more specifically to focus on the crossing point between philosophical and mathematical investigations, which seems to lie in the gradual deepening of the method of “transcendental variations” developed, from the earliest texts, against realism (the latter consisting of too much confusion between “world in itself” and “phenomenal world”). This method is a kind of reductio ad absurdum which, in the case of the “Raumproblem,” is intended to relativize the primacy accorded to the three-dimensional Euclidean space and consists to introduce empirically non discriminable changes. And the transition from philosophy to mathematics hangs precisely on this elements of variation which, after allowing the questioning of “transcendent” outlook, become objects of investigation for the description of empirical space.

The issue is then, as often with Hausdorff, a “reliably estimate” of our “leeway”, but no longer considered in light of our transcendent powerlessness, but according to our descriptive possibilities. And in such a framework, the investigations on continuity but also the fundamental concept of “topological space,” that Hausdorff will later develop, make sense.

 

Davide Crippa (Université Paris Diderot, SPHERE), Reflexive knowledge in mathematics: the case of impossibility results

Proofs that certain geometric problems (like the duplication of the cube, the trisection of the angle or the quadrature of the circle) cannot be constructed by ruler and compass can be dated back to the half of XIXth or the beginning of XXth century.

However, claims to the impossibility of solving such problems by given sets of means, and attempts to prove these claims can be occasionally encountered before, especially from XVIIth century onwards. How such primitive impossibility arguments were structured, and what was the role of impossibility claims within the mathematical practice in which they appeared?

I shall consider one broad case study: the attempts to prove the impossibility of squaring the central conic sections (the circle, the ellipse of the hyperbola), led by J. Gregory and G.W. Leibniz.

I shall claim that they constituted salient examples of “reflexive knowledge”, namely a thinking about mathematics carried out by means of mathematical resources.

 

Graham Leach-Krouse (Kansas State University), Generalizing the Boolos-Heck theorem

One deep result of Frege’s Grundgesetze is this: there exists an interpretation —a theorem-preserving translation— of second order Peano Arithmetic PA into the theory HP consisting of nothing but the axioms of second order logic and Hume’s principle. In the appendix to Die Grundlagen Der Arithmetic §§82–83, Boolos and Heck offer a proof of the converse result: there exists an interpretation of HP in second-order PA. By combining these two results, it’s possible to show that HP and PA are equivalent theories in an important sense. In this paper I apply structural abstraction principles —abstraction principles associating abstract first-order representatives to third-order isomorphism types of models— to show how similar results can be achieved for many more theories than is commonly supposed.

 

Marcus Rossberg (University of Connecticut), Inferentialism and conservativeness: on the extension of second-order consequence

A difficulty for inferentialism is presented. It is shown that third-order logic is not conservative over second-order logic: there are sentences formulated in pure second-order logic that are theorems of third-order logic, but cannot be proven in pure second-order logic. This new incompleteness challenge is formulated proof-theoretically rather than by appeal to model-theoretic semantics. The impossibility to demonstrate the truth of such second-order sentences using the inference rules of second-order logic alone seems to refute the inferentialist’s claim that the meaning of the quantifiers is determined by their introduction and elimination rules: such sentences being truths of third-order logic should be true in virtue of the meaning of the logical vocabulary. I argue that the inferentialist can answer this challenge. In the course of the argument, previously neglected features of proof-theoretic and Henkin consequence will be elucidated.

 

Dewi Trebaul (Aix-Marseille Université, CEPERC), What theories of meaning for what formal languages? A comparison between the fregean and the model-theoretical framework

The question we would like to address in our talk is the following: can the admittance of a type of expression in a formal language invalidate the meaning-theory for such a language? The particular case of non-logical constants will be our starting-point. Demopoulos argued in his article “Frege, Hilbert and the conceptual structure of model theory” (1994) that the rigid bond between sense and reference in Frege’s theory of meaning makes it impossible to account for the use of such symbols: the necessity of being interpreted, which provides a greater generality in the exposition of formal theories. Being able to be interpreted in various ways, they don’t allow that a specified sense determines uniquely a reference. Such expressions seem better suited for first-order languages of model theory.

We would like to defend the idea that what makes it impossible for Frege to admit such symbols is not his meaning-theory, but rather his view of the relationship between language and theory. The foundational character of the language dispenses truth to be relativised to a structure. A formal language is not to be conceived as a tool indifferent to the expression of one theory or another, but rather as a framework that has to be filled with a specific content.