PSA 1998 Abstracts

Abstracts of Contributed Papers


[Conference Home Page]


"Do We See Through a Social Microscope?: Credibility as Vicarious Selector"
Douglas Allchin, University of Texas at El Paso

Abstract:
Credibility in a scientific community (sensu Shapin) is a vicarious selector (sensu Campbell) for the reliability of reports by individual scientists or institutions. Similarly, images from a microscope (sensu Hacking) are vicarious selectors for studying specimens. Working at different levels, the process of indirect reasoning and checking indicates a unity to experimentalist and sociological perspectives, along with a resonance of strategies for assessing reliability.


"The Dogma of Isomorphism: A Case Study From Speech Perception"
Irene Appelbaum, University of Montana

Abstract:
It is a fundamental tenet of philosophy of the "special sciences" that an entity may be analyzed at multiple levels of organization. As a corollary, it is often assumed that the levels into which a system may be theoretically analyzed map straightforwardly onto real stages of processing. I criticize this assumption in a case study from the domain of speech science. I argue (i) that the dominant research framework in speech perception embodies the assumption that units of processing mirror units of conceptual structure, and (ii) that this assumption functions not as a falsifiable hypothesis, but as an entrenched dogma.


"The Curve Fitting Problem"
Prasanta S. Bandyopadhyay and Robert J. Boik, Montana State University

Abstract:
In the curve fitting problem two conflicting desiderata, simplicity and goodness-of-fit pull in opposite directions. To solve this problem, two proposals, the first one based on Bayes' theorem criterion (BTC) and the second one advocated by Forster and Sober based on Akaike's Information Criterion (AIC) are discussed. We show that AIC, which is frequentist in spirit, is logically equivalent to BTC, provided that a suitable choice of priors is made. We evaluate the charges against Bayesianism and contend that AIC approach has shortcomings. We also discuss the relationship between Schwarz's Bayesian Information Criterion and BTC.


"Bell's Theorem, Non-Separability and Space-Time Individuation in Quantum Mechanics"
Darrin W. Belousek, University of Notre Dame

Abstract:
We first examine Howard's analysis of the Bell factorizability condition in terms of "separability" and "locality" and then consider his claims that the violations of Bell's inequality by the statistical predictions of quantum mechanics should be interpreted in terms of "non-separability" rather than "non-locality" and that "non-separability" implies the failure of space-time as a principle of individuation for quantum-mechanical systems.And I find his arguments for both claims to be lacking.


"Why Physical Symmetries?"
Elena Castellani, University of Florence

Abstract:
This paper is concerned with the meaning of "physical symmetries," i.e. the symmetries of the so-called laws of nature. The importance of such symmetries in nowadays science raises the question of understanding their very nature. After a brief review of the relevant physical symmetries,I first single out and discuss their most significant functions in today's physics. Then I explore the possible answers to the interpretation problem raised by the role of symmetries and I argue that investigating the real nature of physical symmetries implies, in some sense, discussing the meaning and methods of physics itself.


"Problems with the Deductivist Image of Scientific Reasoning"
Philip E. Catton, University of Canterbury

Abstract:
There seem to be some very good reasons for a philosopher of science to be a deductivist about scientific reasoning. Deductivism is apparently connected with a demand for clarity and definiteness in the reconstruction of scientists' reasonings. And some philosophers even think that deductivism is the way round the problem of induction. But the deductivist image is challenged by cases of actual scientific reasoning, in which hard-to-state and thus discursively ill-defined elements of though nonetheless significantly condition what practitioners accept as cogent argument. And arguably, these problem cases abound. For example, even geometry -- for most of its history -- was such a problem case, despite its exactness and rigor. It took a tremendous effort on the part of Hilbert and others, to make geometry fit the deductivist image. Looking to the empirical sciences, the problems seem worse. Even the most exact and rigorous of empirical sciences -- mechanics -- is still the kind of problem case which geometry once was. In order for the deductivist image to fit mechanics, Hilbert's sixth problem (for mechanics) would need to be solved. This is a difficult, and perhaps ultimately impossible task, in which the success so far achieved is very limited. I shall explore some consequences of this for realism as well as for deductivism. Through discussing links between non-monotonicity, skills, meaning, globality in cognition, models, scientific understanding, and the ideal of rational unification,I argue that deductivists can defend their image of scientific reasoning only by trivializing it, and that for the adequate illumination of science, insights from anti-deductivism are needed as much as those which come from deductivism.


"Are GRW Tails as Bad as They Say?"
Alberto Cordero, Graduate Center and Queens College, CUNY

Abstract:
GRW models of the physical world are severely criticized in the literature for involving wave function 'tails', the allegation being that the latter create fatal interpretive problems and even compromise standard arithmetic.I find such objections both unfair and misguided. But not all is well with the GRW approach. The complaint I articulate in this paper does not have to do with tails as such but with the specific way in which past physical structures linger forever in the total GRW wave function. I argue that this feature, which is an artifact of a particular and ultimately optional genre of collapse mechanisms, yields a total picture that is too close to the "Many Worlds" type to deserve clear methodological acceptability, particularly in light of the effective empirical equivalence between the two objectivist approaches.


"Why Bayesian Psychology is Incomplete"
Frank Döring, University of Cincinnati

Abstract:
Bayesian psychology, in what is perhaps its most familiar version, is incomplete: Jeffrey conditionalization is sensitive to the order in which the evidence arrives. This order effect can be so pronounced as to call for a belief adjustment that cannot be understood as an assimilation of incoming evidence by Jeffrey's rule. Hartry Field's reparameterization of Jeffrey's rule avoids the order effect but fails as an account of how new evidence should be assimilated.


"The Conserved Quantity Theory of Causation and Chance Raising"
Phil Dowe, University of Tasmania

Abstract:
In this paper I consider a problem for the Conserved Quantity Theory of causation, in its most recent version (Salmon 1997). The problem concerns cases where an event, which tends to prevent another, fails to on a particular occasion, and where the two are linked by causal processes. I call this the case of "connected non-causes." Chance-raising theories of causation, on the other hand, do account easily for this problem, but face the problem of chance-lowering causes. I show how the two approaches can be combined to solve both problems.


"Laudan's Naturalistic Axiology"
Karyn Freedman, University of Toronto

Abstract:
Doppelt (1986, 1990), Siegel (1990) and Rosenberg (1996) argue that the pivotal feature of Laudan's normative naturalism, namely his axiology, lacks a naturalistic foundation. In this paper I show that this objection is ill founded, and turns on an ambiguity in the notion of 'naturalism.' Specifically, I argue that there are two important senses of naturalism running through Laudan's work. Once these two strands are made explicit, the objection raised by Doppelt et al. simply evaporates.


"Is Pure R-Selection Really Selection?"
Bruce Glymour, Kansas State University

Abstract:
Lennox and Wilson (1994) critique modern accounts of selection on the grounds that such accounts will class evolutionary events as cases of selection whether or not the environment checks population growth. Lennox and Wilson claim that pure r-selection involves no environmental checks, and that accounts of natural selection ought to distinguish between the two sorts of cases. I argue that Lennox and Wilson are mistaken in claiming that pure r-selection involves no environmental checks, but suggest that two related cases support their substantive complaint, namely that modern accounts of selection have resources insufficient for making important distinctions in causal structure.


"Explanatory Pluralism in Paleobiology"
Todd A. Grantham, College of Charleston

Abstract:
This paper is a defense of "explanatory pluralism" (i.e., the view that some events can be correctly explained in two distinct ways). To defend pluralism, I argue that a certain class of macroevolutionary trends (what I call "asymmetrical passive trends") can be explained in two distinct but compatible ways. The first approach ("actual sequence explanation")is to trace out the particular forces that affect each species. The second approach treats the trend as "passive" or "random" diffusion from a boundary in morphological space. I argue that while these strategies are distinct, both kinds of explanation can be true of a single trend. Further, since neither strategy can be reduced or eliminated from paleobiology, we should accept that both strategies can provide correct explanations for a single trend.


"Theories as Complexes of Representational Media"
Robin F. Hendry, University of Durham, and Stathis Psillos, The London School of Economics

Abstract:
In this paper, we review two standard analyses of scientific theories in terms of linguistic and nonlinguistic structures respectively. We show that by focusing exclusively on either linguistic or extra linguistic representational media, both of the standard views fail to capture correctly the complex nature of scientific theories. We argue primarily against strong versions of the two approaches and suggest that the virtues of their weaker versions can be brought together under our own interactions approach. As historical individuals, theories are complex consortia of different representational media: words, equations, diagrams, analogies and models of different kinds.To replace this complexity with a monistic account is to ignore the representational diversity that characterizes even the most abstract physical theory.


"Helmholtz's Naturalized Conception of Geometry and his Spatial Theory of Signs"
David J. Hyder, Max-Planck-Institut für Wissenschaftsgeschichte

Abstract:
I analyze the two main doctrines of Helmholtz's "The Facts in Perception," in which he argued that the axioms of Euclidean geometry are not, as his neo-Kantian opponents had argued, binding on any experience of the external world. This required two argumentative steps: (1) a new account of the structure of our representations which was consistent both with the experience of our (for him) Euclidean world and with experience of a non-Euclidean one, and (2) a demonstration of how geometric and mathematical propositions derived not from Kantian intuition, but from the contingent natures of the representations themselves. I show how (1) and (2) together provided a naturalized "physiological epistemology," a view that was adopted in many of its essential features by Einstein, Wittgenstein and the Vienna Circle.


"Use-Novelty, Gellerization, and Severe Tests"
Tetsuji Iseda, University of Maryland, College Park

Abstract:
This paper analyzes Deborah Mayo's recent criticism of use-novelty requirement. She claims that her severity criterion captures actual scientific practice better than use-novelty, and that use-novelty is not a necessary condition for severity. Even though she is right in that there are certain cases in which evidence used for the construction of the hypothesis can test the hypothesis severely, I do not think that her severity criterion fits better with out intuition about good tests than use-novelty. I argue for this by showing a parallelism in terms of severity between the confidence interval case and what she calls "gellerization." To account for the difference between these cases, we need to take into account certain additional considerations like a systematic neglect of relevant alternatives.


"A Note on Nonlocality, Causation and Lorentz-Invariance"
Federico Laudisa, University of Florence

Abstract:
The status of a causal approach to EPR-Bell nonlocal correlations in terms of a counterfactual theory of causation is reviewed. The need to take into due account the spacetime structure of the events involved is emphasized. Furthermore, it is argued that adopting this approach entails the assumption of a privileged frame of reference, an assumption that seem seven more in need of justification than the causal theory itself.


"Explaining the Emergence of Cooperative Phenomena"
Chuang Liu, University of Florida

Abstract:
Phase transitions, such as spontaneous magnetization in ferromagnetism, are the most fundamental cooperative phenomenon in which long-range orders emerge in a system under special conditions. Unlike collective phenomena which are mere collections of processes (or actions) mostly accountable by statistical averages of the micro-constituents' properties, they are results of genuine cooperation in which unique properties emerge for the whole system that are not such simple averages. In this paper I investigate what the problem of phase transitions is, how it is solved by a mathematical maneuver, i.e. taking the thermodynamic limit, and whether the solution is sound and rigorous as claimed.


"Van Fraassen and Ruetsche on Preparation and Measurement"
Bradley Monton, Princeton University

Abstract:
Ruetsche (1996) has argued that van Fraassen's (1991) Copenhagen Variant of the Modal Interpretation (CVMI) gives unsatisfactory accounts of measurement and of state preparation. I defend the CVMI against Ruetsche's first argument by using decoherence to show that the CVMI does not need to account for the measurement scenario which Ruetsche poses. I then show, however, that there is a problem concerning preparation, and the problem is more serious than the one Ruetsche focuses on. The CVMI makes no substantive predictions for the everyday processes we take to be measurements.


"Applying Pure Mathematics"
Anthony Peressini, Marquette University

Abstract:
Much of the current thought concerning mathematical ontology and epistemology follows Quine and Putnam in looking to the indispensable application of mathematics in science. In particular, the Quine/Putnam indispensability approach is the inevitable staging point for virtually all contemporary discussions of mathematical ontology. Just recently serious challenges to the indispensability approach have begun appearing. At the heart of this debate is the notion of an indispensable application of (pure) mathematics in scientific theory. To date the discussion has focused on indispensability, while little has been said about the process of application itself.In this paper I focus on the process of applying (pure) mathematical theory in physical theory.


"Functional and Intentional Action Explanations"
Mark Risjord, Emory University

Abstract:
Functional explanation in the social sciences is the focal point for conflict between individualistic and social modes of explanation. While the agent may have thought she had reasons for action, the functional explanation reveals the hidden strings of the puppet master. This essay argues that the conflict is merely apparent. The erotetic model of explanation is used to analyze the forms of intentional action and functional explanations.One result is that there are two kinds of functional explanation, and only one is appropriate in the social sciences. While a functional explanation may have the same topic as an intentional action explanation, they are compatible.


"What Should a Normative Theory of Values in Science Accomplish?"
Kristina Rolin, University of Helsinki

Abstract:
This paper compares two different views about the role of values in scientific judgment, one proposed by Ernan McMullin and the other by Helen Longino. I argue first that McMullin's distinction among epistemic and non-epistemic values is based on a problematic understanding of the goalsof science. Second, I argue that Longino offers a more adequate understanding of the goals of science but her concept of constitutive value is not sufficiently normative. Third, I conclude that a normative theory of values in science should address the normative concerns central to McMullin's work while integrating the more complex understanding of the goals of science emerging from Longino's work.


"Changing the Subject: Redei on Causal Dependenceand Screening Off in Relativistic Quantum Field Theory"
Laura Ruetsche and Rob Clifton, University of Pittsburgh

Abstract:
In a recent pair of articles (Redei 1996, 1997), Miklos Redei has taken enormous strides toward characterizing the conditions under which relativistic quantum field theory is a safe setting for the deployment of causal talk. Here, we challenge the adequacy of the accounts of causal dependence and screening off on which rests the relevance of Redei's theorems to the question of causal good behavior in the theory.


"Visual Prototypes"
Pauline Sargent, University of California, San Diego

Abstract:
In this paper I introduce the concept of "visual prototype" to capture one particular kind of work done by visual representation in the practice of science. I use an example from neuroscience; more particularly, the example of the visual representation of the pattern of convolutions (the gyri and sulci) of the human cortex as this pattern is constructed and used in the practice of mapping the functional human brain. I argue that this work which is visual representation as distinct from linguistic representation is an essential piece of the human brain mapping project.


"Selection and the Extent of Explanatory Unification"
Rob Skipper, University of Maryland at College Park

Abstract:
According to Philip Kitcher, scientific unification is achieved via the derivation of numerous scientific statements from economies of argument schemata. I demonstrate that the unification of selection phenomena across the domains in which it is claimed to occur, evolutionary biology, immunology and, speculatively, neurobiology, is unattainable on Kitcher's view. I then introduce an alternative method for rendering the desired unification based on the concept of a mechanism schema which can be integrated with Wesley Salmon's causal-mechanical model of explanation. I conclude that the gain in unification provided by the alternative account suggests that Kitcher's view is defective.


"Rhetoric, Narrative and Argument in Bruno Latour's Science in Action"
David G. Stern, University of Iowa

Abstract:
Why does Latour's Science in Action, which approaches science rhetorically, highlighting the persuasive and political work that must be done to establish a scientific or technological fact, not examine its own rhetoric? Latour acknowledges he is making use of the persuasive techniques he attributes to science, but to submit his own techniques to the demythologizing scrutiny he directs at philosophy of science would undercut his primary goal of recruiting readers to his colors. The paper scrutinizes three central figures in Science in Action: science as war, as network, and as Janus-faced, and examines its disciplinary and biographical context.


"The Failure of Equivariance for Real Ensembles of Bohmian Systems"
Jitendra Subramanyam, University of Maryland

Abstract:
The continuity equation has long been thought to secure the equivariance of ensembles of Bohmian systems. The statistical postulate then secures the empirical adequacy of Bohm's theory. This is true for ideal ensembles those represented as continuous fluids in configuration space. But ensembles that confirm the experimental predictions of quantum mechanics are not ideal. For these non-ideal ensembles neither equivariance nor some approximation to equivariance follows from the axioms of Bohm's theory. Approximate equivariance(and hence the empirical adequacy of Bohm's theory) requires adding to Bohm's axioms or giving a detailed account of the behavior of Bohmian ensembles.


"Reconsidering the Concept of Equilibrium in ClassicalStatistical Mechanics"
Janneke van Lith-van Dis, Utrecht University

Abstract:
In the usual procedure of deriving equilibrium thermodynamics from classical statistical mechanics, Gibbsian fine-grained entropy is taken as the analog of thermodynamical entropy. However, it is well known that the fine-grained entropy remains constant under the Hamiltonian flow. In this paper it is argued that we needn't search for alternatives for fine-grained entropy, nor do we have to give up Hamiltonian dynamics, in order to solve the problem of the constancy of fine-grained entropy and, more generally, account for the non-equilibrium part of the laws of thermodynamics. Rather, we have to weaken the requirement that equilibrium identified with a stationary probability distribution.


"Who's Afraid of Undermining? Why the PrincipalPrinciple Need Not Contradict Humean Supervenience"
Peter B. Vranas, University of Michigan

Abstract:
The Principal Principle (PP) says that, for any proposition A, given any admissible evidence and the proposition that the chance of A is x%,one's conditional credence in A should be x%. Humean Supervenience (HS)claims that, among possible worlds like ours, no two differ without differingin the spacetime-point-by-spacetime-point arrangement of local properties.David Lewis (1986b, 1994) has argued that PP contradicts HS, and his argument has been accepted by Bigelow, Collins, and Pargetter (1993), Thau (1994),Hall (1994), Strevens (1995), Ismael (1996), and Hoefer (1997). Against this consensus, I argue that PP need not contradict HS.


"Reasoning With the 'Rosetta Stones' of Biology:Experimental Systems and Doable Research"
Kevin Lattery, University of Minnesota

Abstract:
I compare the Rosetta Stone's function of making hieroglyphics research"doable" with the use of experimental systems. Although biologists routinely recognize the importance of these systems, philosophers have not. One reason for this is that philosophers typically conceive experimental tools as mere instruments for gathering evidence. I show why we need an account of experimental tools that conceives them beyond the context of experiments and evidential reasoning. More specifically, I describe an investigative reasoning with experimental systems to establish and extend "doable research"and argue that this investigative reasoning is central to justifying and structuring our knowledge in experimental biology.


"The Limited World of Science: A Tractarian Account of Objective Knowledge"
Alfred Nordmann, University of South Carolina

Abstract:
According to Wittgenstein's Tractatus, scientists determine the structure of the world by identifying the true propositions which describe it. This determination is possible because i) elementary propositions are contingent, i.e., their truth depends on nothing but agreement with what is the case, ii) completely general propositions delimit the degree of freedom which the totality of elementary propositions leaves to the structure of the world. Once such completely general propositions are adopted (e.g.,the principle of conservation of weight), agreement among scientists can reduce to agreement among representations. On this account, Lavoisier's chemistry better promotes objective knowledge than Priestley's.


"The Likelihood Principle and the Reliability of Evidence"
Andrew Backe, University of Pittsburgh

Abstract:
The leading philosophical account of statistical inference relies on a principle which implies that only the observed outcome from an experiment matters to an inference. This principle is the likelihood principle, and it entails that information about the experimental arrangement itself is of no inferential value. In the present paper, I argue that adherence to the likelihood principle results in an account of inference that is insensitive to the capacity of observed evidence to distinguish a genuine phenomenon from a chance effect. This capacity, which is referred to as the reliability of the evidence, is dependent on the error properties of the experimental arrangement itself. Error properties incorporate outcomes not actually observed and, thus, cannot always be captured by likelihood functions.If my analysis is correct, then the central principle underlying the leading philosophical theory of inference should be rejected.


"Objects or Events? Towards an Ontology for QuantumField Theory"
Andreas Bartels, Universität Gesamthochschule Paderborn

Abstract:
Recently P. Teller and S. Auyang have suggested competing ersatz-ontologies which could account for the "loss of classical particles" in Quantum Field Theory (QFT): Field quanta vs. Field events. However, both ontologies suffer from serious defects. While quanta lack numerical identity, spatiotemporal localizability, and independence from basic representations, events if understood as concrete measurement events are related to the theory only statistically. I propose an alternative solution: The entities of QFT are events of the type "Quantum system S is in quantum state Y." The latter are not point events, but Davidsonian events, i.e. they can be identified by their location inside the causal net of the world.


"No One Knows the Date of the Hour: An UnorthodoxApplication of Rev. Bayes' Theorem"
Paul Bartha, University of British Columbia, and Christopher Hitchcock, Rice University

Abstract:
Carter and Leslie (1996) have argued, using Bayes' theorem, that our being alive now supports the hypothesis of an early "Doomsday." Unlike some critics (Eckhardt 1997), we accept their argument in part: given that we exist, our existence now indeed favors "Doom sooner" over "Doom later."The very fact of our existence, however, favors "Doom later." In simple cases, a hypothetical approach to the problem of "old evidence" shows that these two effects cancel out: our existence now yields no information about the coming of Doom. More complex cases suggest a move from countably additive to non-standard probability measures.


"Can Experiments Help Us Choose Between the Bohm and Copenhagen Interpretations of Quantum Mechanics?"
Lon Becker, University of Illinois at Chicago

Abstract:
In this paper I will look at whether it is possible to help us decide between the Bohm and Copenhagen interpretations of quantum mechanics. I will look at experiments which assume that the two interpretations are empirically indistinguishable but highlight features that might lead us to favor one interpretation over the other. I will argue that such experiments are interesting but ultimately not convincing. I will then sketch an experiment to suggest how it might be possible to create an experiment which could distinguish between the two interpretations by focusing on the presence or absence of collapse.


"Empiricism, Conservativeness and Quasi-Truth"
Otavio Bueno, University of Leeds

Abstract:
A first step is taken towards articulating a constructive empiricist philosophy of mathematics, thus extending van Fraassen's account to this domain. In order to do so, I adapt Field's nominalisation programme, making it compatible with an empiricist stance. Two changes are introduced: (a)Instead of taking conservativeness as the norm of mathematics, the empiricist countenances the weaker notion of quasi-truth (as formulated by da Costa and French), from which the formal properties of conservativeness are derived.(b) Instead of quantifying over space-time regions, the empiricist only admits quantification over occupied regions, since this is enough for his or her needs.


"Organization, Evolution and Cognition: Beyond Campbell's Evolutionary Epistemology"
Wayne Christensen and Clifford Hooker, University of Newcastle

Abstract:
Daniel Campbell has long advocated a naturalist epistemology based on a general selection theory, with the scope of knowledge restricted to vicarious adaptive processes. But being a vicariant is problematic because it involves an unexplained epistemic relation. We argue that this relation is to be explicated organizationally in terms of the regulation of behavior and internal state by the vicariant, but that Campbell's selectionist account can give no satisfactory account of it because it is opaque to organization.We show how organizational constraints and capacities are crucial to understanding both evolution and cognition and conclude with a proposal for an enriched, generalized model of evolutionary epistemology that places high-order regulatory organization at the center.


"The Analysis of Singular Spacetimes"
Erik Curiel, University of Chicago

Abstract:
Much controversy surrounds the question of what ought to be the proper definition of 'singularity' in general relativity, and the question of whether the prediction of such entities leads to a crisis for the theory.I argue that a definition in terms of curve incompleteness is adequate,and in particular that the idea that singularities correspond to 'missing points' has insurmountable problems. I conclude that singularities per se pose no serious problems for the theory, but their analysis does bringing focus several problems of interpretation at the foundation of the theory often ignored in the philosophical literature.


"The Light at the End of the Tunneling: Observations and Underdetermination"
Michael Dickson, Indiana University

Abstract:
Some version(s) of the following two theses are commonly accepted.(1) Observations are somehow "theory-laden." (2) There are observationally equivalent (but otherwise distinct) theories. How can both be true? What is meant by observational equivalence" if there are no "observations" that could be shared by theories? This paper is an attempt to untangle these issues by saying what "theory-ladenness" and "observational equivalence" amount to, and by considering an example. Having done so, the paper outlines a program for reconciling (1) and (2) within some sciences, based on a conception of theories as mathematical formalism plus interpretation.


"Inference to the Best Explanation is Coherent"
Igor Douven, Utrecht University

Abstract:
In his (1989) van Fraassen argues that Inference to the Best Explanation is incoherent in the sense that adopting it as a rule for belief change will make one susceptible to a dynamic Dutchbook. The present paper argues against this. An epistemic strategy is described that allows us to infer to the best explanation free of charge.


"NIH Consensus Conferences: Resolving Medical Controversy with Data in a Science Court"
John H. Ferguson, Office of Medical Applications of Research, NIH

Abstract:
Although the word science connotes exactness and precision, especially when applied to the so called hard sciences like mathematics, physics and chemistry, there is the need in all the sciences for interpretation of data. Interpretation of experiments and data may be even more essential in the softer sciences like biology, medicine and certainly the social sciences. As the courts must interpret the law, so too must the latest scientific findings be interpreted by someone, some group or institution to apply the findings in a useful or meaningful societal setting.

The NIH has had a program for the interpretation of new medical science for the past twenty years called the Consensus Development Program. This program was initiated to address some congressional concerns that new research funded by the NIH that was applicable to medical practice was not getting applied in the health care community. The NIH Consensus Conferences were designed to bridge this gap by assessing medical treatments and procedures that were newly derived from research and were now ready for application in the health care system, but where controversy regarding use or application remained — in other words — where unbiased interpretation was required. These conferences area kind of court proceeding attempting to resolve these controversies with interpretation of the new medical science and health data by an impartial jury.


"The Plurality of Bayesian Measures of Confirmation and the Problem of Measure Sensitivity"
Branden Fitelson, University of Wisconsin-Madison

Abstract:
Contemporary Bayesian confirmation theorists measure degree of confirmation using a variety of non-equivalent relevance measures. As a result, a great many of the arguments surrounding quantitative Bayesian confirmation theory are implicitly sensitive to choice of measure of confirmation. Strictly speaking, such arguments are enthymematic, since they presuppose that some relevance measure (or class of relevance measures) is superior to other relevance measures that have been proposed and defended in the philosophical literature. I present a survey of this pervasive class of Bayesian confirmation-theoretic enthymemes, and a brief analysis of some recent attempts to resolve this problem of measure sensitivity.


"Moral Responsibility and the 'Ignorant Scientist'"
John Forge, Griffith University

Abstract:
The question addressed in this paper is whether a scientist who has engaged in pure research can be held morally responsible for outcomes of that research which affect others, most notably through technology, when these effects were not foresee. The scientist was "ignorant" of these outcomes.It would appear that the answer must be that the or she could not be responsible, for surely one is only responsible for what one is, in the appropriate sense, in control of, and being genuinely ignorant would seem to preclude control. As against this, it will be claimed that a case can be made for saying that if the scientist can be held responsible for being ignorant, then he or she can be held (indirectly) responsible for the outcomes in questions, and indeed there are circumstances in which scientists can be responsible for being ignorant. As it happens, it is only in the sense of blame, and not of praise, that responsiblity has purchase here.


"Interpolation as Explanation"
Jaakko Hintikka, Boston University, and Ilpo Halonen, University of Helsinki

Abstract:
A (normalized) interpolant I in Craig's theorem is a kind of explanation why the consequence relation (from F to G) holds. This is because I isa summary of the interaction of the configurations specified by F and G, respectively, that shows how G follows from F. If explaining E means deriving it from a background theory T plus situational information A and if among the concepts of E we can separate those occurring only in T or only in A, then the interpolation theorem applies in two different ways yielding two different explanations and two different covering laws.


"Category Theory: The Language of Mathematics"
Elaine Landry, McGill University

Abstract:
In this paper, I set out to situate my claim that category theory provides the language for mathematical discourse. Against foundational approaches, I argue that there is no need to reduce either the content or structure of mathematical concepts and theories to either the universe of sets or the category of categories. I assign category theory the role of organizing what we say about the structure of both mathematical concepts and theories. Finally, I argue that category theory, sees as the language of mathematics, provides a framework for mathematical structuralism.


"Defending Abduction"
Ilkka Niiniluoto, University of Helsinki

Abstract:
Charles S. Peirce argued that, besides deduction and induction, there is a third mode of inference which he called "hypothesis" or "abduction." He characterized abduction as reasoning "from effect to cause", and as"the operation of adopting an explanatory hypothesis." Peirce's ideas about abduction, which are related also to historically earlier accounts of heuristic reasoning (the method of analysis), have been seen as providing a logic of scientific discovery. Inference to the best explanation (IBE) has been regarded as an important mode of justification, both in everyday life, detective stories, and science. In particular, scientific realism has been defended by an abductive no-miracle argument (Smart, Putnam, Boyd), while the critics of realism have attempted to show that this appeal to abduction is question-begging, circular, or incoherent (Fine, Laudan, van Fraassen).This paper approaches these issues by distinguishing weaker and stronger forms of abduction, and by showing how these types of inferences can be given Peircean and Bayesian probabilistic reconstructions.


"'Laws of Nature' as an Indexical Term: A Reinterpretation of Lewis's Best-System Analysis"
John Roberts, University of Pittsburgh

Abstract:
David Lewis's best-system analysis of laws of nature is perhaps the best known sophisticated regularity theory of laws. Its strengths are widely recognized, even by some of its ablest critics. Yet it suffers from what appears to be a glaring weakness: It seems to grant an arbitrary privilege to the standards of our own scientific culture. I argue that by reformulating, or reinterpreting, Lewis's exposition of the best-system analysis, we arrive at a view that is free of this weakness. The resulting theory of laws has the surprising consequence that the term "law of nature" is indexical.


"Scientific Objectivity and Psychiatric Nosology"
Patricia Ross, University of Minnesota

Abstract:
This paper challenges the traditional conception of objectivity in science by arguing that its singular focus on evidential relations and search for a perspectivalism renders it inadequate. Through an examination of psychiatric nosology, I argue that interesting and unique problems arise that challenge this conception of objectivity and that these challenges cannot be met by this account. However, a social practice account of objectivity provides a much more successful way of thinking about values and objectivity in psychiatric nosology. Moreover, it provides us with a far more successful account in general.


"The Semantic (Mis)Conception of Theories"
C. Wade Savage, University of Minnesota

Abstract:
On the traditional, "syntactic" conception, a scientific theory isa (preferably axiomatized) collection of statements. On the version of the opposing "semantic" conception advanced by Beatty and Giere, a theory is a definition plus an hypothesis that the definition has application.On the version advanced by Giere and van Fraassen, a theory is a non-linguistic model specified by a definition plus a hypothesis that the model appliesto some empirical system. On the version advanced by van Fraassen, Suppes, and Suppe, a theory is a collection of mathematical models. The advantages claimed for these versions of the "semantic" conception are either spurious or can be obtained under a suitable version of the "syntactic" conception.This conclusion will be argued here for the first two conceptions.


"Functionalism and the Meaning of Social Facts"
Warren Schmaus, Illinois Institute of Technology

Abstract:
This paper defends a social functionalist interpretation, modeled on psychological functionalism, of the meanings of social facts. Social functionalism provides a better explanation of the possibility of interpreting other cultures than approaches that identify the meanings of social facts with either mental states or behavior. I support this claim through a functionalist reinterpretation of sociological accounts of the categories that identify them with their collective representations. Taking the category of causality as my example, I show that if we define it instead in terms of its functional relations to moral rules, it becomes easier to recognize in other cultures.


"Proper Function and Recent Selection"
Peter Schwartz, University of Pennsylvania

Abstract:
"Modern History" version of the etiological theory claim that in order for a trait X to have the proper function F, individuals with X must have been recently favored by natural selection for doing F (Godfrey-Smith 1994, Griffiths 1992, 1993). For many traits with prototypical proper functions, however, such recent selection may not have occurred: traits may have been maintained due to lack of variation or due to selection for other effects.I examine this flaw in Modern History accounts and offer an alternative etiological theory, the Continuing Usefulness account, which appears to avoid such problems.


"Does Science Undermine Religion?"
Neven Sesardic, Miyazaki International College

Abstract:
In this paper I first explain the difference between three kinds of questions (purely scientific, purely religious, and mixed). Second, I try to show that historical conflicts between religion and science were not accidental; there is an inherent logic that forces religion to make empirical claims, putting it on a collision course with science. Third, I argue that the defeat of religion in its conflict with science over a number of empirical issues eventually undermines the credibility of "purely religious" beliefs as well. Fourth, I discuss some consequences of these views for the relationbetween science and religion.


"Degrees of Freedom in the Social World"
Mariam Thalos, SUNY at Buffalo

Abstract:
Ever since Hobbes we have sought to explain such extraordinarily commonplace facts as that prudent people trust each other, keep promises and succeed by and large at coordinating so as not to collide in corridors and roadways.In fact, it is the success which meets our efforts to coordinate without communication so as not to collide in corridors, which is perhaps the most perplexing of all social phenomena, because it is enjoyed by creatures with the capacity to conceive of (inconveniently) many coordination schemes that achieve the same ends. And ever since Thomas Schelling we have been suspecting that traditional game theory, which as I shall show embraces a Kantian view of agency, cannot explain this success. I shall here propose an anti-Kantian account of agency which takes as its point of departure the kinds of coordinations that Schelling demonstrated are so difficult for traditional game theory to explain. My proposal shall reject the Bayesian foundations of game theory, which rest on this Kantian view of agency.


"Measured Realism and Statistical Inference: An Explanation for the Fast Progress of 'Hard' Psychology"
J.D. Trout, Loyola University of Chicago

Abstract:
The use of null hypothesis significance testing (NHST) in psychology has been under sustained attack, despite its reliable use in the notably successful, so-called "hard" areas of psychology, such as perception and cognition. I argue that, in contrast to merely methodological analyses of hypothesis testing (in terms of "test severity," or other confirmation-theoretic notions), only a patently metaphysical position can adequately capture the uneven but undeniable successes of theories in "hard psychology." I contend that Measured Realism satisfies this description and characterizes the role of NHST in hard psychology.


"The Principle of the Common Cause Faces the Bernstein Paradox"
Jos Uffink, Utrecht University

Abstract:
I consider the problem of extending Reichenbach's principle of the common cause to more than two events, vis-à-vis an example posed by Bernstein. It is argued that the only reasonable extension of Reichenbach's principle stands in conflict with a recent proposal due to Horwich. I also discuss prospects of the principle of the common cause in the light of these and other difficulties known in the literature and argue that a more viable version of the principle is the one provided by Penrose and Percifal (1962).


"Inference to the Best Explanation and TheoreticalEntities"
Susan Vineberg, Wayne State University

Abstract:
Scientific entity realism has been challenged on the grounds that it depends on the controversial principle of inference to the best explanation.I defend the view against this challenge, by showing that the particular inferences needed by the entity realist do not have the objectionable features antirealists cite in questioning inference to the best explanation. Indeed, these inferences are either ones needed by empiricists in arguing for empirical adequacy, or which are no stronger than the realist needs for justifying his claims about ordinary objects.


"Gravity and Gauge Theory"
Steve Weinstein, Northwestern University

Abstract:
Gauge theories are theories which are invariant under a characteristic group of "gauge" transformations. General relativity is invariant under transformations of the diffeomorphism group. This has prompted many philosophers and physicists to treat general relativity as a gauge theory, and diffeomorphisms as gauge transformations. I argue that this approach is misguided.


"Defending Longino's Social Epistemology"
K. Brad Wray, University of Calgary

Abstract:
Though many agree social factors influence inquiry, developing a viable social epistemology has proved to be difficult. According to Longino, it is the processes that make inquiry possible that are social; they require a number of people to sustain them. These processes also ensure that the results of inquiry are not merely subjective opinions, and thus deserve the label "knowledge." I defend Longino's epistemology against charges of Kitcher, Schmitt, and Solomon. Longino rightly recognizes that different social factors have different effects on inquiry, and recommends reconceptualizing "knowledge," distinguishing knowledge from opinion by reference to a social standard.



Symposium: Philosophical Perspectives on Quantum Chaos

Organizer: Frederick M. Kronz, University of Texas at Austin
Chair: TBA

"Bohmian Insights into Quantum Chaos"
James T. Cushing, University of Notre Dame

Abstract:
The ubiquity of chaos in classical mechanics (CM), as opposed to the situation in standard quantum mechanics (QM), might be taken as speaking against QM being the fundamental theory of physical phenomena. Bohmian mechanics (BM), as a formulation of quantum theory, may clarify both the existence of chaos in the quantum domain and the nature of the classical limit. Two interesting possibilities are (i) that CM and classical chaos are included in and underwritten by quantum mechanics (BM) or (ii)that BM and CM simply possess a common region of (noninclusive) overlap. In the latter case, neither CM nor QM alone would be sufficient, even in principle, to account for all the physical phenomena we encounter. In this talk I shall summarize and discuss the implications of recent work on chaos and the classical limit within the framework of BM.


"Nonseparability and Quantum Chaos"
Frederick M. Kronz, The University of Texas at Austin

Abstract:
The clockwork image of classical physics has been shattered by the relatively recent discovery of chaotic classical models. The converse to this ironic situation seems to be developing in quantum physics. Conventional wisdom has it that chaotic behavior is either strongly suppressed or absent in quantum models. Some have concluded that these considerations serve to undermine the correspondence principle thereby raising serious doubts about the adequacy of quantum mechanics. So, the quantum chaos question is a prime subject for philosophical analysis. The most significant reasons given for the absence of suppression of chaotic behavior in quantum models are the linearity of Schrödinger's equation and the unitarity of the time-evolution described by that equation. Both are shown in this essay to be irrelevant by demonstrating that the crucial feature for chaos is the nonseparability of the Hamiltonian. That demonstration indicates that quantum chaos is likely to be exhibited in models of open quantum systems. A measure for probing such models for chaotic behavior is developed and then used to show that quantum mechanics has chaotic models for systems having a continuous energy spectrum. The prospects of this result for vindicating the correspondence principle are then briefly examined.


"Chaos and Fundamentalism"
Gordon Belot, Princeton University

Abstract:
I discuss the philosophical significance of the problem of quantum chaos. I concentrate on the most pressing form of this problem: the fact that there are many chaotic classical systems whose quantizations are non-chaotic, even periodic. I argue that this puts a strain on the correspondence principle, conceived of as requiring that quantum mechanics should be capable of accounting for the empirical successes of classical mechanics. I argue that even very weak versions of the correspondence principle are in some danger of being falsified in light of the problem of quantum chaos: the viability of weak versions of the correspondence principle depends upon rather delicate empirical considerations. This leaves us worrying about a question which was once thought to have been settled: the empirical and conceptual relationship between classical mechanics and quantum mechanics. I close with some comments concerning the philosophical import of the details of such limiting relations between theories.



Symposium: The Organism in Philosophical Focus

Organizer: Manfred D. Laubichler, Princeton University
Chair: TBA

"Fashioning a Descriptive Model of the Worm"
Rachel Ankeny, University of Pittsburgh

Abstract:
The nematode Caenorhabditis elegans frequently is touted as an exemplar of a model organism for genetics, development, and neurobiology. This paper analyzes the concept of C. elegans as a model organism against the background of a larger historical study. I argue that the early history of the worm project included a pre-explanatory stage that involved descriptive rather than explanatory models and accordingly lacked much emphasis on theory or explanations as they are traditionally viewed in the philosophy of science; therefore the applicability of previous philosophical views on modeling (e.g., the semantic conception of theories) is limited. As an alternative, I expand the concept of a descriptive model, and use it to understand how C. elegans may be viewed as a prototype of the metazoa against the backdrop of examples of the three types of modeling that occur with C. elegans: modeling of structures, of processes, and of information. In conclusion, I suggest that more investigation of descriptive models(such as those generated in the worm project) and their relation to explanatory models developed in later stages of a scientific research program must be done in order to capture important aspects of organism-based research.


"Behavior at the Organismal and Molecular Levels: The Case of C. elegans"
Kenneth Schaffner, George Washington University

Abstract:
Caenorhabditis elegans (C. elegans) is a tiny worm that has become the focus of a large number of world wide research projects examining its genetics, development, neuroscience, and behavior. Recently two groups of investigators (Lockery's and Rankin's labs) have begun to tie together the behavior of the organism and the underlying neural circuits and molecular processes implemented in those circuits. Behavior is quintessentially organismal -- it is the organism as a whole that moves and mates -- but the explanations are devised at the molecular and neurocircuit levels, and tested in populations using protocols that span many levels of aggregation. This paper presents a summary of key results in neural network analysis of the worm, and considers the prospects for generalizing those results, both methodologically and substantively, to other organisms. In particular, some comparisons and contrasts with behavioral and neuroscience work on the fruit fly, Drosophilamelanogaster, will be briefly considered.


"Organism and Decomposition: Steps Towards an Integrative Theory of Biology"
Manfred D. Laublicher, Princeton University, and Günter P. Wagner, Yale University

Abstract:
The organism is the central reference frame for much of biological theory and practice. It is therefore all the more surprising that no well defined organism concept exists or even that the organism concept does not draw similar attention as other intensely debated biological concepts such as the gene or the species concept. So far, the problem of the organism has been mostly discussed in the context of the problem of individuality and the unit of selection issue (see e.g., Buss 1987, Hull 1989, Ghiselin 1974).

Here we investigate the question whether there is an organism hidden in many successful biological theories and models. We propose a notion of the organism that would serve as the focus of integration for the many separate biological theories that make up the newly established fields of organismal and integrative biology. We argue that the structural weakness of many of these theories lies in the absence of a general theory of the organism that, together with an appropriate decomposition theorem, would account for the specific variables in these theories and models as properly individualized characters of an organism. We will investigate the formal properties of such a general organism concept and demonstrate the application of a decomposition theorem in the case of the unit of selection(see also Laubichler 1997, Wagner, Laubichler, and Bagheri 1997).


"Ontological Butchery: Distinguishing Organisms from Other Functionally Integrated Biological Entities"
Jack A. Wilson, Washington and Lee University

Abstract:
Biological entities of many kinds, such as cells, organisms, and colonies, are made up of physically continuous and causally integrated parts. Some are composed of parts which are themselves built from causally integrated parts, e.g., a colonial siphonophore and the zooids that compose it. Both have many of the properties commonly associated with an organism. The parts of these entities display diverse degrees of integration which makes it difficult to distinguish organisms from other kinds of living things. In this paper, I explore the history of the organism concept in biology and defend a demarcation criterion based on a position shared by J.S. Huxley and Richard Dawkins, that an organism is a living entity that is sufficiently heterogeneous in form to be rendered non-functional if cut in half.


"The Organism in Development"
Robert C. Richardson, University of Cincinnati

Abstract:
Developmental biology has resurfaced in recent years, but often apparently without a clear central role for the organism. The organism is pulled in divergent directions: on the one hand, there is an important body of work that emphasizes the role of the gene in development, as executing and controlling embryological changes (cf. Gilbert 1991); on the other hand, there are more theoretical approaches under which the organism disappears as little more than an instance for testing biological theories (cf. Brandon 1997; Schank and Wimsatt 1976, and Kauffman 1993). I press for the ineliminability of the organism in developmental biology. The disappearance of the organism is illusion. Heterochrony, or change of timing in development, assumes a central role in evolutionary developmental biology(Gould 1977; McKinney and McNamara 1991), whether or not it deserves the role often accorded to it as the central developmental mechanism (cf. Raff1996). Genetic studies of the basis of heterochrony in C. elegans,for example, display early appearance of adult structures, and deletion of early stages, both heterochronic changes. They nonetheless display the global basis of heterochronies (Ambros and Moss 1994), and leave a central role to the organism. More theoretical approaches treat heterotopy, or spatial patterning in development, as the consequence of more general and abstract processes of development (Kauffman 1993), or understand changes in timing in terms of general dependency relations (Schank and Wimsatt 1976). These more global, "emergentist"approaches fit poorly with a broad range of cases in developmental biology. Classical transplantation experiments show a degree of epigenetic (systemic and local) control for normal growth (Bryant and Simpson 1984) which is not consistent either with more formal models or with developmental entrenchment. In studies of adult organisms (Maunz and German 1997), there are heterochronic patterns of growth which are displayed by particular organisms (in this case, rabbits),but which may not be general. There are many sources of heterochrony, and it is important to determine which are prevalent in a given case. Again, the organism turns out to be central in understanding the evolution of development.

Commentator: Jane Maienschein, Arizona State University



Symposium: Studies in the Interaction of Psychology and Neuroscience

Organizer: Gary Hatfield, University of Pennsylvania
Chair: TBA

"Mental Functions as Constraints on Neurophysiology: Biology and Psychology of Color Vision"
Gary Hatfield, University of Pennsylvania

Abstract:
The concept of function has been prominent in both philosophy of biology and philosophy of psychology. Philosophy of psychology, or philosophical analysis of psychological theory, reveals that rigorous functional analyses can be carried out in advance of physiological knowledge. Indeed, in the area of sensory perception, and color vision in particular, knowledge of psychological function leads the way in the individuation and investigation of visual neurophysiology. Psychological functions constrain biological investigation. This example is of general interest as an instance of the relation between biological and psychological functions and their "wet"realizations.


"Cognitive Science and the Neuroethology of Electroreception"
Brian L. Keeley, Washington University in St. Louis

Abstract:
The theoretical relationship between psychology and neurobiology has been a long standing point of debate within the philosophy of cognitive science. Some, Fodor for example, have argued that psychology has a significant degree of theoretical autonomy from the facts of neurobiology. Others, such as the Churchlands, have responded with arguments purporting to show that neurobiology has a crucial and ineliminable role to play incognitive theorizing at all levels. This paper takes as its starting point the science of neuroethology, a multidisciplinary investigation of animal behavior that shares many structural similarities with cognitive science. The neuro-ethological investigation which led to the discovery of electroreception (the ability of certain non-human animals to perceive the world via electricity) is used as a model for how a variety of disciplines should interact and cross-constrain one another. Such a model suggests that the fields that make up cognitive science ought to co-evolve and constrain one another in a mutually supportive fashion.


"Neuropsychology and Neurology"
William Hirstein, William Paterson University

Abstract:
Blind-sight and the various agnosias provide important windows on brain functioning. Through the study of deficits caused by brain damage, inferences can be made about the normal functioning of the brain. But these inferences require a conception of normal function cast in psychological terms. The methodology of neuropsychology shed light on the cross-constraints between psychology and neuroscience.



Symposium: Toward a New Understanding of Scientific Success

Organizer: Janet A. Kourany, University of Notre Dame
Chair: TBA

"How Inevitable Are the Results of Successful Science?"
Ian Hacking, University of Toronto

Abstract:
There are innumerable ways in which successful sciences and their applications could have gone differently different questions, different aims, different patrons, different talents, different recourses, different interests. Yet there is a strong conviction, among many scientists, that any successful investigation of a definite subject matter (such as genetics, heat, or astrophysics) would produce equivalent results. But even some quite cautious authors, like Keller, doubt this. Genetics without the idea of a genetic program could, she seems to imply, have led to a non-equivalent but successful genetics. The strongest claim for inevitability is that any intelligent being who took up science, even an alien in outer space, would develop something equivalent to our fundamental physics, if that being was curious in the ways in which we are curious. In contrast my claims about the forms of scientific knowledge seem to imply that had the physics of the past fifty years not been weapons-funded, non-equivalent physics would have resulted. Of course some workers with a social approach to knowledge make much stronger suggestions than these. Who is right? The inevitabilist or the contingentist? To answer, one needs first to understand what it is to make different investigations of the same field of inquiry. One needs to understand what it is for two bodies of knowledge to be "equivalent." Only when these conceptual issues are clarified can one even begin to assess claims to inevitability or contingency of successful fundamental scientific conclusions. My contribution to this symposium will be an attempted clarification of "equivalence," and a discussion of the implications for "inevitability."


"What Does 'Explanation' Mean in Developmental Biology?"
Evelyn Fox Keller, MIT

Abstract:
Three questions form the background to my remarks:
(1) What are the prevailing explanatory narratives in contemporary biological discussions of development?
(2) What are the scientific functions of these narratives?
(3) How are we to assess the productivity of an explanatory narrative?

By referring to explanations in terms of narrative, I mean to underscore two starting assumptions: (a) Explanations are rarely if ever either logically complete or robust; (b) Yet, whatever their purely logical shortcomings, they are nonetheless indispensable to the conduct of science. Thus, attempts to assess the productivity of explanatory narratives cannot rest solely on logical measures, but will also need to take other measures, including creativity and rhetorical power, into account. It is part of my aim to describe more appropriate (in the sense, i.e., of realistic) measures of explanatory narratives in developmental biology. It is also my hope that such an exercise might even be useful to working scientists in helping them to identify narratives that have outlived their usefulness.

As a case in point, I want to consider the notion of genetic program. Ever since its introduction in 1961, this notion has served as something of a rhetorical tour de force for molecular biology that has manifested tremendous productivity. Today, however, I will argue that this notion has outlived its usefulness, and may even have become counterproductive. I will illustrate by examining the specific case of the much publicized recent success of mammalian cloning.


"A Successor to the Realism-Anti-Realism Question"
Janet A. Kourany, University of Notre Dame

Abstract:
Ian Hacking and Evelyn Fox Keller have suggested, both in their papers for this symposium and in previous work, that our views of scientific assessment have to be enriched in certain ways. What follows with regard to our views of the results of such assessment? Traditional philosophy of science, for the most part, gives us only two options. Either scientific theories that are positively assessed are true (or approximately true, or the like), or they are only useful fictions (empirically adequate,or something of the sort). Of course, traditional philosophy of science then goes on to disclose the intractable problems associated with each of these options, leading at least some philosophers of science (e.g.,Arthur Fine) to conclude that not only are both options indefensible, but so is the question that calls them forth. From my point of view, also, the traditional options are unacceptable, but for different reasons. They inspire the wrong kinds of attitudes toward the results of science. To say that a theory is true (or approximately true, etc.) is a discussion-stopper. However socially dangerous or unhelpful or sexist or racist or whatever the theory is, it is to say that that's the way things are we have to accept them. Of course, we need not pursue such a theory, but that is a difficult course of action to adopt. After all, truth, it is said, has intrinsic value: it should not be suppressed. And at any rate truth, it is said, will doubtless prove useful in the long run, even if it does not seem so in the short run. To say, on the other hand, that a theory is only useful (or empirically adequate, etc.) is another discussion-stopper, though not as profound a one. However socially dangerous or unhelpful or sexist or racist or whatever the theory is, it is to say that it still has value, that it still can be used for purposes of prediction. But it leaves open the possibility of pursuing an alternative theory that is just as useful, but without the drawbacks. The problem is that it buries the question of what the theory is useful for, why the predictions are needed whether, in fact, the theory is really useful to us at all.

What changes occur when we enrich our views of theory assessment along the lines suggested by Hacking and Keller? I will argue that the realism-anti-realism question under these conditions changes significantly, and I will link up pursuit of this changed question with the descriptive-prescriptive program for philosophy of science I argue for elsewhere.



Symposium: Evidence, Data Generation, and Scientific Practice: Toward A Reliabilist Philosophy of Experiment

Organizer: Deborah G. Mayo, Virginia Polytechnic Institute and State University
Chair: Ian Hacking, University of Toronto

"Data, Phenomena and Reliability"
James F. Woodward, California Institute of Technology

Abstract:
This paper will explore, in the context of the distinction that I have introduced elsewhere between data and phenomena, how data serve as evidence for phenomena. Phenomena (e.g., the existence of neutral currents)are stable, repeatable effects or processes that are potential objects of explanation by general theories and which can serve as evidence for such theories. Data (e.g., bubble chamber photographs) serve as evidence for the existence of phenomena but are typically not explained or entailed by general theory, even in connection with background information.

In contrast to standard philosophical models which invite us to think of evidential relationships as logical relationships of some kind I will argue that the reliability of the processes by which data are produced is crucial to their evidential status. Reliability is roughly the notion invoked by the reliabalist tradition in epistemology it is a matter of good error characteristics under repetition. On the reliabalist picture, it is this information about error characteristics rather than the deductive or inductive logical relationships between data and phenomena claims that determine whether the former support the latter.

This paper will describe several examples of data to phenomena reasoning in which the reliability of a detection process is at issue. I will then use these to motivate a number of more general claims about reliability and evidential support. For example, although the process by which evidence is produced is usually regarded as irrelevant by the standard models of confirmation, it is crucial on the reliabalist alternative. Given two qualitatively identical pieces of evidence (e. g., two indistinguishable bubble chamber photographs) one may be good evidence for the existence of neutral currents and the other may not be at all, depending on the way in which they have been produced. Similarly, while most standard models of evidential support require that evidence be (or be representable as) sentential or propositional in form the reliabalist model doesn't impose this requirement. Data which are non-propositional and which no one knows how to represent propositionally (e. g. X-ray photographs of possible tumors) can figure as the outcome of a reliable detection process. Moreover, various patterns of data to phenomena reasoning that are actually employed in science and that look circular or otherwise defective on standard accounts are readily reconstructable as legitimate on a reliabalist analysis. In all of these respects, the reliabalist account provides a better reconstruction of reasoning from data to phenomena than standard models of confirmation.


"Can Philosophical Theories of Evidence be Useful to Scientists?"
Peter Achinstein, Johns Hopkins University

Abstract:
Scientists frequently argue about whether, or to what extent, some putative evidence supports a given hypothesis. (Witness the recent debate concerning whether the Martian meteorite is evidence of past life on Mars.) The debates in question are not about whether the statements describing the evidence are true but about the relationship between these statements and the hypothesis they are supposed to support. Now there is an abundance of theories of evidence proposed by philosophers of science — probabilistic as well as non-probabilistic accounts — that attempt to say how to determine whether, or to what extent, evidence supports an hypothesis. Yet these theories are generally ignored by scientists in their disputes. Why is this so? Is it simply the result of a lack of communication between disciplines? Is it that philosophical theories of evidence are too abstract to be of any practical use? Is it that although methodological theories may describe actual practice, they have little if any effect in shaping or altering that practice? Perhaps some of each. But there is another factor that I want to consider, viz. that philosophers, particularly those espousing objective (by contrast to subjective) theories of evidence, generally assume that evidence is an a priori relationship. They assume that whether e, if true, supports h, and to what extent it does, can be determined by a priori calculation. I believe this assumption is mistaken. In general, whether e, if true, supports h, and to what extent, are empirical questions. If so, then scientists in a given discipline would seem to be in a better position to determine evidential relationships than philosophers.

In this paper I propose to do the following: (1) To argue that the relationship in question is empirical, not a priori. I will do so by showing that whether, and the extent to which, putative evidence supports an hypothesis depends crucially on how the evidence was gathered. Flaws in selection procedures are possible that can affect confirmation, and these can only be known empirically. Such flaws can be totally unexpected, given what is known, yet they can be shown to affect confirmation because they impugn the reliability of the data gathering method. I will argue that this explains why certain controversies occur among scientists over whether the putative evidence supports an hypothesis, even when there is no disagreement over what the evidence is. It also explains why certain experiments are repeated, usually with significant modifications.

(2) To consider whether this makes philosophical accounts irrelevant in settling evidential disputes (if so, why, if not, why not?); and to consider, more generally, how philosophers can contribute to scientific debates about evidence. Carnap held the idea that because the relationship between evidence and hypothesis is a priori, in principle philosophers can help settle scientific disputes by performing a priori calculations. This goal, I think, must be abandoned. Nevertheless, I believe that philosophical contributions are possible. I will argue that scientists in actual practice operate with several different concepts of evidence that need to be distinguished and clarified. I will distinguish four such, one subjective and the other three objective. All of them,I will argue, represent evidence as an empirical relationship. In the objective class one concept is relativized to a set of beliefs, while the other two are not. Of the latter, one requires the hypothesis itself to be true, the other does not. So there are fundamental differences.

By reference to actual examples, I will show how certain scientific disputes about the efficacy of the evidence arise because of empirical disagreements involving the testing procedure (something one would expect using concepts that represent evidence as an empirical relationship), while some disputes emerge from a use of the different concepts themselves. In the latter cases I will argue that attention to these differences and clarification of the concepts can aid in the resolution of the disputes.


"Experimental Practice and the Reliable Detection of Errors"
Deborah G. Mayo, Virginia Polytechnic Institute and State University

Abstract
Given the growing trend to renounce "armchair"philosophy of science, it is surprising that current philosophies of evidence remain largely divorced from the problems of evidence in actual scientific practice. The source of the problem, I argue, is the assumption — retained from logical empiricist accounts of confirmation — that how well evidence supports a hypothesis is a matter of a logical calculation based on given statements of data and hypotheses. Such accounts lack the resources to address central scientific questions about evidence because these questions turn on empirical information about the overall reliability of the procedures for generating the data and selecting the test — on what may be called procedural reliability. Two pieces of evidence that would equally well warrant a given hypothesis H on logical measures of evidential-relationship, may in practice be regarded as differing greatly in their evidential value because of differences in how reliably each is able to rule out errors in affirming H. I believe it is time to remedy this situation. Drawing on the work of my fellow contributors, Jim Woodward and Peter Achinstein, as well as my own work(e.g., Mayo 1996), I will attempt to sketch the directions in which a new reliabilist account of evidence might move.



Symposium: Conceptual Foundations of Field Theories in Physics

Organizer: Andrew Wayne, Concordia University
Chair: TBA

"The Gauge Argument"
Paul Teller, University of California at Davis

Abstract:
Contemporary quantum field theory contains a remarkable argument for the existence of so called "gauge particles," such as the photon and its generalizations. Starting from the assumption that the quantum field has certain very natural spatial symmetry properties, keeping the quantum field equations consistent appears to force the introduction of a field for the gauge particles. Physicists are careful not to claim that this argument provides a purely a priori deduction of the existence of gauge particles. But the argument does appear to require extraordinarilylittle to get it started. My effort will be to understand the natureof this argument as clearly as possible.


"Counterintuitive Features of Quantum Field Theory"
Gordon Fleming, The Pennsylvania State University

Abstract:
I will discuss the conceptual and structural aspects of several counterintuitive features of quantum field theory selected from the following topics: The Reeh-Schieder property of the vacuum state; Haag's theorem and the existence of unitarily inequivalent representations of a quantum field theory; Rindler quanta and horizon radiation. All of these features were puzzling surprises to the physics community when they were discovered. Allof them have long since been incorporated into the technical repertoireof quantum field theorists via widely accepted formal structures. None of them enjoys more than very indirect empirical support, if that. And none of them has yet been provided with better than tentatively plausibleconceptual analysis and physical interpretation. This situation willnot be rectified here but some suggestions will be made.


"Description, Individuation, and Relation in Gauge Field Theory"
Sunny Auyang, Independent Scholar

Abstract:
A quantum field is best interpreted as the quantization of a classicalfield, a continuous system. Together with the gravitational field,the quantum fields make up the fundamental ontology of modern physics. Despite its importance, the philosophical implications of the field ontology remain largely unexplored. This paper argues, first, the field ontology does not abandon the common sense notion of individual entities with definite properties but makes the notion precise; second, as spatio-temporally structured matter, it is incompatible with both the substantival and relational theories of spacetime.

Three types of entities are distinguished in field theories. Events, represented by local field operators, are the basic extensionlessentities individuated spatio-temporally. The equivalent of enduringbodies represented by paths, relate the events causally. Particles,which are the modes of field excitation, lack numerical identity and cannotbe referred to individually. The primacy of local fields over particlesimplies that the significance of spacetime lies in individuating and identifyingentities. As such it is built into the definition of the entitiesand exhausted by the definition. Spacetime does not exist independentlyand contain or support matter, as substantivalism maintains. Noris it the relation among predefined entities, for there are no individual entities in the field without presupposing its spatio-temporal structure.

Commentator: Andrew Wayne, Concordia University



Symposium: Philosophy and the Social Aspects of Scientific Inquiry:Moving On From the Science Wars"

Organizer: Noretta Koertge, Indiana University at Bloomington
Chair: Cassandra Pinnick, Western Kentucky University

"Reviving the Sociology of Science"
Philip Kitcher, University of California, San Diego

Abstract:
Despite the prevalence of the idea that sociology of science has come to play a major role in science studies, the discipline of sociologyof science, begun by Merton and his successors, has been largely abandoned. Using comparisons with other areas of sociology, I shall try to show whata genuine sociology of science might be, arguing that it can make importantcontributions to our understanding of scientific practices. Thosecontributions would not be antithetical to philosophy of science but, rather,would enrich our investigations.


"The Recent Past and Possible Future of the Sociology of Science"
Stephen Cole, SUNY at Stony Brook

Abstract:
In the 1980s the social constructivist approach became the overwhelmingly dominant influence on the sociology of science. Contrary to the wide definition of constructivism used by some historians and philosophers of science, I see constructivism as having at its essence a relativistic epistemology. Without relativism, constructivism does not differ in many significant ways from the traditional Mertonian sociology of science. Mertonians, in fact, studied more than twenty years ago many of the problems which constructivists address.

Social constructivism is seen not only as an intellectual movement, but as a political movement. As an intellectual movement constructivism is in shatters, not being able to successfully defend itself from the problems of reflexivity, the countless cases brought to light where evidence from the empirical world played determining roles in theory choice, and from attacks by natural scientists. As a political movement, constructivism retains its hegemony. All major disciplinary organizations and journals are dominated by or controlled by constructivists. It is not surprising that the number of people doing non constructivist sociology of science has dwindled to a mere handful. A once promising discipline, with important policy implications, has essentially been killed.

There are many problems in the organization and doing of science which cry out for sociological analysis. The new sociologists of science will be informed by some of the work of constructivists, but will ultimately return to their disciplinary routes and use the theories and methods of sociology which made the earlier work of substantial utility. The paper will briefly outline some of the problems that are most in need of attention.


"Science, Values and the Value of Science"
Noretta Koertge, Indiana University at Bloomington

Abstract:
Many people who reject the strong claims of postmodernists about the inevitability of epistemic relativism, the tyranny of language games, and the determinative influence of social values and professional interests on the content of scientific knowledge, nevertheless share some sympathy with the political critiques of science, technology and science policycoming out of cultural studies. In this paper I demonstrate the influence of postmodernist perspectives on definitions of scientific literacy and new initiatives in math and science education. Philosophers of science need to articulate more clearly a normative model of the role of values in science that can supplant the simplistic alternatives of positivism and postmodernism. I contrast Longino's approach with one deriving from Popper's philosophy of social science.


"The Contexts of Scientific Practice"
Rose-Mary Sargent, Merrimack College

Abstract:
When those in science studies began to look at the particularities surrounding laboratory practices it was a good counter to what had been a rather simplistic account of experiment in the philosophical literature. But the trend to see practices as purely social and divorced from the realm of theoretical knowledge has gone too far. I argue that studies of the social, technological, and intellectual contexts of science must be integrated in order to capture the full complexity and variety of experimental activities. In addition, historical cases show that although experimenters have pursued different goals ranging from the purely epistemic to the primarily social, significant methodological standards have tended to remain constant across contexts.



Symposium: Philosophy of Chemistry

Organizer: Eric R. Scerri, Bradley University
Chair: TBA

"Agency of Multilevel Coherences: in Chemistry and in Other Sciences"
Joseph E. Earley, Sr., Georgetown University

Abstract:
Investigators in many fields routinely deal with compound individuals -- entities composed of parts that themselves have definite characteristics. Observable properties of composites generally depend in complex ways on features of the constituents. In the kinds of systems that chemists deal with, agency of compound entities can adequately be understood in terms of the composition and structure of the coherence, and the properties of the components -- but that understanding is not always straightforward. This paper considers approaches to the functioning of compound individuals that are generally employed in contemporary chemistry, and examines whether such approaches might be useful in dealing with some currently-open questions of philosophical interests, such as the validity of multi-level selection in evolutionary biology (cf. special issue of The American Naturalist, July 1997).


"Can We Exclude Nature from the Stability of Laboratory Research?"
Daniel Rothbart, George Mason University

Abstract:
How can we explain the stability of chemical research? In his Representing and Intervening Ian Hacking argues that the technician's use of modern instruments is stabilized by the causal powers of certain entities, a position known as entity realism. But the conception of laboratory research that emerges from his more recent writings contradicts, by implication, entity realism, and suggests a kind of constructivist antirealism. We find in these writing an impoverished conception of research, one which fails to recognize the philosophical importance of the engineer's design of modern instruments. An exploration of modern spectrometers in chemistry illustrates this point. A functional realism is developed, according to which the experimenter is committed to the existence of causal processes as defined purposively from the engineer's design of the modern instrument.


"Putting Quantum Mechanics to Work in Chemistry"
Andrea I. Woody, The University of Chicago

Abstract:
This essay examines contemporary diagrammatic and graphical representations of molecular systems, specifically molecular orbital models and electron density graphs, derived from quantum mechanical models. Its aim is to suggest that recent dissatisfaction with reductive accounts of chemistry(see, e.g., Synthese, June 1997) may stem from the inability of standard analyses of reduction to incorporate the diverse forms of representation widely employed in the application of the stationary state Schrödinger equation to chemistry. After a brief sketch of the historical development of these techniques, in part to display the computational complexities inherent in corresponding quantum calculations, I compare algebraic and diagrammatic representations of a few specific molecules, demonstrating the strengths of each as tools for a variety of reasoning tasks prevalent in chemical practice. I subsequently provide a more general, and preliminary, analysis of the inferential resources of molecular orbital diagrams, arguing that certain characteristics of this two-dimensional representation scheme are particularly well-designed to promote the robustness of common forms of chemical inference.


"The Periodic System: Its Status and its Alleged Reduction"
Eric R. Scerri, Bradley University

Abstract:
One of the reasons why philosophers of science have not paid much attention to chemistry, until recently, has been the view that this field does not possess any profound theories or ideas like quantum mechanics, relativity or Darwin's theory of evolution. The purpose of this article is to suggest that, within chemistry and in terms of explanatory power, the periodic system possesses a comparable status to the above mentioned theories.

The tendency to dismiss the periodic system's role seems to stem from the mistaken view that it has been reduced to quantum mechanics. It will be argued that all that can be strictly deduced from the principles of quantum mechanics is an explanation for the closing of electron shells, whereas the independent problem of the closing of the periods cannot. In order to obtain the electron configurations of atoms it is necessary to use a number of ad hoc schemes such as assuming the order of filling of electron shells by reference to experimental data. The periodic system exists as an autonomous and essentially chemical explanatory principle.



Symposium: The Coevolution of Language and Brain: Is there Anything New Under the Sun?

Organizer: Edward Manier, University of Notre Dame
Chair: Edward Manier, University of Notre Dame

"The Roots of Human Linguistic Competenceâin Apes"
Duane M. Rumbaugh, Georgia State University

Abstract:
Human competencies for language, symbolism, facile learning, prediction, mathematics, and so on, are natural expressions of processes that evolved notably within the order Primates. The evolution of large-bodied primates was basic to the evolution of a large brain. But particularly within the great ape and human species the process of encephalization contributed to an even larger brain, one within which the operations of intelligence played an increasingly important role in adaptation. From that point, it is proposed that evolution became directed more to the evolution of large brains, not large bodies. Even at that point in evolutionary history, the neurological bases for language and a broad range of cognitive skills were in place. From these natural endowments, our species has excelled as a natural projection of competencies that became refined in the hominids. Recent research with apes has revealed the critical importance of thelogic-structure of early environment and patterns in interaction with brain size and complexity to the acquisition of language and relational learning processes in young apes. It will be argued that conventional views of learning that emphasize stimuli, responses, and reinforcement are insufficient to the challenge of accounting for the acquisition of such skills in infant apes. To account for them, the merits of a new class of learning phenomena termed Emergents will be advanced.


"Syntax Facit Saltum: Minimal Syntactic Computation and the Evolution of Language"
Robert C. Berwick, MIT

Abstract:
Ever since Darwin and long before, the species-specific human language faculty has captured the imagination of evolutionary thinkers. The reason is simple: For evolutionists, novelties or new traits in a single lineage -- autapomorphies -- have always posed a challenge. Whatever the evolutionary scenario, one must strike a balance between language's singularity and evolutionary continuity: How can one account for the striking specificity of human syntactic constraints and at the same time retain a Darwinian-style explanation for language's emergence?

In this talk we show how to resolve this discontinuity paradox in a new way -- appealing to the recent linguistic syntactic theory dubbed the "Minimalist Program" (Chomsky, 1995). We show that once the fundamental property of generativity emerges, then minimalism forces much of the rest of human syntax to follow. All we need in addition is what Deaconargues for: pre-existing substrate of symbols/words (a lexicon). Coupled with the appearance of a single combinatorial operation of hierarchical concatenation, this leads directly to many of the distinguishing human syntactic properties: recursive generative capacity; basic grammatical relations like subject and object (and only these); observed locality constraints on natural language syntax; and the non-counting, structure dependence of natural language rules. Put another way, while it is surely true that natural language, like the vertebrate eye, is in some sense an "organ of extreme complexity and perfection" in Darwin's terms, we shall argue that one does not need to advance incremental, adaptationist arguments with intermediate steps to explain much of natural language's specific syntactic design.


"Evolutionary Engineering: How our Mental Capacities were Built from a Set of Primitives."
Marc Hauser, Harvard University

Abstract:
Studies within cognitive neuroscience illustrate how patients with particular forms of neurological damage can exhibit dissociations between implicit and explicit knowledge. Similarly, recent work in cognitive development shows that up to a certain age, infants have an implicit understanding of the physical and psychological world, but that this knowledge only becomes explicit with the emergence of other domains of expertise such as language. In this presentation, I would like to propose that the distinction between implicit and explicit knowledge can shed light on cognitive evolution, and in particular, the ways in which nonhuman animals are relatively constrained in their cognitive abilities. More specifically, I propose that in the domain of belief-desire psychology, nonhuman animals are limited to implicit understanding and that the evolutionary acquisition of language facilitated the transition from implicit to explicit understanding of the psychological world. To evaluate this hypothesis, I present results from experiments on nonhuman primates using tasks that tap into both implicit and explicit knowledge. Tests of implicit knowledge are modeled after those used in studies of prelinguistic infants. Specifically, we use the preferential looking time procedure to explore what nonhuman primates know about (i) invisible displacements, (ii) the causes of object contact and movement, (iii) the distinction about animate and inanimate objects, and (iv) objects with goals, desires and beliefs. We then use the results from tests of implicit knowledge to explore comparable problems at an explicit level. Results support the position that for some domain-specific systems of knowledge, nonhuman primates may be incapable of accessing, explicitly, what they know implicitly. Consequently, nonhuman primates may be frozen in an implicit state of understanding the world, in much the same way that some autistic children and prelinguistic human infants appear to be.


"Could Grammar Have Evolved?"
Terry Deacon, McLean Hospital, Harvard Medical School, and Boston University

Abstract:
One intention of many models of language evolution is to explain how grammatical analysis algorithms could have evolved to become innately represented in some form within the human mind (read brain). Though I think that the presumed necessity of postulating such a linguistic instinct to explain language learning is incorrect for other reasons (see Deacon, 1997), I intend in this paper to examine the claim that a biological evolutionary mechanism could produce such results. A biologically plausible model cannot merely postulate some big bang mutation that creates this unique and sophisticated set of interlinked algorithms by accident. It must instead either show how it is already present in other systems in a sort of preadaptationist argument (e.g. theories that grammar and syntax are entirely computationally parasitic on prior motor action programming systems) or show how it can be progressively instantiated in the brain's wetware by a form of Baldwinian evolution, which, by entirely Darwinian means, gradually allows patterns of language use to select for specific biological substrates that support more efficient performance.

Though I am sympathetic with the preadaptationist approach and the analogies between syntax and action planning are insightful, I do not think the strong form of this argument can succeed. The principal evidence against it is the fact that the corresponding preadaptation is present in quite elaborate form in most birds and mammals, without any correlated language ability. There are also well known linguistic challenges to many related functionalist accounts of grammar which cannot be entirely discounted.

Baldwinian models also provide very informative approaches and some have led to remarkable predictions about rates and constraints of this sort of coevolution. However, all Baldwinian models must make certain assumptions that I will show cannot be met, specifically in the case of language. These have to do with the requirement that for selection to take place the neural substrate of the linguistic computation in question must be essentially the same for all users in essentially all circumstances across long stretches of evolutionary time.

The principal problem is the identification of which sign stimuli will constitute which grammatical categories of language elements. If the assignment of these grammatical categories (or any surrogate categorical distinction) can be given by some universal invariants in the input or output stimuli, or in the computation itself, then there can be a universal neural substrate correlated with the grammatical computation enabling selection to take place. Such invariants are evident in the case of many aspects of language function (sound analysis, speech articulation, symbol learning, short-term verbal memory demands) and in much nonlanguage communication (alarm calls, honey bee foraging dances, human laughter and sobbing), but not with respect to the categorical distinctions that define the basic units of a grammar and syntax. Because these are symbolic rule systems applied to arbitrary and highly variable surface structures from phonemes to phrases, they are intrinsically not correlated with invariants present in the surface structure or use patterns, and rapidly change in very brief time spans. Grammatical and syntactic knowledge thus cannot evolve to be increasingly innate for essentially the same reasons that word reference cannot: the indirect nature of symbolic reference (that both are based on) necessarily undermines any potential invariance relationships. The interesting question is "What invariant relationships are left for explaining the unquestioned evolution of a uniquely human propensity for grammar and syntax?"


"Richard Rudner, Postmodern Separatism and The Symbolic Species"
David Rudner, Tufts University

Abstract:
The philosophy of science and the ethnography of science represent respectively prescriptive and descriptive approaches to a common topic. But neither is pure. Both disciplines poach in each other's waters. Both pass judgments about each other's competencies. Neither one can ignore the other. This paper will explore certain key elements in the post-modern ethnographic critique of science, a surprising convergence between these elements and parallel tenets in a pragmatic philosophy of science, and the prospects for a new science of humanity based on empirical findings reported in Terrence Deacon's book, The Symbolic Species. I begin this discussion by quoting a paper that my father, Richard Rudner, wrote twenty five years ago. At the time, he had taken up the gauntlet against philosophers and social scientists who would deny the very idea of social science. At best, according to proponents of (what my father termed the "separatist position") students of human behavior were engaged in what Peter Winch called, "a misbegotten epistemology" an exercise in analysis, explication and definition, but an exercise that necessarily lacked any component of empirical verification. In this regard, the social sciences were held to be essentially distinct from the non-social or non-human or natural sciences.

The separatist arguments were diverse. Some were powerful and subtle; others not. My father addressed all of them and, as far as I'm concerned, demolished them. But his efforts predated the rise of postmodernism in the academy and, twenty-five years later, the separatists are back with a vengeance. Indeed, in an ironic twist, some of the more extreme separatist arguments now militate against the possibility of any science at all -- social or natural. In effect, these extremists have become proponents of a new unified view of human inquiry -- but a non-scientific view.

I believe my father would have welcomed the unification of systematic inquiry and even relished the irony in the separatists' self-inflicted alliance with the non-social sciences. I think he would also have welcomed many aspects of the postmodern critique of science. But I am also certain that he would have deconstructed that critique and made use of the deconstruction to forge a new vision of science that covered both human and non-human domains. My task in this paper will be to explore certain key elements in the post modern critique of science, a surprising convergence between these elements and parallel tenets in a pragmatic philosophy of science, and the prospects for a new science of humanity based on empirical findings reported in Terrence Deacon's book, The Symbolic Species.

Contents:
The Gauntlet
Science as a Cultural System
Power, Hegemony and the Privileging of Science
Discursive Ascent
Indeterminacy, Plurality and Incoherence
The Scientist qua Scientist Makes Value Judgments
The Symbolic Species and the Evolution of Science



Symposium: The Developmental Systems Perspective in Philosophy of Biology

Organizer: Peter Godfrey-Smith, Stanford University
Chair: TBA

"Causal Symmetries and Developmental Systems Theory"
Peter Godfrey-Smith, Stanford University

Abstract:
Some of the central philosophical claims made by Oyama and other developmental systems theorists concern causation. In particular, it is argued that there is no way to apportion causal responsibility between genetic and environmental factors, when explaining particular biological traits. So developmental systems views oppose the idea that some traits are mostly genetic while others are mostly environmental.

The plausibility of this idea depends on how causal relations are understood. I will evaluate this part of the developmental systems view, making use of some new ideas about the nature of causation.


"Development, Culture, and the Units of Inheritance"
James R. Griesemer, University of California at Davis

Abstract:
Developmental systems theory (DST) expands the unit of replication from genes to whole systems of developmental resources, which DST interprets in terms of cycling developmental processes. Expansion seems required by DST's argument against privileging genes in evolutionary and developmental explanations of organic traits. DST and the expanded replicator brook no distinction between biological and cultural evolution. However, by endorsing a single expanded unit of inheritance and leaving the classical molecular notion of gene intact, DST achieves only a nominal reunification of heredity and development. I argue that an alternative conceptualization of inheritance denies the classical opposition of genetics and development while avoiding the singularity inherent in the replicator concept. It also yields a new unit -- "the reproducer" -- which genuinely integrates genetic and developmental perspectives. The reproducer concept articulates the non-separability of genetic and developmental roles in units of heredity, development, and evolution. DST reformulated in terms of reproducers rather than replicators leaves intact a conceptual basis for an empirically interesting distinction between cultural and biological evolution.


"Causal Democracy and Causal Contribution in DST"
Susan Oyama, John Jay College, CUNY

Abstract:
In reworking a variety of biological concepts, DST has made frequent use of parity of reasoning. We have done this to show that factors that have similar sorts of impact on a developing organism, for instance, tend nevertheless to be invested with quite different causal importance. We have made similar arguments about evolutionary processes. Together, these analyses have allowed DST not only to resolve some age-old muddles about the nature of development, but also to effect a long-delayed reintegration of developmental into evolutionary theory.

Our penchant for causal symmetry, however (or "causal democracy," as it has recently been termed), has sometimes been misunderstood. This paper shows that causal symmetry is neither a platitude about multiple influences nor a denial of useful distinctions, but a powerful way of exposing hidden assumptions and opening up traditional formulations to fruitful change.


"Development, Evolution, Adaptation"
Kim Sterelny, Victoria University, New Zealand

Abstract:
Developmental systems theorists argue that standard versions of neo-Darwinism need fundamental reorganization, not just repair. They argue, for example, that the notion of an innate trait depends on a dichotomous view of development that is unrescuably confused. They have argued that the Modern Synthesis downplayed the role of development and failed to integrate developmental biology within evolutionary biology. But they further suggest that developmental biology cannot be integrated within evolutionary biology without fundamentally rethinking neo-Darwinian ideas. In this paper I shall argue against these claims. The developmental systems theory critique of the received view of evolutionary theory has often been insightful, but these insights can, I argue, be incorporated within a modified version of that view.



Symposium: The Prospects for Presentism in Spacetime Theories

Organizer: Steven Savitt, The University of British Columbia
Chair: Lawrence Sklar, University of Michigan

There's No Time Like the Present (in Minkowski Spacetime)"
Steven Savitt, The University of British Columbia

Abstract:
Mark Hinchliff concludes a recent paper, "The Puzzle of Change," with a section entitled "Is the Presentist Refuted by the Special Theory of Relativity?". His answer is "no."I respond by arguing that presentists face great difficulties in merely stating their position in Minkowski spacetime. I round up some likely candidates for the job and exhibit their deficiencies. Specifically I consider proposals that the surface of a point's past light cone (Godfrey Smith), its absolute elsewhere (Weingard), and only the point itself (Sklar?) be considered its present. I also consider the suggestion that the present be relativized to observers and reject it in (I think) a novel way.


"Special Relativity and the Present"
Mark Hinchliff, Reed College

Abstract:
This paper opens by responding to a set of stock objections to presentism (the view, roughly speaking, that only presently existing things exist)that arise from the special theory of relativity. It then considers two further STR-based objections that have received less attention. And it ends with a proposal for how to fit a view of time, presentism, having considerable intuitive support together with a scientific theory, STR, having considerable empirical support. Guiding the paper in many places is an analogy with the complexity and subtlety of fitting ourintuitive view of the mind with our best scientific theories of the body.


"Is Presentism Worth Its Price?"
Craig Callendar, The London School of Economics

Abstract:
This paper highlights some of the undesirable consequences of presentism and then asks whether the original reasons for believing in presentism are worth suffering these consequences. First, I review the state of presentism in a specially relativistic (SR) world. I argue that recent contributions which define "objective becoming" upon Minkowski spacetime (e.g., Dorato 1995, Hogarth and Clifton 1995, and Rakic 1997) are not really doing presentists a favor. By deflating tenses so that they are compatible with correctly representing the world as a 4-dimensional manifold, these recent contributions miss the point of presentism. With no escape through the deflated tenses program, I claim that we must agree with S. Savitt (see his contribution) that no plausible presentist position compatible with SR is in the offing.

I then briefly look at general relativity (GR), where matters are even worse. Here presentists must try to make sense of objective becoming in all sorts of pathological spacetimes, e.g., spacetimes without the ability to be temporally oriented, without global time functions, with closed timelike curves, and more. If SR makes one pessimistic about the chances of reconciling presentism with modern spacetime physics, GR causes one to despair.

One way out of all of these problems is to interpret relativity (S Rand GR) instrumentally, à la J.S. Bell. Since this reading of relativity may be forced upon us by the nonlocalities of quantum mechanics, this move (for this reason) may be a respectable one. Understood either way, it is clear that presentism requires a radical re-thinking of contemporary spacetime physics. Given all the work one must do to sustain presentism, therefore, I would like to re-examine the original reasons for believing it in the second half of the paper. Are these reasons so compelling as to warrant radically re-conceiving contemporary physics? I think not. For even in its own territory, metaphysics, the case for presentism is far from convincing. Presentism just isn't worth its price.


"Presentism According to Relativity and Quantum Mechanics"
Simon Saunders, University of Oxford

Abstract:
Presentism is not, I claim, a viable metaphysical position, given the special or general theory of relativity as currently understood. For presentism is only plausible insofar as the present is taken to be an intersubjective spatial reality, rather than attaching to an individual person or worldline. From a 4-dimensional point of view, this is to specify a particular space-time foliation; but failing special matter distributions (and associated solutions to the Einstein equations), there is no such privileged foliation. An example, indeed, would have been the Lorentz ether; this is just the kind of object required by presentism (and, in general, by tensed theories of time). In renouncing the ether as a physical object, Einstein keyed the equations to space-time structure instead, and not to any special matter distribution. On the other hand we can surely define an effective dynamics, appropriate for the description of a single world-line. There is no claim that the resulting equations have universal applicability; they are as keyed to the specific as you like. But do they have to obey at least some of the fundamental symmetries? Would it matter if these equations turned out to be non-local, for instance?

The question has parallels in the case of quantum mechanics. In the De Broglie-Bohm theory there is likewise an effective dynamics (an effective state reduction); the same is true in the Everett relative-state theory. The latter, of course, purports to respect the space-time symmetries. Would it matter if these equations were non-local or non covariant? I suggest that it would, and locality and covariance should function as a constraint even at this level. Locality and covariance cannot be renounced at the effective level, because, ultimately, that is the basis of the experience of each one of us. If these theories cannot deliver at that level, then any purported consistency of the fundamental (as opposed to effective) equations with relativity would seem little more than specious.


Commentator: Lawrence Sklar, University of Michigan

Symposium: Special Relativity and Ontology

Organizer: Yuri V. Balashov, University of Notre Dame
Chair: TBA

"Becoming and the Arrow of Causation"
Mauro Dorato, University of Rome

Abstract:
In a reply to Nicholas Maxwell, Stein (1991) has proved that Minkowski spacetime can leave room for the kind of indeterminateness required both by certain interpretations of quantum mechanics and by objective becoming. More recently, his result has been strengthened and extended to worldline-dependent becoming by Clifton and Hogarth (1995). In a recent paper, however,I have argued that by examining the consequences of outcome-dependence for the co-determinateness of spacelike-related events in Bell-type experiments, it turns out that the only becoming relation that is compatible with both causal and noncausal readings of nonlocality is the universal relation (Dorato 1996). Such a result will be discussed vis-à-vis recent claims to the effect that the relation of causation should be treated as time-symmetric (Price 1996), something which would rule out, on a different basis, Stein's attempt at defining objective becoming in terms of an asymmetric relation of causal connectibility. Other, more recent attempts (Rakic 1997) at extending Minkowksi spacetime by introducing a non-invariant becoming relation will be examined and rejected on methodological grounds. Finally, aspects of the crucial problem of exporting current attempts at defining becoming in STR into GTR will be presented.


"QM and STR: Combining Quantum Mechanics With Relativity Theory"
Storrs McCall, McGill University

Abstract:
Stein (1991) and Clifton & Hogarth (1995) argue that Minkowski spacetime permits the notion of objective becoming to be defined, provided that an event may be said to have become only in relation to another event in its absolute future (Stein) or to a particular inertial worldline or observer (Clifton & Hogarth). Non-locality in quantum mechanics ,on the other hand, would appear to call for a broader definition. Thus in the Bell-Aspect experiment if the outcome at A is h, the probability of h at B is p, whereas if the outcome at A is v the probability is p'. For the photon at B it is plausible to argue that the outcome at A has already become, or that the entangled quantum state has already partially collapsed, even though A lies outside the past light cone of B and hence has not 'become' in the Stein-Clifton-Hogarth sense.

A broader definition would make becoming and collapse relative not to particular events or worldlines but to coordinate frames. Thus in frame f the collapse at A affects the probability at B, while in frame f' B precedes A and the collapse at B affects the probability at A. Becoming and collapse are world-wide processes, though always relative to a frame, and causal influences in the EPR experiment are reciprocal rather than unidirectional. We end by showing that the apparently conflicting currents of world-wide becoming in different frames can be reconciled in a Lorentz-invariant branched spacetime structure defined in terms of formal constraints on an ordering relation — the Einstein causal relation.


"Relativity and Persistence"
Yuri Balashov, University of Notre Dame

Abstract:
The nature of persistence over time has been intensely debated in contemporary metaphysics. The two opposite views are widely known as "endurantism"(or "three dimensionalism") and "perdurantism" ("four-dimensionalism"). According to the former, objects are extended in three spatial dimensions and persist through time by being wholly present at any moment at which they exist. On the rival account, objects are extended both in space and time and persist by having temporal parts, no part being present at more than one time.

Relativistic considerations seem highly relevant to this debate. But they have played little role in it so far. This paper seeks to remedy that situation. I argue that the four dimensional ontology of perduring objects is required, and the rival endurantist ontology is ruled out, by the special theory of relativity (SR). My strategy is the following. I take the essential idea of endurantism, that objects are entirely present at single moments of time, and show that it commits one to unacceptable conclusions regarding coexistence, in the context of SR. I then propose and discuss a plausible and relativistically-invariant account of coexistence for perduring objects, which is free of these defects. The options available to the endurantist are as follows: (1) reject endurantism, (2) reject the idea that objects can coexist, (3) reject SR or (4) take an instrumentalist stance with regard to it. Of these options, (4) is surely the least implausible one for the endurantist. I briefly discuss what is at stake in taking it up.



Symposium: Kuhn, Cognitive Science, and Conceptual Change

Organizer: Nancy J. Nersessian, Georgia Institute of Technology
Chair: Nancy J. Nersessian, Georgia Institute of Technology

"Continuity Through Revolutions: A Frame-Based Account of Conceptual Change During Scientific Revolutions"
Xiang Chen, California Lutheran University, and Peter Barker, University of Oklahoma

Abstract:
According to Kuhn's original account, a revolutionary change in science is an episode in which one major scientific system replaces another in a discontinuous manner. The scientific change initiated by Copernicus in the 16th century has been used as a prototype of this kind of discontinuous scientific change. However, many recent historical studies indicate that this prototype exhibited strong continuity. We suggest that the difficulty in capturing continuity results, in part, from an unreflexive use of the classical account of human concepts, which represents concepts by means of necessary and sufficient conditions.

A more satisfactory account has been developed in cognitive psychology over the last decade, using frames as the basic vehicle for representing concepts. Using frames to capture structural relations within concepts and the direct links between concept and taxonomy, we develop a model of conceptual change in science that more adequately reflects the current insight that episodes like the Copernican revolution are not always abrupt changes. We show that, when concepts are represented by frames, the transformation from one taxonomy to another can be achieved in a piecemeal fashion not preconditioned by a crisis stage, and that a new taxonomy can arise naturally out of the old frame instead of emerging separately from the existing conceptual system. This cognitive mechanism of continuous change demonstrates that anomaly and incommensurability are no longer liabilities, but play a constructive role in promoting the progress of science.


"Nomic Concepts, Frames, and Conceptual Change"
Hanne Andersen, University of Roskilde, Denmark, and Nancy J. Nersessian, Georgia Institute of Technology

Abstract:
In his last writings Thomas Kuhn introduced a distinction between normic concepts -- concepts such as 'liquid', 'gas' and 'solid', which together form contrast sets and nomic terms -- terms such as 'force' which form part of laws of nature, such as Newton's three laws of motion. We argue that family resemblance is the basis on which both normic and nomic concepts build, but that nomic concepts show an additional fine structure that cannot be explained by the family resemblance account alone. We develop a cognitive-historical account of concepts drawing both on investigations from the cognitive sciences on how humans reason and represent concepts, generally, and on fine-grained historical investigations of cognitive practices employed in science. Our account shows that representing nomic concepts requires extending current cognitive science research on family resemblance concepts. We propose that nomic concepts can be represented in frame-like structures that capture such aspects of the concept in question as the ontological status and function and causal power attributed to it in explanations of the problem situations in which it participates. Finally, we explore the implications of this account for conceptual change, specifically for the problem of the cognitive disequilibrium preceding conceptual change. On our account of concepts, conceptual change can be triggered by two different kinds of disequilibrium in the conceptual system: either related to changes of the similarity classes of problem situations or related to changes in the fine structure of the situation type.


"Kuhnian Puzzle Solving and Schema Theory"
Thomas J. Nickles, University of Nevada at Reno

Abstract:
In a recent article I showed how case-based reasoning (CBR) illuminates and corrects Kuhn's account of puzzle solving by direct modeling on exemplars. But CBR helps only up to a point, where it must yield to something like schema theory. After developing this theme and attending to some difficulties within schema theory, I consider the implications for methodology and philosophy of science. (1) A non-rules conception of inquiry obviously challenges traditional conceptions of method and of inquiry itself as a rationally intelligible, step-by-step procedure. (2) Kuhnian inquiry is rhetorical rather than logical in the senses that similarity, analogy, metaphor, etc., are the important cognitive relations; Kuhn's similarity metric is graded rather than all-or nothing; and he introduces place (topos) as well as time (history) into his model of science. Rather than searching the space of all logical possibilities, scientists in a given historical period construct and explore local search spaces around the points they actually occupy. The resulting scientific development resembles the nearly continuous biological evolutionary pathways through design space more than the great leaps that have long tempted rationalistic philosophers. But what could methodology then look like?



Symposium: The Medical Consensus Conference

Organizer: Miriam Solomon, Temple University
Chair: TBA

"The Epistemic Contribution of Medical Consensus Conferences"
Paul Thagard, University of Waterloo

Abstract: I recently completed a study of the development and acceptance of the bacterial theory of peptic ulcers (Thagard, forthcoming). A key event in the general acceptance of this theory, which was greeted with great skepticism when it was first proposed in 1983, was a Consensus Conference sponsored by NIH in 1994 that endorsed antibacterial treatment of ulcers. In the following three years, at least nine other countries held similar conferences and the American Digestive Health Foundation sponsored a second American consensus conference in February 1997. I attended the Canadian Helicobacter Pylori Consensus Conference that took place in Ottawa in April 1997. My talk will describe the aims and events of the conference, then focus on the general question: how do medical consensus conferences contribute to medical knowledge and practice? After a discussion of the nature of evidence-based medicine and the logic involved in decisions about treatment and testing, I extend and apply Alvin Goldman's (1992) epistemic standards to evaluate the scientific and practical benefits of consensus conferences.


Title TBA
John H. Ferguson, Office of Medical Applications of Research, NIH

Abstract:
To be provided


"Are Consensus Conferences Political Instruments?"
Reidar Lie, University of Bergen, Norway

Abstract:
Consensus conferences have been adopted by health authorities in a number of countries as a method of technology assessment in medicine. The idea is that by having evidence from a variety of experts presented to a panel of neutral people it is possible to reach a balanced decision concerning appropriate use of a new technology. Critics have pointed out that the conclusion reached by the panel depends crucially on who are chosen as the experts and panelists, and that it is possible to predict the decision if you know who these people are. There is also a problem when consensus conferences are organized in rapidly evolving fields of knowledge: the results of ongoing clinical trials may for example make the decision by a consensus conference quickly obsolete. For this reason one may suspect that consensus conferences are used as political instruments rather than as instruments of neutral assessment of new technologies. Two examples from Norway, the consensus conferences to assess ultrasound screening in pregnancy and estrogen replacement therapy, will be used to examine this claim.

Commentator: Miriam Solomon, Temple University



Symposium: Relations between Philosophy of Science and Sociology ofScience in Central Europe, 1914-1945

Organizer: Alan Richardson, University of British Columbia
Chair: TBA

"On the Relations between Psychologism and Sociologism in Early Twentieth-Century German Philosophy"
Martin Kusch, Cambridge University, UK

Abstract:
In this paper I shall investigate one of the intellectual roots of the early sociology of knowledge in the German-speaking world (especially Fleck, Jerusalem, Mannheim, Scheler). This root is the debate over psychological naturalism. I shall show that the sociology of knowledge was informed and influenced by both psychologism and by the phenomenological critique of psychologism. I shall also focus on missed opportunities, and some important social influences upon the sociology of knowledge (e.g. one might say that a strong programme in the sociology of knowledge first surfaced in Scheler's war-writings).


"Edgar Zilsel's 'Sociological Turn' in Philosophy"
Elisabeth Nemeth, University of Vienna, Austria

Abstract:
Edgar Zilsel's studies on the origins of modern science are well known to historians and sociologists of science. Philosophers, by contrast, have hardly taken note of his work. One reason might be that it is difficult to define his philosophical position on the periphery of the Vienna Circle, that is, it cannot be exclusively related to analytic philosophy or, for all that matter, to any other twentieth-century philosophical tradition. It could also be that philosophers have never really appreciated the sociological and historical turn that Ziesel wanted to introduce in philosophy. That Ziesel is ignored as a philosopher is certainly unfortunate. First, his philosophical position is interesting in and of itself: his Kantian philosophy of science in conjunction with a materialist theory of history and his very broad research interests in the history of philosophy, aesthetics and science brought forth his unique sort of empiricism. Second, Ziesel provides an interesting illustration of how a philosophical and epistemological concern with science can be fruitfully incorporated in a project of empirical research. My essay will sketch some of the characteristic features of this transposition of philosophy into sociology.


"Logical Empiricism and the Sociology of Knowledge: Rethinking the Relations of Philosophy and Sociology of Science"
Alan Richardson, University of British Columbia

Abstract:
Recent re-appraisals of logical empiricism have highlighted some hitherto seldom noticed features of the received view of science and related projects such as Popper's critical rationalism. A number of themes in the recent interpretative work invite a curiosity about the relation of logical empiricism to sociology of science as practiced in the 1930s and 1940s and as practiced now. This essay will examine some themes in Carnap's philosophy of science with an eye toward their relations to historical and contemporary concerns with the social aspects of knowledge. The themes are: a concern for objectivity in the sense of intersubjectivity; pluralism, conventionalism, and voluntarism in the foundations of knowledge; practical principles for the adoption of linguistic frameworks, practical rationality, and the limits of epistemological explanation. It will explore features of Carnap's project in relation to Scheler's sociology of science (especially, the rejection of transcendental philosophy of value as a constraint on practical rationality), Merton's normative sociology, and current projects in SSK, especially David Bloor's work on constructivism and conventionalism and Harry Collins's on the experimenter's regress.


"Otto Neurath and the Sociology of Knowledge"
Thomas E. Uebel, The London School of Economics, UKv

Abstract:
Neurath was the only one of the core members of the Vienna Circle, who actively pursued an interest in the sociology of knowledge. This paper will consider several aspects of Neurath's involvement, partly to chronicle it historically and partly to assess its systematic place in his thought. This paper will show that, predisposed by his training in the social sciences, Neurath early on showed an awareness of the ideological dimension of economic and social theories and, by the time of the Circle, of that of scientific metatheory and philosophy. Relevant texts include the Circle's manifesto as well as his critique of Mannheim. Moreover, his own version of naturalism, already controversial in the Circle itself, attempted to integrate sociological considerations within his epistemology. A brief comparison with some contemporary approaches will conclude the paper.



Symposium: Realism and Classification in the Social Sciences

Organizer: Michael Root, University of Minnesota
Chair: John Dupre, University of London

"How We Divide the World?"
Michael Root, University of Minnesota

Abstract:
Most systems of classification or kinds in the social sciences are invented rather than discovered, though, in most cases, invented by the subjects of the science and not the social scientist. Some of these kinds are real and other only nominal. According to conventional wisdom, this cannot be, for a kind cannot be both constructed and real. I explain how kinds can be both and use the account to resolve a recent debate over the reality of race.


"The Politics of Taxonomy"
Richard W. Miller, Cornell University

Abstract:
In the natural sciences, taxonomies are justified by their role in facilitating the explanatory use of a unifying theory (for example, evolutionary theory in the case of biology). But even on the broadest understanding of "theory", no warranted unifying theory regulates the scientific explanation of any significant range of social phenomena. This distinctive disunity of the social sciences, along with the distinctive nature of social questions, sustains a different way of assessing classifications as marking genuine distinguishing properties, a form of taxonomic assessment in which moral evaluations and political contexts are directly relevant. I will show how such assessment should proceed in the judgment of the social scientific standing of classifications by ethnicity and nationality. Appealing to connections in natural science between classification, communication and interests in change, I will argue that the inherently political nature of social scientific taxonomy is not a barrier to objectivity. Rather, traditional questions about the boundary between facts and values become questions of the rational social division of labor in social inquiry.


"Race: Biological Reality or Social Construct?"
Robin Andreasen, University of Wisconsin, Madison

Abstract:
Race was once thought to be a real biological kind. Today, however, the dominant view is that biological races don't exist. Theorists are now debating the question of the social reality, or lack thereof, of race. I challenge the trend to reject the biological objectivity of race by arguing that cladism -- a branch of systematic biology that individuates taxa solely by appeal to common ancestry -- in conjunction with current work in human evolution, provides a new way to define race biologically. I then show how to reconcile the proposed biological conception with current social scientific theories about the nature of human racial categories. Specifically, I argue that social scientific conceptions of race and the cladistic concept are complementary; they should not be interpreted as in competition.


"Local Realism and Global Arguments"
Harold Kincaid, University of Alabama at Birmingham

Abstract:
The paper looks at realism issues in the social sciences taken as debates whether basic predicates of kind terms in the social sciences should be taken realistically, i.e. as picking out mind-independent entities. The first half looks at various general conceptual arguments given to show that social kinds should not be taken realistically. These arguments include (1) claims that the best explanation of poor empirical progress in the social sciences is that they lack natural kinds, (2) arguments based on the idea that social kinds don't have the right relation to the physical, (3) doubts based on the lack of any clear fact/value distinction in the social science, and (4) arguments pointing to apparent multiplicity of competing ways to dividing up the social world. I argue that none of these considerations establishes that social predicates do not refer. At most, they provide important clues to potential obstacles. The second half of the paper then goes on to argue that the interesting issues are not ones that can be decided independently of substantive issues in the social sciences themselves and that therefore trying to decide realism issues in a global fashion for all social science is misguided. I thus look in detail at some basic categories in economics -- in particular, the productive/nonproductive distinction which goes to the heart of the discipline -- as a way of identifying some of the various kinds of general empirical considerations that are at issue in deciding when social kinds should be taken realistically and when they should not. Crucial issues involve whether we can produce well-confirmed causal explanations, and I try to show how that is possible with social kinds despite the doubts that have been raised about them.



Symposium: The Structure of Scientific Theories Thirty Years On

Organizers: Steven French, Leeds University, and Nick Huggett, University of Illinois at Chicago
Chair: TBA

"Understanding Scientific Theories: An Assessment of Developments, 1969-1998"
Frederick Suppe, University of Maryland

Abstract
The positivistic Received View construed scientific theories syntactically as axiomatic calculi where theoretical terms were given a partial interpretation via correspondence rules connecting them to observation statements. This view came under increasing attack in the 1960s, and its death can be dated March 26, 1969, when Carl G. Hempel opened the Illinois Symposium on Theories with an address repudiating the Received View and proposing an alternative analysis. This paper will assess what, with hindsight, seem the most important defects in the Received View; survey the main proposed successor analyses to the Received View — semantic reworkings of the Received View, various versions of the Semantic Conception of Theories, and the Structuralist Analysis of Theories; evaluate how well they avoid those defects; examine what new problems they face and where the most promising require further development or leave unanswered questions; and explore implications of recent work on models for understanding theories.


"Theories, Models and Structures: Thirty Years On"
Steven French, University of Leeds, and Newton da Costa, University of Sao Paulo

Abstract
In The Structure of Scientific Theories, Suppe put forward the Semantic Approach as a promising alternative to the linguistic formulation approaches of the Received View and Weltanschauungen analyses. This approach has since entered the mainstream of philosophy of science through the work of van Fraassen, Giere and Suppe himself. Our aim in this paper is to survey the developments of the past thirty years from the perspective of Suppes' set-theoretic analysis and to suggest how recent work might overcome a series of objections to the Semantic Approach.


"The Emergence of Feminist Philosophy of Science"
Helen E. Longino, University of Minnesota

Abstract
This paper will describe the emergence of feminist concerns in philosophy of science. Feminist philosophers and scientists have been concerned with explicitly sexist or androcentric aspects of scientific research programs and with the metaphoric gendering of various aspects of scientific practice and content In the effort to identify and understand these features of the sciences, feminist scholars have advanced theses about the role of social values in scientific research, about the character of experiment, about evidential relations, and about science and power. In this they have made common cause with scholars in social and cultural studies of science, but with important differences. This paper will articulate the relationship of the work of feminists to other changes in philosophy of science in the last thirty years.


"Local Philosophies of Science"
Nick Huggett, University of Illinois in Chicago

Abstract
One of the central characteristics of the received view was its tendency to treat questions in the philosophy of science as global: what is the nature of scientific theories? What are the units of significance in science? What is an explanation? And so on. Even problems in specific sciences were typically approaches (for instance, by Reichenbach) with answers to the global questions in mind. Much contemporary work still seeks global answers, but there is also a widespread opinion that science has no essence, and that philosophical work should be concentrated on local questions that arise within sciences, independently of general theories about the nature of science. This idea can be particularly identified, for instance, in the work of Nancy Cartwright, Arthur Fine, and Ian Hacking, as well as in contemporary philosophy of individual sciences. This paper will discuss the "no essence" view, and evaluate the prospects for local philosophical research programs.


"The Legacy of 'Weltanschauungen' Approaches in Philosophy of Science"
Alison Wylie, University of Western Ontario

Abstract:
When Suppe mapped the emerging terrain of post-positivist philosophy of science in 1968, what he called "Weltanschauungen" approaches figured prominently. As a loosely articulated family of views about science these require that, in various senses and to varying degrees, the language and logic of science, the knowledge it produces, its preferred forms of practice and even its defining epistemic ideals must be contextualized. The central question that divides the advocates of such approaches is, what constitutes context: is it restricted to the theoretical, conceptual assumptions that inform inquiry (aspects of context internal to science), or does it extend to various of the (external) human, social conditions that make possible the scientific enterprise?

In the intervening thirty years, "Weltanschauungen" approaches have proven to be the dominant legacy of the demise of positivism in science studies outside philosophy, and they have generated intense debate about the goals and strategies appropriate to philosophical studies of science. This is an important juncture at which to assess the implications of these developments for philosophy of science. Increasingly, even the most uncompromising constructivists acknowledge that sociologically reductive accounts of science are inadequate, while a growing number of philosophers recognize the need to broaden the scope of their contextualizing arguments. Although this rapprochement has been tentative and uneven, it has given rise to a fundamental reassessment of the traditional opposition between epistemic and social categories of analysis that is evident in recent philosophical work in a number of areas. I will focus here on arguments for rethinking objectivist ideals that recognize their pragmatic, socio-historic contingency. One central problem here, which feminist philosophers of science have taken particular initiative in addressing, is to explain how it is that the features of context typically seen as contaminants of scientific inquiry can serve as a source of critical insight that enhances the integrity, credibility and, indeed, the objectivity of scientific inquiry.



Symposium: Recent Advances in the Logic of Decision: A symposium in Honor of Richard Jeffrey.

Organizer: Allan Franklin, University of Colorado
Chair: Brian Skyrms, University of California, Irvine

Why We Still Need the Logic of Decision
James Joyce, University of Michigan

Abstract
Richard Jeffrey's classic The Logic of Decision (LOD) defends a version of expected utility theory that requires agents to strive to secure evidence of desirable outcomes. As many philosophers have argued, Jeffrey's system fails as an account of rational choice because it is unable to properly distinguish the causal properties of acts from their merely evidential features. Only an appropriately causal utility theory can do the work of a true logic of decision. This notwithstanding, Jeffrey's system has two desirable traits that its causal competitors lack. Ethan Bolker's elegant representation theorem provides LOD with a theoretical foundation superior to that of any other utility theory. In addition, LOD assigns utilities to actions in a manner that does not depend on the way their consequences happen to be individuated. These advantages are so significant, in my view, that we should reject any decision theory that does not use Jeffrey's system as its underlying account of value. All value, I claim, is news value seen from an appropriate epistemic perspective, with different perspectives being appropriate for the evaluation of an agent's actions and for the evaluation of events that lie outside her control. The upshot of this is that causal decision theorists must employ Jeffrey's theory in order to properly formulate their own. I will show how this can be done consistently. This dependence of causal decision theory on the system of LOD may seem to present a problem, however, since LOD is often compared unfavorably with other versions of decision theory because it cannot, except in recherché cases, extract unique probability/utility representations from facts about preferences (given a zero and unit for measuring utility). Fortunately, this shortcoming is not a serious one. Three points need to be made: First, all theories that claim to deliver unique representations resort to technical tricks of dubious merit. Second, reflection on the notion of expected utility theory suggests that unique representations should not be extractable from facts about preferences alone, but only from facts about preferences and beliefs. Third, unique representability can be obtained in LOD if we supplement the Jeffrey/Bolker axioms governing rational preference with independent constraints on rational (comparative) belief. This last point turns out to have significant theoretical implications because it lets us generalize Bolker's representation theorem so that it can be used as a foundation for a wide range of decision theories, causal decision theory among them.


Title: TBA
Ethan Bolker, University of Massachusetts, Boston

Abstract:
To be supplied


"Conditionals and the Logic of Decision"
Richard Bradley, London School of Economics

Abstract
Rational decision making is typically guided by practical reasoning of a hypothetical kind: the sort of reasoning an agent engages in when she supposes that she will perform some or other action under some or other set of conditions and then considers the consequences of her doing so under just these conditions. In reasoning hypothetically agents show that they can take rational attitudes to the possibilities falling within the scope of their hypotheses: that they can believe that if they jump out the window then they will hurt themselves, can desire that if it snows tomorrow that school will be cancelled and can prefer that if beef is for dinner that it be of French rather than British origin.

In this paper, I extend Richard Jeffrey's logic of decision by incorporation of conditionals as objects of agents' attitudes. (A genuine augmentation is involved because I take conditionals to be non-propositional). In the extended logic, probabilities of conditionals equal their conditional probabilities (in accordance with Adams' famous thesis) and the desirabilities of conditionals are weighted averages of the desirabilities of the mutually exclusive consequences consistent with their truth, where the weights are provided by the probabilities of the antecedently presupposed conditions.

In similar fashion, Bolker's axioms are augmented to define the conditions satisfied by rational preferences for conditionals (roughly, a conditional a->b is preferable to another conditional c->d iff it is preferable to learn that b is the case if a is, than to learn that d is the case if cis). Unlike in the Jeffrey-Bolker system, the assumption that these conditions are satisfied by an agent's preferences suffices for them to determine a unique representation of her degrees of belief and, up to a choice of scale, of her degrees of desire (consistent with the extended logic of decision). This shows, I suggest, that to obtain a complete and precise description of a decision-maker's state of mind it is necessary to identify the attitudes she takes to the kinds of possibilities expressed by conditionals.


"Subjective Thoughts on Subjective Probability: What did Dick Jeffrey know and when did he know it? and a Mathematician's Request for Help from the Philosophers"
Ethan Bolker, Department of Mathematics and Computer Science University of Massachusetts, Boston

Abstract
In this talk I provide some views of contemporary decision theory from my perspective as a professional mathematician, an amateur philosopher, and Dick Jeffrey's friend. I'll discuss what Dick and I learned from eachother in Princeton in 1963, and what I've learned since as a decision theoryobserver. In my speculations on the usefulness of that theory I finesse the well studied question "what is probability?" but will address the question "what is random?", with further thoughts on whether TCMITS will believe the answer. I conclude with my own interpretation of Jeffrey's radical probabilism.

Commentator: Richard Jeffrey, Princeton University