Eric Winsberg's Online Papers


Below are some papers I have selected to put online. Most are published (see my CV to find out where). Comments are always welcome.


Sanctioning Models: The Epistemology of Simulation (1999)

In its reconstruction of scientific practice, philosophy of science has traditionally placed scientific theories in a central role, and has reduced the problem of mediating between theories and the world to formal considerations. Many applications of scientific theories, however, involve complex mathematical models whose constitutive equations are analytically unsolvable. The study of these applications often consists in developing representations of the underlying physics on a computer, and using the techniques of computer simulation in order to learn about the behavior of these systems. In many instances, these computer simulations are not simple numbercrunching techniques. They involve a complex chain of inferences that serve to transform theoretical structures into specific concrete knowledge of physical systems. In this paper I argue that this process of transformation has its own epistemology. I also argue that this kind of epistemology is unfamiliar to most philosophy of science, which has traditionally concerned itself with the justification of theories, not in their application. Finally, I urge that the nature of this epistemology suggests that the end results of some simulations do not bear a simple, straightforward relation to the theories from which they stem.

Simulated Experiments: Methodolgy for a Virtual World (2003)

This paper examines the relationship between simulation and experiment. Many discussions of simulation, and indeed the term “numerical experiments,” invoke a strong metaphor of experimentation. On the other hand, many simulations begin as attempts to apply scientific theories. This has lead many to characterize simulation as lying between theory and experiment. The aim of the paper is to try to reconcile these two points of view—to understand what methodological and epistemological features simulation has in common with experimentation, while at the same time keeping a keen eye on simulation’s ancestry as a form of scientific theorizing. In so doing, it seeks to apply some of the insights of recent work on the philosophy of experiment to an aspect of theorizing that is of growing philosophical interest: the construction of local models.

Quantum Life: Interaction, Entanglement, and Separation. (2003)

(together with Arthur Fine)

When quantum systems interact they become entangled in a way that many regard as inseparable, the sign of a puzzling quantum holism. This paper challenges a lemma in the central argument that is said to lead to this conclusion. By explicit constructions of local models it shows how to separate entangled systems, treating the “bad” cases that violate the Bell inequalities. It follows from these constructions that locality, even in these bad cases, does not entail holism. The paper then develops a line suggested by Einstein and argues that, independently of these constructions and of considerations about locality, all of the entangled cases – good and bad – are separable.


Can Conditioning on the Past Hypothesis Militate against the reversibility objections? (2004)

In his recent book, Time and Chance, David Albert claims that by positing that there is a uniform probability distribution defined, on the standard measure, over the space of microscopic states that are compatible with both the current macrocondition of the world, and with what he calls the “past hypothesis”, we can explain the time asymmetry of all of the thermodynamic behavior in the world. The principal purpose of this paper is to dispute this claim. I argue that Albert’s proposal fails in his stated goal—to show how to use the time-reversible dynamics of Newtonian physics to “underwrite the actual content of our thermodynamic experience” (Albert 2000, 159). Albert’s proposal can satisfactorily explain why the overall entropy of the universe as a whole is increasing, but it does not and cannot explain the increasing entropy of relatively small, relatively short-lived systems in energetic isolation without making use of a principle that leads to reversibility objections.

Laws and Statistical Mechanics. (2005)

This paper explores some connections between competing conceptions of scientific laws on the one hand, and a problem in the foundations of statistical mechanics on the other. I examine two proposals for understanding the time asymmetry of thermodynamic phenomenal: David Albert’s recent proposal and a proposal that I outline based on Hans Reichenbach’s “branch systems”. I sketch an argument against the former, and mount a defense of the latter by showing how to accommodate statistical mechanics to recent developments in the philosophy of scientific laws.

Models of Success vs. the Success of Models: Reliability Without Truth (2006)

In computer simulations of physical systems, the construction of models is guided, but not determined, by theory. At the same time simulations models are often constructed precisely because data are sparse. They are meant to replace experiments and observations as sources of data about the world; hence they cannot be evaluated simply by being compared to the world. So what can be the source of credibility for simulation models? I argue that the credibility of a simulation model comes not only from the credentials supplied to it by the governing theory, but also from the antecedently established credentials of the model building techniques employed by the simulationists. In other words, there are certain sorts of model building techniques which are taken, in and of themselves, to be reliable. Some of these model building techniques, moreover, incorporate what are sometimes called “falsifications.” These are contrary-to-fact principles that are included in a simulation model and whose inclusion is taken to increase the reliability of the results. The example of a falsification that I consider, called artificial viscosity, is in widespread use in computational fluid dynamics. Artificial viscosity, I argue, is a principle that is successfully and reliably used across a wide domain of fluid dynamical applications, but it does not offer even an approximately “ realistic” or true account of fluids. Artificial viscosity, therefore, is a counter-example to the principle that success implies truth—a principle at the foundation of scientific
realism. It is an example of reliability without truth.

Handshaking your Way to the Top: Simulation at the Nano-scale. (2007)

Should we philosophers of science, those of us who are interested in methodological, epistemological, and metaphysical issues in the sciences, be paying any attention to developments in nano-scale research? Undoubtedly, it is too early to tell for sure. But arguably, the right prima facie intuition to have is that we should. The project of this paper is to look and see, and to try to give a more definitive answer. Where I look is techniques of model-building, especially methods of computer simulation, that are employed in nanoscience. What I find is that it does indeed look as if there good prospects for philosophers of science to learn novel lesssons, especially about the relations between different theories, and between theories and their models, by paying attention to developments in simulation at the nano-scale.

Laws and Chances in Statistical Mechanics (forthcoming)

Statistical mechanics involves probabilities.   At the same time, most approaches to the foundations of statistical mechanics—programs whose goal is to understand the macroscopic laws of thermal physics from the point of view of microphysics—are classical; they begin with the assumption that the underlying dynamical laws that govern the microscopic furniture of the world are (or can without loss of generality be treated as) deterministic.   This raises some potential puzzles about the proper interpretation of these probabilities.

A Tale of Two Methods (forthcoming)

Simulations (both digital and analog) and experiments share many features.   But what essential features distinguish them?  I discuss two proposals in the literature.  On one proposal, experiments investigate nature directly, while simulations merely investigate models.  On another proposal, simulations differ from experiments in that simulationists manipulate objects that bear only a formal, (rather than material), similarity to the targets of their investigations.   Both of these proposals are rejected.  I argue that simulations fundamentally differ from experiments with regard to the background knowledge that is invoked to argue for the “external validity” of the investigation.

Holism and Entrenchment in Climate Models (WIP)

(together with Johannes Lenhard)


We argue for two central claims about complex computational models—with a particular emphasis on models of the earth’s climate. The first claim is about holism. We argue that recent efforts in the sphere of climate model inter-comparison reveal that modern, state-of-art climate models are what we call “analytically impenetrable.” The second claim is about entrenchment. In particular, we argue that entrenchment can be identified as one of the principal causes of holism.  Climate models are, in interesting ways, products of their specific histories.