Repository URL:
Frederic Wieber
conference paper description
This paper examines, as a case study, some modeling and simulating practices in protein chemistry. In this field, theorists try to grasp proteinic objects by constructing models of their structures and by simulating their dynamical properties. The kind of models they construct and the necessity of performing simulations are linked with the molecular complexity of proteins. Two main types of problems emerge from this complexity. First, experimental problems arise when scientists want to perform on (and to adapt to) proteins some physical experiments (X-rays crystallography, NMR, neutrons scatterings…) and try to interpret the experimental data thus produced. Secondly, theoretical problems of computational complexity arise with the application of quantum mechanics to these excessively large objects. If the first type of problems has historically called for the development of theoretical approaches (in order to refine experimental data and to have access to certain properties of proteins that were very difficult to obtain experimentally), the second type, which is common to chemistry as a whole, has led protein scientists to develop a special kind of models, the so-called “empirical models” (in contrast to “ab initio calculations”). They were aided by massive use of computers after 1960, to construct and extend the use of these models. In the 1970’s, these computerized models were incorporated into a simulation method termed “Molecular Dynamics” (MD) elaborated in statistical physics. This has led to greater insights about experimentally inaccessible dynamical properties of proteins. The computer, as a technological instrument, has influenced in a major way the form of the models that have been constructed. Its limited computational capacities have also influenced the way MD simulation method has been applied in the case of proteins. That’s why I refer to these modeling and simulating activities as “theoretical technologies”. The development of these theoretical technologies must be understood in an “experimental” setting. To show this, I will first analyze the nature of the models actually constructed, in order to emphasize the work of experimental data assembling and estimations (due to the empirical problems early mentioned) necessary in this modeling activity. I will then examine the adaptation of the MD simulation method to proteins. For this adaptation, specialists of the MD method (from statistical physics) collaborated with protein theorists, notably, in Europe, within a particular institution. This effective collaboration has been possible thanks to the computing facilities of this computing center. I will thus emphasize the way computer’s accessibility has led to practical collaboration among scientists, and the importance of the tacit dimensions of simulation’s production during this time of first developments. A parallel between experimental practices (around big instruments) and simulating practices (around supercomputers) can then be proposed. If these two main lines of analysis indicate the potential hybrid nature (between theory and experiment) of these modeling and simulating activities, they will also show how the technological nature of these practices has an effect on the status of the results they produced. Finally, the impact of these technologies on the nature and status of experimental results in protein chemistry will be mentioned.

This conference paper has 0 Wikipedia mention.