Skip to content

Lecture 10

chris wiggins edited this page Apr 2, 2017 · 2 revisions

Readings for Mar 28 (cf. https://data-ppf.slack.com/archives/C3SJQ5FH9/p1490445526198352 ): focus on some of the professional history of computing literature that's on our syllabus.

Starting with Lecture 11, we'll be experiencing how the development of AI -- among mathematicians, cognitive scientists, and the nascent computer science field --
comes to collide with academic and industrial statistics. Useful to that end will be a dive into two works exploring the intellectual and cultural turmoil of bringing explicitly computational techniques, e.g., data assimilation and simulation, into an existing scientific community.

  • Paul Edwards, A Vast Machine (MIT Press, 2010), chs 5-7. NOTE THIS IS A DIFFERENT ASSIGNMENT than on syllabus

  • Peter Galison, ["Computer simulations and the trading zone,"] (http://www.medientheorie.com/doc/galison_simulation.pdf) in The disunity of science: Boundaries, contexts, and power, ed. Peter Galison and David J. Stump (Stanford, CA: Stanford University Press, 1996), 118-157

The Edwards is amazing about data accumulation, modeling, and so forth, for climate data. It's also important and theoretically rich, and will give us more ways to think about data-driven sciences and their infrastructure.

Galison is about monte carlo, and a great follow up to what we've been doing.

Clone this wiki locally