Tag Archives: Alan Turing

Lunch & Learn: Computing at Princeton: Short observations and tall stories

von Neumann and the MANIAC

Few people know that Princeton University’s association with computers and computing predates the ENIAC. Jon goes back to the days of John von Neumann, Oswald Veblen, Alan Turing, John Tukey, and winds his way forward through the memorable days of the mainframes to 1985 when Ira Fuchs arrived to create the University’s high speed network and begin the drive toward ubiquity of access and use. His many stories all have one thing in common… they all used to be funny.

About the speaker: 

Jon Edwards graduated from Princeton in 1975 with a degree in history. He got his PhD from Michigan State University in Ethiopian economic history. After a three year stint as Review Editor of Byte Magazine, he returned to Princeton in 1986 to serve as the Assistant to the VP for Computing and Information Technology. He served as the Coordinator of OIT Institutional Communications and Outreach until his retirement on November 11, 2010.

Listen to the podcast (.mp3)
Download the presentation slides (.pdf)
Video clip, featuring Serge Goldstein, Director of OIT Academic Services (.mp4)

Lunch & Learn: Why Your Humble iPod May Be Holding the Biggest Mystery in All of Science with Bernard Chazelle

chazelle1.jpgIn 1965, Intel co-founder Gordon Moore predicted that the number of transistors placed on an integrated circuit would double approximately every two years. That prediction, notes Bernard Chazelle, Computer Science Professor at Princeton, if anything underestimated the results during the past half century and should continue for at least another decade. Moore’s Law, he posits, is responsible for most of the desktop and hip-pocket wonders of the computer age, notably remarkable improvements in processing speed, memory capacity, and network bandwidth.
Moore’s Law correctly predicted revolutionary technological and social change in the late 20th century. But by 2020 if not before, as transistor features approach just atoms in width, Moore’s Law will have run its course. New technologies may replace integrated circuit technologies to extend Moore’s Law for decades; Chazelle argues that the years ahead will usher in the era of the “Algorithm,” a notion which, he contends, may prove to be the most disruptive and revolutionary scientific paradigm since quantum mechanics.

Continue reading