Write a Blog >>

The recent, remarkable successes of machine learning are due in part to the invention of machine learning methods (especially for deep learning), to the collection of datasets for tackling problems in many fields, and to the availability of powerful hardware, including CPUs, GPUs, and custom-designed ASICs. Software systems, however, are central to this progress.

This talk suggests that it is instructive and fruitful to think of these software systems from a programming-language perspective. It focuses on TensorFlow, a recent system for machine learning that operates at large scale and in heterogeneous environments. TensorFlow owes its generality to its programmability. In TensorFlow, models for machine learning are assembled from primitive operations by function composition and other simple, familiar constructs. Other aspects of TensorFlow, such as its support for automatic differentiation and its memory management, are less common in mainstream programming languages. TensorFlow enables the development of a wide variety of models, in both production and research. As examples, this talk briefly describes some recent research applications related to programming.

This talk is based on joint work with many people, primarily at Google Brain. More information on TensorFlow is available at tensorflow.org.

Mon 19 Jun

pldi-2017-keynotes
17:55 - 18:50: PLDI Invited Speakers - Joint Keynote ­– Martin Abadi at Auditorium, Vertex Building
pldi-2017-keynotes149788770000017:55 - 18:50
Talk