Scaling TensorFlow using tf.distribute / Taylor Robie.

Author
Robie, Taylor [Browse]
Format
Video/Projected medium
Language
English
Εdition
1st edition.
Published/​Created
O'Reilly Media, Incorporated, 2020.
Description
1 online resource.

Details

Subject(s)
Author
Library of Congress genre(s)
Series
Safari Books Online (Series) [More in this series]
Summary note
TensorFlow's tf.distribute library helps you scale your model from a single GPU to multiple GPUs and to multiple machines using simple APIs that require very few changes to your existing code. Join Taylor Robie and Priya Gupta (Google) to learn how you can use tf.distribute to scale your machine learning model on a variety of hardware platforms ranging from commercial cloud platforms to dedicated hardware. You'll learn tools and tips to get the best scaling for your training in TensorFlow. Prerequisite knowledge Familiarity with TensorFlow What you'll learn Learn how to distribute TensorFlow using best practices in 2.0 on a variety of equipment.
Issuing body
Made available through: Safari, an O'Reilly Media Company.
Source of description
Online resource; Title from title screen (viewed February 28, 2020)
Participant(s)/​Performer(s)
Presenter, Taylor Robie, Priya Gupta.
OCLC
1143018256
Other standard number
  • 0636920373636
Statement on language in description
Princeton University Library aims to describe library materials in a manner that is respectful to the individuals and communities who create, use, and are represented in the collections we manage. Read more...
Other views
Staff view

Supplementary Information