跳到主要內容

TensorFlow for Deep Learning

Book Description


Learn how to solve challenging machine learning problems with Tensorflow, Google’s revolutionary new system for deep learning. If you have some background with basic linear algebra and calculus, this practical book shows you how to build—and when to use—deep learning architectures. You’ll learn how to design systems capable of detecting objects in images, understanding human speech, analyzing video, and predicting the properties of potential medicines.
TensorFlow for Deep Learning teaches concepts through practical examples and builds understanding of deep learning foundations from the ground up. It’s ideal for practicing developers comfortable with designing software systems, but not necessarily with creating learning systems. This book is also useful for scientists and other professionals who are comfortable with scripting, but not necessarily with designing learning algorithms.
  • Gain in-depth knowledge of the TensorFlow API and primitives.
  • Understand how to train and tune machine learning systems with TensorFlow on large datasets.
  • Learn how to use TensorFlow with convolutional networks, recurrent networks, LSTMs, and reinforcement learning.

Table of Contents

  1. Preface
    1. Conventions Used in This Book
    2. Using Code Examples
    3. O’Reilly Safari
    4. How to Contact Us
    5. Acknowledgments
  2. 1. Introduction to Deep Learning
    1. Machine Learning Eats Computer Science
    2. Deep Learning Primitives
      1. Fully Connected Layer
      2. Convolutional Layer
      3. Recurrent Neural Network Layers
      4. Long Short-Term Memory Cells
    3. Deep Learning Architectures
      1. LeNet
      2. AlexNet
      3. ResNet
      4. Neural Captioning Model
      5. Google Neural Machine Translation
      6. One-Shot Models
      7. AlphaGo
      8. Generative Adversarial Networks
      9. Neural Turing Machines
    4. Deep Learning Frameworks
      1. Limitations of TensorFlow
    5. Review
  3. 2. Introduction to TensorFlow Primitives
    1. Introducing Tensors
      1. Scalars, Vectors, and Matrices
      2. Matrix Mathematics
      3. Tensors
      4. Tensors in Physics
      5. Mathematical Asides
    2. Basic Computations in TensorFlow
      1. Installing TensorFlow and Getting Started
      2. Initializing Constant Tensors
      3. Sampling Random Tensors
      4. Tensor Addition and Scaling
      5. Matrix Operations
      6. Tensor Types
      7. Tensor Shape Manipulations
      8. Introduction to Broadcasting
    3. Imperative and Declarative Programming
      1. TensorFlow Graphs
      2. TensorFlow Sessions
      3. TensorFlow Variables
    4. Review
  4. 3. Linear and Logistic Regression with TensorFlow
    1. Mathematical Review
      1. Functions and Differentiability
      2. Loss Functions
      3. Gradient Descent
      4. Automatic Differentiation Systems
    2. Learning with TensorFlow
      1. Creating Toy Datasets
      2. New TensorFlow Concepts
    3. Training Linear and Logistic Models in TensorFlow
      1. Linear Regression in TensorFlow
      2. Logistic Regression in TensorFlow
    4. Review
  5. 4. Fully Connected Deep Networks
    1. What Is a Fully Connected Deep Network?
    2. “Neurons” in Fully Connected Networks
      1. Learning Fully Connected Networks with Backpropagation
      2. Universal Convergence Theorem
      3. Why Deep Networks?
    3. Training Fully Connected Neural Networks
      1. Learnable Representations
      2. Activations
      3. Fully Connected Networks Memorize
      4. Regularization
      5. Training Fully Connected Networks
    4. Implementation in TensorFlow
      1. Installing DeepChem
      2. Tox21 Dataset
      3. Accepting Minibatches of Placeholders
      4. Implementing a Hidden Layer
      5. Adding Dropout to a Hidden Layer
      6. Implementing Minibatching
      7. Evaluating Model Accuracy
      8. Using TensorBoard to Track Model Convergence
    5. Review
  6. 5. Hyperparameter Optimization
    1. Model Evaluation and Hyperparameter Optimization
    2. Metrics, Metrics, Metrics
      1. Binary Classification Metrics
      2. Multiclass Classification Metrics
      3. Regression Metrics
    3. Hyperparameter Optimization Algorithms
      1. Setting Up a Baseline
      2. Graduate Student Descent
      3. Grid Search
      4. Random Hyperparameter Search
      5. Challenge for the Reader
    4. Review
  7. 6. Convolutional Neural Networks
    1. Introduction to Convolutional Architectures
      1. Local Receptive Fields
      2. Convolutional Kernels
      3. Pooling Layers
      4. Constructing Convolutional Networks
      5. Dilated Convolutions
    2. Applications of Convolutional Networks
      1. Object Detection and Localization
      2. Image Segmentation
      3. Graph Convolutions
      4. Generating Images with Variational Autoencoders
    3. Training a Convolutional Network in TensorFlow
      1. The MNIST Dataset
      2. Loading MNIST
      3. TensorFlow Convolutional Primitives
      4. The Convolutional Architecture
      5. Evaluating Trained Models
      6. Challenge for the Reader
    4. Review
  8. 7. Recurrent Neural Networks
    1. Overview of Recurrent Architectures
    2. Recurrent Cells
      1. Long Short-Term Memory (LSTM)
      2. Gated Recurrent Units (GRU)
    3. Applications of Recurrent Models
      1. Sampling from Recurrent Networks
      2. Seq2seq Models
    4. Neural Turing Machines
    5. Working with Recurrent Neural Networks in Practice
    6. Processing the Penn Treebank Corpus
      1. Code for Preprocessing
      2. Loading Data into TensorFlow
      3. The Basic Recurrent Architecture
      4. Challenge for the Reader
    7. Review
  9. 8. Reinforcement Learning
    1. Markov Decision Processes
    2. Reinforcement Learning Algorithms
      1. Q-Learning
      2. Policy Learning
      3. Asynchronous Training
    3. Limits of Reinforcement Learning
    4. Playing Tic-Tac-Toe
      1. Object Orientation
      2. Abstract Environment
      3. Tic-Tac-Toe Environment
      4. The Layer Abstraction
      5. Defining a Graph of Layers
    5. The A3C Algorithm
      1. The A3C Loss Function
      2. Defining Workers
      3. Training the Policy
      4. Challenge for the Reader
    6. Review
  10. 9. Training Large Deep Networks
    1. Custom Hardware for Deep Networks
    2. CPU Training
      1. GPU Training
      2. Tensor Processing Units
      3. Field Programmable Gate Arrays
      4. Neuromorphic Chips
    3. Distributed Deep Network Training
      1. Data Parallelism
      2. Model Parallelism
    4. Data Parallel Training with Multiple GPUs on Cifar10
      1. Downloading and Loading the DATA
      2. Deep Dive on the Architecture
      3. Training on Multiple GPUs
      4. Challenge for the Reader
    5. Review
  11. 10. The Future of Deep Learning
    1. Deep Learning Outside the Tech Industry
      1. Deep Learning in the Pharmaceutical Industry
      2. Deep Learning in Law
      3. Deep Learning for Robotics
      4. Deep Learning in Agriculture
    2. Using Deep Learning Ethically
    3. Is Artificial General Intelligence Imminent?
    4. Where to Go from Here?
  12. Index

留言

這個網誌中的熱門文章

opencv4nodejs Asynchronous OpenCV 3.x Binding for node.js   122     2715     414   0   0 Author Contributors Repository https://github.com/justadudewhohacks/opencv4nodejs Wiki Page https://github.com/justadudewhohacks/opencv4nodejs/wiki Last Commit Mar. 8, 2019 Created Aug. 20, 2017 opencv4nodejs           By its nature, JavaScript lacks the performance to implement Computer Vision tasks efficiently. Therefore this package brings the performance of the native OpenCV library to your Node.js application. This project targets OpenCV 3 and provides an asynchronous as well as an synchronous API. The ultimate goal of this project is to provide a comprehensive collection of Node.js bindings to the API of OpenCV and the OpenCV-contrib modules. An overview of available bindings can be found in the  API Documentation . Furthermore, contribution is highly appreciated....

2017通訊大賽「聯發科技物聯網開發競賽」決賽團隊29強出爐!作品都在11月24日頒獎典禮進行展示

2017通訊大賽「聯發科技物聯網開發競賽」決賽團隊29強出爐!作品都在11月24日頒獎典禮進行展示 LIS   發表於 2017年11月16日 10:31   收藏此文 2017通訊大賽「聯發科技物聯網開發競賽」決賽於11月4日在台北文創大樓舉行,共有29個隊伍進入決賽,角逐最後的大獎,並於11月24日進行頒獎,現場會有全部進入決賽團隊的展示攤位,總計約為100個,各種創意作品琳琅滿目,非常值得一看,這次錯過就要等一年。 「聯發科技物聯網開發競賽」決賽持續一整天,每個團隊都有15分鐘面對評審團做簡報與展示,並接受評審們的詢問。在所有團隊完成簡報與展示後,主辦單位便統計所有評審的分數,並由評審們進行審慎的討論,決定冠亞季軍及其他各獎項得主,結果將於11月24日的「2017通訊大賽頒獎典禮暨成果展」現場公佈並頒獎。 在「2017通訊大賽頒獎典禮暨成果展」現場,所有入圍決賽的團隊會設置攤位,總計約為100個,展示他們辛苦研發並實作的作品,無論是想觀摩別人的成品、了解物聯網應用有那些新的創意、尋找投資標的、尋找人才、尋求合作機會或是單純有興趣,都很適合花點時間到現場看看。 頒獎典禮暨成果展資訊如下: 日期:2017年11月24日(星期五) 地點:中油大樓國光廳(台北市信義區松仁路3號) 我要報名參加「2017通訊大賽頒獎典禮暨成果展」>>> 在參加「2017通訊大賽頒獎典禮暨成果展」之前,可以先在本文觀看各團隊的作品介紹。 決賽29強團隊如下: 長者安全救星 可隨意描繪或書寫之電子筆記系統 微觀天下 體適能訓練管理裝置 肌少症之行走速率檢測系統 Sugar Robot 賽亞人的飛機維修輔助器 iTemp你的溫度個人化管家 語音行動冰箱 MR模擬飛行 智慧防盜自行車 跨平台X-Y視覺馬達控制 Ironmet 菸消雲散 無人小艇 (Mini-USV) 救OK-緊急救援小幫手 穿戴式長照輔助系統 應用於教育之模組機器人教具 這味兒很台味 Aquarium Hub 發展遲緩兒童之擴增實境學習系統 蚊房四寶 車輛相控陣列聲納環境偵測系統 戶外團隊運動管理裝置 懷舊治療數位桌曆 SeeM智能眼罩 觸...

HTTP headers設置,讓網站更安全避免淪為挖礦機

https://meet.bnext.com.tw/articles/view/42595?utm_campaign=5741400&utm_source=SendPulse&utm_medium=push HTTP headers設置,讓網站更安全避免淪為挖礦機 梅干 2018/04/17 最近一位工程師好友,突然丟一個檢測平台的分數,起初梅干以為是SSL檢測,就進入測試一下,卻發現到,怎分數相當的低,最後研究了一下,原來網站不是有加入SSL憑證就代表安全無誤,就在今年的六月KeyCDN在官方部落格,發表了一篇關於站點HTTP headers配置,分別為X-XSS-Protection、X-Frame-Options、X-Content-Type-Options Content-Security-Policy、Strict-Transport-Security、Public-Key-Pins這六項進行安全的設定,讓網站更加的安全,不會淪為挖礦主機。 而要修正這個HTTP headers的配置,一點也不難,只需將設定參數加入.htaccess文件後,立即就可讓網站更加的安全,並且也可遠離挖礦惡意程式入侵主機,因此身為網站管理員,這個絕不容輕乎,因此也趕快來檢測看看,你的網站是否安全,以及要如何將網站進行修正。 HTTP headers檢測網站 網站名稱: securityheaders.io  網站連結: https://securityheaders.io Step1 首先,開啟.htaccess檔案後,並加入下方的參數設定。 # HTTP security settings start Header set Strict-Transport-Security: max-age=2592000; Header set X-Frame-Options: SAMEORIGIN Header set Referrer-Policy: no-referrer Header set X-XSS-Protection: "1; mode=block" Header set X-Content-Type-Options: nosniff # HTTP security settings end ...