幫助中心 | 我的帳號 | 關於我們

PyTorch深度學習編程(創建和部署深度學習應用程序影印版)(英文版)

  • 作者:(美)伊恩·波特|責編:張燁
  • 出版社:東南大學
  • ISBN:9787564188795
  • 出版日期:2020/06/01
  • 裝幀:平裝
  • 頁數:200
人民幣:RMB 79 元      售價:
放入購物車
加入收藏夾

內容大鋼
    向深度學習勇敢邁出下一步吧,這種機器學習方法正在改變我們周圍的世界。通過這本實用的參考書,你將學會使用Facebook的開源PyTorch框架快速了解深度學習的關鍵思想,掌握創建你自己的神經網路所需的最新技能。
    Ian Pointer首先為你展示如何在基於雲的環境中設置PyTorch,然後通過深入了解每個元素,帶領你創建有助於對圖像、聲音、文本等進行操作的神經網路架構。他還介紹了將遷移學習應用於圖像、調試模型以及生產環境中的PyTorch的關鍵概念。

作者介紹
(美)伊恩·波特|責編:張燁
    伊恩·波特是一名數據工程師,專門為多個財富100強客戶提供機器學習解決方案(包括深度學習技術)。lan目前在Lucidworl

目錄
Preface
1. Getting Started with PyTorch
Building a Custom Deep Learning Machine
GPU
CPU/Motherboard
RAM
Storage
Deep Learning in the Cloud
Google Colaboratory
Cloud Providers
Which Cloud Provider Should I Use?
Using Jupyter Notebook
Installing PyTorch from Scratch
Download CUDA
Anaconda
Finally, PyTorch!(and Jupyter Notebook)
Tensors
Tensor Operations
Tensor Broadcasting
Conclusion
Further Reading
2. Image Classification with PyTorch
Our Classification Problem
Traditional Challenges
But First, Data
PyTorch and Data Loaders
Building a Training Dataset
Building Validation and Test Datasets
Finally, a Neural Network!
Activation Functions
Creating a Network
Loss Functions
Optimizing
Training
Making It Work on the GPU
Putting It All Together
Making Predictions
Model Saving
Conclusion
Further Reading
3. Convolutional Neural Networks
Our First Convolutional Model
Convolutions
Pooling
Dropout
History of CNN Architectures
AlexNet
Inception/GoogLeNet
VGG
ResNet

Other Architectures Are Available!
Using Pretrained Models in PyTorch
Examining a Model's Structure
BatchNorm
Which Model Should You Use?
One-Stop Shopping for Models: PyTorch Hub
Conclusion
Further Reading
4. Transfer Learning and Other Tricks
Transfer Learning with ResNet
Finding That Learning Rate
Differential Learning Rates
Data Augmentation
Torchvision Transforms
Color Spaces and Lambda Transforms
Custom Transform Classes
Start Small and Get Bigger!
Ensembles
Conclusion
Further Reading
5. Text Classificati0n
Recurrent Neural Networks
Long Short-Term Memory Networks
Gated Recurrent Units
biLSTM
Embeddings
torchtext
Getting Our Data: Tweets!
Defining Fields
Building a Vocabulary
Creating Our Model
Updating the Training Loop
Classifying Tweets
Data Augmentation
Random Insertion
Random Deletion
Random Swap
Back Translation
Augmentation and torchtext
Transfer Learning?
Conclusion
Further Reading
6. A Journey into Sound
Sound
The ESC-50 Dataset
Obtaining the Dataset
Playing Audio in Jupyter
Exploring ESC-50
SoX and LibROSA
torchaudio

Building an ESC-50 Dataset
A CNN Model for ESC-50
This Frequency Is My Universe
Mel Spectrograms
A New Dataset
A Wild ResNet Appears
Finding a Learning Rate
Audio Data Augmentation
torchaudio Transforms
SoX Effect Chains
SpecAugment
Further Experiments
Conclusion
Further Reading
7. Debugging PyTorch Models
It's 3 a.m. What Is Your Data Doing?
TensorBoard
Installing TensorBoard
Sending Data to TensorBoard
PyTorch Hooks
Plotting Mean and Standard Deviation
Class Activation Mapping
Flame Graphs
Installing py-spy
Reading Flame Graphs
Fixing a Slow Transformation
Debugging GPU Issues
Checking Your GPU
Gradient Checkpointing
Conclusion
Further Reading
8. PyTorch in Production
Model Serving
Building a Flask Service
Setting Up the Model Parameters
Building the Docker Container
Local Versus Cloud Storage
Logging and Telemetry
Deploying on Kubernetes
Setting Up on Google Kubernetes Engine
Creating a k8s Cluster
Scaling Services
Updates and Cleaning Up
TorchScript
Tracing
Scripting
TorchScript Limitations
Working with libTorch
Obtaining libTorch and Hello World
Importing a TorchScript Model

Conclusion
Further Reading
9. PyTorch in the Wild
Data Augmentation: Mixed and Smoothed
mixup
Label Smoothing
Computer, Enhance!
Introduction to Super-Resolution
An Introduction to GANs
The Forger and the Critic
Training a GAN
The Dangers of Mode Collapse
ESRGAN
Further Adventures in Image Detection
Object Detection
Faster R-CNN and Mask R-CNN
Adversarial Samples
Black-Box Attacks
Defending Against Adversarial Attacks
More Than Meets the Eye: The Transformer Architecture
Paying Attention
Attention Is All You Need
BERT
FastBERT
GPT-2
Generating Text with GPT-2
ULMFiT
What to Use?
Conclusion
Further Reading
Index

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032