Official source code repository for "Deep Learning with Swift for TensorFlow" book.
๐ @ Amazon (Paperback / Kindle) / ๏ฃฟ Books / Apress / SpringerLink
Discover more insight about deep learning algorithms with Swift for TensorFlow. The Swift language was designed by Apple for optimized performance and development whereas TensorFlow library was designed by Google for advanced machine learning research. Swift for TensorFlow is a combination of both with support for modern hardware accelerators and more. This book covers the deep learning concepts from fundamentals to advanced research. It also introduces the Swift language for beginners in programming. This book is well suited for newcomers and experts in programming and deep learning alike. After reading this book you should be able to program various state-of-the-art deep learning algorithms yourself.
The book covers foundational concepts of machine learning. It also introduces the mathematics required to understand deep learning. Swift language is introduced such that it allows beginners and researchers to understand programming and easily transit to Swift for TensorFlow, respectively. You will understand the nuts and bolts of building and training neural networks, and build advanced algorithms.
- Understand deep learning concepts
- Program various deep learning algorithms
- Run the algorithms in cloud
- Newcomers to programming and/or deep learning, and experienced developers.
- Experienced deep learning practitioners and researchers who desire to work in user space instead of library space with a same programming language without compromising the speed.
โ1.1 Machine Learning
โโ1.1.1 Experience
โโ1.1.2 Task
โโ1.1.3 Performance Measure
โ1.2 Machine Learning Paradigms
โโ1.2.1 Supervised Learning
โโ1.2.2 Unsupervised Learning
โโ1.2.3 Semi-supervised Learning
โโ1.2.4 Reinforcement Learning
โ1.3 Maximum Likelihood Estimation
โ1.4 Elements of a Machine Learning Algorithm
โโ1.4.1 Data
โโ1.4.2 Models
โโ1.4.3 Loss Function
โโ1.4.4 Optimizer
โโ1.4.5 Regularizer
โ1.5 Bias and Variance Trade-Off
โ1.6 Why Deep Learning?
โโ1.6.1 Curse of Dimensionality
โโ1.6.2 Invalid Smoothness Assumption
โโ1.6.3 Deep Learning Advantages
โ1.7 Summary
โ2.1 Linear Algebra
โโ2.1.1 Matrices and Vectors
โโ2.1.2 Unary Matrix Operations
โโ2.1.3 Binary Matrix Operations
โโ2.1.4 Norms
โ2.2 Probability Theory
โโ2.2.1 Joint Probability
โโ2.2.2 Conditional Probability
โโ2.2.3 Elementary Rules
โโ2.2.4 Chain Rule
โโ2.2.5 Bayes Rule
โ2.3 Differential Calculus
โโ2.3.1 Function
โโ2.3.2 Differentiation of Univariate Function
โโ2.3.3 Differentiation of Multivariate Function
โโ2.3.4 Differentiation of Vector Function
โโ2.3.5 Differentiation of Matrix Function
โ2.4 Summary
โ3.1 Swift is Everywhere
โ3.2 Swift for TensorFlow
โ3.3 Algorithmic Differentiation
โโ3.3.1 Programming Approaches
โโ3.3.2 Accumulation Modes
โโ3.3.3 Implementation Approaches
โ3.4 Swift Language
โโ3.4.1 Values
โโ3.4.2 Collections
โโ3.4.3 Control Flow
โโ3.4.4 Closures and Functions
โโ3.4.5 Custom Types
โโ3.4.6 Modern Features
โโ3.4.7 Error Handling
โโ3.4.8 Advanced Operators
โโ3.4.9 Differentiation
โ3.5 Python Interoperability
โ3.6 Summary
โ4.1 Tensor
โ4.2 Dataset Loading
โโ4.2.1 Epochs and Batches
โ4.3 Defining Model
โโ4.3.1 Neural Network Protocols
โโ4.3.2 Sequence of Layers
โ4.4 Training and Testing
โโ4.4.1 Checkpointing
โโ4.4.2 Model Optimization
โโ4.4.3 TrainingLoop
โ4.5 From Scratch for Research
โโ4.5.1 Layer
โโ4.5.2 Activation Function
โโ4.5.3 Loss Function
โโ4.5.4 Optimizer
โ4.6 Summary
โ5.1 Gradient-Based Optimization
โโ5.1.1 Maxima, Minima, and Saddle Points
โโ5.1.2 Input Optimization
โโ5.1.3 Parameters Optimization
โ5.2 Linear Models
โโ5.2.1 Regression
โโ5.2.2 Classification
โ5.3 Deep Neural Network
โโ5.3.1 Dense Neural Network
โ5.4 Activation Functions
โโ5.4.1 Sigmoid
โโ5.4.2 Softmax
โโ5.4.3 ReLU
โโ5.4.4 ELU
โโ5.4.5 Leaky ReLU
โโ5.4.6 SELU
โ5.5 Loss Functions
โโ5.5.1 Sum of Squares
โโ5.5.2 Sigmoid Cross-Entropy
โโ5.5.3 Softmax Cross-Entropy
โ5.6 Optimization
โโ5.6.1 Gradient Descent
โโ5.6.2 Momentum
โ5.7 Regularization
โโ5.7.1 Dataset
โโ5.7.2 Architecture
โโ5.7.3 Loss Function
โโ5.7.4 Optimization
โ5.8 Summary
โ6.1 Convolutional Neural Network
โโ6.1.1 Convolution Layer
โโ6.1.2 Dimensions Calculation
โโ6.1.3 Pooling Layer
โโ6.1.4 Upsampling
โ6.2 Prominent Features
โโ6.2.1 Local Connectivity
โโ6.2.2 Parameter Sharing
โโ6.2.3 Translation Equivariance
โ6.3 Shortcut Connection
โ6.4 Image Recognition
โ6.5 Conclusion
- First, you should install the latest Swift for TensorFlow toolchain.
- But to run only differentiation-specific code (for instance, the source code of Differentiable Programming chapter) not requiring deep learning features, simply install the latest Swift toolchain snapshot from Swift.org under Trunk Development (main) section. (The future differentiation feature updates will be posted on this website and will go through the standard Swift Evolution process.)
- Then you must select the newly installed toolchain in Xcode from
Preferences (Command + ,) > Components > Toolchains > (Swift for TensorFlow or Swift Development Snapshot)
.
This Swift package offers various executable targets listed in Package.swift.
List of all executable targets:
AdvancedOperators
AlgorithmicDifferentiation
Arrays
Classes
Closures
ConditionalStatements
ControlTransfer
Dictionaries
Differentiation
EarlyExit
Enumerations
ErrorHandling
Extensions
Generics
GlobalFunctions
Loops
NestedFunctions
Protocols
PythonInteroperability
Sets
Structures
Values
EpochAndBatches
FromScratchForResearch
ModelDefinition
TensorExplanation
TrainingAndTesting
TrainingLoopExample
InputOptimization
LinearRegression
ParametersOptimization
PolynomialRegression
ImageRecognition
Select the executable target (for instance, AlgorithmicDifferentiation
) from the scheme drop-down menu and then click the Run button. And result should be presented in the console.
To execute any of them first enter into this package's root directory.
cd deep-learning-with-swift-for-tensorflow-book
Then run the following command in terminal.
swift run AlgorithmicDifferentiation
The following output will be displayed.
expression value: 30.0
expression derivative: 28.0
Just replace AlgorithmicDifferentiation
with any executable target you wish to run.
Note: This Swift package was tested on Xcode 12.5 running on macOS version 11.3. It should also work right out of the box on Linux distributions.
๐ซ [email protected]