You’ve run your first torch demo, and acquainted yourself with the main actors (tensors, modules, optimizers)? Then you’re ready to dive into applied examples. The list of examples keeps growing as the ecosystem evolves. What area are you interested in?
Bird classification is a multi-class classification task. In addition to being a blueprint for doing classification with torch, this introductory example shows how to load data, make use of pre-trained models, and benefit from learning rate schedulers.
Brain image segmentation builds a U-Net from scratch. This intermediate-level example is a great introduction to building your own modules, as well as custom datasets that perform data preprocessing and data augmentation for computer vision.
Labeling poisonous mushrooms is a first introduction to handling a mix of numerical and categorical data, using embedding modules for the latter. It also provides a blueprint for creating torch models from scratch.
torch, tidymodels, and high-energy physics introduces
tabnet, a torch implementation of “TabNet: Attentive Interpretable Tabular Learning” that is fully integrated with the
tidymodelsframework. Thanks to
tidymodelsintegration, both pre-processing and hyperparameter tuning need a minimal amount of code.
Time series forecasting
Introductory time-series forecasting with torch is a thorough introduction to RNNs (GRUs/LSTMs), explaining usage and terminology. torch time series continued: A first go at multi-step prediction builds on this, and widens to the scope to multi-step-prediction.
torch time series, take three: Sequence-to-sequence prediction and torch time series, final episode: Attention expand on the prior two articles, introducing more advanced concepts like sequence-to-sequence processing and attention.
Convolutional LSTM for spatial forecasting is an intermediate-level example that shows how to build a convolutional LSTM from scratch.
- Simple audio classification with torch introduces
torch’s audio processing framework.