Saving and loading models for inference in PyTorch
There are two approaches for saving and loading models for inference in PyTorch.
- The first is saving and loading the
state_dict
- and the second is saving and loading the entire model.
Steps
- 构建Dataset
Define and initialize the neural network
Initialize the optimizer
Save and load the model via
state_dict
Save and load entire model
Saving and loading a general checkpoint in PyTorch
Saving and loading a general checkpoint model for ==inference or resuming training== can be helpful for picking up where you last left off. When saving a general checkpoint, you must save more than just the model’s state_dict. ==It is important to also save the optimizer’s state_dict==, as this contains buffers and parameters that are updated as the model trains. Other items that you may want to save are the epoch you left off on, the latest recorded training loss, external torch.nn.Embedding
layers, and more, based on your own algorithm.
4. Save the general checkpoint
5. Load the general checkpoint
Saving and loading multiple models in one file using PyTorch
2. Define and initialize the neural network
3. Initialize the optimizer
4. Save multiple models
5. Load multiple models
还不快抢沙发