PyTorch uses an Autograd module to compute automatic differentiation. In short, a recorder details what operations are performed and then replays it to synthesize the gradients. This saves time in the development of neural networks as data differentiation is performed swiftly at the forward pass. PyTorch's optim package allows a user to define an optimizer that will update weights automatically. However, when users want to create their own custom model, they can take advantage of PyTorch's nn.module. Given the various modules, PyTorch allows you to implement different types of layers such as convolutional layers, recurrent layers, and linear layers, among others.
The world's most comprehensive data science & artificial intelligence glossary