Related projects¶
Below you can find other projects started by the same author and based on PyTorch as well:
torchlayers:¶
torchlayers is a library based on PyTorch
providing automatic shape and dimensionality inference of torch.nn layers + additional
building blocks featured in current SOTA architectures (e.g. Efficient-Net).
Shape inference for most of
torch.nnmodule (convolutional, recurrent, transformer, attention and linear layers)Dimensionality inference (e.g.
torchlayers.Convworking astorch.nn.Conv1d/2d/3dbased oninput shape)Shape inference of user created modules
Additional Keras-like layers (e.g.
torchlayers.Reshapeortorchlayers.StandardNormalNoise)Additional SOTA layers mostly from ImageNet competitions (e.g. PolyNet, Squeeze-And-Excitation, StochasticDepth.
Useful defaults (
samepadding and defaultkernel_size=3forConv, dropout rates etc.)Zero overhead and torchscript support
You can read documentation over at https://github.com/szymonmaszke/torchlayers.
torchdata:¶
torchdata extends torch.utils.data.Dataset and equips it with
functionalities known from tensorflow.data
like map or cache.
Use
map,apply,reduceorfilterdirectly onDatasetobjectscachedata in RAM/disk or via your own method (partial caching supported)Full PyTorch’s [
Dataset](https://pytorch.org/docs/stable/data.html#torch.utils.data.Dataset) and [IterableDataset](https://pytorch.org/docs/stable/data.html#torch.utils.data.IterableDataset>) supportGeneral
torchdata.mapslikeFlattenorSelectExtensible interface (your own cache methods, cache modifiers, maps etc.)
Useful
torchdata.datasetsclasses designed for general tasks (e.g. file reading)Support for
torchvisiondatasets (e.g.ImageFolder,MNIST,CIFAR10) viatd.datasets.WrapDatasetMinimal overhead (single call to
super().__init__())
You can read documentation over at https://szymonmaszke.github.io/torchdata.
torchlambda:¶
torchlambda is a tool to deploy PyTorch models on Amazon’s AWS Lambda using AWS SDK for C++ and custom C++ runtime.
Using statically compiled dependencies whole package is shrunk to only 30MB.
Due to small size of compiled source code users can pass their models as AWS Lambda layers. Services like Amazon S3 are no longer necessary to load your model.
torchlambda has it’s PyTorch & AWS dependencies always up to date because of continuous deployment run at 03:00 a.m. every day.
You can read project’s wiki over at https://github.com/szymonmaszke/torchlambda/wiki