Shortcuts

LazyLinear

class torch.nn.LazyLinear(out_features, bias=True)[source]

A torch.nn.Linear module with lazy initialization.

In this module, the weight and bias are of torch.nn.UninitializedParameter class. They will be initialized after the first call to forward is done and the module will become a regular torch.nn.Linear module.

Check the torch.nn.modules.lazy.LazyModuleMixin for further documentation on lazy modules and their limitations.

Parameters
  • out_features – size of each output sample

  • bias – If set to False, the layer will not learn an additive bias. Default: True

Variables
  • ~LazyLinear.weight – the learnable weights of the module of shape (out_features,in_features)(\text{out\_features}, \text{in\_features}). The values are initialized from U(k,k)\mathcal{U}(-\sqrt{k}, \sqrt{k}), where k=1in_featuresk = \frac{1}{\text{in\_features}}

  • ~LazyLinear.bias – the learnable bias of the module of shape (out_features)(\text{out\_features}). If bias is True, the values are initialized from U(k,k)\mathcal{U}(-\sqrt{k}, \sqrt{k}) where k=1in_featuresk = \frac{1}{\text{in\_features}}

cls_to_become

alias of Linear

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources