Dropout¤
equinox.nn.Dropout(equinox.Module)
¤
Applies dropout.
Note that this layer behaves differently during training and inference. During
training then dropout is randomly applied; during inference this layer does nothing.
Whether the model is in training or inference mode should be toggled using
equinox.nn.inference_mode.
__init__(p: float = 0.5, inference: bool = False, *, deterministic: bool | None = None)
¤
Arguments:
p: The fraction of entries to set to zero. (On average.)inference: Whether to actually apply dropout at all. IfTruethen dropout is not applied. IfFalsethen dropout is applied. This may be toggled withequinox.nn.inference_modeor overridden duringequinox.nn.Dropout.__call__.deterministic: Deprecated alternative toinference.
__call__(x: Array, *, key: PRNGKeyArray | None = None, inference: bool | None = None, deterministic: bool | None = None) -> Array
¤
Arguments:
x: An any-dimensional JAX array to dropout.key: Ajax.random.PRNGKeyused to provide randomness for calculating which elements to dropout. (Keyword only argument.)inference: As perequinox.nn.Dropout.__init__. IfTrueorFalsethen it will take priority overself.inference. IfNonethen the value fromself.inferencewill be used.deterministic: Deprecated alternative toinference.