register_full_backward_pre_hook
- UNet.register_full_backward_pre_hook(hook: Callable[[Module, Tuple[Tensor, ...] | Tensor], None | Tuple[Tensor, ...] | Tensor], prepend: bool = False) RemovableHandle
- Registers a backward pre-hook on the module. - The hook will be called every time the gradients for the module are computed. The hook should have the following signature: - hook(module, grad_output) -> tuple[Tensor] or None - The - grad_outputis a tuple. The hook should not modify its arguments, but it can optionally return a new gradient with respect to the output that will be used in place of- grad_outputin subsequent computations. Entries in- grad_outputwill be- Nonefor all non-Tensor arguments.- For technical reasons, when this hook is applied to a Module, its forward function will receive a view of each Tensor passed to the Module. Similarly the caller will receive a view of each Tensor returned by the Module’s forward function. - Warning - Modifying inputs inplace is not allowed when using backward hooks and will raise an error. - Parameters:
- hook (Callable) – The user-defined hook to be registered. 
- prepend (bool) – If true, the provided - hookwill be fired before all existing- backward_prehooks on this- torch.nn.modules.Module. Otherwise, the provided- hookwill be fired after all existing- backward_prehooks on this- torch.nn.modules.Module. Note that global- backward_prehooks registered with- register_module_full_backward_pre_hook()will fire before all hooks registered by this method.
 
- Returns:
- a handle that can be used to remove the added hook by calling - handle.remove()
- Return type:
- torch.utils.hooks.RemovableHandle