A Mixed Precision library for JAX in python
Mixed precision training in JAX Mixed precision training [0] is a technique that mixes the use of full andhalf precision floating point numbers during training to reduce the memorybandwidth requirements and improve the computational efficiency of a givenmodel. This library implements support for mixed precision training in JAX by providingtwo key abstractions (mixed precision “policies” and loss scaling). Neuralnetwork libraries (such as Haiku) can integrate with jmp and provide“Automatic Mixed Precision (AMP)” support (automating or simplifying applyingpolicies to modules). All […]
Read more