L iRddlmZddlmZddlmZddlmZddlm Z ddl m Z ddl m Z dd lmZdd lmZej$eej&eej(e ej*eej,eej.e ej0eej2e iZd Zd efd ZdefdZy))optim)_FunctionalAdadelta)_FunctionalAdagrad)_FunctionalAdam)_FunctionalAdamax)_FunctionalAdamW)_FunctionalRMSprop)_FunctionalRprop)_FunctionalSGDc(|tvr |t|<yy)a1 Interface to insert a new functional optimizer to functional_optim_map ``fn_optim_key`` and ``fn_optimizer`` are user defined. The optimizer and key need not be of :class:`torch.optim.Optimizer` (e.g. for custom optimizers) Example:: >>> # import the new functional optimizer >>> # xdoctest: +SKIP >>> from xyz import fn_optimizer >>> from torch.distributed.optim.utils import register_functional_optim >>> fn_optim_key = "XYZ_optim" >>> register_functional_optim(fn_optim_key, fn_optimizer) N)functional_optim_map)keyrs c/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/torch/distributed/optim/utils.pyregister_functional_optimrs &&$)S!' optim_clscv t|}t|g|i|S#t$r}td|d|d}~wwxYw)Nz Optimizer z( does not have a functional counterpart!)rKeyError ValueError_create_functional_optim)rargskwargsfunctional_clses ras_functional_optimr0sW-i8 $N DT DV DD  #K L  s  838functional_optim_clsc"|gg|i|ddiS)N_allow_empty_param_listT)rrrs rrr;s.       !%  rN)torchrfunctional_adadeltarfunctional_adagradrfunctional_adamrfunctional_adamaxrfunctional_adamwr functional_rmspropr functional_rpropr functional_sgdr AdagradAdamAdamWSGDAdadeltaRMSpropRpropAdamaxrrtyperrr rrr3s42,0.2.* MM% JJ KK! II~ NN' MM% KK! LL# *"E4E4r