L i"ddlmZddlZddlmZddlZgdZeddZedd Z d dd Z d dd Z dd Z dd Z y)) annotationsN)TypeVar)fuse_conv_bn_evalfuse_conv_bn_weightsfuse_linear_bn_evalfuse_linear_bn_weightsConvTztorch.nn.modules.conv._ConvNd)boundLinearTztorch.nn.Linearc l|js |jrJdtj|}|j |jJt |j |j|j|j|j|j |j|\|_|_|S)a+Fuse a convolutional module and a BatchNorm module into a single, new convolutional module. Args: conv (torch.nn.modules.conv._ConvNd): A convolutional module. bn (torch.nn.modules.batchnorm._BatchNorm): A BatchNorm module. transpose (bool, optional): If True, transpose the convolutional weight. Defaults to False. Returns: torch.nn.modules.conv._ConvNd: The fused convolutional module. .. note:: Both ``conv`` and ``bn`` must be in eval mode, and ``bn`` must have its running buffers computed. Fusion only for eval!) trainingcopydeepcopy running_mean running_varrweightbiaseps)convbn transpose fused_convs [/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/torch/nn/utils/fusion.pyrrs$ F/FF -t$J ?? &2>>+EE E)=      *&Jz c|j}| |jn|} |tj|}|tj|}|tj|}tj||z} |r"ddgdgt |j dz zz} n!ddgdgt |j dz zz} ||| zj| zj|} ||z | z|z|zj| } tjj| |jtjj| |jfS)aFuse convolutional module parameters and BatchNorm module parameters into new convolutional module parameters. Args: conv_w (torch.Tensor): Convolutional weight. conv_b (Optional[torch.Tensor]): Convolutional bias. bn_rm (torch.Tensor): BatchNorm running mean. bn_rv (torch.Tensor): BatchNorm running variance. bn_eps (float): BatchNorm epsilon. bn_w (Optional[torch.Tensor]): BatchNorm weight. bn_b (Optional[torch.Tensor]): BatchNorm bias. transpose (bool, optional): If True, transpose the conv weight. Defaults to False. Returns: Tuple[torch.nn.Parameter, torch.nn.Parameter]: Fused convolutional weight and bias. dtype) r!torch zeros_like ones_likersqrtlenshapereshapetonn Parameter requires_grad)conv_wconv_bbn_rmbn_rvbn_epsbn_wbn_brconv_weight_dtypeconv_bias_dtype bn_var_rsqrtr' fused_conv_w fused_conv_bs rrr8sS2 &,&8fll>OO ~!!%( |u% |&;;uv~.LB1#V\\!2Q!677Q1#V\\!2Q!677d\1::5AAEEFLe^|3d:TAEEFL <)=)=> <)=)=> rc |js |jrJdtj|} |j|jk(s|jdk(sJd|j |j Jt|j|j|j |j |j|j|j\|_|_ |S)aFuse a linear module and a BatchNorm module into a single, new linear module. Args: linear (torch.nn.Linear): A Linear module. bn (torch.nn.modules.batchnorm._BatchNorm): A BatchNorm module. Returns: torch.nn.Linear: The fused linear module. .. note:: Both ``linear`` and ``bn`` must be in eval mode, and ``bn`` must have its running buffers computed. r rzGTo fuse, linear.out_features == bn.num_features or bn.num_features == 1) rrr out_features num_featuresrrrrrr)linearr fused_linears rrrms 2;;H1HH /==(L    "// 1R__5IQ I ?? &2>>+EE E-C     .*L* rc|j}| |jn|}|tj|}|tj||zz} || j dj |z} ||z | z|zj |} tj j| |jtj j| |jfS)a2Fuse linear module parameters and BatchNorm module parameters into new linear module parameters. Args: linear_w (torch.Tensor): Linear weight. linear_b (Optional[torch.Tensor]): Linear bias. bn_rm (torch.Tensor): BatchNorm running mean. bn_rv (torch.Tensor): BatchNorm running variance. bn_eps (float): BatchNorm epsilon. bn_w (torch.Tensor): BatchNorm weight. bn_b (torch.Tensor): BatchNorm bias. Returns: Tuple[torch.nn.Parameter, torch.nn.Parameter]: Fused linear weight and bias. rr ) r!r"r#r% unsqueezer)r*r+r,) linear_wlinear_br/r0r1r2r3linear_weight_dtypelinear_bias_dtypebn_scalefused_wfused_bs rrrs.#..*2*>DW##E*ekk%&.11H++B/229L2MMG5 H,t377>O7PG 88  gx'='= >@R@R''A r)F)rr r%torch.nn.modules.batchnorm._BatchNormrboolreturnr )r- torch.Tensorr.torch.Tensor | Noner/rJr0rJr1floatr2rKr3rKrrHrI-tuple[torch.nn.Parameter, torch.nn.Parameter])r<r rrGrIr )r@rJrArKr/rJr0rJr1rLr2rJr3rJrIrM) __future__rrtypingrr"__all__r r rrrrrrrRs5"    >? )#4 5 ! !-!! !X2 2 2 2  2  2  2 2232j, ,-, ,^""!" "  "  "  " "3"r