L i/ddlmZddlZddlmZddlmZmZddlm Z ddl m Z ddl m Z mZmZmZddlmZmZmZmZmZmZmZmZdd lmZerdd lmZed Gd dZed GddZed GddZ ed GddZ!d(dZ"d)dZ#d)dZ$ed GddZ%ed GddZ&ed GddZ'd*dZ(d+dZ)d hZ*hd!Z+hd"Z, d,d#Z- d- d.d$Z. d/d%Z/ d0d&Z0 d- d1d'Z1y)2) annotationsN) dataclass)cast TYPE_CHECKING)local)cpp) BaseCTypeBinding NamedCType tensorListT)BaseTyBaseTypeFunctionSchemaListTypeNativeFunctionNativeFunctionsViewGroup SchemaKindType) IDENT_REGEX)SequenceT)frozenc"eZdZUded<ded<y)SavedAttributer nctypestrexprN__name__ __module__ __qualname____annotations__[/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/torchgen/api/autograd.pyrrs  Ir#rcJeZdZUded<ded<ded<ded<ded<d ed <y ) Derivativerformulaoriginal_formulatuple[str, ...] var_namesztuple[SavedAttribute, ...] saved_inputs saved_outputsset[str]named_gradientsNrr"r#r$r&r&+s1L-,.-r#r&cTeZdZUded<ded<ded<ded<ded <d ed <d ed <y )ForwardDerivativerr'r)r*ztuple[Type, ...] var_typesztuple[str, ...] | Nonerequired_inputs_fw_gradrequired_inputs_primalboolrequired_original_self_valueis_reusing_outplace_formulaNrr"r#r$r0r0Gs9 L 4332#'&"&%r#r0ceZdZUded<ded<ded<ded<d ed <d ed <d ed <ded<ded<ded<ded<ded<ded<eddZ ddZy)DifferentiabilityInfornamerfuncz str | NoneopzSequence[Derivative] derivativeszSequence[ForwardDerivative]forward_derivativeszSequence[SavedAttribute]all_saved_inputsall_saved_outputsz Sequence[str]available_named_gradientsr-used_named_gradientszSequence[Binding]args_with_derivativesnon_differentiable_arg_nameszlist[bool] | Noneoutput_differentiabilityzlist[str] | None#output_differentiability_conditionsc2t|jdkDS)Nr)lenrBselfs r$has_derivativesz%DifferentiabilityInfo.has_derivativess4--.22r#c|jy|j}|jjdd}|dddj|ddz}|jdn|jd}t ||||j |j|j|j|j|j|j|j|j|j S) N.)maxsplitrz_copy._copy r9r:r;r<r=r>r?r@rArBrCrDrE) view_copyr9splitjoinr;r8r<r=r>r?r@rArBrCrDrE)rIgfname_split_by_periodview_copy_nameview_copy_op_names r$%create_view_copy_from_view_derivativez;DifferentiabilityInfo.create_view_copy_from_view_derivatives ;;  KK#yysQ?034F;chh  $?  %)GGODDGG9E9J$ (( $ 8 8!22"44&*&D&D!%!:!:"&"<"<)-)J)J%)%B%B040X0X  r#N)returnr4)rUrr[DifferentiabilityInfo | None)rrr r!propertyrJrZr"r#r$r8r8fs I  N&%54/.0/ -,#" -,#0/0/*:9 33 ) % r#r8c|y|jD]9}|j}tjt j ||s9yy)NFT)r<r'researchrformat)infoident derivativer's r$ uses_identresK |&& $$ 99[''. 8 r#ct|dS)Nretain_variablesrerbs r$uses_retain_variablesrjs d. //r#ct|dS)Ngradrhris r$uses_single_gradrms dF ##r#c,eZdZUded<ded<ded<y)DifferentiableInputrr9rtypecpp_typeNrr"r#r$roro I JMr#roc,eZdZUded<ded<ded<y)DifferentiableOutputrr9rrprqNrr"r#r$rtrtrrr#rtc,eZdZUded<ded<ded<y)'NativeFunctionWithDifferentiabilityInforr:z'dict[str, DifferentiabilityInfo] | Nonerbz-dict[str, Sequence[ForwardDerivative]] | Nonefw_derivativesNrr"r#r$rvrvs  11AAr#rvc|jjs6|j+td|jj Dryy)aHow are we going to call the underlying implementation of a declaration? There are two strategies: - use_derived: we want to call the implementation on CPUDoubleType (or a similar, derived Type instance). Because these derived instances deal in Tensors, not Variables (it's a completely different object, so it doesn't dispatch back to VariableType), code on this dispatch path needs to wrap/unwrap tensors. If the derived implementation takes and returns tensors, the implementation is usually differentiable (although we also use the derived dispatch path for non-differentiable functions that we still want to dispatch on the derived Type instance; e.g., size()) - use_type: we want to call the implementation on Type, because it is implemented concretely, and the functions it invokes will get dispatched back to VariableType (which will ensure that they are differentiable.) c34K|]}|jywN)rJ.0rbs r$ z$dispatch_strategy.."s#VTD$8$8#Vs use_deriveduse_type)r: is_abstractrbanyvalues)fns r$dispatch_strategyr s=, ww #VRWW^^EU#V V" r#ct|jjjjjdS)N _foreach_)r:r9base startswith)rVs r$is_foreach_funcr<s' 66;;   + +K 88r#_foreach_zero_>_foreach_add.Tensor_foreach_div.Tensor_foreach_mul.Tensor>_foreach_add.Scalar_foreach_sub.Scalar_foreach_add_.Scalar_foreach_sub_.Scalar_foreach_add.ScalarList_foreach_sub.ScalarList_foreach_add_.ScalarList_foreach_sub_.ScalarListc|jjjjjdd|jjjk(xr |jjj xs%t |jjt vxrt |jjtvxsIt|jjjt|jjk(xrNtdt|jjj|jjDS)Nrc3K|]6\}}|j|jt|jddfv8yw)elemN)rpgetattr)r|argref_args r$r}z+is_reference_for_foreach..gs; W LLSXXwsxx'FG G s<>) r:r9rrSinplacer_foreach_with_inplace_ref_skip_argument_len_checkrG arguments flat_non_outallzip)rVfunction_schemas r$is_reference_for_foreachrXs  ##K048L8L8Q8Q8V8VV $$))11 1 =166;;#<<   8 8 ;166##001?,,99:;   #  --))66!  r#cd}|jD]\}}t||s||}|n|~|jjtj k(rSt |jjtvr.|jD]\}}t||s||}|n|y|jjtj k(r|dfSii}}tt|jjjjjD]0\} \} } | j|| j<| || j<2ggg}} } g}t|jD]\} }|jjddjdd}gg}}t!j"|j$|j&5|j(D]}|j*j-d d }||}t/||j0t2r|d z}n|}|j*j||}|jt5t |j6j|}t9j:||| }t=|j|j0j?}|jAtC|| |jDD]Z}|j6jdk(r5|jAtCt=dtGtHd QtKd ddd|jLDcgc]}|| }}|jO|| jO|| jO|tQ||jtS|tS|tS|tU}|jA|t!j"|j$|j&5|jjjD cgc]H} | j|vr8tW| jt9j:| | j | dJ}} dddg} |jXD]}}!t[|!jL}t[|!j\}"g}#g}$|!j^t[|!j^}#|!j`rt[|!j`}$|!j}d|vr|!jjdd}t|jjj|jjjjD]/\}%} t/|%j0t2rE|%j0jcs+|j| j|%jd z}n%|%j0jcrt/|%j0t2su|%j0tetfjhk(r%t |jjtjvs*J|jjd|%j0dD]7}&| j|&z}'|'|vs|%j|&z}|j|'|}9n?|%j| jk7r&|j| j|%j}t|D]3\} }(|(| jk(s|%j|| <|%j0|"| <5t|#D]$\} }(|(| jk(s|%j|#| <&t|$D]$\} }(|(| jk(s|%j|$| <&2| jAtm|tS|tS|"tS|#tS|$|!jn|!jpts|jjjjt|d|jv|jjjx|| tStU| tStU| dtUgdd dfS#1swYzxYwcc}wcc} w#1swYKxYw)zGenerate DifferentiabilityInfo for out-place foreach function, return the existing one for in-place. The second return value indicates whether the info is generated in this function. NNFFrlzgrads[i]resultz result[i])!use_const_ref_for_mutable_tensorsuse_ilistref_for_tensor_listsrLrz[i])binds)rr)r9rp)r'r(r*r+r,r.)r9rargumentdefaultz, )_p_tr'r*r1r2r3r5r6Foreachr"rQT)=itemsrr:kindrrrr9r enumeraterrrr<r'replacer parametrizerpart_of_structured_groupr+rrS isinstancerprrrr argument_typer remove_const_refappendrr,r r RuntimeErrorr*extendr&tuplesetr r=listr1r2r3is_tensor_likerr Tensor_foreach_with_tensor_overloadr0r5r6r8rr; overload_name))foreach_functionfunctional_info_by_signature non_functional_info_by_signature dispatch_key ref_diff_infor diff_infomap_refarg2foreacharg map_name2argirrr>r? all_var_namesmodified_derivative_formulasrdmodified_formular+r, ref_inputref_input_jit_name mapped_name mapped_exprnew_exprrcanonical_nctype ref_outputvarr*modified_derivativerBr= fw_derivativer1r2r3 foreach_argsuffix curr_exprr9s) r$gen_foreach_derivativeinforrs37M&B&H&H&J"'(8/J !,/  $     ! ! & & (J,>,> >  %%** +/H H*J*P*P*R  &OY+,C /2hhgll+!$ SXX%:.`.`*:*S*S " +(44  %.^^%9%9#%>q%A"34FG l;7<>112DkR#3#;#;i..334h$ **< +DKX#-KK!=!=!?$ ##"*:M! &)66 + $$))X5!((&#-%-Ik4J$"* 'r** +/" +F.find_info...{s$XFc%,,"3"344Xs+-N)rr>r{s r$r}zBmatch_differentiability_info..find_info..zs)X$BWBWXXs&(zzAttempted to convert a derivative formula for a mutable operator to be used by automatically by its functional variant ("zV"). this is not currently supported (we'd need to fix up the formula in the codegen).Default) tagsr:rrout signaturerrrrr) rVf_sig info_dictr is_generateddiff_info_dictdifferentiability_infosrrs r$ find_infoz/match_differentiability_info..find_info^sy !&& QVV[[]jnn%D 66, ,*1662D8 8  t 4 0 09K/6= = !&& U.N%N8?I%,,. V::=aff+GSV Ve# # 1 66!88 88&@,0' #I|  "'3N2@'/7E,QVV4!</ /r#NrIstrides_or_errorz`Calling '.strides()' in the 'self' derivative formula of an in-place function is not supported: )r:rbrwrOrcjdfd }tjtjd||S)NcR|jdd|jdS)NrO original_selfrMgroup)mpostfixs r$replzSmatch_differentiability_info..replace_self_with_original_self..repls("#''!*]7)AGGAJ<PPr#rIrz re.Match[str]r[r)r_subrra)r'rrs ` r$replace_self_with_original_selfzEmatch_differentiability_info..replace_self_with_original_selfs0Q66+"4"4tG95E"FgVVr#self_pzThe formula for "z" is using the original value of self that is being modified inplace. This would lead to wrong forward gradients. Please use "result" in the formula only.rrcL|jdd|jdS)NrOrrMr)rs r$rz*match_differentiability_info..repls#ggaj\ |<.check_parens_nest_level_gt_zero sE$%E&'/#%9$)QJE',z/4#%9$)QJE /$(r#z"self_t_raw.defined() ? self_t_raw.z_(z) : z(self_t_raw.defined() ? self_t_raw.copy_(original_self_poriginal_self_tr)rVrr[z4tuple[dict[str, DifferentiabilityInfo] | None, bool])r'rrrr[rr)rrr[r4)rrr functionalrr:rrr<r*r+rrrvr=rGr'r_r`rrarr9rr3 fullmatchrr4r0r1r2)native_functionsrschemarrrrVis_exact_matchrbrd saved_inputfw_derivative_dictkeyr=fw_infor'rrrequired_primalsis_single_method_on_self_tdirectly_do_inplaceop_namebetween_parensmatchrr5rrs ` @@r$match_differentiability_inforJs"9!>!>!@$ FI ;;=J11 1 t,i7$ "9!>!>!@( FI ;;=J11 1 t,i7($ 6 6 =6p=?F c $-aL! > 66;;=J.. .I4I!((* "&"2"2J!5!55+5+B+BK#5[=M=M#M!GGHvvh!P#M  MM7d  EG"*E :IC++*,"3'"&":": vv{{} 2 22*001Q662215!//W 99[//97C%*/ }=GG#B'4"P"A'4"P=&&!3!3H!=tWM#*#A#A 99[//97C8H(94i%&27.*/'*.G15NLL)BGLE27++a.%++a. (6U*627O7dii;O,+&222-999$FwirR`Qaaefmen"o%MWIUYZaYb"c/3IIk001BCWM00U"))K$6$67H$I7ST- & '"+")"3"3070O0O/?5Q8F4F '#': s #KE :N  3Y7I  c J MS$ (s >P#,>P)cN|jxr|duxs||jvSrz)rrC)r9rprbs r$is_differentiabler?s1       ED$E$EEr#c  |j}|jr|j|nd ttj||jj Dcgc]@\}}t ||jtj|djB}}} r jnd}|t|t|k7r$tdt|dt|dg}d|vr6|jjtjk(r tdt||D]\}} |s |j!| |St#t% fd |} t' r| dd S| Scc}}w) NT)symint)r9rprqz(The length of output_differentiability (z)), does not match the number of outputs (z).FzXoutput_differentiability=False for inplace operation (version_counter won't get updated)cFt|j|jSrz)rr9rp)rrbs r$z,gen_differentiable_outputs..es*1661664@r#rO)r:rbrr return_namesreturnsrtrp return_typerqrDrGrrrrrrfilterrm) rrrVr9retoutputsrDdifferentiable_outputsdifferentiableoutput candidate_differentiable_outputsrbs @r$gen_differentiable_outputsr*Gs A772773<DS--a0!&&..A + D# __S6??A +G+AEt<<$+ ' (CL 8:3?W;X:YZ99@ , ,*BTBT1Tj '**BG&L 6 "NF&--f5 6&%'+@'J($/33//=+s!AF)rbr\rcrr[r4)rbr\r[r4)rrvr[r)rVrr[r4)rVrrrr[r4)r) rrr6dict[FunctionSchema, dict[str, DifferentiabilityInfo]]rr+rrr[z)tuple[DifferentiabilityInfo | None, bool])r zlist[NativeFunction]rr+r[z-list[NativeFunctionWithDifferentiabilityInfo])r9rrprrbr\r[r4)rrvrrr[zlist[DifferentiableOutput])2 __future__rr_ dataclassesrtypingrrtorchgenr torchgen.apirtorchgen.api.typesr r r r torchgen.modelr rrrrrrrtorchgen.utilsrcollections.abcrrr&r0r8rerjrmrortrvrrrrrrrrrr*r"r#r$r5s" !&JJ   '(  $ $6 $&&&< $h h h V0$ $  $ $BBB.b9..! # D"U$U#U ' UU/Upr*rSr3rj != =F#0/#069#0#0r#