L i ddlZddlZddlmZddlZddlmZddlmZm Z ddl m Z erddl m Z ejeZdgZGddZy) N) TYPE_CHECKING)register_multi_grad_hook)register_module_forward_hook register_module_forward_pre_hook) tree_flatten)RemovableHandle ModuleTrackercleZdZUdZeeed< ddZdZe dZ dZ dZ d Z d Zd Zd Zd Zy)r aD ``ModuleTracker`` is a context manager that tracks the nn.Module hierarchy during execution so that other system can query which Module is currently being executed (or its backward is being executed). You can access the ``parents`` attribute on this context manager to get the set of all the Modules currently being executed via their fqn (fully qualified name, also used as the key within the state_dict). You can access the ``is_bw`` attribute to know if you are currently running in backward or not. Note that ``parents`` is never empty and always contains the "Global" key. The ``is_bw`` flag will remain ``True`` after the forward until another Module is executed. If you need it to be more accurate, please submit an issue requesting this. Adding a map from fqn to the module instance is possible but not done yet, please submit an issue requesting this if you need it. Example usage .. code-block:: python mod = torch.nn.Linear(2, 2) with ModuleTracker() as tracker: # Access anything during the forward pass def my_linear(m1, m2, bias): print(f"Current modules: {tracker.parents}") return torch.mm(m1, m2.t()) + bias torch.nn.functional.linear = my_linear mod(torch.rand(2, 2)) parentsNcdh|_tj|_tj|_d|_g|_yNGlobalF)r weakrefWeakKeyDictionary_known_modulesWeakSet _seen_modules _has_callback_hooksselfs `/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/torch/utils/module_tracker.py__init__zModuleTracker.__init__@s: z 9@9R9R9T.5oo.?"-/ cjryfd}tjjjj |d_y)Nc$dh_d_yr )r rrsrcallbackz:ModuleTracker._maybe_set_engine_callback..callbackLs$:DL!&D rT)rtorchautogradVariable_execution_enginequeue_callback)rrs` r_maybe_set_engine_callbackz(ModuleTracker._maybe_set_engine_callbackGs<     ' 11@@J!rcDtjjdk7S)z` A boolean marking if this is currently running during the backward pass or not )r_C_current_graph_task_idrs ris_bwzModuleTracker.is_bwSs xx..0B66rcP||jvr"t|j|j|<|j|}||jvrX|j D]*\}}|d||j|<|j |,|jj ||S)N.)rtype__name__rnamed_children _get_mod_nameadd)rmodmod_namenamesubmods rr.zModuleTracker._get_mod_nameZs d)) )'+Cy'9'9D   $&&s+ d(( ( # 2 2 4 + f19 !D6.B##F+""6* +    " "3 'rcfd}|S)Ncrjjvrtjdrdndjj y)NzaThe module hierarchy tracking seems to be broken as this Module was already entered. %s during %sbackwardforward)r#r loggerinfor/argsr(r2rs rfnz(ModuleTracker._get_append_fn..fnfsK//1t||# w"'JY LL  T "rrr2r(r<s``` r_get_append_fnzModuleTracker._get_append_fnes # rcfd}|S)Ncjvrjjytjdrdndy)NzhThe Module hierarchy tracking is confused as we're exiting a Module that was never entered. %s during %sr6r7)r remover8r9r:s rr<z%ModuleTracker._get_pop_fn..fnts:t||# ##D) ~"'JYrr=r>s``` r _get_pop_fnzModuleTracker._get_pop_fnss  rc X|j|}|j|dt|\}}|Dcgc],}t|tj s|j s+|.}}|r6|jjt||j|dyycc}wNFT) r.r?r isinstancerTensor requires_gradrappendrrC)rr0inputr2r;_atensorss r _fw_pre_hookzModuleTracker._fw_pre_hooks!!#&(D%(*u%a"VjELL&Aaoo1VV  KK  ($2B2B42NO  WB' B'(B'c X|j|}|j|dt|\}}|Dcgc],}t|tj s|j s+|.}}|r6|jjt||j|dyycc}wrE) r.rCrrFrrGrHrrIrr?) rr0rJoutputr2r;rKrLrMs r _fw_post_hookzModuleTracker._fw_post_hooks!!#&%u%'v&a"VjELL&Aaoo1VV  KK  ($2E2EdD2QR  WrOcnt|j|_t|j|_|SN)rrN_fw_pre_handlerrR_fw_post_handlers r __enter__zModuleTracker.__enter__s->t?P?PQ;DrlsP 9-1   8 $  GGr