L iC UddlZddlZddlZddlZddlZddlZddlmZddlmZm Z m Z ddl m Z ddl mZmZmZmZmZmZmZmZmZmZmZmZer$ddlmZGdd e ZeeefZej<Zd ZGd d ej@d d Z!d.dZ"d.dZ#dZ$dZ%dZ&d/dZ'ejPjSDchc]Q\}}e*|e+se,|ejZej\ej^ej`ejbfr|Sc}}Z2d/dZ3ejPjSDchc]#\}}e*|e+re,|ejhr|%c}}Z5ejPjSDchc]#\}}e*|e+re,|ejr|%c}}hdzZ6dZ7dZ8dZ9dZ:dZ;ejxfZ=eedfe>d< ddl?m@Z@e=e@fz Z=dZBdZCd ZDd!ZEd"ZFeGZHd#ZId/d$ZJd%ZKGd&d'ZLd(ZMd)ZNd*ZOe d+d,ZPd-ZQycc}}wcc}}wcc}}w#eA$rY_wxYw)0N)ABCMeta)ModuleexprAST) lru_cache) CallableDictIterableIteratorListOptionalTupleUnioncastAny TYPE_CHECKINGType)NodeNGc eZdZdZdZdZdZdZy) EnhancedASTNr)__name__ __module__ __qualname__ first_token last_tokenlineno end_linenoend_col_offsetT/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/asttokens/util.pyrr,sKJ FJNr!rcbtj|dt|jdS)zRReturns a human-friendly representation of a token with the given type and string.:u)tokentok_namereprlstrip)tok_typestrings r" token_reprr,9s'NN8,d6l.A.A#.F GGr!ceZdZdZdZy)Tokena TokenInfo is an 8-tuple containing the same 5 fields as the tokens produced by the tokenize module, and 3 additional ones useful for this module: - [0] .type Token type (see token.py) - [1] .string Token (a string) - [2] .start Starting (row, column) indices of the token (a 2-tuple of ints) - [3] .end Ending (row, column) indices of the token (a 2-tuple of ints) - [4] .line Original line (string) - [5] .index Index of the token in the list of tokens that it belongs to. - [6] .startpos Starting character offset into the input text. - [7] .endpos Ending character offset into the input text. cBt|j|jSN)r,typer+selfs r"__str__z Token.__str__Ns dii --r!N)rrr__doc__r4r r!r"r.r.@s  .r!r.z0type string start end line index startpos endposcN|j|k(xr|duxs|j|k(S)zVReturns true if token is of the given type and, if a string is given, has that string.N)r1r+r&r*tok_strs r" match_tokenr9Ss* x  PW_%O 8OPr!c t|||sGtdt||dt|d|jdd|jddzy)z Verifies that the given token is of the expected type. If tok_str is given, the token string is verified too. If the token doesn't match, raises an informative ValueError. zExpected token z, got z on line rz col rN)r9 ValueErrorr,strstartr7s r" expect_tokenr>YsQ UHg . 7#SZ kk!nekk!nq(* ++ /r!cd|tjtjtjfvS)zS These are considered non-coding tokens, as they don't affect the syntax tree. )r&NLCOMMENTENCODING) token_types r"is_non_coding_tokenrDes" %--@ @@r!ctjttgtft j |jS)z; Generates standard library tokens for the given code. )tokenizegenerate_tokensrrr<ioStringIOreadline)texts r"rGrGms4  ! !$xC'8"++d:K:T:T"U VVr!c2t|drtStS)z Returns a function which yields all direct children of a AST node, skipping children that are singleton nodes. The function depends on whether ``node`` is from ``ast`` or from the ``astroid`` module. get_children)hasattriter_children_astroiditer_children_astnodes r"iter_children_funcrSxs#*$"? VEVVr!c@|s t|rgS|jSr0) is_joined_strrM)rRinclude_joined_strs r"rOrOs  d 3 I   r!c#$K|s t|ryt|tjr3t |j |j D]\}}|||ytj|D]}|jtvs|ywr0) rU isinstanceastr zipkeysvaluesiter_child_nodes __class__ SINGLETONS)rRrVkeyvaluechilds r"rPrPs  d 3 chhDIIt{{3 e  k ##D)e j( k s BB B>ConstDelAttrDelName AssignAttr AssignNamec:|jjtvS)z+Returns whether node is an expression node.)r^rexpr_class_namesrQs r"is_exprrj  $4 44r!c:|jjtvS)z)Returns whether node is a statement node.)r^rstmt_class_namesrQs r"is_stmtrnrkr!c4|jjdk(S)z&Returns whether node is a module node.rr^rrQs r" is_modulerqs  H ,,r!c4|jjdk(S)zFReturns whether node is a JoinedStr node, used to represent f-strings. JoinedStrrprQs r"rUrUs  K //r!c4|jjdk(S)zSReturns whether node is an `Expr` node, which is a statement that is an expression.ExprrprQs r" is_expr_stmtrvs  F **r!.CONSTANT_CLASSES)rcc"t|tS)z(Returns whether node is a Constant node.)rXrwrQs r" is_constantrys D* ++r!c@t|xr|jtuS)z)Returns whether node is an Ellipsis node.)ryraEllipsisrQs r" is_ellipsisr|s T  5tzzX55r!c4|jjdk(S)z2Returns whether node is a starred expression node.StarredrprQs r" is_starredrs  I --r!c |jjdvxsV|jjdk(xr;tttt t j|jS)z?Returns whether node represents a slice, e.g. `1:2` in `x[1:2]`)SliceExtSlicer) r^ranymapis_slicerrYreltsrQs r"rrsX  nn!66 .. ! !W , =#hSYY 5 : :;< r!c|jjdk(xrPt|tj xr3|j |j cxuxr|jcxuxrduScS)Nr)r^rrXrYrloweruppersteprQs r"is_empty_astroid_slicers^ nn(8sww' '8 ** 7dii 74 7 8r!c|sd}t|}t}d}|dtfg}|r|j\}}} | turi||vsJ|j ||||\} } |j ||| ft |} ||D]} |j| | | tfn|||ttt| }|r|S)a5 Scans the tree under the node depth-first using an explicit stack. It avoids implicit recursion via the function call stack to avoid hitting 'maximum recursion depth exceeded' error. It calls ``previsit()`` and ``postvisit()`` as follows: * ``previsit(node, par_value)`` - should return ``(par_value, value)`` ``par_value`` is as returned from ``previsit()`` of the parent. * ``postvisit(node, par_value, value)`` - should return ``value`` ``par_value`` is as returned from ``previsit()`` of the parent, and ``value`` is as returned from ``previsit()`` of this node itself. The return ``value`` is ignored except the one for the root node, which is returned from the overall ``visit_tree()`` call. For the initial node, ``par_value`` is None. ``postvisit`` may be None. cyr0r )rRpvalueras r"zvisit_tree..sr!N) rSset _PREVISITpopaddappendleninsertrr r.)rRprevisit postvisit iter_childrendoneretstackcurrent par_valuerar post_valueinsns r" visit_treers$ 0I$T*- $ # $ " #% % GY  D   hhw#GY7fj llGY 34 JcW%2! S1fi012 gy$x*F Gc  *r!c#Kt|}t}|g}|rZ|j}||vsJ|j||t |}|||D]}|j |||rYyyw)a1 Recursively yield all descendant nodes in the tree starting at ``node`` (including ``node`` itself), using depth-first pre-order traversal (yieling parents before their children). This is similar to ``ast.walk()``, but with a different order, and it works for both ``ast`` and ``astroid`` trees. Also, as ``iter_children()``, it skips singleton nodes generated by ``ast``. By default, ``JoinedStr`` (f-string) nodes and their contents are skipped because they previously couldn't be handled. Set ``include_joined_str`` to True to include them. N)rSrrrrr)rRrVrrrrrcs r"walkr)s%T*- $ &% iikG $  HHW M e*C 7$6 7 ll3 s A3A86A8cd}g}t|D]-\}}}|j||||j||}/|j||ddj|S)ah Replaces multiple slices of text with new values. This is a convenience method for making code modifications of ranges e.g. as identified by ``ASTTokens.get_text_range(node)``. Replacements is an iterable of ``(start, end, new_text)`` tuples. For example, ``replace("this is a test", [(0, 4, "X"), (8, 9, "THE")])`` produces ``"X is THE test"``. rN)sortedrjoin)rK replacementsppartsr=endnew_texts r"replacerFsn! % &| 4 uc8 LLa LL A ,,tABx r!ceZdZdZdZdZy) NodeMethodsz[ Helper to get `visit_{node_type}` methods given a node's class and cache the results. ci|_yr0)_cacher2s r"__init__zNodeMethods.__init__^s DKr!c|jj|}|sCd|jjz}t |||j }||j|<|S)z Using the lowercase name of the class as node_type, returns `obj.visit_{node_type}`, or `obj.visit_default` if the type-specific method is not found. visit_)rgetrrgetattr visit_default)r3objclsmethodnames r"rzNodeMethods.getbsW [[__S !F   **, ,dsD#"3"34fdkk# Mr!N)rrrr5rrr r!r"rrZs r!rc#PKg}|D]}|jtjtjtjfvr0|r|dj |j k(r|j|nt|D]}|g}|t|D]}|yw)a Fixes tokens yielded by `tokenize.generate_tokens` to handle more non-ASCII characters in identifiers. Workaround for https://github.com/python/cpython/issues/68382. Should only be used when tokenizing a string that is known to be valid syntax, because it assumes that error tokens are not actually errors. Combines groups of consecutive NAME, NUMBER, and/or ERRORTOKEN tokens into a single NAME token. N) r1rFNAME ERRORTOKENNUMBERrr=rcombine_tokens)original_tokensgrouptokcombined_tokens r"patched_generate_tokensrps E  ((x}}h&9&98??K KE"IMMSYY6 S,U3 N    )/ sB$B&cLtd|Dr&t|Dchc]}|jc}dk7r|Stjtj dj d|D|dj|dj|djgScc}w)Nc3VK|]!}|jtjk(#ywr0)r1rFr).0rs r" z!combine_tokens..s@3sxx8...@s')rrc34K|]}|jywr0)r+)rts r"rz!combine_tokens..s/Aqxx/srr)r1r+r=rline) rrrrF TokenInforrr=r)rrs r"rrs @%@ @C]bHcVYHcDdhiDi l ]]ww///Ahnn "IMM 1X]]  IdsB!ct||Dcgc]&}t|st|jdvr|(}}|rt |dS|Scc}w)zq If the given AST node contains multiple statements, return the last one. Otherwise, just return the node. ) excepthandler ExceptHandler match_case MatchCase TryExcept TryFinallyr)rSrnr1r last_stmt)rRrb child_stmtss r"rrsi0)$/5 u~e--2  +  [_ %% + s+A)maxsizec~dtj}tj|Dcgc]}t|tjs|!}}|Dcgc]}|j |j f}}tt|t|k(}tfd|D}|xr|Scc}wcc}w)a The positions attached to nodes inside f-string FormattedValues have some bugs that were fixed in Python 3.9.7 in https://github.com/python/cpython/pull/27729. This checks for those bugs more concretely without relying on the Python version. Specifically this checks: - Values with a format spec or conversion - Repeated (i.e. identical-looking) expressions - f-strings implicitly concatenated over multiple lines. - Multiline, triple-quoted f-strings. z( f"a {b}{b} c {d!r} e {f:g} h {i:{j}} k {l:{m:n}}" f"a {b}{b} c {d!r} e {f:g} h {i:{j}} k {l:{m:n}}" f"{x + y + z} {x} {y} {z} {z} {z!a} {z:z}" f''' {s} {t} {u} {v} ''' )c3dK|]'}tj||jk()ywr0)rYget_source_segmentid)rrRsources r"rz)fstring_positions_work..s/  64(DGG3 s-0) rYparserrXNamer col_offsetrrall)treerR name_nodesname_positionspositions_are_uniquecorrect_source_segmentsrs @r"fstring_positions_workrs & 6 $!$$N:dCHH3MN*N?IJtT[[$//2J.JS01S5HH    9"99OJsB5B5B:ctjdk\ryt|dD]}t|tj s|j D]}t|ddt|tjs+ts't|jD]}t|dd|jsit|jddy)zy Add a special attribute `_broken_positions` to nodes inside f-strings if the lineno/col_offset cannot be trusted. ) NT)rV_broken_positions) sys version_inforrXrYrsr\setattrFormattedValuerra format_spec)r joinedstrpartrbs r"annotate_fstring_nodesrs    6?i i /  ? d'. D#,, -%'DJJ' 6e E. 5 6    $""$7 > ??r!r0)F)RrY collectionsrHrr&rFabcrrrr functoolsrtypingrr r r r r rrrrrrastroid_compatrrAstNoderr, namedtupler.r9r>rDrGrSrO__dict__itemsrXr1 issubclass expr_contextboolopoperatorunaryopcmpopr_rPstmtrmrirjrnrqrUrvConstantrw__annotations__ astroid.nodesrc ImportErrorryr|rrrobjectrrrrrrrrrr)rrs00r"r s  !!    $C +v% &'  )H. "K " "7,^ _.&Q +AWW LL..0cDAqJq$4GS--szz3<<VYV_V_`ac ,#&,,"4"4"6H$!Q!!T*z!SXX/FH#&<<#5#5#7H41a!!T*z!SXX/FHOP 5 5 - 0+'*ll_%c "4!uh , 6 .  H ( V:(,2 . 4::>?K c4HHPs+H1AH1>(H7(H= II  I