UL iL dZdZddlZddlmZmZmZmZmZm Z m Z m Z ddl m Z mZmZmZmZmZddlmZmZmZmZmZmZmZmZej8dZgdZd eeeeee fd eeeeee fd eeeeeefd eeeeeefd eeeeeefdeeeeee fdeeeeeefdeeeeee fdeeeeee fdeeeeee fdeeeeee fdeeeeee fdeeeeeefdeeeeee fdeeeeee fdeeeeee fdeeeeee fdeeeeefdeeeeeefdeeeefdeeefdeeeeeefdeeeefd eeeeefd!eeeeefd"eeeefd#eeeefgZGd$d%e Z e Z!Gd&d'eZ"Gd(d)eZ#Gd*d+e Z$Gd,d-e Z%y).a; Defines a variety of Pygments lexers for highlighting IPython code. This includes: IPythonLexer, IPython3Lexer Lexers for pure IPython (python + magic/shell commands) IPythonPartialTracebackLexer, IPythonTracebackLexer Supports 2.x and 3.x via keyword `python3`. The partial traceback lexer reads everything but the Python code appearing in a traceback. The full lexer combines the partial lexer with an IPython lexer. IPythonConsoleLexer A lexer for IPython console sessions, with support for tracebacks. IPyLexer A friendly lexer which examines the first line of text and from it, decides whether to use an IPython lexer or an IPython console lexer. This is probably the only lexer that needs to be explicitly added to Pygments. z1.1.1N) BashLexer HtmlLexerJavascriptLexer RubyLexer PerlLexer Python2Lexer Python3LexerTexLexer)LexerDelegatingLexer RegexLexer do_insertionsbygroupsusing)GenericKeywordLiteralNameOperatorOtherTextErrorz.*? ) IPython3Lexer IPythonLexerIPythonPartialTracebackLexerIPythonTracebackLexerIPythonConsoleLexerIPyLexerz"(?s)(\s*)(%%capture)([^\n]*\n)(.*)z (?s)(\s*)(%%debug)([^\n]*\n)(.*)z (?is)(\s*)(%%html)([^\n]*\n)(.*)z%(?s)(\s*)(%%javascript)([^\n]*\n)(.*)z(?s)(\s*)(%%js)([^\n]*\n)(.*)z (?s)(\s*)(%%latex)([^\n]*\n)(.*)z(?s)(\s*)(%%perl)([^\n]*\n)(.*)z(?s)(\s*)(%%prun)([^\n]*\n)(.*)z(?s)(\s*)(%%pypy)([^\n]*\n)(.*)z"(?s)(\s*)(%%python2)([^\n]*\n)(.*)z"(?s)(\s*)(%%python3)([^\n]*\n)(.*)z!(?s)(\s*)(%%python)([^\n]*\n)(.*)z(?s)(\s*)(%%ruby)([^\n]*\n)(.*)z!(?s)(\s*)(%%timeit)([^\n]*\n)(.*)z(?s)(\s*)(%%time)([^\n]*\n)(.*)z$(?s)(\s*)(%%writefile)([^\n]*\n)(.*)z(?s)(\s*)(%%file)([^\n]*\n)(.*)z(?s)(\s*)(%%)(\w+)(.*)z(?s)(^\s*)(%%!)([^\n]*\n)(.*)z(%%?)(\w+)(\?\??)$z\b(\?\??)(\s*)$z(%)(sx|sc|system)(.*)(\n)z(%)(\w+)(.*\n)z ^(!!)(.+)(\n)z(!)(?!=)(.+)(\n)z!^(\s*)(\?\??)(\s*%{0,2}[\w\.\*]*)z!(\s*%{0,2}[\w\.\*]*)(\?\??)(\s*)$cfeZdZdZdZddgZejjZe edzed<y)rz&IPython code lexer (based on Python 3)IPythonipythonipython3rootN) __name__ __module__ __qualname____doc__namealiasesr tokenscopyipython_tokens]/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/ipython_pygments_lexers.pyrrs:0 D*%G  % % 'F#fVn4F6Nr.rc $eZdZdZdZddeeejfdeeje jeje jjfdee jejefdee jee j"e j$fdeeje jjefd ee je jjefd ee jefd efgiZy ) rzX Partial lexer for IPython tracebacks. Handles all the non-python output. zIPython Partial Tracebackr#z ^(\^C)?(-+\n)z^( File)(.*)(, line )(\d+\n)z&(?u)(^[^\d\W]\w*)(\s*)(Traceback.*?\n)z(.*)( in )(.*)(\(.*\)\n)z(\s*?)(\d+)(.*?\n)z(-*>?\s?)(\d+)(.*?\n)z(?u)(^[^\d\W]\w*)(:.*?\n)z.*\nN)r$r%r&r'r(rrr Tracebackr NamespacerNumberInteger Exception WhitespacerEntityTagrr*r-r.r/rrs 'D xw/@/@A B1%%NN%%NN**  :););TB ,t{{DHHE  &++W^^-C-CUK ))?)?G  *8DNND+I J e ]/ 1Fr.rc"eZdZdZdZddgZdZy)raA IPython traceback lexer. For doctests, the tracebacks can be snipped as much as desired with the exception to the lines that designate a traceback. For non-syntax error tracebacks, this is the line of hyphens. For syntax error tracebacks, this is the line which lists the File and line number. zIPython Traceback ipythontb ipython3tbc Dtj|ttfi|y)z A subclass of `DelegatingLexer` which delegates to the appropriate to either IPyLexer, IPythonPartialTracebackLexer. N)r __init__rrselfoptionss r/r=zIPythonTracebackLexer.__init__s"   -!= AH r.N)r$r%r&r'r(r)r=r-r.r/rrs DL)G  r.rcpeZdZdZdZddgZdgZdZdZdZ e jd Z d Z d Zd Zd ZdZy)ra An IPython console lexer for IPython code-blocks and doctests, such as: .. code-block:: rst .. code-block:: ipythonconsole In [1]: a = 'foo' In [2]: a Out[2]: 'foo' In [3]: print(a) foo Support is also provided for IPython exceptions: .. code-block:: rst .. code-block:: ipythonconsole In [1]: raise Exception Traceback (most recent call last): ... Exception zIPython console sessionipythonconsoleipython3consoleztext/x-ipython-consolezIn \[[0-9]+\]: z \.\.+\.: zOut\[[0-9]+\]: z+^(\^C)?(-+\n)|^( File)(.*)(, line )(\d+\n)c |jd|j}|jd|j}|jd|j}|j dz}|j dz}|j dz}gd}|D]2} |j | t jt| 4tj|fi|tdi||_ tdi||_|jy)akInitialize the IPython console lexer. Parameters ---------- in1_regex : RegexObject The compiled regular expression used to detect the start of inputs. Although the IPython configuration setting may have a trailing whitespace, do not include it in the regex. If `None`, then the default input prompt is assumed. in2_regex : RegexObject The compiled regular expression used to detect the continuation of inputs. Although the IPython configuration setting may have a trailing whitespace, do not include it in the regex. If `None`, then the default input prompt is assumed. out_regex : RegexObject The compiled regular expression used to detect outputs. If `None`, then the default output prompt is assumed. in1_regex in2_regex out_regex )rErFrGin1_regex_rstripin2_regex_rstripout_regex_rstripNr-)getrErFrGrstrip __setattr__recompilelocalsr r=rpylexerrtblexerreset) r?r@rErFrGrIrJrKattrsattrs r/r=zIPythonConsoleLexer.__init__Es(KK T^^< KK T^^< KK T^^< %++-4$++-4$++-4  ?D   T2::fhtn#= > ? t'w'$/w/ ,7w7  r.c<d|_d|_d|_g|_y)Noutputr)modeindexbuffer insertions)r?s r/rTzIPythonConsoleLexer.reset|s   r.c#K|jdk(rdtj|jfg}nZ|jdk(r&|jj |j}n%|j j |j}t|j|D]\}}}|j|z||f|xjt|jz c_ d|_g|_yw)zu Generator of unprocessed tokens after doing insertions and before changing to a new state. rXrinputrYN) rZrOutputr\rRget_tokens_unprocessedrSrr]r[len)r?r*itvs r/buffered_tokensz#IPythonConsoleLexer.buffered_tokenss 99 '..$++67F YY' !\\88EF\\88EF$T__f= 'GAq!**q.!Q& & ' c$++&&  sC3C5c|jj|}|jj|}|r/|jj |j k(s|rd}nd}|r+|j dk7rd}d}dt j|f}|||fS|jj|}|jj|} |s| rEd}|r|j} n| j} || d}dt j|d| f}|||fS|jj|} | s|rT|j dk7rEd}| r| j} n|j} || d}dt j|d| f}|||fS|jj|} | s|rT|j dk7rEd}| r| j} n|j} || d}dt j|d| f}|||fS|jj|r d}|}d}|||fS|j d vrd}nd}|}d}|||fS) a Parses the line and returns a 3-tuple: (mode, code, insertion). `mode` is the next mode (or state) of the lexer, and is always equal to 'input', 'output', or 'tb'. `code` is a portion of the line that should be added to the buffer corresponding to the next mode and eventually lexed by another lexer. For example, `code` could be Python code if `mode` were 'input'. `insertion` is a 3-tuple (index, token, text) representing an unprocessed "token" that will be inserted into the stream of tokens that are created from the buffer once we change modes. This is usually the input or output prompt. In general, the next mode depends on current mode and on the contents of `line`. TFtbrXrYrNr_)r_rX)rFmatchrJgrouprMrZrPromptrGrKendHeadingrErI ipytb_start) r?line in2_matchin2_match_rstrip end_inputrZcode insertion out_matchout_match_rstripidx in1_matchin1_match_rstrips r/get_mcizIPythonConsoleLexer.get_mcism0NN((. 0066t< )//+224 E II d*DDGNND1Iy( (NN((. 0066t< (Dmmo&**,:DGOOT$3Z8Iy( (NN((. tyyD'8Dmmommo:DGNND#J7Iy( ( 0066t<  0TYY$5FD&**,&**,:DGNND#J7Iy( (    ! !$ 'DDIy( ( 99+ +DD T9$$r.c#K|jtj|D]}|j}|j |\}}}||j k7r |j D]}|||_|r1|jjt|j|gf|xj|z c_ |j D]}|yw)N) rTline_refinditerrjrzrZrfr]appendrbr\)r?textrirorZrsrttokens r/raz*IPythonConsoleLexer.get_tokens_unprocessed s %%d+ E;;=D$(LL$6 !D$ tyy !113 EK   &&DKK(89+'FG KK4 K ))+ EK sC!C#N)r$r%r&r'r(r) mimetypesrErFrGrOrPrnr=rTrfrzrar-r.r/rrsa: %D!23G)*I#II"I"**KLK5n ,s%jr.rc(eZdZdZdZddgZdZdZy)rao Primary lexer for all IPython-like code. This is a simple helper lexer. If the first line of the text begins with "In \[[0-9]+\]:", then the entire text is parsed with an IPython console lexer. If not, then the entire text is parsed with an IPython lexer. The goal is to reduce the number of lexers that are registered with Pygments. z IPy sessionipyipy3c ptj|fi|tdi||_tdi||_y)z Create a new IPyLexer instance which dispatch to either an IPythonCOnsoleLexer (if In prompts are present) or and IPythonLexer (if In prompts are not present). Nr-)r r=rrr>s r/r=zIPyLexer.__init__1s5 t'w'(373#6#A#A r.c#Ktjd|jtjr |j}n |j }|j |D]}|yw)Nz.*(In \[[0-9]+\]:))rOristripDOTALLrrra)r?rlexrs r/razIPyLexer.get_tokens_unprocessed>sX 88)4::< C**C##C//5 EK sA(A*N)r$r%r&r'r(r)r=rar-r.r/rr!s"  DfoG Br.r)&r' __version__rOpygments.lexersrrrrrrr r pygments.lexerr r r rrrpygments.tokenrrrrrrrrrPr|__all__r,rrrrrrr-r.r/rs_>         "**W   .xu\':; ,xu\':; ,xuY'78 1xu_'=> )xu_'=> ,xuX7 +xuY'78 +xu\':; +xu\':; .xu\':; .xu\':; -xu\':; +xuY'78 -xu\':; +xu\':; 0xu\':; +xu\':;x$ GH(xuY'78HXwAB(D12!8HguY?OQU#VW7D9:x% *:DAB(8U9-=tDE)8D(D+IJ)8D(D+IJcRj5L5 ;:;| O BL%L^%u%r.