python (3.11.7)
    eR                     P   d Z dZdZddlZddlZddlmZmZ ddlT ddl	m
Z
 d	  ee
          D             g d
z   Z[
	 e
 n
# e$ r eZ
Y nw xY wd Zd Zd
 Zd ZdZdZe edez             z    ee          z   ZdZdZdZdZ edd          Z eeeee          ZdZ edd           ee          z   Zdez   Z eee          Z  ede dz             Z! ee!e e          Z"dZ#dZ$d Z%d!Z&d"Z' ee'd#z   e'd$z             Z( ee'd%z   e'd&z             Z) ed'd(d)d*d+d,d-d.d/	  	        Z*d0Z+ ed1d2d3          Z, ee*e+e,          Z- ee"e-e)e          Z.ee.z   Z/ ee'd4z    ed5d          z   e'd6z    ed7d          z             Z0 edee(          Z1e ee1e"e-e0e          z   Z2 e3ej4        e/e2e%e&f          \  Z5Z6Z7Z8 ed8d9d:d;           ed8d9d<d=          z  h d>z  Z9 ej4        e#           ej4        e$          e7e8d?d@ e9D             dA e9D             dB e9D             Z:d#d$hdC e9D             z  dD e9D             z  Z;d5d7hdE e9D             z  dF e9D             z  Z<dGZ= G dH dIe>          Z? G dJ dKe>          Z@dL ZAeAfdMZBdN ZC G dO dP          ZD ej4        dQejE                  ZF ej4        dRejE                  ZGdS ZHdT ZIdU ZJdV ZKeLdWk    rUddlMZM eNeMjO                  dk    r& eB ePeMjO        d                   jQ                   dS  eBeMjR        jQ                   dS dS )Xa  Tokenization help for Python programs.
generate_tokens(readline) is a generator that breaks a stream of
text into Python tokens.  It accepts a readline-like method which is called
repeatedly to get the next line of input (or "" for EOF).  It generates
5-tuples with these members:
    the token type (see token.py)
    the token (a string)
    the starting (row, column) indices of the token (a 2-tuple of ints)
    the ending (row, column) indices of the token (a 2-tuple of ints)
    the original line (string)
It is designed to match the working of the Python tokenizer exactly, except
that it produces COMMENT tokens for comments and gives type OP for all
operators
Older entry points
    tokenize_loop(readline, tokeneater)
    tokenize(readline, tokeneater=printtoken)
are the same, except instead of generating tokens, tokeneater is a callback
function to which the 5 fields described above are passed as 5 arguments,
each time a new token is found.zKa-Ping Yee <ping@lfw.org>z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip Montanaro    N)BOM_UTF8lookup)*   )tokenc                 *    g | ]}|d          dk    |S )r   _ ).0xs     J/BuggyBox/python/3.11.7/bootstrap/lib/python3.11/lib2to3/pgen2/tokenize.py
<listcomp>r   %   s!    
0
0
0AaDCKK1KKK    )tokenizegenerate_tokens
untokenizec                  8    dd                     |           z   dz   S )N(|))joinchoicess    r
   groupr   0   s    C#((7"3"33c99r   c                      t          |  dz   S )Nr   r   r   s    r
   anyr   1   s    %/C//r   c                      t          |  dz   S )N?r   r   s    r
   mayber    2   s    E7Oc11r   c                  :     t           fd D                       S )Nc              3      K   | ];}d z   D ]3}|                                 |                                 k    ,||z   V  4<dS )) N)casefold)r   r   yls      r
   	<genexpr>z _combinations.<locals>.<genexpr>4   s`        !e) qzz||qzz||/K/KA/K/K/K/K/K r   )set)r&