PK.•¦81*ýmÂÂEGG-INFO/PKG-INFOMetadata-Version: 1.0 Name: Pygments Version: 0.10 Summary: Pygments is a syntax highlighting package written in Python. Home-page: http://pygments.org/ Author: Georg Brandl Author-email: g.brandl@gmx.net License: BSD License Description: Pygments ~~~~~~~~ Pygments is a syntax highlighting package written in Python. It is a generic syntax highlighter for general use in all kinds of software such as forum systems, wikis or other applications that need to prettify source code. Highlights are: * a wide range of common languages and markup formats is supported * special attention is paid to details, increasing quality by a fair amount * support for new languages and formats are added easily * a number of output formats, presently HTML, LaTeX, RTF, SVG and ANSI sequences * it is usable as a command-line tool and as a library * ... and it highlights even Brainfuck! The `Pygments tip`_ is installable with ``easy_install Pygments==dev``. .. _Pygments tip: http://dev.pocoo.org/hg/pygments-main/archive/tip.tar.gz#egg=Pygments-dev :copyright: 2006-2007 by Georg Brandl, Armin Ronacher and others. :license: BSD, see LICENSE for more details. Keywords: syntax highlighting Platform: any Classifier: License :: OSI Approved :: BSD License Classifier: Intended Audience :: Developers Classifier: Intended Audience :: End Users/Desktop Classifier: Intended Audience :: System Administrators Classifier: Development Status :: 5 - Production/Stable Classifier: Programming Language :: Python Classifier: Operating System :: OS Independent PK.•¦8“×2EGG-INFO/dependency_links.txt PK.•¦8²á^à  EGG-INFO/SOURCES.txtAUTHORS CHANGES LICENSE MANIFEST.in Makefile TODO ez_setup.py setup.cfg setup.py Pygments.egg-info/PKG-INFO Pygments.egg-info/SOURCES.txt Pygments.egg-info/dependency_links.txt Pygments.egg-info/entry_points.txt Pygments.egg-info/not-zip-safe Pygments.egg-info/top_level.txt docs/generate.py docs/pygmentize.1 docs/build/api.html docs/build/authors.html docs/build/changelog.html docs/build/cmdline.html docs/build/filterdevelopment.html docs/build/filters.html docs/build/formatterdevelopment.html docs/build/formatters.html docs/build/index.html docs/build/installation.html docs/build/integrate.html docs/build/lexerdevelopment.html docs/build/lexers.html docs/build/moinmoin.html docs/build/plugins.html docs/build/quickstart.html docs/build/rstdirective.html docs/build/styles.html docs/build/tokens.html docs/build/unicode.html docs/src/api.txt docs/src/authors.txt docs/src/changelog.txt docs/src/cmdline.txt docs/src/filterdevelopment.txt docs/src/filters.txt docs/src/formatterdevelopment.txt docs/src/formatters.txt docs/src/index.txt docs/src/installation.txt docs/src/integrate.txt docs/src/lexerdevelopment.txt docs/src/lexers.txt docs/src/moinmoin.txt docs/src/plugins.txt docs/src/quickstart.txt docs/src/rstdirective.txt docs/src/styles.txt docs/src/tokens.txt docs/src/unicode.txt external/markdown-processor.py external/moin-parser.py external/rst-directive.py pygments/__init__.py pygments/cmdline.py pygments/console.py pygments/filter.py pygments/formatter.py pygments/lexer.py pygments/plugin.py pygments/scanner.py pygments/style.py pygments/token.py pygments/unistring.py pygments/util.py pygments/filters/__init__.py pygments/formatters/__init__.py pygments/formatters/_mapping.py pygments/formatters/bbcode.py pygments/formatters/html.py pygments/formatters/img.py pygments/formatters/latex.py pygments/formatters/other.py pygments/formatters/rtf.py pygments/formatters/svg.py pygments/formatters/terminal.py pygments/formatters/terminal256.py pygments/lexers/__init__.py pygments/lexers/_clbuiltins.py pygments/lexers/_luabuiltins.py pygments/lexers/_mapping.py pygments/lexers/_phpbuiltins.py pygments/lexers/_vimbuiltins.py pygments/lexers/agile.py pygments/lexers/asm.py pygments/lexers/compiled.py pygments/lexers/dotnet.py pygments/lexers/functional.py pygments/lexers/math.py pygments/lexers/other.py pygments/lexers/special.py pygments/lexers/templates.py pygments/lexers/text.py pygments/lexers/web.py pygments/styles/__init__.py pygments/styles/autumn.py pygments/styles/borland.py pygments/styles/bw.py pygments/styles/colorful.py pygments/styles/default.py pygments/styles/emacs.py pygments/styles/friendly.py pygments/styles/fruity.py pygments/styles/manni.py pygments/styles/murphy.py pygments/styles/native.py pygments/styles/pastie.py pygments/styles/perldoc.py pygments/styles/trac.py pygments/styles/vim.py scripts/check_sources.py scripts/count_loc.py scripts/epydoc.css scripts/find_codetags.py scripts/find_error.py scripts/fix_epydoc_markup.py scripts/get_vimkw.py scripts/pylintrc scripts/reindent.py scripts/vim2pygments.py tests/run.py tests/test_basic_api.py tests/test_clexer.py tests/test_cmdline.py tests/test_examplefiles.py tests/test_html_formatter.py tests/test_latex_formatter.py tests/test_regexlexer.py tests/test_token.py tests/test_using_api.py tests/test_util.py tests/dtds/HTML4-f.dtd tests/dtds/HTML4-s.dtd tests/dtds/HTML4.dcl tests/dtds/HTML4.dtd tests/dtds/HTML4.soc tests/dtds/HTMLlat1.ent tests/dtds/HTMLspec.ent tests/dtds/HTMLsym.ent tests/examplefiles/AlternatingGroup.mu tests/examplefiles/DancingSudoku.lhs tests/examplefiles/Intro.java tests/examplefiles/Makefile tests/examplefiles/SmallCheck.hs tests/examplefiles/Squeak.st tests/examplefiles/Sudoku.lhs tests/examplefiles/apache2.conf tests/examplefiles/badcase.java tests/examplefiles/batchfile.bat tests/examplefiles/boot-9.scm tests/examplefiles/ceval.c tests/examplefiles/classes.dylan tests/examplefiles/condensed_ruby.rb tests/examplefiles/database.pytb tests/examplefiles/de.MoinMoin.po tests/examplefiles/django_sample.html+django tests/examplefiles/dwarf.cw tests/examplefiles/example.c tests/examplefiles/example.cpp tests/examplefiles/example.lua tests/examplefiles/example.moo tests/examplefiles/example.pas tests/examplefiles/example.rb tests/examplefiles/example.rhtml tests/examplefiles/example.weechatlog tests/examplefiles/example.xhtml tests/examplefiles/example.xml tests/examplefiles/firefox.mak tests/examplefiles/format.ml tests/examplefiles/fucked_up.rb tests/examplefiles/functional.rst tests/examplefiles/genshi_example.xml+genshi tests/examplefiles/genshitext_example.genshitext tests/examplefiles/html+php_faulty.php tests/examplefiles/jinjadesignerdoc.rst tests/examplefiles/ltmain.sh tests/examplefiles/matlabsession_sample.txt tests/examplefiles/moin_SyntaxReference.txt tests/examplefiles/multiline_regexes.rb tests/examplefiles/numbers.c tests/examplefiles/perl5db.pl tests/examplefiles/perlfunc.1 tests/examplefiles/phpcomplete.vim tests/examplefiles/pleac.in.rb tests/examplefiles/py3_test.txt tests/examplefiles/python25-bsd.mak tests/examplefiles/ruby_func_def.rb tests/examplefiles/sample.m tests/examplefiles/simple.md tests/examplefiles/smarty_example.html tests/examplefiles/source.lgt tests/examplefiles/sources.list tests/examplefiles/squid.conf tests/examplefiles/string_delimiters.d tests/examplefiles/test.R tests/examplefiles/test.bas tests/examplefiles/test.boo tests/examplefiles/test.cs tests/examplefiles/test.css tests/examplefiles/test.d tests/examplefiles/test.erl tests/examplefiles/test.html tests/examplefiles/test.java tests/examplefiles/test.jsp tests/examplefiles/test.moo tests/examplefiles/test.myt tests/examplefiles/test.pas tests/examplefiles/test.php tests/examplefiles/test.rb tests/examplefiles/test.rhtml tests/examplefiles/test.tcsh tests/examplefiles/test.xsl tests/examplefiles/type.lisp tests/examplefiles/zmlrpc.f90 PK.•¦8…HÚb EGG-INFO/top_level.txtpygments PK •¦8“×2EGG-INFO/not-zip-safe PK.•¦8?óË„66EGG-INFO/entry_points.txt[console_scripts] pygmentize = pygments.cmdline:main PK/•¦8.£b£bpygments/lexer.pyc;ò ª3ýFc @sådZdkZyeWn ej odklZnXdklZlZdk l Z dk l Z l Z lZlZdklZlZlZlZddd d d d d dddg Zed„ƒZdefd„ƒYZdefd„ƒYZd efd„ƒYZd efd„ƒYZdefd„ƒYZ defd„ƒYZ!d„Z"defd„ƒYZ#e#ƒZ$d„Z%defd „ƒYZ&defd!„ƒYZ'd efd"„ƒYZ(d e'fd#„ƒYZ)d$„Z*dS(%s pygments.lexer ~~~~~~~~~~~~~~ Base lexer classes. :copyright: 2006-2007 by Georg Brandl. :license: BSD, see LICENSE for more details. N(sSet(s apply_filterssFilter(sget_filter_by_name(sErrorsTextsOthers _TokenType(s get_bool_opts get_int_opts get_list_optsmake_analysatorsLexers RegexLexersExtendedRegexLexersDelegatingLexers LexerContextsincludesflagssbygroupssusingsthiscCsdS(Nf0.0((sx((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysss LexerMetacBstZdZd„ZRS(s‚ This metaclass automagically converts ``analyse_text`` methods into static methods which always return float values. cCs?d|jot|dƒ|ds(sselfsoptionss __class__s__name__(sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__repr__\s cKs8t|tƒ ot||}n|ii|ƒdS(s8 Add a new stream filter to this lexer. N(s isinstancesfilter_sFiltersget_filter_by_namesoptionssselfsfilterssappend(sselfsfilter_soptions((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys add_filtercscCsdS(s~ Has to return a float between ``0`` and ``1`` that indicates if a lexer wants to highlight this text. Used by ``guess_lexer``. If this method returns ``0`` it won't highlight it in any case, if it returns ``1`` highlighting with this lexer is guaranteed. The `LexerMeta` metaclass automatically wraps this function so that it works like a static method (no ``self`` or ``cls`` parameter) and the return value is automatically converted to `float`. If the return value is an object that is boolean `False` it's the same as if the return values was ``0.0``. N((stext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys analyse_textks csâtˆtƒodiˆiƒƒ‰nýdiˆiƒƒ‰ˆidjo`y7ˆidƒ‰ˆidƒoˆt dƒ‰nWq%t j oˆidƒ‰q%XnxˆidjoUy dk }Wnt j ot d ƒ‚nX|i ˆƒ}ˆi|d ƒ‰nˆiˆiƒ‰ˆioˆiƒ‰nˆioˆidƒ‰nˆid joˆiˆiƒ‰nˆidƒ oˆd7‰n‡‡d †}|ƒ}| ot|ˆiˆƒ}n|SdS( s= Return an iterable of (tokentype, value) pairs generated from `text`. If `unfiltered` is set to `True`, the filtering mechanism is bypassed even if filters are defined. Also preprocess the text, i.e. expand tabs and strip it if wanted and applies registered filters. u s sguesssutf-8uslatin1schardetNskTo enable chardet encoding guessing, please install the chardet library from http://chardet.feedparser.org/sencodingic#s1x*ˆiˆƒD]\}}}||fVqWdS(N(sselfsget_tokens_unprocessedstextsistsv(sistsv(stextsself(s.build/bdist.linux-x86_64/egg/pygments/lexer.pysstreamer¡s(s isinstancestextsunicodesjoins splitlinessselfsencodingsdecodes startswithslensUnicodeDecodeErrorschardets ImportErrorsdetectsencsstripallsstripsstripnlstabsizes expandtabssendswithsstreamersstreams unfiltereds apply_filterssfilters(sselfstexts unfilteredsencsstreamschardetsstreamer((sselfstexts.build/bdist.linux-x86_64/egg/pygments/lexer.pys get_tokensys@    cCs t‚dS(s  Return an iterable of (tokentype, value) pairs. In subclasses, implement this method as a generator to maximize effectiveness. N(sNotImplementedError(sselfstext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocessed©s(s__name__s __module__s__doc__sNonesnamesaliasess filenamessalias_filenamess mimetypess LexerMetas __metaclass__s__init__s__repr__s add_filters analyse_textsFalses get_tokenssget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysLexer,s     0cBs#tZdZed„Zd„ZRS(s  This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer. The lexers from the ``template`` lexer package use this base lexer. cKs;|||_|||_||_ti ||dS(N( s _root_lexersoptionssselfs root_lexers_language_lexerslanguage_lexers_needlesneedlesLexers__init__(sselfs _root_lexers_language_lexers_needlesoptions((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__¼s cCs×d}g}g}x|ii|ƒD]m\}}}||i jo8|o#|i t |ƒ|fƒg}n||7}q%|i |||fƒq%W|o|i t |ƒ|fƒnt ||ii|ƒƒSdS(Ns(sbuffereds insertionss lng_buffersselfslanguage_lexersget_tokens_unprocessedstextsistsvsneedlesappendslens do_insertionss root_lexer(sselfstextsbuffereds insertionss lng_buffersistsv((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocessedÂs (s__name__s __module__s__doc__sOthers__init__sget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysDelegatingLexer²s  cBstZdZRS(sI Indicates that a state should include rules from another state. (s__name__s __module__s__doc__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysincludeÙs scombinedcBs tZdZd„Zd„ZRS(s: Indicates a state combined from multiple states. cGsti||ƒSdS(N(stuples__new__sclssargs(sclssargs((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__new__åscGsdS(N((sselfsargs((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__ès(s__name__s __module__s__doc__s__new__s__init__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pyscombinedàs  s _PseudoMatchcBsMtZdZd„Zed„Zed„Zed„Zd„Zd„Z RS(s: A pseudo match object constructed from a string. cCs||_||_dS(N(stextsselfs_textsstarts_start(sselfsstartstext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__òs cCs |iSdS(N(sselfs_start(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysstartöscCs|it|iƒSdS(N(sselfs_startslens_text(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysendùscCs"|otdƒ‚n|iSdS(Ns No such group(sargs IndexErrorsselfs_text(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysgroupüscCs|ifSdS(N(sselfs_text(sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysgroupsscCshSdS(N((sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys groupdicts( s__name__s __module__s__doc__s__init__sNonesstartsendsgroupsgroupss groupdict(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys _PseudoMatchís      cst‡d†}|SdS(sL Callback that yields multiple actions for each group in the match. c#sxítˆƒD]ß\}}|tjoq q t|ƒtjo<|i|dƒ}|o|i |dƒ||fVqìq |o|i |dƒ|_ nxL||t|i |dƒ|i|dƒƒ|ƒD]}|o|VqÓqÓWq W|o|iƒ|_ ndS(Ni(s enumeratesargssisactionsNonestypes _TokenTypesmatchsgroupsdatasstartsctxsposslexers _PseudoMatchsitemsend(slexersmatchsctxsisdatasitemsaction(sargs(s.build/bdist.linux-x86_64/egg/pygments/lexer.pyscallback s$  "N(sNonescallback(sargsscallback((sargss.build/bdist.linux-x86_64/egg/pygments/lexer.pysbygroupsss_ThiscBstZdZRS(sX Special singleton used for indicating the caller class. Used by ``using``. (s__name__s __module__s__doc__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys_This s c sšh‰dˆjoGˆidƒ}t|ttfƒo|ˆds c#sƒˆi|iƒˆˆ}|iƒ}x:|i |i ƒˆD] \}}}||||fVqAW|o|iƒ|_ndS(N(skwargssupdateslexersoptionss_otherslxsmatchsstartsssget_tokens_unprocessedsgroups gt_kwargssistsvsctxsendspos(slexersmatchsctxsvsissslxst(s gt_kwargsskwargss_other(s.build/bdist.linux-x86_64/egg/pygments/lexer.pyscallbackMs  N( s gt_kwargsskwargsspopsss isinstancesliststuples_othersthissNonescallback(s_otherskwargss gt_kwargsscallbackss((s_otherskwargss gt_kwargss.build/bdist.linux-x86_64/egg/pygments/lexer.pysusing(s    sRegexLexerMetacBs,tZdZd„Zed„Zd„ZRS(sw Metaclass for RegexLexer, creates the self._tokens attribute from self.tokens on the first instantiation. c Cskt|ƒtjptd|‚|ddjptd|‚||jo ||Sng} ||<|i} xð||D]ä} t | t ƒoD| |jptd|‚| i |i||t| ƒƒƒq{nt| ƒtjptd| ‚yti| d| ƒi}Wn5tj o)} td| d||| fƒ‚nXt| dƒtjpt| dƒptd | df‚t| ƒd jo t}n¡| d }t |tƒo‡|d jo d }qE||jo |f}qE|d jo |}qE|d djot|dƒ }qEtptd|‚nt |tƒod|i }|i d7_ g}xE|D]=}||jptd|‚|i |i|||ƒƒqW|||<|f}not |tƒoIx<|D]4}||jp|d d fjptd|‚qîW|}ntptd|‚| i#|| d|fƒq{W| SdS(Nswrong state name %ris#sinvalid state name %rscircular state reference %rswrong rule def %rs+uncompilable regex %r in state %r of %r: %sis2token type must be simple type or callable, not %ris#popiÿÿÿÿs#pushis#pop:sunknown new state %rs_tmp_%dscircular state ref %rsunknown new state sunknown new state def %r($stypesstatesstrsAssertionErrors processedstokenssclssflagssrflagss unprocessedstdefs isinstancesincludesextends_process_statestuplesrescompilesmatchsrexs Exceptionserrs ValueErrors _TokenTypescallableslensNones new_statestdef2sintsFalsescombineds_tmpnamesitokenssistatesappend( sclss unprocesseds processedsstates new_statesrexsitokenssistatestdef2stdefstokensserrsrflags((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys_process_state`sd!    "!%=           2 cCsWh}|i|<|p |i|}x'|iƒD]}|i|||ƒq2W|SdS(N( s processedsclss _all_tokenssnames tokendefsstokensskeyssstates_process_state(sclssnames tokendefssstates processed((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysprocess_tokendef s  cOstt|dƒ oLh|_d|_t|dƒo|ioq]|id|iƒ|_nti |||ŽSdS(Ns_tokensistoken_variantss( shasattrsclss _all_tokenss_tmpnamestoken_variantssprocess_tokendefstokenss_tokensstypes__call__sargsskwds(sclssargsskwds((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__call__§s  (s__name__s __module__s__doc__s_process_statesNonesprocess_tokendefs__call__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysRegexLexerMetaZs  @ cBs2tZdZeZeiZhZdfd„Z RS(s± Base for simple stateful regular expression-based lexers. Simplifies the lexing process so that you need only provide a list of states and regular expressions. srootc csd}|i}t|ƒ} || d}xínoåxÞ|D]^\}} }|||ƒ} | o9t | ƒtjo|| | iƒfVnx| || ƒD] } | Vq™W| iƒ}|tj oÓt|tƒo_x®|D]P}|djo| iƒq×|djo| i| dƒq×| i|ƒq×WnSt|tƒo | |3n8|djo| i| dƒntptd|‚|| d}nPq:q:Wya||djo1|d7}dg} |d}|td fVw3n|t||fV|d7}Wq,tj oPq,Xq3Wd S( s} Split ``text`` into (tokentype, text) pairs. ``stack`` is the inital stack (default: ``['root']``) iiÿÿÿÿis#pops#pushswrong state def: %rs srootu N(spossselfs_tokenss tokendefsslistsstacks statestacks statetokenssrexmatchsactions new_statestextsmstypes _TokenTypesgroupsitemsendsNones isinstancestuplesstatespopsappendsintsFalsesAssertionErrorsTextsErrors IndexError( sselfstextsstacks new_states tokendefss statetokensspossstatesrexmatchsms statestacksitemsaction((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocessedÓsX             ( s__name__s __module__s__doc__sRegexLexerMetas __metaclass__sres MULTILINEsflagsstokenssget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys RegexLexer´s  cBs&tZdZeed„Zd„ZRS(s9 A helper object that holds lexer position data. cCs?||_||_|p t|ƒ|_|pdg|_dS(Nsroot(stextsselfspossendslensstack(sselfstextspossstacksend((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__s  cCsd|i|i|ifSdS(NsLexerContext(%r, %r, %r)(sselfstextspossstack(sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__repr__s(s__name__s __module__s__doc__sNones__init__s__repr__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys LexerContext s cBstZdZeed„ZRS(sE A RegexLexer that uses a context object to store its state. c csc|i}| ot|dƒ}|d}n!|}||id}|i}xnoxÿ|D]N\}} }|||i |i ƒ}|o t| ƒtjo)|i | |iƒfV|i ƒ|_ n?x| |||ƒD] } | VqÜW| o||id}n|tj o”t|tƒo|ii|ƒn\t|tƒo|i|3n>|djo|ii|idƒntptd|‚||id}nPq_q_Wy’|i |i joPn||i djo<|i d7_ dg|_|d}|i tdfVwXn|i t||i fV|i d7_ WqQtj oPqQXqXWd S( s Split ``text`` into (tokentype, text) pairs. If ``context`` is given, use this lexer context instead. isrootiÿÿÿÿis#pushswrong state def: %rs u N(sselfs_tokenss tokendefsscontexts LexerContextstextsctxs statetokenssstacksrexmatchsactions new_statespossendsmstypes _TokenTypesgroupsitemsNones isinstancestuplesextendsintsappendsFalsesAssertionErrorsTextsErrors IndexError( sselfstextscontexts new_states tokendefss statetokenssrexmatchsmsctxsitemsaction((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocessedsZ       (s__name__s __module__s__doc__sNonesget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysExtendedRegexLexers ccsÔt|ƒ}y|iƒ\}}Wn,tj o x|D] }|Vq:WdSnXt}t }x!|D]\} }}|tjo | }nd}xÅ|o| t|ƒ|jo¦|||| !} ||| fV|t| ƒ7}x4|D],\} } }|| |fV|t|ƒ7}qæW|| }y|iƒ\}}Wq“tj ot}Pq“Xq“W||||fV|t|ƒ|7}qdW|oH|pd}x8|D],\} }}|||fV|t|ƒ7}qœWndS(sg Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here. Ni(siters insertionssnextsindexsitokenss StopIterationstokenssitemsNonesrealpossTruesinsleftsistsvsoldislenstmpvalsit_indexsit_tokensit_valuesFalsesp(s insertionsstokenssinsleftsit_valuesindexsrealpossitemsitokenssoldisit_tokenstmpvalsit_indexsispstsv((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys do_insertionsXsN           (+s__doc__sressets NameErrorssetssSetspygments.filters apply_filterssFilterspygments.filterssget_filter_by_namespygments.tokensErrorsTextsOthers _TokenTypes pygments.utils get_bool_opts get_int_opts get_list_optsmake_analysators__all__s staticmethods_default_analysestypes LexerMetasobjectsLexersDelegatingLexersstrsincludestuplescombineds _PseudoMatchsbygroupss_ThissthissusingsRegexLexerMetas RegexLexers LexerContextsExtendedRegexLexers do_insertions(ssets RegexLexers get_bool_optsExtendedRegexLexersOthersusingsget_filter_by_namesRegexLexerMetas__all__s LexerMetasres _PseudoMatchscombinedsincludes apply_filterss get_int_opts do_insertionssLexersbygroupssFiltersDelegatingLexersErrors_default_analysesmake_analysators LexerContexts _TokenTypes_ThissthissTexts get_list_opt((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys? s4  $ †'    2ZV>PK/•¦8PÇ,Ç,pygments/cmdline.pyc;ò û›Gc@sþdZdkZdkZdklZdklZlZlZdk l Z l Z l Z dk lZlZlZlZdklZlZlZlZlZdklZlZdklZlZd Zd „Zd „Z d „Z!d „Z"ei#d„Z$dS(s¥ pygments.cmdline ~~~~~~~~~~~~~~~~ Command line interface. :copyright: 2006-2007 by Georg Brandl. :license: BSD, see LICENSE for more details. N(sdedent(s __version__s __author__s highlight(s ClassNotFounds OptionErrorsdocstring_headline(sget_all_lexerssget_lexer_by_namesget_lexer_for_filenamesfind_lexer_class(sget_all_formatterssget_formatter_by_namesget_formatter_for_filenamesfind_formatter_classsTerminalFormatter(sget_all_filterssfind_filter_class(sget_all_stylessget_style_by_namesxUsage: %s [-l ] [-F [:]] [-f ] [-O ] [-P ] [-o ] [] %s -S

%(title)s

''' DOC_HEADER_EXTERNALCSS = '''\ %(title)s

%(title)s

''' DOC_FOOTER = '''\ ''' class HtmlFormatter(Formatter): r""" Format tokens as HTML 4 ```` tags within a ``
`` tag, wrapped
    in a ``
`` tag. The ``
``'s CSS class can be set by the `cssclass` option. If the `linenos` option is set to ``"table"``, the ``
`` is
    additionally wrapped inside a ```` which has one row and two
    cells: one containing the line numbers and one containing the code.
    Example:

    .. sourcecode:: html

        
1
            2
def foo(bar):
              pass
            
(whitespace added to improve clarity). Wrapping can be disabled using the `nowrap` option. With the `full` option, a complete HTML 4 document is output, including the style definitions inside a ``

%(title)s

s5 %(title)s

%(title)s

s cBs¡tZdZdZdgZddgZd„Zd„Zd„Ze d„Z d „Z d „Z d „Z d „Zd „Zd„Zd„Zd„Zd„ZRS(sÔ Format tokens as HTML 4 ```` tags within a ``
`` tag, wrapped
    in a ``
`` tag. The ``
``'s CSS class can be set by the `cssclass` option. If the `linenos` option is set to ``"table"``, the ``
`` is
    additionally wrapped inside a ```` which has one row and two
    cells: one containing the line numbers and one containing the code.
    Example:

    .. sourcecode:: html

        
1
            2
def foo(bar):
              pass
            
(whitespace added to improve clarity). Wrapping can be disabled using the `nowrap` option. With the `full` option, a complete HTML 4 document is output, including the style definitions inside a ``