PK:6YvEGG-INFO/PKG-INFOMetadata-Version: 1.0 Name: Pygments Version: 0.8.1 Summary: Pygments is a syntax highlighting package written in Python. Home-page: http://pygments.org/ Author: Georg Brandl Author-email: g.brandl@gmx.net License: BSD License Description: Pygments ~~~~~~~~ Pygments is a syntax highlighting package written in Python. It is a generic syntax highlighter for general use in all kinds of software such as forum systems, wikis or other applications that need to prettify source code. Highlights are: * a wide range of common languages and markup formats is supported * special attention is paid to details, increasing quality by a fair amount * support for new languages and formats are added easily * a number of output formats, presently HTML, LaTeX, RTF and ANSI sequences * it is usable as a command-line tool and as a library * ... and it highlights even Brainfuck! The `Pygments trunk `__ is installable via *easy_install* with ``easy_install Pygments==dev``. :copyright: 2006-2007 by Georg Brandl, Armin Ronacher and others. :license: BSD, see LICENSE for more details. Keywords: syntax highlighting Platform: any Classifier: License :: OSI Approved :: BSD License Classifier: Intended Audience :: Developers Classifier: Intended Audience :: End Users/Desktop Classifier: Intended Audience :: System Administrators Classifier: Development Status :: 5 - Production/Stable Classifier: Programming Language :: Python Classifier: Operating System :: OS Independent PK:62EGG-INFO/dependency_links.txt PK:6EGG-INFO/SOURCES.txtAUTHORS CHANGES LICENSE MANIFEST.in Makefile TODO ez_setup.py pygmentize setup.cfg setup.py Pygments.egg-info/PKG-INFO Pygments.egg-info/SOURCES.txt Pygments.egg-info/dependency_links.txt Pygments.egg-info/not-zip-safe Pygments.egg-info/top_level.txt docs/generate.py docs/pygmentize.1 docs/build/api.html docs/build/authors.html docs/build/changelog.html docs/build/cmdline.html docs/build/filterdevelopment.html docs/build/filters.html docs/build/formatterdevelopment.html docs/build/formatters.html docs/build/index.html docs/build/installation.html docs/build/integrate.html docs/build/lexerdevelopment.html docs/build/lexers.html docs/build/moinmoin.html docs/build/plugins.html docs/build/quickstart.html docs/build/rstdirective.html docs/build/styles.html docs/build/tokens.html docs/build/unicode.html docs/src/api.txt docs/src/authors.txt docs/src/changelog.txt docs/src/cmdline.txt docs/src/filterdevelopment.txt docs/src/filters.txt docs/src/formatterdevelopment.txt docs/src/formatters.txt docs/src/index.txt docs/src/installation.txt docs/src/integrate.txt docs/src/lexerdevelopment.txt docs/src/lexers.txt docs/src/moinmoin.txt docs/src/plugins.txt docs/src/quickstart.txt docs/src/rstdirective.txt docs/src/styles.txt docs/src/tokens.txt docs/src/unicode.txt external/moin-parser.py pygments/__init__.py pygments/cmdline.py pygments/console.py pygments/filter.py pygments/formatter.py pygments/lexer.py pygments/plugin.py pygments/scanner.py pygments/style.py pygments/token.py pygments/unistring.py pygments/util.py pygments/filters/__init__.py pygments/formatters/__init__.py pygments/formatters/_mapping.py pygments/formatters/bbcode.py pygments/formatters/html.py pygments/formatters/latex.py pygments/formatters/other.py pygments/formatters/rtf.py pygments/formatters/terminal.py pygments/lexers/__init__.py pygments/lexers/_luabuiltins.py pygments/lexers/_mapping.py pygments/lexers/_phpbuiltins.py pygments/lexers/_vimbuiltins.py pygments/lexers/agile.py pygments/lexers/compiled.py pygments/lexers/dotnet.py pygments/lexers/functional.py pygments/lexers/math.py pygments/lexers/other.py pygments/lexers/special.py pygments/lexers/templates.py pygments/lexers/text.py pygments/lexers/web.py pygments/styles/__init__.py pygments/styles/autumn.py pygments/styles/borland.py pygments/styles/colorful.py pygments/styles/default.py pygments/styles/emacs.py pygments/styles/friendly.py pygments/styles/fruity.py pygments/styles/manni.py pygments/styles/murphy.py pygments/styles/native.py pygments/styles/pastie.py pygments/styles/perldoc.py pygments/styles/trac.py scripts/check_sources.py scripts/count_loc.py scripts/epydoc.css scripts/find_codetags.py scripts/find_error.py scripts/fix_epydoc_markup.py scripts/get_vimkw.py scripts/pylintrc scripts/reindent.py scripts/vim2pygments.py tests/run.py tests/test_basic_api.py tests/test_clexer.py tests/test_cmdline.py tests/test_examplefiles.py tests/test_html_formatter.py tests/test_latex_formatter.py tests/test_token.py tests/test_using_api.py tests/test_util.py tests/dtds/HTML4-f.dtd tests/dtds/HTML4-s.dtd tests/dtds/HTML4.dcl tests/dtds/HTML4.dtd tests/dtds/HTML4.soc tests/dtds/HTMLlat1.ent tests/dtds/HTMLspec.ent tests/dtds/HTMLsym.ent tests/examplefiles/AlternatingGroup.mu tests/examplefiles/Intro.java tests/examplefiles/Makefile tests/examplefiles/SmallCheck.hs tests/examplefiles/apache2.conf tests/examplefiles/batchfile.bat tests/examplefiles/boot-9.scm tests/examplefiles/ceval.c tests/examplefiles/classes.dylan tests/examplefiles/condensed_ruby.rb tests/examplefiles/database.pytb tests/examplefiles/django_sample.html+django tests/examplefiles/dwarf.cw tests/examplefiles/example.c tests/examplefiles/example.cpp tests/examplefiles/example.pas tests/examplefiles/example.rb tests/examplefiles/example.rhtml tests/examplefiles/example.xhtml tests/examplefiles/example.xml tests/examplefiles/format.ml tests/examplefiles/fucked_up.rb tests/examplefiles/functional.rst tests/examplefiles/genshi_example.xml+genshi tests/examplefiles/genshitext_example.genshitext tests/examplefiles/html+php_faulty.php tests/examplefiles/jinjadesignerdoc.rst tests/examplefiles/ltmain.sh tests/examplefiles/moin_SyntaxReference.txt tests/examplefiles/multiline_regexes.rb tests/examplefiles/numbers.c tests/examplefiles/perl5db.pl tests/examplefiles/perlfunc.1 tests/examplefiles/phpcomplete.vim tests/examplefiles/pleac.in.rb tests/examplefiles/simple.md tests/examplefiles/smarty_example.html tests/examplefiles/sources.list tests/examplefiles/test.bas tests/examplefiles/test.boo tests/examplefiles/test.cs tests/examplefiles/test.css tests/examplefiles/test.d tests/examplefiles/test.html tests/examplefiles/test.java tests/examplefiles/test.jsp tests/examplefiles/test.myt tests/examplefiles/test.pas tests/examplefiles/test.php tests/examplefiles/test.rb tests/examplefiles/test.rhtml PK:6Hb EGG-INFO/top_level.txtpygments PKD62EGG-INFO/not-zip-safe PK 6r4]]EGG-INFO/scripts/pygmentize#!/usr/bin/python2.3 import sys, pygments.cmdline sys.exit(pygments.cmdline.main(sys.argv)) PK:6?gbgbpygments/lexer.pyc; >Fc @sdZdkZyeWn ej odklZnXdklZlZdk l Z dk l Z l Z lZlZdklZlZlZlZddd d d d d dddg ZedZdefdYZdefdYZd efdYZd efdYZdefdYZ defdYZ!dZ"defdYZ#e#Z$dZ%defd YZ&defd!YZ'd efd"YZ(d e'fd#YZ)d$Z*dS(%s pygments.lexer ~~~~~~~~~~~~~~ Base lexer classes. :copyright: 2006-2007 by Georg Brandl. :license: BSD, see LICENSE for more details. N(sSet(s apply_filterssFilter(sget_filter_by_name(sErrorsTextsOthers _TokenType(s get_bool_opts get_int_opts get_list_optsmake_analysatorsLexers RegexLexersExtendedRegexLexersDelegatingLexers LexerContextsincludesflagssbygroupssusingsthiscCsdS(Nf0.0((sx((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysss LexerMetacBstZdZdZRS(s This metaclass automagically converts ``analyse_text`` methods into static methods which always return float values. cCs?d|jot|d|ds(sselfsoptionss __class__s__name__(sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__repr__\s cKs8t|t ot||}n|ii|dS(s8 Add a new stream filter to this lexer. N(s isinstancesfilter_sFiltersget_filter_by_namesoptionssselfsfilterssappend(sselfsfilter_soptions((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys add_filtercscCsdS(s~ Has to return a float between ``0`` and ``1`` that indicates if a lexer wants to highlight this text. Used by ``guess_lexer``. If this method returns ``0`` it won't highlight it in any case, if it returns ``1`` highlighting with this lexer is guaranteed. The `LexerMeta` metaclass automatically wraps this function so that it works like a static method (no ``self`` or ``cls`` parameter) and the return value is automatically converted to `float`. If the return value is an object that is boolean `False` it's the same as if the return values was ``0.0``. N((stext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys analyse_textks csttodiindiiidjo`y7ididot dnWq%t j oidq%XnxidjoUy dk }Wnt j ot d nX|i }i|d niiioinioidnid joiinid od7nd }|}| ot|i}n|SdS( s= Return an iterable of (tokentype, value) pairs generated from `text`. If `unfiltered` is set to `True`, the filtering mechanism is bypassed even if filters are defined. Also preprocess the text, i.e. expand tabs and strip it if wanted and applies registered filters. u s sguesssutf-8uslatin1schardetNskTo enable chardet encoding guessing, please install the chardet library from http://chardet.feedparser.org/sencodingic#s1x*iD]\}}}||fVqWdS(N(sselfsget_tokens_unprocessedstextsistsv(sistsv(stextsself(s.build/bdist.linux-x86_64/egg/pygments/lexer.pysstreamers(s isinstancestextsunicodesjoins splitlinessselfsencodingsdecodes startswithslensUnicodeDecodeErrorschardets ImportErrorsdetectsencsstripallsstripsstripnlstabsizes expandtabssendswithsstreamersstreams unfiltereds apply_filterssfilters(sselfstexts unfilteredsencsstreamschardetsstreamer((sselfstexts.build/bdist.linux-x86_64/egg/pygments/lexer.pys get_tokensys@    cCs tdS(s Return an iterable of (tokentype, value) pairs. In subclasses, implement this method as a generator to maximize effectiveness. N(sNotImplementedError(sselfstext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocesseds(s__name__s __module__s__doc__sNonesnamesaliasess filenamessalias_filenamess mimetypess LexerMetas __metaclass__s__init__s__repr__s add_filters analyse_textsFalses get_tokenssget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysLexer,s     0cBs#tZdZedZdZRS(s  This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer. The lexers from the ``template`` lexer package use this base lexer. cKs;|||_|||_||_ti ||dS(N( s _root_lexersoptionssselfs root_lexers_language_lexerslanguage_lexers_needlesneedlesLexers__init__(sselfs _root_lexers_language_lexers_needlesoptions((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__s cCsd}g}g}x|ii|D]m\}}}||i jo8|o#|i t ||fg}n||7}q%|i |||fq%W|o|i t ||fnt ||ii|SdS(Ns(sbuffereds insertionss lng_buffersselfslanguage_lexersget_tokens_unprocessedstextsistsvsneedlesappendslens do_insertionss root_lexer(sselfstextsbuffereds insertionss lng_buffersistsv((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocesseds (s__name__s __module__s__doc__sOthers__init__sget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysDelegatingLexers  cBstZdZRS(sI Indicates that a state should include rules from another state. (s__name__s __module__s__doc__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysincludes scombinedcBs tZdZdZdZRS(s: Indicates a state combined from multiple states. cGsti||SdS(N(stuples__new__sclssargs(sclssargs((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__new__scGsti||dS(N(stuples__init__sselfsargs(sselfsargs((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__s(s__name__s __module__s__doc__s__new__s__init__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pyscombineds  s _PseudoMatchcBsMtZdZdZedZedZedZdZdZ RS(s: A pseudo match object constructed from a string. cCs||_||_dS(N(stextsselfs_textsstarts_start(sselfsstartstext((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys__init__s cCs |iSdS(N(sselfs_start(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysstartscCs|it|iSdS(N(sselfs_startslens_text(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysendscCs"|otdn|iSdS(Ns No such group(sargs IndexErrorsselfs_text(sselfsarg((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysgroupscCs|ifSdS(N(sselfs_text(sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysgroupsscCshSdS(N((sself((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys groupdicts( s__name__s __module__s__doc__s__init__sNonesstartsendsgroupsgroupss groupdict(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys _PseudoMatchs      cstd}|SdS(sL Callback that yields multiple actions for each group in the match. c#sxtD]\}}|tjoq q t|tjo<|i|d}|o|i |d||fVqq |o|i |d|_ nxL||t|i |d|i|d|D]}|o|VqqWq W|o|i|_ ndS(Ni(s enumeratesargssisactionsNonestypes _TokenTypesmatchsgroupsdatasstartsctxsposslexers _PseudoMatchsitemsend(slexersmatchsctxsisdatasitemsaction(sargs(s.build/bdist.linux-x86_64/egg/pygments/lexer.pyscallback s$  "N(sNonescallback(sargsscallback((sargss.build/bdist.linux-x86_64/egg/pygments/lexer.pysbygroupsss_ThiscBstZdZRS(sX Special singleton used for indicating the caller class. Used by ``using``. (s__name__s __module__s__doc__(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys_Thiss c shdjoGid}t|ttfo|d|djo|ii|idntptd|||id}nPq_q_Wy|i |i joPn||i djo<|i d7_ dg|_|d}|i tdfVwXn|i t||i fV|i d7_ WqQtj oPqQXqXWd S( s Split ``text`` into (tokentype, text) pairs. If ``context`` is given, use this lexer context instead. isrootiis#pushswrong state def: %rs u N(sselfs_tokenss tokendefsscontexts LexerContextstextsctxs statetokenssstacksrexmatchsactions new_statespossendsmstypes _TokenTypesgroupsitemsNones isinstancestuplesextendsintsappendsFalsesAssertionErrorsTextsErrors IndexError( sselfstextscontexts new_states tokendefss statetokenssrexmatchsmsctxsitemsaction((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysget_tokens_unprocessedsZ       (s__name__s __module__s__doc__sNonesget_tokens_unprocessed(((s.build/bdist.linux-x86_64/egg/pygments/lexer.pysExtendedRegexLexers ccst|}y|i\}}Wn,tj o x|D] }|Vq:WdSnXt}t }x!|D]\} }}|tjo | }nd}x|o| t||jo|||| !} ||| fV|t| 7}x4|D],\} } }|| |fV|t|7}qW|| }y|i\}}Wqtj ot}PqXqW||||fV|t||7}qdW|oH|pd}x8|D],\} }}|||fV|t|7}qWndS(sg Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here. Ni(siters insertionssnextsindexsitokenss StopIterationstokenssitemsNonesrealpossTruesinsleftsistsvsoldislenstmpvalsit_indexsit_tokensit_valuesFalsesp(s insertionsstokenssinsleftsit_valuesindexsrealpossitemsitokenssoldisit_tokenstmpvalsit_indexsispstsv((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys do_insertionsPsN           (+s__doc__sressets NameErrorssetssSetspygments.filters apply_filterssFilterspygments.filterssget_filter_by_namespygments.tokensErrorsTextsOthers _TokenTypes pygments.utils get_bool_opts get_int_opts get_list_optsmake_analysators__all__s staticmethods_default_analysestypes LexerMetasobjectsLexersDelegatingLexersstrsincludestuplescombineds _PseudoMatchsbygroupss_ThissthissusingsRegexLexerMetas RegexLexers LexerContextsExtendedRegexLexers do_insertions(ssets RegexLexers get_bool_optsExtendedRegexLexersOthersusingsget_filter_by_namesRegexLexerMetas__all__s LexerMetasres _PseudoMatchscombinedsincludes apply_filterss get_int_opts do_insertionssLexersbygroupssFiltersDelegatingLexersErrors_default_analysesmake_analysators LexerContexts _TokenTypes_ThissthissTexts get_list_opt((s.build/bdist.linux-x86_64/egg/pygments/lexer.pys? s4  $ '    2YP>PK:6L6e))pygments/cmdline.pyc; )$Fc@sdZdkZdkZdklZdklZlZlZdk l Z l Z l Z dk lZlZlZlZdklZlZlZlZlZdklZlZdklZlZd Zd Zd Z d Z!d Z"dZ#dS(s pygments.cmdline ~~~~~~~~~~~~~~~~ Command line interface. :copyright: 2006-2007 by Georg Brandl. :license: BSD, see LICENSE for more details. N(sdedent(s __version__s __author__s highlight(s ClassNotFounds OptionErrorsdocstring_headline(sget_all_lexerssget_lexer_by_namesget_lexer_for_filenamesfind_lexer_class(sget_all_formatterssget_formatter_by_namesget_formatter_for_filenamesfind_formatter_classsTerminalFormatter(sget_all_filterssfind_filter_class(sget_all_stylessget_style_by_namesQUsage: %s [-l ] [-F [:]] [-f ] [-O ] [-o ] [] %s -S

%(title)s

''' DOC_HEADER_EXTERNALCSS = '''\ %(title)s

%(title)s

''' DOC_FOOTER = '''\ ''' CSSFILE_TEMPLATE = '''\ td.linenos { background-color: #f0f0f0; padding-right: 10px; } %(styledefs)s ''' class HtmlFormatter(Formatter): r""" Format tokens as HTML 4 ```` tags within a ``
`` tag, wrapped
    in a ``
`` tag. The ``
``'s CSS class can be set by the `cssclass` option. If the `linenos` option is set to ``"table"``, the ``
`` is
    additionally wrapped inside a ```` which has one row and two
    cells: one containing the line numbers and one containing the code.
    Example:

    .. sourcecode:: html

        
1
            2
def foo(bar):
              pass
            
(whitespace added to improve clarity). Wrapping can be disabled using the `nowrap` option. With the `full` option, a complete HTML 4 document is output, including the style definitions inside a ``

%(title)s

s5 %(title)s

%(title)s

s sMtd.linenos { background-color: #f0f0f0; padding-right: 10px; } %(styledefs)s cBstZdZdZdgZddgZdZdZdZe dZ d Z d Z d Z d Zd ZdZdZdZRS(s Format tokens as HTML 4 ```` tags within a ``
`` tag, wrapped
    in a ``
`` tag. The ``
``'s CSS class can be set by the `cssclass` option. If the `linenos` option is set to ``"table"``, the ``
`` is
    additionally wrapped inside a ```` which has one row and two
    cells: one containing the line numbers and one containing the code.
    Example:

    .. sourcecode:: html

        
1
            2
def foo(bar):
              pass
            
(whitespace added to improve clarity). Wrapping can be disabled using the `nowrap` option. With the `full` option, a complete HTML 4 document is output, including the style definitions inside a ``