{ "info": { "author": "Enrique P\u00e9rez Arnaud", "author_email": "enriquepablo@gmail.com", "bugtrack_url": null, "classifiers": [ "Development Status :: 4 - Beta", "Environment :: Console", "Intended Audience :: Developers", "License :: OSI Approved :: GNU General Public License v3 (GPLv3)", "Operating System :: POSIX :: Linux", "Programming Language :: Python :: 3", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: Software Development :: Interpreters" ], "description": "The Terms knowledge store\r\n=========================\r\n\r\nTerms is a knowledge store.\r\nIt provides a declarative language to express and query that knowledge.\r\nThe main claim to usefulness that Terms has\r\nrelies in the Terms language:\r\nIt is purported to be very powerful and concise,\r\nand at the same time very readable,\r\nvery close to the natural languages.\r\n\r\nTerms is licensed under the GPLv3, and is hosted at\r\n`github `_.\r\n\r\nThe Terms language\r\n++++++++++++++++++\r\n\r\nHere I will describe the Terms language. \r\nIt is a declarative logic language. \r\nWith it you can:\r\n\r\n* define new words (nouns, verbs, and names);\r\n\r\n* build facts out of your defined words;\r\n\r\n* build rules that combine given facts to produce new facts;\r\n\r\n* perform complex queries.\r\n\r\nThe Terms language is similar to other logic languages,\r\nsuch as Prolog, or CLIPS\r\n(it is nearer to CLIPS in that it is forward chaining,\r\nbased on a RETE network).\r\nBut in a certain sense it is more expressive,\r\nbecause all defined items (or words)\r\nhave the same category.\r\nIn Terms, you build sentences, or facts,\r\nwith a verb (i.e. a word) and any number of objects,\r\nand these objects can be any kind of word:\r\nnames, verbs, or nouns, or even other facts.\r\nIn contrast, to build facts in Prolog,\r\nyou use as verbs a special kind of item, a predicate,\r\nthat cannot be treated as an argument term\r\n(equivalent in Prolog to an object in Terms).\r\nIn Terms, a rule can have a logical variable\r\nthat ranges over any fact or term, including verbs,\r\nsomething that is not possible in (idiomatic) Prolog.\r\n\r\nI would say that that difference gives Terms\r\nenough of an edge so as to be generally useful.\r\n\r\nIn any case, Terms is based on a first order theory,\r\ninterpreted in a finite universe,\r\nso it might be implemented in Prolog;\r\nthat's why I specified \"idiomatic\".\r\n\r\nTo try the examples given below, if you have installed Terms,\r\nyou have to type \"terms\" in a terminal,\r\nand you will get a REPL where you can enter Terms constructs.\r\nTo install Terms, follow the instuctions in the INSTALL.rst.\r\n\r\nMore examples `can be found here `_\r\nand in the\r\n`github repository `_.\r\n\r\nWords\r\n-----\r\n\r\nThe main building block of Terms constructs are words.\r\n\r\nTo start with, there are a few predefined words:\r\n``word``, ``verb``, ``noun``, ``number``, ``thing``, and ``exist``.\r\n\r\nNew words are defined relating them to existing words.\r\n\r\nThere are 2 relations that can be established among pairs of words.\r\n\r\nAs we shall see below,\r\nthese relations are formally similar to the set theory relations\r\n\"is an element of\" and \"is a subset of\".\r\n\r\nIn English, we express the first relation as \"is of type\",\r\nand in Terms it is expressed as::\r\n\r\n word1 is a word2.\r\n\r\nSo we would say that ``word1`` is of type ``word2``,\r\ndefining ``word1`` in terms of ``word2``\r\n(so ``word2`` must have been defined before, or be predefined).\r\nThe second relation is expressed in English as \"is subtype of\",\r\nand in Terms::\r\n\r\n a word1 is a word2.\r\n\r\nSo, we would say that ``word1`` is a subtype of ``word2``,\r\nalso defining ``word1`` in terms of ``word2``.\r\nAmong the predefined words, these relations are given::\r\n\r\n word is a word.\r\n verb is a word.\r\n a verb is a word.\r\n noun is a word.\r\n a noun is a word.\r\n thing is a noun.\r\n a thing is a word.\r\n exist is a verb.\r\n a exist is a word.\r\n number is a word.\r\n a number is a word.\r\n\r\nTo define a new word, you put it in relation to an existing word. For example::\r\n\r\n a person is a thing.\r\n a man is a person.\r\n a woman is a person.\r\n john is a man.\r\n sue is a woman.\r\n\r\nThese relations have consecuences, given by 2 implicit rules::\r\n\r\n A is a B; a B is a C -> A is a C.\r\n a A is a B; a B is a C -> a A is a C.\r\n\r\nTherefore, from all the above, we have, for example, that::\r\n\r\n thing is a word.\r\n person is a word.\r\n person is a noun.\r\n john is a word.\r\n a man is a thing.\r\n john is a thing.\r\n sue is a person.\r\n ...\r\n\r\nWith words, we can build facts.\r\nA fact consists of a verb and any number of (labelled) objects.\r\n\r\nVerbs are special words, in that they determine\r\nthe modifiers of the facts built with them.\r\nThese modifiers are words, and are labeled.\r\nTo define a new verb,\r\nyou provide first an ancestor verb\r\n(or a series of ancestor verbs separated by colons),\r\nand then the types of words that can be modifiers for the verb in a fact,\r\nassociated with their labels.\r\nFor example::\r\n\r\n to love is to exist, subj a person, who a person.\r\n\r\nThat can be read as:\r\n``love`` is defined as a subtype of ``exist``,\r\nand when used in facts it can take a subject of type ``person``\r\nand an object labelled ``who`` also of type ``person``.\r\n\r\nThe primitive verb is ``exist``,\r\nthat just defines a ``subj`` object of type ``thing``.\r\nThere are more predefined verbs,\r\nthe use of which we shall see when we explain the treatment of time in Terms.\r\n\r\nFacts\r\n-----\r\n\r\nFacts are built with a verb and a number of objects.\r\nThey are given in parenthesis. For example, we might have a fact such as::\r\n\r\n (love john, who sue).\r\n\r\nThe ``subj`` object is special: all verbs have it,\r\nand in facts it is not labelled with ``subj``,\r\nit just takes the place of the subject right after the verb.\r\n\r\nVerbs inherit the object types of their ancestors. The primitive ``exist`` verb\r\nonly takes one object, ``subj``, of type ``word``, inherited by all the rest of the verbs.\r\nSo, if we define a verb::\r\n\r\n to adore is to love.\r\n\r\nIt will have a ``who`` object of type ``person``. If ``adore`` had provided\r\na new object, it would have been added to the inherited ones.\r\nA new verb can override an inherited object type to provide a subtype of the original\r\nobject type\r\n(like we have done above with ``subj``; subj is predefined to be of type ``word``.)\r\n\r\nFacts are words,\r\n\"first class citizens\",\r\nand can be used wherever a word can be used.\r\nFacts are words of type ``exist``, and also of type ,\r\nwere is the verb used to build the fact.\r\nSo our facts are actually synctactic sugar for\r\n``(love john, who sue) is a love.``\r\n\r\nThe objects in a fact can be of any type (a ``word``, a ``verb``, a ``noun``, a ``thing``,\r\na ``number``). In addition, they can also be facts (type ``exist``).\r\nSo, if we define a verb like::\r\n\r\n to want is to exist, subj a person, what a exist.\r\n\r\nWe can then build facts like::\r\n\r\n (want john, what (love sue, who john)).\r\n\r\nAnd indeed::\r\n\r\n (want john, what (want sue, what (love sue, who john))).\r\n\r\nRules\r\n-----\r\n\r\nWe can build rules, that function producing new facts out of existing (or newly added) ones.\r\nA rule has 2 sets of facts, the conditions (given first) and the consecuences. The facts in each set of\r\nfacts are separated by semicolons (conjunctions), and the symbol ``->`` (implication) separates the conditions\r\nfrom the consecuences.\r\nA simple rule might be::\r\n\r\n (love john, who sue)\r\n ->\r\n (love sue, who john).\r\n\r\nThe facts in the knowledge base are matched with the conditions of rules,\r\nand when all the conditions of a rule are matched by coherent facts,\r\nthe consecuences are added to the knowledge base. The required coherence\r\namong matching facts concerns the variables in the conditions.\r\n\r\nWe can use variables in rules. They are logical variables, used only to match words,\r\nand with a scope limited to the rule were they are used. We build variables by\r\ncapitalizing the name of the type of words that it can match, and appending any number of\r\ndigits. So, for example, a variable ``Person1`` would match any person, such as\r\n``sue`` or ``john``. With variables, we may build a rule like::\r\n\r\n (love Person1, who Person2)\r\n ->\r\n (love Person2, who Person1).\r\n\r\nIf we have this rule, and also that ``(love john, who sue)``, the system will conclude\r\nthat ``(love sue, who john)``.\r\n\r\nVariables can match whole facts. For example, with the verbs we have defined, we could\r\nbuild a rule such as::\r\n\r\n (want john, what Exists1)\r\n ->\r\n (Exists1).\r\n\r\nWith this, and ``(want john, what (love sue, who john)).``, the system would conclude\r\nthat ``(love sue, who john)``.\r\n\r\nVariables that match verbs (or nouns) have a special form, in that they are prefixed by\r\nthe name of a verb (or a noun), so that they match verbs (or nouns) that are subtypes of the prefix verb (or noun).\r\nFor example, with the words we have from above, we might make a rule like::\r\n\r\n (LoveVerb1 john, who Person1)\r\n ->\r\n (LoveVerb1 Person1, who john).\r\n\r\nIn this case, ``LoveVerb1`` would match both ``love`` and ``adore``, so both\r\n``(love john, who sue)`` and ``(adore john, who sue)`` would produce the conclusion\r\nthat ``(love sue, who john)`` or ``(adore sue, who john)``.\r\n\r\nFor a more elaborate example we can define a new verb::\r\n\r\n to be-allowed is to exist, subj a person, to a verb.\r\n\r\nand a rule::\r\n\r\n (want Person1, what (LoveVerb1 Person1, who Person2));\r\n (be-allowed Person1, to LoveVerb1)\r\n ->\r\n (LoveVerb1 Person1, who Person2).\r\n\r\nThen, ``(be-allowed john, to adore)`` would allow him to adore but not to love.\r\n\r\nWe can use word variables, e.g. ``Word1``, that will match any word or fact.\r\n\r\nIn conditions, we may want to match a whole fact, and at the same time match some of\r\nits component words. To do this, we prepend the fact with the name\r\nof the fact variable, separated with a colon. With this, the above rule would become::\r\n\r\n (want Person1, what Love1:(LoveVerb1 Person1, who Person2));\r\n (be-allowed Person1, to LoveVerb1)\r\n ->\r\n (Love1).\r\n\r\n\r\nIntegers\r\n--------\r\n\r\nIntegers are of type ``number``.\r\nWe don't define numbers, we just use them.\r\nAny sequence of characters that can be cast as an integer type in Python\r\nare numbers in Terms, e.g.: ``1``.\r\n\r\nNumber variables are composed just with a capital letter and an integer, like\r\n``N1``, ``P3``, or ``F122``.\r\n\r\nPythonic conditions\r\n-------------------\r\n\r\nIn rules, we can add a section where we test conditions with Python, or where we produce\r\nnew variables out of existing ones. This is primarily provided to test arithmetic conditions\r\nand to perform arithetic operations. This section is placed after the conditions,\r\nbetween the symbols ``<-`` and ``->``. The results of the tests are placed in a\r\n``condition`` python variable, and if it evaluates to ``False``, the rule is not fired.\r\n\r\nTo give an example, let's imagine some new terms::\r\n\r\n to aged is to exist, age a number.\r\n a bar is a thing.\r\n club-momentos is a bar.\r\n to enters is to exist, where a bar.\r\n\r\nNow, we can build a rule such as::\r\n\r\n (aged Person1, age N1);\r\n (want Person1, what (enters Person1, where Bar1))\r\n <-\r\n condition = N1 >= 18\r\n ->\r\n (enters Person1, where Bar1).\r\n\r\nIf we have that::\r\n\r\n (aged sue, age 17).\r\n (aged john, age 19).\r\n (want sue, what (enters sue, where club-momentos)).\r\n (want john, what (enters john, where club-momentos)).\r\n\r\nThe system will (only) conclude that ``(enters john, where club-momentos)``.\r\n\r\nNegation\r\n--------\r\n\r\nWe can use 2 kinds of negation in Terms, classical negation and\r\nnegation by failure.\r\n\r\n**Classical negation**\r\n\r\nAny fact can be negated by prepending ``!`` to its verb::\r\n\r\n (!aged sue, age 17).\r\n\r\nA negated fact is the same as a non-negated one.\r\nOnly a negated fact can match a negated fact,\r\nand they can be asserted or used in rules.\r\nThe only special thing about negation is that\r\nthe system will not allow a fact and its negation\r\nin the same knowledge base: it will warn of a contradiction\r\nand will reject the offending fact.\r\n\r\n**Negation by failure**\r\n\r\nIn pythonic conditions, we can use a function ``runtime.count``\r\nwith a single string argument, a Terms fact (possibly with variables),\r\nthat will return the number of facts in the db matching the given one.\r\nWe can use this to test for the absence of any given fact\r\nin the knowledge base, and thus have negation by failure.\r\n\r\nSome care must be taken with the ``count`` function.\r\nIf a fact is entered that might match a pythonic ``count`` condition,\r\nit will never by itself trigger any rule.\r\nRules are activated by facts matching normal conditions;\r\nand pythonic conditions can only allow or abort\r\nthose activations.\r\nIn other words, when a fact is added,\r\nit is tested against all normal conditions in all rules,\r\nand if it activates any rule, the pythonic conditions are tested.\r\nAn example of this behaviour can be seen\r\n`here `_.\r\nIf you examine the ontology in the previous link,\r\nyou will see that it is obviously wrong;\r\nthat's the reason I say that care must be taken.\r\nCounting happens in time,\r\nand it is not advisable to use it without activating time.\r\n\r\nTime\r\n----\r\n\r\nIn the monotonic classical logic we have depicted so far,\r\nit is very simple to represent physical time:\r\nyou only need to add a ``time`` object of type ``number``\r\nto any temporal verb.\r\nHowever, to represent the present time, the now,\r\ni.e., a changing distinguished instant of time,\r\nthis logic is not enough.\r\nWe need to use some non-monotonic tricks for that,\r\nthat are implemented in Terms as a kind of temporal logic.\r\nThis temporal logic can be activated in the settings file::\r\n\r\n\r\n [mykb]\r\n dbms = postgresql://terms:terms@localhost\r\n dbname = mykb\r\n time = normal\r\n instant_duration = 60\r\n\r\nIf it is activated, several things happen.\r\n\r\nThe first is that the system starts tracking the present time:\r\nIt has an integer register whose value represents the current time.\r\nThis register is updated every ``config['instant_duration']`` seconds.\r\nThere are 3 possible values for the ``mode``\r\nsetting for time:\r\nIf the setting is ``none``, nothing is done with time.\r\nIf the setting is ``normal``, the current time of the system is incremented by 1 when it is updated.\r\nIf the setting is ``real``, the current time of the system\r\nis updated with Python's ``import time; int(time.time())``.\r\n\r\nThe second thing that happens is that, rather than defining verbs extending ``exist``,\r\nwe use 2 new verbs, ``occur`` and ``endure``, both subtypes of ``exist``.\r\nThese new verbs have special ``number`` objects:\r\n``occur`` has an ``at_`` object, and ``endure`` a ``since_`` and a ``till_`` objects.\r\n\r\nThe third is that the system starts keeping 2 different factsets,\r\none for the present and one for the past.\r\nAll reasoning occurs in the present factset.\r\nWhen we add a fact made with these verbs, the system automatically adds\r\nto ``occur`` an ``at_`` object and to ``endure`` a ``since_`` object,\r\nboth with the value of its \"present\" register.\r\nThe ``till_`` object of ``endure`` facts is left undefined.\r\nWe never explicitly set those objects.\r\nEach time the time is updated, all ``occur`` facts are removed from the present\r\nand added to the past factset, and thus stop producing consecuences.\r\nQueries for ``occur`` facts go to the past factset if we specify an ``at_`` object in the query,\r\nand to the present if an ``at_`` object is not provided.\r\nThe same goes for ``endure`` facts, substituting ``at_`` with ``since_``.\r\nWe might say that the ``endure`` facts in the present factset are in\r\npresent continuous tense.\r\n\r\nThe fourth thing that happens when we activate the temporal logic\r\nis that we can use a new predicate in the consecuances of our rules:\r\n``finish``. This verb is defined like this::\r\n\r\n to finish is to exist, subj a thing, what a exist.\r\n\r\nAnd when a rule with such a consecuence is activated,\r\nit grabs the provided ``what`` fact from the present factset,\r\nadds a ``till_`` object to it with the present time as value,\r\nremoves it from the present factset,\r\nand adds it to the past factset.\r\n\r\nThere is also the temporal verb ``exclusive-endure``, subverb of ``endure``.\r\nThe peculiarity of ``exclusive-endure`` is that whenever a fact with\r\nsuch verb is added to the knowledge base,\r\nany previous present facts with the same subject and verb are ``finish`` ed.\r\n\r\nA further verb, ``happen``, derived from ``occur``, has the singularity that,\r\nwhen a fact is added as a consecuence of other facts, and is built\r\nwith a verb derived from ``happen``, is fed through the pipeline back to the\r\nuser adding the facts that are producing consecuences.\r\n\r\n\r\nQuerying\r\n--------\r\n\r\nQueries are sets of facts separated by semicolons,\r\nwith or without variables.\r\nIf the query contains no variables, the answer will be ``true``\r\nfor presence of the asked facts or ``false`` for their absence.\r\nTo find out whether a fact is negated we must query its negation.\r\n\r\nIf we include variables in the query,\r\nwe will obtain all the variable substitutions\r\nthat would produce a ``true`` query,\r\nin the form of a json list of mappings of strings.\r\n\r\nHowever, we can not add special constraints,\r\nlike we can in rules with pythonic conditions.\r\n\r\n\r\n**Miscelaneous technical notes.**\r\n\r\n* I have shown several different kinds of variables,\r\n for things, for verbs, for numbers, for facts.\r\n But the logic behind Terms is first order,\r\n there is only one kind of individuals,\r\n and the proliferation of kinds of variables\r\n is just syntactic sugar.\r\n ``Person1`` would be equivalent to something like\r\n \"for all x, x is a person and x...\".\r\n ``LoveVerb1`` would be equivalent to something like\r\n \"for all x, a x is a love and x...\".\r\n\r\n* The design of the system is such that\r\n both adding new facts (with their consecuences)\r\n and querying for facts should be independent of\r\n the size of the knowledge base.\r\n The only place where we depend on the size of the data\r\n is in arithmetic conditions,\r\n since at present number objects are not indexed as such.\r\n\r\n* The Python section of the rules is ``exec``ed\r\n with a dict with the ``condition`` variable in locals\r\n and an empty dict as globals. We might add whatever we\r\n like as globals; for example, numpy.\r\n\r\n\r\nThe Terms Protocol\r\n++++++++++++++++++\r\n\r\nOnce you have a knowledge store in place and a kb daemon running::\r\n\r\n $ mkdir -p var/log\r\n $ mkdir -p var/run\r\n $ bin/kbdaemon start\r\n\r\nYou communicate with it through a TCP socket (e.g. telnet),\r\nwith a communication protocol that I shall describe here.\r\n\r\nA message from a client to the daemon, in this protocol, is a series of\r\nutf8 coded byte strings terminated by the string ``'FINISH-TERMS'``.\r\n\r\nThe daemon joins these strings and, depending on a header,\r\nmakes one of a few things.\r\nA header is an string of lower case alfabetic characters,\r\nseparated from the rest of the message by a colon.\r\n\r\n* I there is no header, the message is assumed to be\r\n a series of constructs in the Terms language,\r\n and fed to the compiler.\r\n Depending on the type of constructs, the response can be different:\r\n\r\n * If the construct is a query, the response is a json string\r\n followed by the string ``'END'``;\r\n \r\n * If the constructs are definitions, facts and/or rules,\r\n the response consists on the series of facts that derive as\r\n consecuences of the entered constructs, that are constructed\r\n with a verb that ``is to happen``, terminated by the string ``'END'``.\r\n\r\n* If there is a ``lexicon:`` header, the response is a json string\r\n followed by the string ``'END'``. The contents of the json depend\r\n on a second header:\r\n \r\n * ``get-subwords`` returns a list of word names that are subword\r\n of the word whose name is given after the header.\r\n \r\n * ``get-words:`` returns a list of word names that are\r\n of the type of the word whose name is given after the header.\r\n \r\n * ``get-verb:`` return a representation of the objects that the verb\r\n named after the header has. For each object, there is a list with\r\n 3 items:\r\n \r\n * A string with the name of the label;\r\n \r\n * A string with the name of the type of the object;\r\n \r\n * A boolean that signals that the object must be a fact in itself.\r\n\r\n* If there is a ``compiler:`` header:\r\n \r\n * If there is an ``exec_globals:`` header, the string that follows\r\n is assumed to be an exec_global, and fed to the knowledge store as such.\r\n \r\n * If there is a ``terms:`` header, what follows are assumed to be\r\n Terms constructs, and we go back to the first bullet point in this series.\r\n\r\nInstallation and usage\r\n======================\r\n\r\nInstallation with setuptools on a virtualenv\r\n++++++++++++++++++++++++++++++++++++++++++++\r\n\r\nYou don't need to use pythonbrew,\r\nbut you must make sure you are using python 3.3.0 or above::\r\n\r\n $ pythonbrew use 3.3.0\r\n\r\nMake a virtualenv, and install setuptools::\r\n\r\n $ pyvenv test-terms\r\n $ cd test-terms/\r\n $ . bin/activate\r\n $ wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python\r\n\r\nInstall Terms (in this case, with PostgreSQL support)::\r\n\r\n $ easy_install Terms[PG]\r\n\r\nInstallation with buildout on a clean debian machine\r\n++++++++++++++++++++++++++++++++++++++++++++++++++++\r\n\r\nI use this to develop Terms.\r\n\r\nStart with a clean basic debian 7.1 virtual machine,\r\nonly selecting the \"standard system utilities\" and\r\n\"ssh server\" software during installataion.\r\n\r\nSome additional software, first to compile python-3.3::\r\n\r\n # aptitude install vim sudo build-essential libreadline-dev zlib1g-dev libpng++-dev libjpeg-dev libfreetype6-dev libncurses-dev libbz2-dev libcrypto++-dev libssl-dev libdb-dev\r\n $ wget http://www.python.org/ftp/python/3.3.2/Python-3.3.2.tgz\r\n $ tar xzf Python-3.3.2.tgz\r\n $ cd Python-3.3.2\r\n $ ./configure\r\n $ make\r\n $ sudo make install\r\n\r\nInstall git, and an RDBMS::\r\n\r\n $ sudo aptitude install git postgresql postgresql-client postgresql-server-dev-9.1\r\n\r\nAllow method \"trust\" to all local connections for PostgreSQL, and create a \"terms\" user::\r\n\r\n $ sudo vim /etc/postgresql/9.1/main/pg_hba.conf\r\n $ sudo su - postgres\r\n $ psql\r\n postgres=# create role terms with superuser login;\r\n CREATE ROLE\r\n postgres=# \\q\r\n $ logout\r\n\r\nGet the buildout::\r\n\r\n $ git clone https://github.com/enriquepablo/terms-project.git\r\n\r\nMake a python-3.3.2 virtualenv::\r\n\r\n $ cd terms-project\r\n $ pyvenv env\r\n $ . env/bin/activate\r\n\r\nEdit the configuration file and run the buildout\r\n(if you ever change the configuration file,\r\nyou must re-run the buildout)::\r\n\r\n $ vim config.cfg\r\n $ python bootstrap.py\r\n $ bin/buildout\r\n\r\nNow we initialize the knowledge store, and start the daemon::\r\n\r\n $ bin/initterms -c etc/terms.cfg\r\n\r\nNow, you can start the REPL and play with it::\r\n\r\n $ bin/terms -c etc/terms.cfg\r\n >> a man is a thing.\r\n man\r\n >> quit\r\n $\r\n\r\n\r\nInterfacing with Terms\r\n++++++++++++++++++++++\r\n\r\nOnce installed, you should have a ``terms`` script,\r\nthat provides a REPL.\r\n\r\nIf you just type ``terms`` in the command line,\r\nyou will get a command line interpreter,\r\nbound to an in-memory sqlite database.\r\n\r\nIf you want to make your Terms knowledge store persistent,\r\nyou must edit the configuration file,\r\nand add a section for your knowledge store.\r\nIf you have installed Terms with easy_install,\r\nyou must create this configuration file in ``~/.terms.cfg``::\r\n\r\n [mykb]\r\n dbms = sqlite:////path/to/my/kbs\r\n dbname = mykb\r\n time = none\r\n\r\nThen you must initialize the knowledge store::\r\n\r\n $ initterms mykb\r\n\r\nAnd now you can start the REPL::\r\n\r\n $ terms mykb\r\n >>\r\n\r\nIn the configuration file you can put as many\r\nsections (e.g., ``[mykb]``) as you like,\r\none for each knowledge store.\r\n\r\nUsing PostgreSQL\r\n++++++++++++++++\r\n\r\nTo use PostgreSQL, you need the psycopg2 package,\r\nthat you can get with easy_install. Of course,\r\nyou need PostgreSQL and its header files for that::\r\n\r\n $ sudo aptitude install postgresql postgresql-client postgresql-server-dev-9.1\r\n $ easy_install Terms[PG]\r\n\r\nThe database specified in the configuration file must exist if you use\r\npostgresql,\r\nand the user (specified in the config file in the dbms URL)\r\nmust be able to create and drop tables and indexes.\r\nYou would have a config file like::\r\n\r\n [mykb]\r\n dbms = postgresql://terms:terms@localhost\r\n dbname = testkb\r\n time = normal\r\n\r\nSo, for example, once you are set, open the REPL::\r\n\r\n eperez@calandria$ initterms mykb\r\n eperez@calandria$ terms mykb\r\n >> a person is a thing.\r\n >> to love is to exist, subj a person, who a person.\r\n >> john is a person.\r\n >> sue is a person.\r\n >> (love john, who sue).\r\n >> (love john, who sue)?\r\n true\r\n >> (love sue, who john)?\r\n false\r\n >> quit\r\n eperez@calandria$ terms testing\r\n >> (love john, who sue)?\r\n true\r\n\r\nUsing the kbdaemon\r\n++++++++++++++++++\r\n\r\nTerms provides a daemon that listens on TCP port 1967.\r\nTo use the daemon, you must put your config in a section of the config file named \"default\"::\r\n\r\n [default]\r\n dbms = postgresql://terms:terms@localhost\r\n dbname = testkb\r\n time = normal\r\n\r\nNow you can start the daemon::\r\n\r\n $ bin/kbdaemon start\r\n kbdaemon started\r\n $\r\n\r\nAnd you can interface with it by making a TCP connection to port 1967 of the machine\r\nand using the protocol described at the end of the README.rst.\r\n\r\nSupport\r\n=======\r\n\r\nThere is a `mailing list `_ at google groups.\r\nYou can also open an issue in `the tracker `_.\r\nOr mail me .", "description_content_type": null, "docs_url": null, "download_url": "UNKNOWN", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "http://pypi.python.org/pypi/Terms", "keywords": "", "license": "GNU GENERAL PUBLIC LICENSE Version 3", "maintainer": "", "maintainer_email": "", "name": "enriquepablo", "package_url": "https://pypi.org/project/enriquepablo/", "platform": "UNKNOWN", "project_url": "https://pypi.org/project/enriquepablo/", "project_urls": { "Download": "UNKNOWN", "Homepage": "http://pypi.python.org/pypi/Terms" }, "release_url": "https://pypi.org/project/enriquepablo/0.1.0b4/", "requires_dist": null, "requires_python": null, "summary": "A smart knowledge store", "version": "0.1.0b4" }, "last_serial": 858494, "releases": { "0.1.0b4": [] }, "urls": [] }