{ "info": { "author": "Mohammadreza Alihoseiny", "author_email": "salam@alihoseiny.ir", "bugtrack_url": null, "classifiers": [], "description": "# WordCloudFa\n[![Downloads](https://pepy.tech/badge/wordcloud-fa)](https://pepy.tech/project/wordcloud-fa)\n![](https://img.shields.io/pypi/v/wordcloud-fa.svg?style=popout)\n\n\n![](https://github.com/alihoseiny/word_cloud_fa/raw/master/Examples/masked-example.png)\n\n\nThis module is an easy-to-use wrapper for [word_cloud module](https://github.com/amueller/word_cloud).\n\nThe original module doesn't support Farsi Texts. But by using **WordCloudFa** you can generate word clouds from \ntexts those are including Persian and English words.\n\nThis module is not only a wrapper, but it adds some features to the original module.\n\n\n\n- [How to Install](#how-to-install)\n- [How to Use](#how-to-use)\n * [Generating Word Cloud from Text](#generating-word-cloud-from-text)\n * [Generating Word Cloud from Frequencies](#generating-word-cloud-from-frequencies)\n * [Working with Stopwords](#working-with-stopwords)\n * [Mask Image](#mask-image)\n * [Reshaping words](#reshaping-words)\n- [Examples](#examples)\n- [Font](#font)\n- [Persian Tutorial](#persian-tutorial)\n- [Contribution](#contribution)\n- [There is any problem?](#there-is-any-problem)\n- [Citations](#citations)\n\n\n\n# How to Install\nFor installing this module, you can simply run \n\n`pip install wordcloud-fa`.\n\nThis module tested on `python 3`\n\n*WordCloudFa* depends on `numpy` and `pillow`.\n\nAlso you should have `Hazm` module. Normally, all of them will install automatically when you install this module using \n`pip` as described at the beginning of this section. \n\nTo save the wordcloud into a file, `matplotlib` can also be installed.\n\n**Attention**\n\nYou need to have `python-dev` for python3 on your system. If you don't have it, you can install it on operating systems \nthose using `apt` as the package manager (Like Ubuntu) by this command:\n\n`sudo apt-get install python3-dev`\n\nAnd you can install it on operating systems those using `yum` as the package manager (like RedHat, Fedora and ...) you can \nuse the following command:\n\n`sudo yum install python3-devel` \n\n# How to Use\nFor creating a word cloud from a text, first you should import the class into your code:\n\n`from wordcloud_fa import WordCloudFa`\n\nyou can create an instance of this class like:\n\n`wodcloud = WordCloudFa()`\n\nYou can pass different parameters to the constructor. For see full documents of them, you can see \n[WordCloud Documentations](https://amueller.github.io/word_cloud/) \n\nThere are two parameters that are not in the original class.\n\nFirst one is `persian_normalize`. If you pass this parameter with `True` value, your data will normalize by using \n[Hazm normalizer](https://github.com/sobhe/hazm). It's recommended to always pass this parameter. That will replace \narabic letters with persian ones and do some other stuff.\nThe default value of this parameter is `False`.\n\n`wodcloud = WordCloudFa(persian_normalize=True)` \n\nthe second parameter is `include_numbers` that is not in the published original module. If you set this parameter to `False`,\n all Persian, Arabic and English numbers will remove from your data.\n \n The default value of this parameter is `True`\n \n `wodcloud = WordCloudFa(include_numbers=False)`\n \n ## Generating Word Cloud from Text\n for generating word cloud from a string, you can simply call `generate` method of you instance:\n \n ```python\nwodcloud = WordCloudFa(persian_normalize=True)wc = wodcloud.generate(text)\nimage = wc.to_image()\nimage.show()\nimage.save('wordcloud.png')\n\n``` \n\n## Generating Word Cloud from Frequencies\n\nYou can generate a word cloud from frequencies. You can use the output of `process_text` method as frequencies.\n Also you can use any dictionary like this.\n \n ```python\nwodcloud = WordCloudFa()\nfrequencies = wodcloud.process_text(text)\nwc = wodcloud.generate_from_frequencies(frequencies)\n``` \n\n`generate_from_frequencies` method in this module will exclude stopwords. But the original module will not exclude them \nwhen you are using this method. Also you can use Persian words as keys in frequencies dict without any problem.\n\n## Working with Stopwords\n\nStopwords are the words that we don't want to consider. If you dan't pass any stopword, the default words in the \n[stopwords](https://github.com/alihoseiny/word_cloud_fa/blob/master/wordcloud_fa/stopwords) file will consider as \nstopwords.\n\nYou don't want to use them at all and you want to choose your stopwords? you can simply set `stopwords` parameter when \nyou are creating an instance from `WordCloudFa` and pass a `set` of words into it.\n\n```python\nstop_words = set(['\u06a9\u0644\u0645\u0647\u200c\u06cc \u0627\u0648\u0644', '\u06a9\u0644\u0645\u0647\u200c\u06cc \u062f\u0648\u0645'])\nwc = WordCloudFa(stopwords=stop_words)\n``` \n\nIf you want to add additional words to the default stopwords, you can simply call `add_stop_words` method on your \ninstance of `WordCloudFa` and pass an iterable type (`list`, `set`, ...) into it.\n\n```python\nwc = WordCloudFa()\nwc.add_stop_words(['\u06a9\u0644\u0645\u0647\u200c\u06cc \u0627\u0648\u0644', '\u06a9\u0644\u0645\u0647\u200c\u06cc \u062f\u0648\u0645'])\n``` \n\nAlso you can add stopwords from a file. That file should include stopwords and each word should be in a separate line.\n\nFor that, you should use `add_stop_words_from_file` method. The only parameter of this \n\nmethod is relative or absolute path to the stop words file.\n\n```python\nwc = WordCloudFa()\nwc.add_stop_words_from_file(\"stopwords.txt\")\n```\n\n## Mask Image\n\nYou can mask the final word cloud by an image. For example, the first image of this document is a wordcloud masked by an image \nof the map of Iran country. For setting a mask, you should pass the `mask` parameter.\n\nBut before, you first should be sure you have a black and white image. Because other images will not create a good result.\n\nThen, you should convert that image to a numpy array. For that, you should do something like this:\n\n```python\nimport numpy as np\nfrom PIL import Image\n\nmask_array = np.array(Image.open(\"mask.png\"))\n\n```\n\nYou just should add those two imports, but you don't need to be worried about installing them, because those have been \ninstalled as dependencies of this module.\n\nThen, you can pass that array to the constructor of the `WordCloudFa` class for masking the result.\n\n```python\nwodcloud = WordCloudFa(mask=mask_array)\n```\n\nNow you can use your worldcloud instance as before.\n\n## Reshaping words\n\nWhen you pass your texts into an instance of this class, all words will reshape for turning to a proper way for showing \nAnd avoiding the invalid shape of Persian or Arabic words (splitted and inverse letters).\n\nIf you want to do the same thing outside of this module, you can call `reshape_words` static method.\n\n```python\nreshaped_words = WordCloudFa.reshape_words(['\u06a9\u0644\u0645\u0647\u200c\u06cc \u0627\u0648\u0644', '\u06a9\u0644\u0645\u0647\u200c\u06cc \u062f\u0648\u0645'])\n```\n\nthis method gets an `Iterable` as input and returns a list of reshaped words.\n\n**DONT FORGET THAT YOU SHOULD NOT PASS RESHAPED WORDS TO THE METHODS OF THIS CLASS AND THIS STATIC METHOD IS ONLY FOR USAGES OUT OF THIS MODULE**\n\n# Examples\nYou can see [Example codes in the Examples directory](https://github.com/alihoseiny/word_cloud_fa/tree/master/Examples).\n\n![](https://github.com/alihoseiny/word_cloud_fa/raw/master/Examples/english-example.png)\n![](https://github.com/alihoseiny/word_cloud_fa/raw/master/Examples/mixed-example.png)\n![](https://github.com/alihoseiny/word_cloud_fa/raw/master/Examples/persian-example.png)\n\n# Font\nThe default font is an unknown! font that supports both Persian and English letters. So you don't need to pass a font for \ngetting results. But if you want to change the font you can pass `font_path` parameter.\n\n# Persian Tutorial\nIf you want to read a brief tutorial about how to use this package in Farsi (Persian), you can \n[click on this link](https://blog.alihoseiny.ir/%da%86%da%af%d9%88%d9%86%d9%87-%d8%a8%d8%a7-%d9%be%d8%a7%db%8c%d8%aa%d9%88%d9%86-%d8%a7%d8%a8%d8%b1-%da%a9%d9%84%d9%85%d8%a7%d8%aa-%d9%81%d8%a7%d8%b1%d8%b3%db%8c-%d8%a8%d8%b3%d8%a7%d8%b2%db%8c%d9%85%d8%9f/?utm_source=github&utm_medium=readme&utm_campaign=wordcloudfa).\n\n# Contribution\nWe want to keep this library fresh and useful for all Iranian developers. So we need your help for adding new features, fixing bugs and adding more documents.\n\nYou are wondering how you can contribute to this project? Here is a list of what you can do:\n\n1. Documents are not enough? You can help us by adding more documents.\n2. The current code could be better? You can make this cleaner or faster.\n3. Do you think one useful feature missed? You can open an issue and tell us about it.\n4. Did you find a good open and free font that supports Farsi and English? You can notify us by a pull request or if opening an issue\n\n# There is any problem?\nIf you have questions, find some bugs or need some features, you can open an issue and tell us. For some strange reasons this is not possible? so contact me by this email: `salam@alihoseiny.ir`.\n\n# Citations\nTexts in the `Example` directory are from [this](https://fa.wikipedia.org/wiki/%D8%A7%DB%8C%D8%B1%D8%A7%D9%86) and \n[this](https://en.wikipedia.org/wiki/Iran) Wikipedia pages.", "description_content_type": "text/markdown", "docs_url": null, "download_url": "https://github.com/alihoseiny/word_cloud_fa/archive/0.1.3.tar.gz", "downloads": { "last_day": -1, "last_month": -1, "last_week": -1 }, "home_page": "https://github.com/alihoseiny/word_cloud_fa", "keywords": "wordcloud,word cloud,Farsi,persian,Iran,nlp,National Language Processing,text processing,data visualization", "license": "MIT", "maintainer": "", "maintainer_email": "", "name": "wordcloud-fa", "package_url": "https://pypi.org/project/wordcloud-fa/", "platform": "", "project_url": "https://pypi.org/project/wordcloud-fa/", "project_urls": { "Download": "https://github.com/alihoseiny/word_cloud_fa/archive/0.1.3.tar.gz", "Homepage": "https://github.com/alihoseiny/word_cloud_fa" }, "release_url": "https://pypi.org/project/wordcloud-fa/0.1.4/", "requires_dist": null, "requires_python": "", "summary": "A wrapper for wordcloud module for creating persian (and other rtl languages) word cloud.", "version": "0.1.4" }, "last_serial": 5601030, "releases": { "0.1": [ { "comment_text": "", "digests": { "md5": "0efeee0c2b79612c574820a6a81b761f", "sha256": "d1f4ee32a65823d0c704813721aa55c25aa7ef0de20b230d226b809e2b401f01" }, "downloads": -1, "filename": "wordcloud_fa-0.1.tar.gz", "has_sig": false, "md5_digest": "0efeee0c2b79612c574820a6a81b761f", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 3093, "upload_time": "2019-05-17T18:46:02", "url": "https://files.pythonhosted.org/packages/3e/a6/0abce579a2b35bbe1905fdbbf3cd38d74850aaf60b21025e43c6f13fc4b4/wordcloud_fa-0.1.tar.gz" } ], "0.1.1": [ { "comment_text": "", "digests": { "md5": "4b81bbe7552cac83ae9cc7818405dbe6", "sha256": "aba34b0fedf6edb692db417c1235cc1117cb070e67e0b4f890a64824480cb263" }, "downloads": -1, "filename": "wordcloud_fa-0.1.1.tar.gz", "has_sig": false, "md5_digest": "4b81bbe7552cac83ae9cc7818405dbe6", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 64602, "upload_time": "2019-05-17T19:00:39", "url": "https://files.pythonhosted.org/packages/bb/8b/bdc580bb48c842723f837ace7f8ac0855b9bebe68449297ef02cb9f03965/wordcloud_fa-0.1.1.tar.gz" } ], "0.1.2": [ { "comment_text": "", "digests": { "md5": "c709dd1e1cb1b5f1d5a936ebb5aba731", "sha256": "444a87e770014359a7d7852a84ec35d35f743d2429a71f964cac9e5656f05644" }, "downloads": -1, "filename": "wordcloud_fa-0.1.2.tar.gz", "has_sig": false, "md5_digest": "c709dd1e1cb1b5f1d5a936ebb5aba731", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 64726, "upload_time": "2019-05-18T05:03:42", "url": "https://files.pythonhosted.org/packages/10/cd/fb7f3524811710bc5fc180bb734540627148f7989a555109e8e15996f91d/wordcloud_fa-0.1.2.tar.gz" } ], "0.1.3": [ { "comment_text": "", "digests": { "md5": "8de1616fd73d1cad8aae3c54c3472fe9", "sha256": "2fd2bbaeeceecec36732066bc748a4905b25849514a12c28babd11d217c3b0db" }, "downloads": -1, "filename": "wordcloud_fa-0.1.3.tar.gz", "has_sig": false, "md5_digest": "8de1616fd73d1cad8aae3c54c3472fe9", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 72687, "upload_time": "2019-06-06T07:42:09", "url": "https://files.pythonhosted.org/packages/fe/2b/4e765f7d2fe0df1966b5cd8d4b63fd76ba7235f98b970e25b4ebf181e687/wordcloud_fa-0.1.3.tar.gz" } ], "0.1.4": [ { "comment_text": "", "digests": { "md5": "856c051dd66265c644c6d0b3e11d0ade", "sha256": "ba47a4ea37773698626e20870a551470cb95ae8f7cbf768fbac315efd42470ba" }, "downloads": -1, "filename": "wordcloud_fa-0.1.4.tar.gz", "has_sig": false, "md5_digest": "856c051dd66265c644c6d0b3e11d0ade", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 73635, "upload_time": "2019-07-29T18:33:27", "url": "https://files.pythonhosted.org/packages/49/bb/6904c325a1adba4a98cf8561f452308a8527318bc0e4383c7bc4b9131171/wordcloud_fa-0.1.4.tar.gz" } ] }, "urls": [ { "comment_text": "", "digests": { "md5": "856c051dd66265c644c6d0b3e11d0ade", "sha256": "ba47a4ea37773698626e20870a551470cb95ae8f7cbf768fbac315efd42470ba" }, "downloads": -1, "filename": "wordcloud_fa-0.1.4.tar.gz", "has_sig": false, "md5_digest": "856c051dd66265c644c6d0b3e11d0ade", "packagetype": "sdist", "python_version": "source", "requires_python": null, "size": 73635, "upload_time": "2019-07-29T18:33:27", "url": "https://files.pythonhosted.org/packages/49/bb/6904c325a1adba4a98cf8561f452308a8527318bc0e4383c7bc4b9131171/wordcloud_fa-0.1.4.tar.gz" } ] }