Started by GitHub push by severinsimmler Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision ef0e8680a784a1c74f8b4c164eef6709ace2ecb2 (refs/remotes/origin/testing) Commit message: "Update __init__.py" > git config core.sparsecheckout # timeout=10 > git checkout -f ef0e8680a784a1c74f8b4c164eef6709ace2ecb2 > git rev-list c3c24d1bd1dfaacb849753308063479f4e485422 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda4659662574398070357.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting nose (from -r requirements-dev.txt (line 2)) Using cached nose-1.3.7-py3-none-any.whl Collecting nosexcover (from -r requirements-dev.txt (line 3)) Using cached nosexcover-1.0.11-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 4)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 5)) Using cached Sphinx-1.6.4-py2.py3-none-any.whl Collecting nbsphinx (from -r requirements-dev.txt (line 6)) Using cached nbsphinx-0.2.14-py2.py3-none-any.whl Collecting recommonmark (from -r requirements-dev.txt (line 7)) Using cached recommonmark-0.4.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pandas-0.20.3-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting lda>=1.0.5 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached numpy-1.13.3-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached lxml-4.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting matplotlib>=1.5.3 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached matplotlib-2.0.2-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3)) Using cached coverage-4.4.1-cp35-cp35m-manylinux1_x86_64.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4)) Using cached qtconsole-4.3.1-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipywidgets-7.0.1-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_console-5.2.0-py2.py3-none-any.whl Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4)) Using cached nbconvert-5.3.1-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 4)) Using cached notebook-5.1.0-py2.py3-none-any.whl Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipykernel-4.6.1-py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 5)) Using cached alabaster-0.7.10-py2.py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Jinja2-2.9.6-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 5)) Using cached requests-2.18.4-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 5)) Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 5)) Using cached docutils-0.14-py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from sphinx->-r requirements-dev.txt (line 5)) Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 5)) Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting six>=1.5 (from sphinx->-r requirements-dev.txt (line 5)) Using cached six-1.11.0-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 5)) Using cached imagesize-0.7.1-py2.py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Babel-2.5.1-py2.py3-none-any.whl Collecting nbformat (from nbsphinx->-r requirements-dev.txt (line 6)) Using cached nbformat-4.4.0-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 6)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 7)) Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pytz-2017.2-py2.py3-none-any.whl Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.1-py2.py3-none-any.whl Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached scipy-0.19.1-cp35-cp35m-manylinux1_x86_64.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pbr-3.1.1-py2.py3-none-any.whl Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached Werkzeug-0.12.2-py2.py3-none-any.whl Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting pyparsing!=2.0.0,!=2.0.4,!=2.1.2,!=2.1.6,>=1.5.6 (from matplotlib>=1.5.3->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting jupyter-core (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_core-4.3.0-py2.py3-none-any.whl Collecting jupyter-client>=4.1 (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_client-5.1.0-py2.py3-none-any.whl Collecting ipython-genutils (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting ipython>=4.0.0; python_version >= "3.3" (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython-6.2.1-py3-none-any.whl Collecting widgetsnbextension~=3.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached widgetsnbextension-3.0.3-py2.py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached prompt_toolkit-1.0.15-py3-none-any.whl Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting mistune>=0.7.4 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached mistune-0.7.4-py2.py3-none-any.whl Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached bleach-2.1.1-py2.py3-none-any.whl Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 5)) Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached urllib3-1.22-py2.py3-none-any.whl Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached idna-2.6-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached certifi-2017.7.27.1-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbsphinx->-r requirements-dev.txt (line 6)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 6)) Using cached decorator-4.1.2-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached boto-2.48.0-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting pyzmq>=13 (from jupyter-client>=4.1->qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0; python_version >= "3.3"->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached pexpect-4.2.1-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0; python_version >= "3.3"->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Collecting jedi>=0.10 (from ipython>=4.0.0; python_version >= "3.3"->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached jedi-0.11.0-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0; python_version >= "3.3"->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached html5lib-1.0b10-py2.py3-none-any.whl Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting parso==0.1.0 (from jedi>=0.10->ipython>=4.0.0; python_version >= "3.3"->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached parso-0.1.0-py2.py3-none-any.whl Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Installing collected packages: nose, coverage, nosexcover, Pygments, decorator, six, ipython-genutils, traitlets, jupyter-core, pyzmq, python-dateutil, jupyter-client, ptyprocess, pexpect, simplegeneric, parso, jedi, pickleshare, wcwidth, prompt-toolkit, ipython, tornado, ipykernel, qtconsole, jsonschema, nbformat, entrypoints, mistune, webencodings, html5lib, bleach, testpath, MarkupSafe, Jinja2, pandocfilters, nbconvert, terminado, notebook, widgetsnbextension, ipywidgets, jupyter-console, jupyter, alabaster, urllib3, idna, certifi, chardet, requests, sphinxcontrib-websupport, docutils, snowballstemmer, imagesize, pytz, babel, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, scipy, boto, bz2file, smart-open, gensim, pbr, lda, lxml, Werkzeug, click, itsdangerous, flask, cycler, pyparsing, matplotlib, PyYAML, bokeh, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Werkzeug-0.12.2 alabaster-0.7.10 babel-2.5.1 bleach-2.1.1 bokeh-0.12.9 boto-2.48.0 bz2file-0.98 certifi-2017.7.27.1 chardet-3.0.4 click-6.7 commonmark-0.5.4 coverage-4.4.1 cycler-0.10.0 dariah-topics decorator-4.1.2 docutils-0.14 entrypoints-0.2.3 flask-0.12.2 gensim-3.0.0 html5lib-1.0b10 idna-2.6 imagesize-0.7.1 ipykernel-4.6.1 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.0.1 itsdangerous-0.24 jedi-0.11.0 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.1.0 jupyter-console-5.2.0 jupyter-core-4.3.0 lda-1.0.5 lxml-4.0.0 matplotlib-2.0.2 mistune-0.7.4 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.2.14 nose-1.3.7 nosexcover-1.0.11 notebook-5.1.0 numpy-1.13.3 pandas-0.20.3 pandocfilters-1.4.2 parso-0.1.0 pbr-3.1.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.15 ptyprocess-0.5.2 pyparsing-2.2.0 python-dateutil-2.6.1 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.1 recommonmark-0.4.0 regex-2017.9.23 requests-2.18.4 scipy-0.19.1 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.3 snowballstemmer-1.2.1 sphinx-1.6.4 sphinxcontrib-websupport-1.0.1 terminado-0.6 testpath-0.3.1 tornado-4.5.2 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.0.3 + ./setup.py sdist bdist_wheel running sdist running egg_info writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing requirements to dariah_topics.egg-info/requires.txt writing dariah_topics.egg-info/PKG-INFO writing top-level names to dariah_topics.egg-info/top_level.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check warning: check: missing required meta-data: url creating dariah_topics-0.3.0.dev1 creating dariah_topics-0.3.0.dev1/dariah_topics creating dariah_topics-0.3.0.dev1/dariah_topics.egg-info creating dariah_topics-0.3.0.dev1/test copying files to dariah_topics-0.3.0.dev1... copying README.md -> dariah_topics-0.3.0.dev1 copying setup.cfg -> dariah_topics-0.3.0.dev1 copying setup.py -> dariah_topics-0.3.0.dev1 copying dariah_topics/__init__.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.3.0.dev1/test Writing dariah_topics-0.3.0.dev1/setup.cfg Creating tar archive removing 'dariah_topics-0.3.0.dev1' (and everything under it) running bdist_wheel running build running build_py copying dariah_topics/postprocessing.py -> build/lib/dariah_topics copying dariah_topics/__init__.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.3.0.dev1-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.3.0.dev1.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.3.0.dev1-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/doclist.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/mallet.py' adding 'dariah_topics/meta.py' adding 'dariah_topics/model_creation.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-0.3.0.dev1.dist-info/DESCRIPTION.rst' adding 'dariah_topics-0.3.0.dev1.dist-info/metadata.json' adding 'dariah_topics-0.3.0.dev1.dist-info/top_level.txt' adding 'dariah_topics-0.3.0.dev1.dist-info/WHEEL' adding 'dariah_topics-0.3.0.dev1.dist-info/METADATA' adding 'dariah_topics-0.3.0.dev1.dist-info/RECORD' + ./setup.py build_sphinx -a running build_sphinx Running Sphinx v1.6.4 loading pickled environment... done [autosummary] generating autosummary for: CONTRIBUTING.md, Introducing_MALLET.ipynb, Introducing_gensim.ipynb, Introducing_lda.ipynb, README.md, demonstrator/README.md, docs/gen/dariah_topics.doclist.rst, docs/gen/dariah_topics.evaluation.rst, docs/gen/dariah_topics.mallet.rst, docs/gen/dariah_topics.meta.rst, docs/gen/dariah_topics.preprocessing.rst, docs/gen/dariah_topics.rst, docs/gen/dariah_topics.visualization.rst, docs/gen/modules.rst, index.rst, presentation_topic_over_time.ipynb WordCloud functions not available, they require the wordcloud module building [mo]: all of 0 po files building [html]: all source files updating environment: 0 added, 1 changed, 0 removed reading sources... [100%] docs/gen/dariah_topics looking for now-outdated files... none found pickling environment... done checking consistency... /mnt/data/jenkins/workspace/DARIAH-Topics/presentation_topic_over_time.ipynb: WARNING: document isn't included in any toctree done preparing documents... done writing output... [ 6%] CONTRIBUTING writing output... [ 12%] Introducing_MALLET writing output... [ 18%] Introducing_gensim writing output... [ 25%] Introducing_lda writing output... [ 31%] README writing output... [ 37%] demonstrator/README writing output... [ 43%] docs/gen/dariah_topics writing output... [ 50%] docs/gen/dariah_topics.doclist writing output... [ 56%] docs/gen/dariah_topics.evaluation writing output... [ 62%] docs/gen/dariah_topics.mallet writing output... [ 68%] docs/gen/dariah_topics.meta writing output... [ 75%] docs/gen/dariah_topics.preprocessing writing output... [ 81%] docs/gen/dariah_topics.visualization writing output... [ 87%] docs/gen/modules writing output... [ 93%] index writing output... [100%] presentation_topic_over_time /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:: WARNING: 'any' reference target not found: Path /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:10: WARNING: 'any' reference target not found: Path() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.get_docs:6: WARNING: 'any' reference target not found: _get_item(self, index) /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:: WARNING: 'any' reference target not found: Path /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:10: WARNING: 'any' reference target not found: Path() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.with_segment_files:5: WARNING: 'any' reference target not found: strings.Formatter /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_binary:: WARNING: 'any' reference target not found: stoplist /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_binary:: WARNING: 'any' reference target not found: replacements_files /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_binary:: WARNING: 'any' reference target not found: gram_sizes=1,2 /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_state /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_state /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_topic_docs /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: output_doc_topics /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.create_mallet_model:: WARNING: 'any' reference target not found: num_iterations = 0 /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth: or :py:meth: /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth: or :py:meth: /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: (?pattern) /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: _ /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: author /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: title /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_doc_term_matrix:4: WARNING: 'any' reference target not found: _wordcounts() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: _create_large_counter() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: _create_sparse_index()` /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: get_labels() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: doc_labels /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: doc_tokens /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: type_dictionary /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: doc_ids /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: type_dictionary /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.create_sparse_bow:4: WARNING: 'any' reference target not found: doc_dictionary /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.find_hapax:4: WARNING: 'any' reference target not found: create_sparse_matrix() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.find_stopwords:4: WARNING: 'any' reference target not found: create_sparse_matrix() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.make_doc2bow_list:4: WARNING: 'any' reference target not found: get_document_topics() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.read_from_csv:4: WARNING: 'any' reference target not found: create_document_list() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.read_from_tei:4: WARNING: 'any' reference target not found: create_document_list() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.read_from_txt:4: WARNING: 'any' reference target not found: create_document_list() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.segment:1: WARNING: 'any' reference target not found: segment_size /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.tokenize:4: WARNING: 'any' reference target not found: lower generating indices... genindex py-modindex highlighting module code... [ 16%] dariah_topics.meta highlighting module code... [ 33%] dariah_topics.mallet highlighting module code... [ 50%] dariah_topics.preprocessing highlighting module code... [ 66%] dariah_topics.evaluation highlighting module code... [ 83%] dariah_topics.doclist highlighting module code... [100%] dariah_topics.visualization writing additional pages... search copying images... [ 12%] build/sphinx/doctrees/nbsphinx/Introducing_MALLET_93_1.png copying images... [ 25%] build/sphinx/doctrees/nbsphinx/presentation_topic_over_time_18_2.png copying images... [ 37%] build/sphinx/doctrees/nbsphinx/Introducing_gensim_88_1.png copying images... [ 50%] build/sphinx/doctrees/nbsphinx/Introducing_lda_71_1.png copying images... [ 62%] build/sphinx/doctrees/nbsphinx/Introducing_lda_72_0.png copying images... [ 75%] build/sphinx/doctrees/nbsphinx/presentation_topic_over_time_20_0.png copying images... [ 87%] build/sphinx/doctrees/nbsphinx/presentation_topic_over_time_16_2.png copying images... [100%] build/sphinx/doctrees/nbsphinx/presentation_topic_over_time_17_2.png copying static files... /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/sphinx/application.py:445: RemovedInSphinx17Warning: app.status_iterator() is now deprecated. Use sphinx.util.status_iterator() instead. RemovedInSphinx17Warning) WARNING: html_static_path entry '/mnt/data/jenkins/workspace/DARIAH-Topics/docs/_static' does not exist done copying extra files... done dumping search index in English (code: en) ... done dumping object inventory... done build succeeded, 46 warnings. + nosetests --cover-html FFFFFFFFFFFF....................... ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._ ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._ File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 254, in _ ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 272, in dariah_topics.postprocessing._ Failed example: sparse_bow = create_sparse_bow(doc_labels, doc_tokens, type_dictionary, doc_dictionary) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in sparse_bow = create_sparse_bow(doc_labels, doc_tokens, type_dictionary, doc_dictionary) NameError: name 'create_sparse_bow' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 279, in dariah_topics.postprocessing._ Failed example: make_doc2bow_list(sparse_bow) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in make_doc2bow_list(sparse_bow) NameError: name 'make_doc2bow_list' is not defined -------------------- >> begin captured logging << -------------------- gensim.models.doc2vec: DEBUG: Fast version of gensim.models.doc2vec is being used summa.preprocessing.cleaner: INFO: 'pattern' package not found; tag filters are not available for English gensim.corpora.dictionary: INFO: adding document #0 to Dictionary(0 unique tokens: []) gensim.corpora.dictionary: INFO: built Dictionary(4 unique tokens: ['for', 'test', 'corpus', 'testing']) from 2 documents (total 4 corpus positions) gensim.models.ldamodel: INFO: using symmetric alpha at 1.0 gensim.models.ldamodel: INFO: using symmetric eta at 0.25 gensim.models.ldamodel: INFO: using serial LDA version on this node gensim.models.ldamodel: INFO: running online (single-pass) LDA training, 1 topics, 1 passes over the supplied corpus of 2 documents, updating model once every 2 documents, evaluating perplexity every 2 documents, iterating 1x with a convergence threshold of 0.001000 gensim.models.ldamodel: WARNING: too few updates, training might not converge; consider increasing the number of passes or iterations to improve accuracy gensim.models.ldamodel: DEBUG: bound: at document #0 gensim.models.ldamodel: INFO: -1.689 per-word bound, 3.2 perplexity estimate based on a held-out corpus of 2 documents with 4 words gensim.models.ldamodel: INFO: PROGRESS: pass 0, at document #2/2 gensim.models.ldamodel: DEBUG: performing inference on a chunk of 2 documents gensim.models.ldamodel: DEBUG: 0/2 documents converged within 1 iterations gensim.models.ldamodel: DEBUG: updating topics gensim.models.ldamodel: INFO: topic #0 (1.000): 0.250*"test" + 0.250*"corpus" + 0.250*"for" + 0.250*"testing" gensim.models.ldamodel: INFO: topic diff=0.283778, rho=1.000000 --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._save_matrix_market ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._save_matrix_market File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 107, in _save_matrix_market ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 132, in dariah_topics.postprocessing._save_matrix_market Failed example: document_term_matrix = preprocessing.create_document_term_matrix(tokenized_corpus, document_labels, large_corpus=True) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in document_term_matrix = preprocessing.create_document_term_matrix(tokenized_corpus, document_labels, large_corpus=True) AttributeError: module 'dariah_topics.preprocessing' has no attribute 'create_document_term_matrix' ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 133, in dariah_topics.postprocessing._save_matrix_market Failed example: _save_matrix_market(document_term_matrix, path) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in _save_matrix_market(document_term_matrix, path) NameError: name 'document_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 134, in dariah_topics.postprocessing._save_matrix_market Failed example: with open(os.path.join(path, 'document_term_matrix.mm'), 'r', encoding='utf-8') as file: print(file.read()) #doctest: +NORMALIZE_WHITESPACE Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in with open(os.path.join(path, 'document_term_matrix.mm'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/document_term_matrix.mm' ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_gensim_doc_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_gensim_doc_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 506, in _show_gensim_doc_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 531, in dariah_topics.postprocessing._show_gensim_doc_topics Failed example: doc_topic = visualization.create_doc_topic(corpus, model, doc_labels) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in doc_topic = visualization.create_doc_topic(corpus, model, doc_labels) NameError: name 'visualization' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 532, in dariah_topics.postprocessing._show_gensim_doc_topics Failed example: len(doc_topic.T) == 2 Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in len(doc_topic.T) == 2 NameError: name 'doc_topic' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 533, in dariah_topics.postprocessing._show_gensim_doc_topics Failed example: True Expected nothing Got: True -------------------- >> begin captured logging << -------------------- gensim.corpora.mmcorpus: INFO: storing corpus in Matrix Market format to /tmp/corpus.mm gensim.matutils: INFO: saving sparse matrix to /tmp/corpus.mm gensim.matutils: INFO: PROGRESS: saving document #0 gensim.matutils: INFO: saved 2x2 matrix, density=25.000% (1/4) gensim.matutils: DEBUG: closing /tmp/corpus.mm gensim.matutils: DEBUG: closing /tmp/corpus.mm gensim.corpora.indexedcorpus: INFO: saving MmCorpus index to /tmp/corpus.mm.index gensim.corpora.indexedcorpus: INFO: loaded corpus index from /tmp/corpus.mm.index gensim.matutils: INFO: initializing corpus reader from /tmp/corpus.mm gensim.matutils: INFO: accepted corpus with 2 documents, 2 features, 1 non-zero entries gensim.models.ldamodel: INFO: using symmetric alpha at 1.0 gensim.models.ldamodel: INFO: using symmetric eta at 0.5 gensim.models.ldamodel: INFO: using serial LDA version on this node gensim.models.ldamodel: INFO: running online (single-pass) LDA training, 1 topics, 1 passes over the supplied corpus of 2 documents, updating model once every 2 documents, evaluating perplexity every 2 documents, iterating 50x with a convergence threshold of 0.001000 gensim.models.ldamodel: WARNING: too few updates, training might not converge; consider increasing the number of passes or iterations to improve accuracy gensim.models.ldamodel: DEBUG: bound: at document #0 gensim.models.ldamodel: INFO: -1.151 per-word bound, 2.2 perplexity estimate based on a held-out corpus of 2 documents with 0 words gensim.models.ldamodel: INFO: PROGRESS: pass 0, at document #2/2 gensim.models.ldamodel: DEBUG: performing inference on a chunk of 2 documents gensim.models.ldamodel: DEBUG: 2/2 documents converged within 50 iterations gensim.models.ldamodel: DEBUG: updating topics gensim.models.ldamodel: INFO: topic #0 (1.000): 0.600*"corpus" + 0.400*"test" gensim.models.ldamodel: INFO: topic diff=0.455046, rho=1.000000 --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_gensim_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_gensim_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 329, in _show_gensim_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 352, in dariah_topics.postprocessing._show_gensim_topics Failed example: isinstance(gensim2dataframe(model, 4), pd.DataFrame) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in isinstance(gensim2dataframe(model, 4), pd.DataFrame) NameError: name 'gensim2dataframe' is not defined -------------------- >> begin captured logging << -------------------- gensim.corpora.dictionary: INFO: adding document #0 to Dictionary(0 unique tokens: []) gensim.corpora.dictionary: INFO: built Dictionary(4 unique tokens: ['for', 'test', 'corpus', 'testing']) from 2 documents (total 4 corpus positions) gensim.models.ldamodel: INFO: using symmetric alpha at 1.0 gensim.models.ldamodel: INFO: using symmetric eta at 0.25 gensim.models.ldamodel: INFO: using serial LDA version on this node gensim.models.ldamodel: INFO: running online (single-pass) LDA training, 1 topics, 1 passes over the supplied corpus of 2 documents, updating model once every 2 documents, evaluating perplexity every 2 documents, iterating 1x with a convergence threshold of 0.001000 gensim.models.ldamodel: WARNING: too few updates, training might not converge; consider increasing the number of passes or iterations to improve accuracy gensim.models.ldamodel: DEBUG: bound: at document #0 gensim.models.ldamodel: INFO: -1.684 per-word bound, 3.2 perplexity estimate based on a held-out corpus of 2 documents with 4 words gensim.models.ldamodel: INFO: PROGRESS: pass 0, at document #2/2 gensim.models.ldamodel: DEBUG: performing inference on a chunk of 2 documents gensim.models.ldamodel: DEBUG: 0/2 documents converged within 1 iterations gensim.models.ldamodel: DEBUG: updating topics gensim.models.ldamodel: INFO: topic #0 (1.000): 0.250*"test" + 0.250*"corpus" + 0.250*"for" + 0.250*"testing" gensim.models.ldamodel: INFO: topic diff=0.223775, rho=1.000000 --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_lda_doc_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_lda_doc_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 402, in _show_lda_doc_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 423, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: doc_term_matrix = create_doc_term_matrix(corpus, ['doc1', 'doc2']) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in doc_term_matrix = create_doc_term_matrix(corpus, ['doc1', 'doc2']) NameError: name 'create_doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 424, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: vocab = doc_term_matrix.columns Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in vocab = doc_term_matrix.columns NameError: name 'doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 426, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: model.fit(doc_term_matrix.as_matrix().astype(int)) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in model.fit(doc_term_matrix.as_matrix().astype(int)) NameError: name 'doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 427, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: topics = lda2dataframe(model, vocab) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in topics = lda2dataframe(model, vocab) NameError: name 'lda2dataframe' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 428, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: doc_topic = lda_doc_topic(model, vocab, ['doc1', 'doc2']) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in doc_topic = lda_doc_topic(model, vocab, ['doc1', 'doc2']) NameError: name 'lda_doc_topic' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 429, in dariah_topics.postprocessing._show_lda_doc_topics Failed example: len(doc_topic.T) == 2 Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in len(doc_topic.T) == 2 NameError: name 'doc_topic' is not defined ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_lda_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_lda_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 290, in _show_lda_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 310, in dariah_topics.postprocessing._show_lda_topics Failed example: doc_term_matrix = create_doc_term_matrix(corpus, ['doc1', 'doc2']) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in doc_term_matrix = create_doc_term_matrix(corpus, ['doc1', 'doc2']) NameError: name 'create_doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 311, in dariah_topics.postprocessing._show_lda_topics Failed example: vocab = doc_term_matrix.columns Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in vocab = doc_term_matrix.columns NameError: name 'doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 313, in dariah_topics.postprocessing._show_lda_topics Failed example: model.fit(doc_term_matrix.as_matrix().astype(int)) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in model.fit(doc_term_matrix.as_matrix().astype(int)) NameError: name 'doc_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 314, in dariah_topics.postprocessing._show_lda_topics Failed example: df = lda2dataframe(model, vocab, num_keys=1) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in df = lda2dataframe(model, vocab, num_keys=1) NameError: name 'lda2dataframe' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 315, in dariah_topics.postprocessing._show_lda_topics Failed example: len(df) == 1 Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in len(df) == 1 NameError: name 'df' is not defined ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_mallet_doc_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_mallet_doc_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 452, in _show_mallet_doc_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 463, in dariah_topics.postprocessing._show_mallet_doc_topics Failed example: df = show_doc_topic_matrix(outfolder) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in df = show_doc_topic_matrix(outfolder) NameError: name 'show_doc_topic_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 464, in dariah_topics.postprocessing._show_mallet_doc_topics Failed example: len(df.T) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in len(df.T) NameError: name 'df' is not defined ====================================================================== FAIL: Doctest: dariah_topics.postprocessing._show_mallet_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing._show_mallet_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 363, in _show_mallet_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 377, in dariah_topics.postprocessing._show_mallet_topics Failed example: df = show_topics_keys(outfolder, num_topics=10) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in df = show_topics_keys(outfolder, num_topics=10) NameError: name 'show_topics_keys' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 378, in dariah_topics.postprocessing._show_mallet_topics Failed example: len(df) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in len(df) NameError: name 'df' is not defined ====================================================================== FAIL: Doctest: dariah_topics.postprocessing.save_document_term_matrix ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing.save_document_term_matrix File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 40, in save_document_term_matrix ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 81, in dariah_topics.postprocessing.save_document_term_matrix Failed example: document_term_matrix = preprocessing.create_document_term_matrix(tokenized_corpus, document_labels) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in document_term_matrix = preprocessing.create_document_term_matrix(tokenized_corpus, document_labels) AttributeError: module 'dariah_topics.preprocessing' has no attribute 'create_document_term_matrix' ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 82, in dariah_topics.postprocessing.save_document_term_matrix Failed example: save_document_term_matrix(document_term_matrix, path) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in save_document_term_matrix(document_term_matrix, path) NameError: name 'document_term_matrix' is not defined ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 83, in dariah_topics.postprocessing.save_document_term_matrix Failed example: preprocessing.read_document_term_matrix(os.path.join(path, 'document_term_matrix.csv')) #doctest +NORMALIZE_WHITESPACE Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in preprocessing.read_document_term_matrix(os.path.join(path, 'document_term_matrix.csv')) #doctest +NORMALIZE_WHITESPACE AttributeError: module 'dariah_topics.preprocessing' has no attribute 'read_document_term_matrix' ====================================================================== FAIL: Doctest: dariah_topics.postprocessing.save_model ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing.save_model File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 195, in save_model ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 215, in dariah_topics.postprocessing.save_model Failed example: with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: print(file.read()) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/*.txt' -------------------- >> begin captured logging << -------------------- dariah_topics.postprocessing: INFO: Saving tokenized corpus to tmp ... dariah_topics.postprocessing: INFO: Creating directory tmp ... dariah_topics.postprocessing: DEBUG: Current file: document_label --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.postprocessing.save_tokenized_corpus ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing.save_tokenized_corpus File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 156, in save_tokenized_corpus ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 179, in dariah_topics.postprocessing.save_tokenized_corpus Failed example: with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: print(file.read()) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/*.txt' -------------------- >> begin captured logging << -------------------- dariah_topics.postprocessing: INFO: Saving tokenized corpus to tmp ... dariah_topics.postprocessing: DEBUG: Current file: document_label --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.postprocessing.show_topics ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.postprocessing.show_topics File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 222, in show_topics ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 242, in dariah_topics.postprocessing.show_topics Failed example: with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: print(file.read()) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "", line 1, in with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/*.txt' -------------------- >> begin captured logging << -------------------- dariah_topics.postprocessing: INFO: Saving tokenized corpus to tmp ... dariah_topics.postprocessing: DEBUG: Current file: document_label --------------------- >> end captured logging << --------------------- Name Stmts Miss Cover ----------------------------------------------------- dariah_topics/__init__.py 0 0 100% dariah_topics/doclist.py 93 8 91% dariah_topics/postprocessing.py 149 116 22% dariah_topics/preprocessing.py 223 17 92% dariah_topics/visualization.py 176 98 44% ----------------------------------------------------- TOTAL 641 239 63% ---------------------------------------------------------------------- Ran 35 tests in 92.992s FAILED (failures=12) Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... Publishing Cobertura coverage results... Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@6591ff22[description=A python library for topic modeling.,homepage=,name=Topics,license=,fork=true,size=186599,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:ef0e868) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/ef0e8680a784a1c74f8b4c164eef6709ace2ecb2 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE