Started by GitHub push by severinsimmler Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 6ec37f39282d8aac7f264429e050520eedc05295 (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f 6ec37f39282d8aac7f264429e050520eedc05295 Commit message: "Fix bug" > git rev-list 02324ace1ee238dbeeff63b2d381f70514f32430 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda7686282758526010855.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Using cached pytest-3.3.2-py2.py3-none-any.whl Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Using cached pytest_cov-2.5.1-py2.py3-none-any.whl Collecting pytest-nbsmoke (from -r requirements-dev.txt (line 6)) Downloading pytest_nbsmoke-0.1.6-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Using cached Sphinx-1.6.6-py2.py3-none-any.whl Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-build-l4f1u1jn/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Using cached recommonmark-0.4.0-py2.py3-none-any.whl Collecting pyinstaller (from -r requirements-dev.txt (line 15)) Collecting pandas>=0.19.2 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pandas-0.22.0-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached gensim-3.2.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lda>=1.0.5 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached numpy-1.14.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached lxml-4.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting matplotlib>=1.5.3 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached matplotlib-2.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting wordcloud>=1.3.1 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) Collecting attrs>=17.2.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached attrs-17.4.0-py2.py3-none-any.whl Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4)) Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached six-1.11.0-py2.py3-none-any.whl Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached py-1.5.2-py2.py3-none-any.whl Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5)) Using cached coverage-4.4.2-cp35-cp35m-manylinux1_x86_64.whl Collecting jupyter-client (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_client-5.2.1-py2.py3-none-any.whl Collecting ipykernel (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipykernel-4.8.0-py3-none-any.whl Collecting nbconvert (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbconvert-5.3.1-py2.py3-none-any.whl Collecting pyflakes (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pyflakes-1.6.0-py2.py3-none-any.whl Collecting nbformat (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbformat-4.4.0-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Using cached jupyter_console-5.2.0-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Using cached qtconsole-4.3.1-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Using cached notebook-5.3.1-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Using cached ipywidgets-7.1.0-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Using cached docutils-0.14-py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Jinja2-2.10-py2.py3-none-any.whl Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached requests-2.18.4-py2.py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Babel-2.5.3-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Using cached imagesize-0.7.1-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Using cached alabaster-0.7.10-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Collecting macholib>=1.8 (from pyinstaller->-r requirements-dev.txt (line 15)) Using cached macholib-1.9-py2.py3-none-any.whl Collecting pefile>=2017.8.1 (from pyinstaller->-r requirements-dev.txt (line 15)) Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.1-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pytz-2017.3-py2.py3-none-any.whl Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached scipy-1.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pbr-3.1.1-py2.py3-none-any.whl Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Werkzeug-0.14.1-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting pillow (from wordcloud>=1.3.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Pillow-5.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting jupyter-core (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_core-4.4.0-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pyzmq-16.0.3-cp35-cp35m-manylinux1_x86_64.whl Collecting ipython>=4.0.0 (from ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython-6.2.1-py3-none-any.whl Collecting mistune>=0.7.4 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached mistune-0.8.3-py2.py3-none-any.whl Collecting testpath (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting entrypoints>=0.2.2 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting bleach (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached bleach-2.1.2-py2.py3-none-any.whl Collecting ipython-genutils (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached prompt_toolkit-1.0.15-py3-none-any.whl Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached terminado-0.8.1-py2.py3-none-any.whl Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached Send2Trash-1.4.2-py3-none-any.whl Collecting widgetsnbextension~=3.1.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Using cached widgetsnbextension-3.1.0-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached urllib3-1.22-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB) Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached idna-2.6-py2.py3-none-any.whl Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Using cached decorator-4.2.1-py2.py3-none-any.whl Collecting altgraph>=0.15 (from macholib>=1.8->pyinstaller->-r requirements-dev.txt (line 15)) Using cached altgraph-0.15-py2.py3-none-any.whl Collecting future (from pefile>=2017.8.1->pyinstaller->-r requirements-dev.txt (line 15)) Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached boto-2.48.0-py2.py3-none-any.whl Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached boto3-1.5.18-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pexpect-4.3.1-py2.py3-none-any.whl Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jedi-0.11.1-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached html5lib-1.0.1-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting botocore<1.9.0,>=1.8.32 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached botocore-1.8.32-py2.py3-none-any.whl Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached s3transfer-0.1.12-py2.py3-none-any.whl Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached jmespath-0.9.3-py2.py3-none-any.whl Collecting parso==0.1.1 (from jedi>=0.10->ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached parso-0.1.1-py2.py3-none-any.whl Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Installing collected packages: attrs, pluggy, six, py, pytest, coverage, pytest-cov, ipython-genutils, decorator, traitlets, jupyter-core, pyzmq, python-dateutil, jupyter-client, wcwidth, prompt-toolkit, simplegeneric, ptyprocess, pexpect, parso, jedi, pickleshare, Pygments, ipython, tornado, ipykernel, mistune, testpath, jsonschema, nbformat, pandocfilters, entrypoints, webencodings, html5lib, bleach, MarkupSafe, Jinja2, nbconvert, pyflakes, pytest-nbsmoke, jupyter-console, qtconsole, terminado, Send2Trash, notebook, widgetsnbextension, ipywidgets, jupyter, docutils, urllib3, chardet, certifi, idna, requests, pytz, babel, imagesize, alabaster, sphinxcontrib-websupport, snowballstemmer, sphinx, nbsphinx, commonmark, recommonmark, altgraph, macholib, future, pefile, pyinstaller, numpy, pandas, regex, scipy, bz2file, boto, jmespath, botocore, s3transfer, boto3, smart-open, gensim, pbr, lda, lxml, Werkzeug, itsdangerous, click, flask, pyparsing, cycler, matplotlib, PyYAML, bokeh, pillow, wordcloud, dariah-topics Running setup.py install for nbsphinx: started Running setup.py install for nbsphinx: finished with status 'done' Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.4.2 Werkzeug-0.14.1 alabaster-0.7.10 altgraph-0.15 attrs-17.4.0 babel-2.5.3 bleach-2.1.2 bokeh-0.12.13 boto-2.48.0 boto3-1.5.18 botocore-1.8.32 bz2file-0.98 certifi-2018.1.18 chardet-3.0.4 click-6.7 commonmark-0.5.4 coverage-4.4.2 cycler-0.10.0 dariah-topics decorator-4.2.1 docutils-0.14 entrypoints-0.2.3 flask-0.12.2 future-0.16.0 gensim-3.2.0 html5lib-1.0.1 idna-2.6 imagesize-0.7.1 ipykernel-4.8.0 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.1.0 itsdangerous-0.24 jedi-0.11.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.1 jupyter-console-5.2.0 jupyter-core-4.4.0 lda-1.0.5 lxml-4.1.1 macholib-1.9 matplotlib-2.1.1 mistune-0.8.3 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.3.1 notebook-5.3.1 numpy-1.14.0 pandas-0.22.0 pandocfilters-1.4.2 parso-0.1.1 pbr-3.1.1 pefile-2017.11.5 pexpect-4.3.1 pickleshare-0.7.4 pillow-5.0.0 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.2 pyflakes-1.6.0 pyinstaller-3.3.1 pyparsing-2.2.0 pytest-3.3.2 pytest-cov-2.5.1 pytest-nbsmoke-0.1.6 python-dateutil-2.6.1 pytz-2017.3 pyzmq-16.0.3 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.1.10 requests-2.18.4 s3transfer-0.1.12 scipy-1.0.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.6 snowballstemmer-1.2.1 sphinx-1.6.6 sphinxcontrib-websupport-1.0.1 terminado-0.8.1 testpath-0.3.1 tornado-4.5.3 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.1.0 wordcloud-1.3.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing requirements to dariah_topics.egg-info/requires.txt writing dariah_topics.egg-info/PKG-INFO writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing top-level names to dariah_topics.egg-info/top_level.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check warning: check: missing required meta-data: url creating dariah_topics-0.4.0.dev1 creating dariah_topics-0.4.0.dev1/dariah_topics creating dariah_topics-0.4.0.dev1/dariah_topics.egg-info creating dariah_topics-0.4.0.dev1/test copying files to dariah_topics-0.4.0.dev1... copying README.md -> dariah_topics-0.4.0.dev1 copying setup.cfg -> dariah_topics-0.4.0.dev1 copying setup.py -> dariah_topics-0.4.0.dev1 copying dariah_topics/__init__.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/doc_meta_list.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.4.0.dev1/test Writing dariah_topics-0.4.0.dev1/setup.cfg Creating tar archive removing 'dariah_topics-0.4.0.dev1' (and everything under it) running bdist_wheel running build running build_py installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doc_meta_list.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.4.0.dev1-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.4.0.dev1.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.4.0.dev1-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/doc_meta_list.py' adding 'dariah_topics/doclist.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/mallet.py' adding 'dariah_topics/meta.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-0.4.0.dev1.dist-info/DESCRIPTION.rst' adding 'dariah_topics-0.4.0.dev1.dist-info/metadata.json' adding 'dariah_topics-0.4.0.dev1.dist-info/top_level.txt' adding 'dariah_topics-0.4.0.dev1.dist-info/WHEEL' adding 'dariah_topics-0.4.0.dev1.dist-info/METADATA' adding 'dariah_topics-0.4.0.dev1.dist-info/RECORD' + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.5.3, pytest-3.3.2, py-1.5.2, pluggy-0.6.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: nbsmoke-0.1.6, cov-2.5.1 collected 61 items IntroducingGensim.ipynb . [ 1%] IntroducingLda.ipynb . [ 3%] IntroducingMallet.ipynb . [ 4%] Visualizations.ipynb . [ 6%] dariah_topics/mallet.py ...... [ 16%] dariah_topics/postprocessing.py ........... [ 34%] dariah_topics/preprocessing.py ........................... [ 78%] dariah_topics/visualization.py .. [ 81%] test/demonstrator_test.py .. [ 85%] test/test_fuzzy_segmenting.py ......... [100%] --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.5.3-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =============================== warnings summary =============================== dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._stopwords_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_stopwords /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) -- Docs: http://doc.pytest.org/en/latest/warnings.html =================== 61 passed, 10 warnings in 177.16 seconds =================== + ./setup.py build_sphinx -a running build_sphinx Running Sphinx v1.6.6 loading pickled environment... done [autosummary] generating autosummary for: IntroducingGensim.ipynb, IntroducingLda.ipynb, IntroducingMallet.ipynb, README.md, Visualizations.ipynb, demonstrator/README.md, docs/CONTRIBUTING.md, docs/gen/dariah_topics.doclist.rst, docs/gen/dariah_topics.evaluation.rst, docs/gen/dariah_topics.mallet.rst, docs/gen/dariah_topics.meta.rst, docs/gen/dariah_topics.preprocessing.rst, docs/gen/dariah_topics.rst, docs/gen/dariah_topics.visualization.rst, docs/gen/modules.rst, index.rst building [mo]: all of 0 po files building [html]: all source files updating environment: 0 added, 3 changed, 0 removed reading sources... [ 33%] Visualizations reading sources... [ 66%] demonstrator/README reading sources... [100%] docs/gen/dariah_topics.visualization looking for now-outdated files... none found pickling environment... done checking consistency... /mnt/data/jenkins/workspace/DARIAH-Topics/Visualizations.ipynb: WARNING: document isn't included in any toctree done preparing documents... done writing output... [ 6%] IntroducingGensim writing output... [ 12%] IntroducingLda writing output... [ 18%] IntroducingMallet writing output... [ 25%] README writing output... [ 31%] Visualizations writing output... [ 37%] demonstrator/README writing output... [ 43%] docs/CONTRIBUTING writing output... [ 50%] docs/gen/dariah_topics writing output... [ 56%] docs/gen/dariah_topics.doclist writing output... [ 62%] docs/gen/dariah_topics.evaluation writing output... [ 68%] docs/gen/dariah_topics.mallet writing output... [ 75%] docs/gen/dariah_topics.meta writing output... [ 81%] docs/gen/dariah_topics.preprocessing writing output... [ 87%] docs/gen/dariah_topics.visualization writing output... [ 93%] docs/gen/modules writing output... [100%] index /mnt/data/jenkins/workspace/DARIAH-Topics/docs/CONTRIBUTING.md: WARNING: Could not lex literal_block as "json". Highlighting skipped. /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:: WARNING: 'any' reference target not found: Path /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:10: WARNING: 'any' reference target not found: Path() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.get_docs:6: WARNING: 'any' reference target not found: _get_item(self, index) /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:: WARNING: 'any' reference target not found: Path /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:10: WARNING: 'any' reference target not found: Path() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.with_segment_files:5: WARNING: 'any' reference target not found: strings.Formatter /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/evaluation.py:docstring of dariah_topics.evaluation.read_dictionary:4: WARNING: 'any' reference target not found: create_dictionary() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/evaluation.py:docstring of dariah_topics.evaluation.read_sparse_bow:4: WARNING: 'any' reference target not found: create_sparse_bow() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:: WARNING: 'any' reference target not found: tokenized_document /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:: WARNING: 'any' reference target not found: tokenized_corpus /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:: WARNING: 'any' reference target not found: replacements_files /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth: or :py:meth: /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth: or :py:meth: /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: (?<name>pattern) /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: _ /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: author /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:: WARNING: 'any' reference target not found: title generating indices... genindex py-modindex highlighting module code... [ 16%] dariah_topics.preprocessing highlighting module code... [ 33%] dariah_topics.visualization highlighting module code... [ 50%] dariah_topics.mallet highlighting module code... [ 66%] dariah_topics.meta highlighting module code... [ 83%] dariah_topics.doclist highlighting module code... [100%] dariah_topics.evaluation writing additional pages... search copying images... [ 11%] build/sphinx/doctrees/nbsphinx/IntroducingMallet_60_0.png copying images... [ 22%] build/sphinx/doctrees/nbsphinx/Visualizations_27_0.png copying images... [ 33%] build/sphinx/doctrees/nbsphinx/Visualizations_20_0.png copying images... [ 44%] build/sphinx/doctrees/nbsphinx/IntroducingGensim_51_0.png copying images... [ 55%] build/sphinx/doctrees/nbsphinx/IntroducingLda_58_0.png copying images... [ 66%] build/sphinx/doctrees/nbsphinx/Visualizations_29_0.png copying images... [ 77%] build/sphinx/doctrees/nbsphinx/Visualizations_18_0.png copying images... [ 88%] demonstrator/screenshot.png copying images... [100%] build/sphinx/doctrees/nbsphinx/Visualizations_25_0.png copying static files... WARNING: html_static_path entry '/mnt/data/jenkins/workspace/DARIAH-Topics/docs/_static' does not exist done copying extra files... done dumping search index in English (code: en) ... done dumping object inventory... done build succeeded, 24 warnings. [DocLinks] Copying Documentation to 1 ... Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] SUCCESS on repos [GHRepository@5241268a[description=A Python library for Topic Modeling.,homepage=,name=Topics,license=<null>,fork=true,size=38362,milestones={},language=Jupyter Notebook,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Fri, 19 Jan 2018 02:13:22 GMT], ETag=[W/"0ae81deff6aa38e279d3b06e3db905c4"], Last-Modified=[Mon, 18 Dec 2017 12:03:31 GMT], OkHttp-Received-Millis=[1516328002341], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1516328002151], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[86CE:54F0:17558E9:2A5AA9E:5A615441], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4999], X-RateLimit-Reset=[1516331602], X-Runtime-rack=[0.084298], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:6ec37f3) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/6ec37f39282d8aac7f264429e050520eedc05295 Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: 1.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: SUCCESS