Started by GitHub push by severinsimmler Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 4aaf5544d729e70a1b9cb4fc4e79bd68ecd3b5d7 (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f 4aaf5544d729e70a1b9cb4fc4e79bd68ecd3b5d7 Commit message: "Add link to help page" > git rev-list 4242df4cd6b230a5c1b0e8fd28d475bb464c15a2 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda2397263892146078136.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Using cached pytest-3.3.2-py2.py3-none-any.whl Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Using cached pytest_cov-2.5.1-py2.py3-none-any.whl Collecting pytest-nbsmoke (from -r requirements-dev.txt (line 6)) Using cached pytest_nbsmoke-0.1.6-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Using cached Sphinx-1.6.6-py2.py3-none-any.whl Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-build-q2ved019/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Using cached recommonmark-0.4.0-py2.py3-none-any.whl Collecting pyinstaller (from -r requirements-dev.txt (line 15)) Collecting pandas>=0.19.2 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pandas-0.22.0-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached gensim-3.2.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lda>=1.0.5 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached numpy-1.14.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached lxml-4.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting matplotlib>=1.5.3 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached matplotlib-2.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting wordcloud>=1.3.1 (from dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4)) Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached py-1.5.2-py2.py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) Collecting attrs>=17.2.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached attrs-17.4.0-py2.py3-none-any.whl Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached six-1.11.0-py2.py3-none-any.whl Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5)) Using cached coverage-4.4.2-cp35-cp35m-manylinux1_x86_64.whl Collecting jupyter-client (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_client-5.2.1-py2.py3-none-any.whl Collecting nbconvert (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbconvert-5.3.1-py2.py3-none-any.whl Collecting ipykernel (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipykernel-4.8.0-py3-none-any.whl Collecting pyflakes (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pyflakes-1.6.0-py2.py3-none-any.whl Collecting nbformat (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbformat-4.4.0-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Using cached qtconsole-4.3.1-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Using cached notebook-5.3.1-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Using cached ipywidgets-7.1.0-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Using cached jupyter_console-5.2.0-py2.py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Babel-2.5.3-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Using cached docutils-0.14-py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Using cached imagesize-0.7.1-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached requests-2.18.4-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Using cached alabaster-0.7.10-py2.py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Jinja2-2.10-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Collecting pefile>=2017.8.1 (from pyinstaller->-r requirements-dev.txt (line 15)) Collecting macholib>=1.8 (from pyinstaller->-r requirements-dev.txt (line 15)) Using cached macholib-1.9-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pytz-2017.3-py2.py3-none-any.whl Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.1-py2.py3-none-any.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached scipy-1.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pbr-3.1.1-py2.py3-none-any.whl Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Werkzeug-0.14.1-py2.py3-none-any.whl Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting pillow (from wordcloud>=1.3.1->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached Pillow-5.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting pyzmq>=13 (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pyzmq-16.0.3-cp35-cp35m-manylinux1_x86_64.whl Collecting jupyter-core (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_core-4.4.0-py2.py3-none-any.whl Collecting mistune>=0.7.4 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached mistune-0.8.3-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting bleach (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached bleach-2.1.2-py2.py3-none-any.whl Collecting testpath (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting entrypoints>=0.2.2 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting ipython>=4.0.0 (from ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython-6.2.1-py3-none-any.whl Collecting ipython-genutils (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached terminado-0.8.1-py2.py3-none-any.whl Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached Send2Trash-1.4.2-py3-none-any.whl Collecting widgetsnbextension~=3.1.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Using cached widgetsnbextension-3.1.0-py2.py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached prompt_toolkit-1.0.15-py3-none-any.whl Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached urllib3-1.22-py2.py3-none-any.whl Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached idna-2.6-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached certifi-2018.1.18-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Using cached decorator-4.2.1-py2.py3-none-any.whl Collecting future (from pefile>=2017.8.1->pyinstaller->-r requirements-dev.txt (line 15)) Collecting altgraph>=0.15 (from macholib>=1.8->pyinstaller->-r requirements-dev.txt (line 15)) Using cached altgraph-0.15-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached boto-2.48.0-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached boto3-1.5.18-py2.py3-none-any.whl Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached html5lib-1.0.1-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pexpect-4.3.1-py2.py3-none-any.whl Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jedi-0.11.1-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached jmespath-0.9.3-py2.py3-none-any.whl Collecting botocore<1.9.0,>=1.8.32 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached botocore-1.8.32-py2.py3-none-any.whl Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.4.0.dev1->-r requirements.txt (line 1)) Using cached s3transfer-0.1.12-py2.py3-none-any.whl Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Collecting parso==0.1.1 (from jedi>=0.10->ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached parso-0.1.1-py2.py3-none-any.whl Installing collected packages: pluggy, py, attrs, six, pytest, coverage, pytest-cov, python-dateutil, ipython-genutils, decorator, traitlets, pyzmq, jupyter-core, jupyter-client, Pygments, jsonschema, nbformat, mistune, pandocfilters, webencodings, html5lib, bleach, testpath, MarkupSafe, Jinja2, entrypoints, nbconvert, tornado, wcwidth, prompt-toolkit, pickleshare, ptyprocess, pexpect, parso, jedi, simplegeneric, ipython, ipykernel, pyflakes, pytest-nbsmoke, qtconsole, terminado, Send2Trash, notebook, widgetsnbextension, ipywidgets, jupyter-console, jupyter, pytz, babel, sphinxcontrib-websupport, docutils, imagesize, urllib3, idna, certifi, chardet, requests, snowballstemmer, alabaster, sphinx, nbsphinx, commonmark, recommonmark, future, pefile, altgraph, macholib, pyinstaller, numpy, pandas, regex, boto, bz2file, jmespath, botocore, s3transfer, boto3, smart-open, scipy, gensim, pbr, lda, lxml, Werkzeug, click, itsdangerous, flask, cycler, pyparsing, matplotlib, PyYAML, bokeh, pillow, wordcloud, dariah-topics Running setup.py install for nbsphinx: started Running setup.py install for nbsphinx: finished with status 'done' Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.4.2 Werkzeug-0.14.1 alabaster-0.7.10 altgraph-0.15 attrs-17.4.0 babel-2.5.3 bleach-2.1.2 bokeh-0.12.13 boto-2.48.0 boto3-1.5.18 botocore-1.8.32 bz2file-0.98 certifi-2018.1.18 chardet-3.0.4 click-6.7 commonmark-0.5.4 coverage-4.4.2 cycler-0.10.0 dariah-topics decorator-4.2.1 docutils-0.14 entrypoints-0.2.3 flask-0.12.2 future-0.16.0 gensim-3.2.0 html5lib-1.0.1 idna-2.6 imagesize-0.7.1 ipykernel-4.8.0 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.1.0 itsdangerous-0.24 jedi-0.11.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.1 jupyter-console-5.2.0 jupyter-core-4.4.0 lda-1.0.5 lxml-4.1.1 macholib-1.9 matplotlib-2.1.1 mistune-0.8.3 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.3.1 notebook-5.3.1 numpy-1.14.0 pandas-0.22.0 pandocfilters-1.4.2 parso-0.1.1 pbr-3.1.1 pefile-2017.11.5 pexpect-4.3.1 pickleshare-0.7.4 pillow-5.0.0 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.2 pyflakes-1.6.0 pyinstaller-3.3.1 pyparsing-2.2.0 pytest-3.3.2 pytest-cov-2.5.1 pytest-nbsmoke-0.1.6 python-dateutil-2.6.1 pytz-2017.3 pyzmq-16.0.3 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.1.10 requests-2.18.4 s3transfer-0.1.12 scipy-1.0.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.6 snowballstemmer-1.2.1 sphinx-1.6.6 sphinxcontrib-websupport-1.0.1 terminado-0.8.1 testpath-0.3.1 tornado-4.5.3 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.1.0 wordcloud-1.3.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing requirements to dariah_topics.egg-info/requires.txt writing top-level names to dariah_topics.egg-info/top_level.txt writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing dariah_topics.egg-info/PKG-INFO reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check warning: check: missing required meta-data: url creating dariah_topics-0.4.0.dev1 creating dariah_topics-0.4.0.dev1/dariah_topics creating dariah_topics-0.4.0.dev1/dariah_topics.egg-info creating dariah_topics-0.4.0.dev1/test copying files to dariah_topics-0.4.0.dev1... copying README.md -> dariah_topics-0.4.0.dev1 copying setup.cfg -> dariah_topics-0.4.0.dev1 copying setup.py -> dariah_topics-0.4.0.dev1 copying dariah_topics/__init__.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/doc_meta_list.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.4.0.dev1/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.4.0.dev1/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.4.0.dev1/test Writing dariah_topics-0.4.0.dev1/setup.cfg Creating tar archive removing 'dariah_topics-0.4.0.dev1' (and everything under it) running bdist_wheel running build running build_py installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doc_meta_list.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.4.0.dev1-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.4.0.dev1.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.4.0.dev1-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/doc_meta_list.py' adding 'dariah_topics/doclist.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/mallet.py' adding 'dariah_topics/meta.py' adding 'dariah_topics/model_creation.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-0.4.0.dev1.dist-info/DESCRIPTION.rst' adding 'dariah_topics-0.4.0.dev1.dist-info/metadata.json' adding 'dariah_topics-0.4.0.dev1.dist-info/top_level.txt' adding 'dariah_topics-0.4.0.dev1.dist-info/WHEEL' adding 'dariah_topics-0.4.0.dev1.dist-info/METADATA' adding 'dariah_topics-0.4.0.dev1.dist-info/RECORD' + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.5.3, pytest-3.3.2, py-1.5.2, pluggy-0.6.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: nbsmoke-0.1.6, cov-2.5.1 collected 61 items IntroducingGensim.ipynb . [ 1%] IntroducingLda.ipynb . [ 3%] IntroducingMallet.ipynb . [ 4%] Visualizations.ipynb . [ 6%] dariah_topics/mallet.py ...... [ 16%] dariah_topics/postprocessing.py ........... [ 34%] dariah_topics/preprocessing.py ........................... [ 78%] dariah_topics/visualization.py .. [ 81%] test/demonstrator_test.py F. [ 85%] test/test_fuzzy_segmenting.py ......... [100%] --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.5.3-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =================================== FAILURES =================================== ____________________ DemonstratorTestCase.test_render_index ____________________ self = <demonstrator_test.DemonstratorTestCase testMethod=test_render_index> def test_render_index(self): > rv = self.app.get('/') test/demonstrator_test.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py:830: in get return self.open(*args, **kw) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/testing.py:127: in open follow_redirects=follow_redirects) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py:803: in open response = self.run_wsgi_app(environ, buffered=buffered) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py:716: in run_wsgi_app rv = run_wsgi_app(self.application, environ, buffered=buffered) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py:923: in run_wsgi_app app_rv = app(environ, start_response) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1997: in __call__ return self.wsgi_app(environ, start_response) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1985: in wsgi_app response = self.handle_exception(e) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1540: in handle_exception reraise(exc_type, exc_value, tb) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py:33: in reraise raise value ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1982: in wsgi_app response = self.full_dispatch_request() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1614: in full_dispatch_request rv = self.handle_user_exception(e) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1517: in handle_user_exception reraise(exc_type, exc_value, tb) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py:33: in reraise raise value ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1612: in full_dispatch_request rv = self.dispatch_request() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1598: in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) demonstrator/webapp.py:59: in index return render_template('index.html') ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/templating.py:134: in render_template context, ctx.app) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/templating.py:116: in _render rv = template.render(context) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/jinja2/environment.py:1008: in render return self.environment.handle_exception(exc_info, True) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/jinja2/environment.py:780: in handle_exception reraise(exc_type, exc_value, tb) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/jinja2/_compat.py:37: in reraise raise value.with_traceback(tb) demonstrator/templates/index.html:41: in top-level template code <a href="{{ url_for('help') }}"><i class="icon-question-sign icon-white"></i> Help</a> ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/helpers.py:333: in url_for return appctx.app.handle_url_build_error(error, endpoint, values) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py:1805: in handle_url_build_error reraise(exc_type, exc_value, tb) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py:33: in reraise raise value ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/helpers.py:323: in url_for force_external=external) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <werkzeug.routing.MapAdapter object at 0x7f21dc080e10>, endpoint = 'help' values = {}, method = None, force_external = False, append_unknown = True def build(self, endpoint, values=None, method=None, force_external=False, append_unknown=True): """Building URLs works pretty much the other way round. Instead of `match` you call `build` and pass it the endpoint and a dict of arguments for the placeholders. The `build` function also accepts an argument called `force_external` which, if you set it to `True` will force external URLs. Per default external URLs (include the server name) will only be used if the target URL is on a different subdomain. >>> m = Map([ ... Rule('/', endpoint='index'), ... Rule('/downloads/', endpoint='downloads/index'), ... Rule('/downloads/<int:id>', endpoint='downloads/show') ... ]) >>> urls = m.bind("example.com", "/") >>> urls.build("index", {}) '/' >>> urls.build("downloads/show", {'id': 42}) '/downloads/42' >>> urls.build("downloads/show", {'id': 42}, force_external=True) 'http://example.com/downloads/42' Because URLs cannot contain non ASCII data you will always get bytestrings back. Non ASCII characters are urlencoded with the charset defined on the map instance. Additional values are converted to unicode and appended to the URL as URL querystring parameters: >>> urls.build("index", {'q': 'My Searchstring'}) '/?q=My+Searchstring' When processing those additional values, lists are furthermore interpreted as multiple values (as per :py:class:`werkzeug.datastructures.MultiDict`): >>> urls.build("index", {'q': ['a', 'b', 'c']}) '/?q=a&q=b&q=c' If a rule does not exist when building a `BuildError` exception is raised. The build method accepts an argument called `method` which allows you to specify the method you want to have an URL built for if you have different methods for the same endpoint specified. .. versionadded:: 0.6 the `append_unknown` parameter was added. :param endpoint: the endpoint of the URL to build. :param values: the values for the URL to build. Unhandled values are appended to the URL as query parameters. :param method: the HTTP method for the rule if there are different URLs for different methods on the same endpoint. :param force_external: enforce full canonical external URLs. If the URL scheme is not provided, this will generate a protocol-relative URL. :param append_unknown: unknown parameters are appended to the generated URL as query string argument. Disable this if you want the builder to ignore those. """ self.map.update() if values: if isinstance(values, MultiDict): valueiter = iteritems(values, multi=True) else: valueiter = iteritems(values) values = dict((k, v) for k, v in valueiter if v is not None) else: values = {} rv = self._partial_build(endpoint, values, method, append_unknown) if rv is None: > raise BuildError(endpoint, values, method, self) E werkzeug.routing.BuildError: Could not build url for endpoint 'help'. Did you mean 'modeling' instead? ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/routing.py:1776: BuildError ----------------------------- Captured stderr call ----------------------------- INFO webapp: Rendering index.html ... ------------------------------ Captured log call ------------------------------- webapp.py 58 INFO Rendering index.html ... =============================== warnings summary =============================== dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._stopwords_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_stopwords /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:739: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) -- Docs: http://doc.pytest.org/en/latest/warnings.html ============== 1 failed, 60 passed, 10 warnings in 123.05 seconds ============== Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@d23dc96[description=A Python library for Topic Modeling.,homepage=,name=Topics,license=<null>,fork=true,size=38362,milestones={},language=Jupyter Notebook,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Fri, 19 Jan 2018 03:01:37 GMT], ETag=[W/"850c317b6cc634cee8ea41b9ba4491a2"], Last-Modified=[Mon, 18 Dec 2017 12:03:31 GMT], OkHttp-Received-Millis=[1516330897924], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1516330897741], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[87D5:54EF:E5D2AE:1CF1CA5:5A615F91], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4990], X-RateLimit-Reset=[1516331602], X-Runtime-rack=[0.084640], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:4aaf554) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/4aaf5544d729e70a1b9cb4fc4e79bd68ecd3b5d7 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: -10.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE