Started by GitHub push by MHuberFaust Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 04b857dce07574e2b2ed31d85669b3f3af59bd3f (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f 04b857dce07574e2b2ed31d85669b3f3af59bd3f Commit message: "next try..." > git rev-list 78d2aed7a6f67f30f98b924ffe814f683f41f5ec # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda4166750899877375268.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Using cached pytest-3.4.0-py2.py3-none-any.whl Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Using cached pytest_cov-2.5.1-py2.py3-none-any.whl Collecting pytest-nbsmoke (from -r requirements-dev.txt (line 6)) Using cached pytest_nbsmoke-0.1.6-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Downloading Sphinx-1.7.0-py2.py3-none-any.whl (1.9MB) Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-build-c_1abadb/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Using cached recommonmark-0.4.0-py2.py3-none-any.whl Collecting pyinstaller (from -r requirements-dev.txt (line 15)) Collecting pandas>=0.19.2 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached pandas-0.22.0-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading regex-2018.02.08.tar.gz (620kB) Collecting gensim>=0.13.2 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading gensim-3.3.0-cp35-cp35m-manylinux1_x86_64.whl (22.5MB) Collecting lda>=1.0.5 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached numpy-1.14.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached lxml-4.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting matplotlib>=1.5.3 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached matplotlib-2.1.2-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading bokeh-0.12.14.tar.gz (15.4MB) Collecting wordcloud>=1.3.1 (from dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting attrs>=17.2.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached attrs-17.4.0-py2.py3-none-any.whl Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4)) Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached py-1.5.2-py2.py3-none-any.whl Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached six-1.11.0-py2.py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5)) Downloading coverage-4.5.1-cp35-cp35m-manylinux1_x86_64.whl (202kB) Collecting jupyter-client (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_client-5.2.2-py2.py3-none-any.whl Collecting pyflakes (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pyflakes-1.6.0-py2.py3-none-any.whl Collecting nbconvert (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbconvert-5.3.1-py2.py3-none-any.whl Collecting ipykernel (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading ipykernel-4.8.1-py3-none-any.whl (108kB) Collecting nbformat (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached nbformat-4.4.0-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Using cached jupyter_console-5.2.0-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Using cached notebook-5.4.0-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Using cached qtconsole-4.3.1-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Using cached ipywidgets-7.1.1-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Using cached docutils-0.14-py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Babel-2.5.3-py2.py3-none-any.whl Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached requests-2.18.4-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Jinja2-2.10-py2.py3-none-any.whl Collecting packaging (from sphinx->-r requirements-dev.txt (line 10)) Using cached packaging-16.8-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Using cached alabaster-0.7.10-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Using cached imagesize-0.7.1-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Collecting pefile>=2017.8.1 (from pyinstaller->-r requirements-dev.txt (line 15)) Collecting macholib>=1.8 (from pyinstaller->-r requirements-dev.txt (line 15)) Using cached macholib-1.9-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading pytz-2018.3-py2.py3-none-any.whl (509kB) Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.1-py2.py3-none-any.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached scipy-1.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached pbr-3.1.1-py2.py3-none-any.whl Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached Werkzeug-0.14.1-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting pillow (from wordcloud>=1.3.1->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached Pillow-5.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting jupyter-core (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jupyter_core-4.4.0-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading pyzmq-17.0.0-cp35-cp35m-manylinux1_x86_64.whl (3.1MB) Collecting entrypoints>=0.2.2 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting testpath (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting mistune>=0.7.4 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached mistune-0.8.3-py2.py3-none-any.whl Collecting bleach (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached bleach-2.1.2-py2.py3-none-any.whl Collecting ipython>=4.0.0 (from ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython-6.2.1-py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting ipython-genutils (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached prompt_toolkit-1.0.15-py3-none-any.whl Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached Send2Trash-1.4.2-py3-none-any.whl Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached terminado-0.8.1-py2.py3-none-any.whl Collecting widgetsnbextension~=3.1.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Using cached widgetsnbextension-3.1.3-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached certifi-2018.1.18-py2.py3-none-any.whl Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached urllib3-1.22-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached idna-2.6-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Using cached decorator-4.2.1-py2.py3-none-any.whl Collecting future (from pefile>=2017.8.1->pyinstaller->-r requirements-dev.txt (line 15)) Collecting altgraph>=0.15 (from macholib>=1.8->pyinstaller->-r requirements-dev.txt (line 15)) Using cached altgraph-0.15-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading boto3-1.5.27-py2.py3-none-any.whl (128kB) Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached boto-2.48.0-py2.py3-none-any.whl Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached html5lib-1.0.1-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading pexpect-4.4.0-py2.py3-none-any.whl (56kB) Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached jedi-0.11.1-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached jmespath-0.9.3-py2.py3-none-any.whl Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Using cached s3transfer-0.1.12-py2.py3-none-any.whl Collecting botocore<1.9.0,>=1.8.41 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.5.0.dev1->-r requirements.txt (line 1)) Downloading botocore-1.8.41-py2.py3-none-any.whl (4.1MB) Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Collecting parso==0.1.1 (from jedi>=0.10->ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Using cached parso-0.1.1-py2.py3-none-any.whl Building wheels for collected packages: regex, bokeh Running setup.py bdist_wheel for regex: started Running setup.py bdist_wheel for regex: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/bc/9f/af/f1f6e517b4f9f4f80aefaedbbff1d6843882295bc17b0e8555 Running setup.py bdist_wheel for bokeh: started Running setup.py bdist_wheel for bokeh: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/e1/7c/57/11a9006179bd0fa4c26b964644c862c2862d011278c67f2ee8 Successfully built regex bokeh Installing collected packages: attrs, pluggy, py, six, pytest, coverage, pytest-cov, python-dateutil, decorator, ipython-genutils, traitlets, jupyter-core, pyzmq, tornado, jupyter-client, pyflakes, entrypoints, Pygments, jsonschema, nbformat, pandocfilters, testpath, mistune, webencodings, html5lib, bleach, MarkupSafe, Jinja2, nbconvert, wcwidth, prompt-toolkit, pickleshare, simplegeneric, ptyprocess, pexpect, parso, jedi, ipython, ipykernel, pytest-nbsmoke, jupyter-console, Send2Trash, terminado, notebook, qtconsole, widgetsnbextension, ipywidgets, jupyter, snowballstemmer, docutils, pytz, babel, certifi, urllib3, chardet, idna, requests, sphinxcontrib-websupport, pyparsing, packaging, alabaster, imagesize, sphinx, nbsphinx, commonmark, recommonmark, future, pefile, altgraph, macholib, pyinstaller, numpy, pandas, regex, bz2file, jmespath, botocore, s3transfer, boto3, boto, smart-open, scipy, gensim, pbr, lda, lxml, click, itsdangerous, Werkzeug, flask, cycler, matplotlib, PyYAML, bokeh, pillow, wordcloud, dariah-topics Running setup.py install for nbsphinx: started Running setup.py install for nbsphinx: finished with status 'done' Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.4.2 Werkzeug-0.14.1 alabaster-0.7.10 altgraph-0.15 attrs-17.4.0 babel-2.5.3 bleach-2.1.2 bokeh-0.12.14 boto-2.48.0 boto3-1.5.27 botocore-1.8.41 bz2file-0.98 certifi-2018.1.18 chardet-3.0.4 click-6.7 commonmark-0.5.4 coverage-4.5.1 cycler-0.10.0 dariah-topics decorator-4.2.1 docutils-0.14 entrypoints-0.2.3 flask-0.12.2 future-0.16.0 gensim-3.3.0 html5lib-1.0.1 idna-2.6 imagesize-0.7.1 ipykernel-4.8.1 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.1.1 itsdangerous-0.24 jedi-0.11.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.2 jupyter-console-5.2.0 jupyter-core-4.4.0 lda-1.0.5 lxml-4.1.1 macholib-1.9 matplotlib-2.1.2 mistune-0.8.3 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.3.1 notebook-5.4.0 numpy-1.14.0 packaging-16.8 pandas-0.22.0 pandocfilters-1.4.2 parso-0.1.1 pbr-3.1.1 pefile-2017.11.5 pexpect-4.4.0 pickleshare-0.7.4 pillow-5.0.0 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.2 pyflakes-1.6.0 pyinstaller-3.3.1 pyparsing-2.2.0 pytest-3.4.0 pytest-cov-2.5.1 pytest-nbsmoke-0.1.6 python-dateutil-2.6.1 pytz-2018.3 pyzmq-17.0.0 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.2.8 requests-2.18.4 s3transfer-0.1.12 scipy-1.0.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.6 snowballstemmer-1.2.1 sphinx-1.7.0 sphinxcontrib-websupport-1.0.1 terminado-0.8.1 testpath-0.3.1 tornado-4.5.3 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.1.3 wordcloud-1.3.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing top-level names to dariah_topics.egg-info/top_level.txt writing dariah_topics.egg-info/PKG-INFO writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing requirements to dariah_topics.egg-info/requires.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check warning: check: missing required meta-data: url creating dariah_topics-0.5.0.dev1 creating dariah_topics-0.5.0.dev1/dariah_topics creating dariah_topics-0.5.0.dev1/dariah_topics.egg-info creating dariah_topics-0.5.0.dev1/test copying files to dariah_topics-0.5.0.dev1... copying README.md -> dariah_topics-0.5.0.dev1 copying setup.cfg -> dariah_topics-0.5.0.dev1 copying setup.py -> dariah_topics-0.5.0.dev1 copying dariah_topics/__init__.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/doc_meta_list.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.5.0.dev1/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.5.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.5.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.5.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.5.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.5.0.dev1/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.5.0.dev1/test Writing dariah_topics-0.5.0.dev1/setup.cfg Creating tar archive removing 'dariah_topics-0.5.0.dev1' (and everything under it) running bdist_wheel running build running build_py copying dariah_topics/visualization.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doc_meta_list.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.5.0.dev1-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.5.0.dev1.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.5.0.dev1-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/doc_meta_list.py' adding 'dariah_topics/doclist.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/mallet.py' adding 'dariah_topics/meta.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-0.5.0.dev1.dist-info/DESCRIPTION.rst' adding 'dariah_topics-0.5.0.dev1.dist-info/metadata.json' adding 'dariah_topics-0.5.0.dev1.dist-info/top_level.txt' adding 'dariah_topics-0.5.0.dev1.dist-info/WHEEL' adding 'dariah_topics-0.5.0.dev1.dist-info/METADATA' adding 'dariah_topics-0.5.0.dev1.dist-info/RECORD' + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.5.3, pytest-3.4.0, py-1.5.2, pluggy-0.6.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: nbsmoke-0.1.6, cov-2.5.1 collected 59 items IntroducingGensim.ipynb . [ 1%] IntroducingLda.ipynb . [ 3%] IntroducingMallet.ipynb . [ 5%] Visualizations.ipynb . [ 6%] dariah_topics/mallet.py ...... [ 16%] dariah_topics/postprocessing.py ........... [ 35%] dariah_topics/preprocessing.py ........................... [ 81%] dariah_topics/visualization.py .. [ 84%] test/test_fuzzy_segmenting.py ......... [100%] --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.5.3-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =============================== warnings summary =============================== dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._stopwords_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_stopwords /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) -- Docs: http://doc.pytest.org/en/latest/warnings.html =================== 59 passed, 10 warnings in 183.42 seconds =================== + ./setup.py build_sphinx -a running build_sphinx Running Sphinx v1.7.0 loading pickled environment... done [autosummary] generating autosummary for: IntroducingGensim.ipynb, IntroducingLda.ipynb, IntroducingMallet.ipynb, README.md, Visualizations.ipynb, docs/CONTRIBUTING.md, docs/gen/dariah_topics.doclist.rst, docs/gen/dariah_topics.evaluation.rst, docs/gen/dariah_topics.mallet.rst, docs/gen/dariah_topics.meta.rst, docs/gen/dariah_topics.preprocessing.rst, docs/gen/dariah_topics.rst, docs/gen/dariah_topics.visualization.rst, docs/gen/modules.rst, index.rst loading intersphinx inventory from https://radimrehurek.com/gensim/objects.inv... loading intersphinx inventory from http://pandas.pydata.org/pandas-docs/stable/objects.inv... loading intersphinx inventory from https://matplotlib.org/objects.inv... loading intersphinx inventory from http://docs.python.org/3/objects.inv... intersphinx inventory has moved: http://docs.python.org/3/objects.inv -> https://docs.python.org/3/objects.inv building [mo]: all of 0 po files building [html]: all source files updating environment: 0 added, 6 changed, 0 removed reading sources... [ 16%] README /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/importlib/_bootstrap.py:222: RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility. Expected 96, got 88 return f(*args, **kwds) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/importlib/_bootstrap.py:222: RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility. Expected 96, got 88 return f(*args, **kwds) Exception occurred: File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/docutils/transforms/universal.py", line 263, in apply lc_smartquotes = self.document.settings.smartquotes_locales AttributeError: 'Values' object has no attribute 'smartquotes_locales' The full traceback has been saved in /tmp/sphinx-err-xtn91l_f.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! + echo WARNING: Building the documentation failed WARNING: Building the documentation failed [DocLinks] Copying Documentation to 1 ... Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] SUCCESS on repos [GHRepository@494f02b6[description=A Python library for Topic Modeling.,homepage=,name=Topics,license=,fork=true,size=38768,milestones={},language=Jupyter Notebook,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Tue, 13 Feb 2018 08:42:08 GMT], ETag=[W/"612dfc99ec5b96704d5611d39563d62b"], Last-Modified=[Fri, 02 Feb 2018 01:31:32 GMT], OkHttp-Received-Millis=[1518511328604], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1518511328412], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[9441:7EAC:2CD72D:4C94B5:5A82A4E0], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4999], X-RateLimit-Reset=[1518514928], X-Runtime-rack=[0.094359], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:04b857d) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/04b857dce07574e2b2ed31d85669b3f3af59bd3f Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: 1.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: SUCCESS