Console Output

Started by GitHub push by severinsimmler
Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10
Fetching upstream changes from https://github.com/DARIAH-DE/Topics
 > git --version # timeout=10
using GIT_ASKPASS to set credentials 
 > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
 > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10
Checking out Revision 375c0180d49816b1d3007d2d505b8e3ef8c0633d (refs/remotes/origin/testing)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 375c0180d49816b1d3007d2d505b8e3ef8c0633d
Commit message: "Tune version number"
 > git rev-list dd78cc060fa09a3533153d94d84796596b81ccb7 # timeout=10
[DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9
Using base prefix '/usr'
New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3
Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python
Installing setuptools, pip, wheel...done.
[DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda2673691041278186507.sh
+ pip install -U pip
Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages
+ pip install -U -r requirements-dev.txt
Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1))
Collecting pytest (from -r requirements-dev.txt (line 4))
  Downloading pytest-3.4.1-py2.py3-none-any.whl (188kB)
Collecting pytest-cov (from -r requirements-dev.txt (line 5))
  Using cached pytest_cov-2.5.1-py2.py3-none-any.whl
Collecting pytest-nbsmoke (from -r requirements-dev.txt (line 6))
  Using cached pytest_nbsmoke-0.1.6-py2.py3-none-any.whl
Collecting jupyter (from -r requirements-dev.txt (line 9))
  Using cached jupyter-1.0.0-py2.py3-none-any.whl
Collecting sphinx (from -r requirements-dev.txt (line 10))
  Using cached Sphinx-1.7.0-py2.py3-none-any.whl
Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11))
  Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-build-uefof9_9/nbsphinx
Collecting recommonmark (from -r requirements-dev.txt (line 12))
  Using cached recommonmark-0.4.0-py2.py3-none-any.whl
Collecting pyinstaller (from -r requirements-dev.txt (line 15))
  Using cached PyInstaller-3.3.1.tar.gz
Collecting pandas>=0.19.2 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading pandas-0.22.0-cp36-cp36m-manylinux1_x86_64.whl (26.2MB)
Collecting regex>=2017.01.14 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading regex-2018.02.21.tar.gz (620kB)
Collecting gensim>=0.13.2 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading gensim-3.3.0-cp36-cp36m-manylinux1_x86_64.whl (22.5MB)
Collecting lda>=1.0.5 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading lda-1.0.5-cp36-cp36m-manylinux1_x86_64.whl (510kB)
Collecting numpy>=1.3 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading numpy-1.14.1-cp36-cp36m-manylinux1_x86_64.whl (12.2MB)
Collecting lxml>=3.6.4 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading lxml-4.1.1-cp36-cp36m-manylinux1_x86_64.whl (5.6MB)
Collecting matplotlib>=1.5.3 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading matplotlib-2.1.2-cp36-cp36m-manylinux1_x86_64.whl (15.0MB)
Collecting bokeh>=0.12.6 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached bokeh-0.12.14.tar.gz
Collecting wordcloud>=1.3.1 (from dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached wordcloud-1.3.1.tar.gz
Collecting attrs>=17.2.0 (from pytest->-r requirements-dev.txt (line 4))
  Using cached attrs-17.4.0-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages (from pytest->-r requirements-dev.txt (line 4))
Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4))
  Using cached six-1.11.0-py2.py3-none-any.whl
Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4))
  Using cached py-1.5.2-py2.py3-none-any.whl
Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4))
Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5))
  Downloading coverage-4.5.1-cp36-cp36m-manylinux1_x86_64.whl (202kB)
Collecting ipykernel (from pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Downloading ipykernel-4.8.2-py3-none-any.whl (108kB)
Collecting nbformat (from pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached nbformat-4.4.0-py2.py3-none-any.whl
Collecting jupyter-client (from pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached jupyter_client-5.2.2-py2.py3-none-any.whl
Collecting nbconvert (from pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached nbconvert-5.3.1-py2.py3-none-any.whl
Collecting pyflakes (from pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached pyflakes-1.6.0-py2.py3-none-any.whl
Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9))
  Using cached qtconsole-4.3.1-py2.py3-none-any.whl
Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9))
  Using cached jupyter_console-5.2.0-py2.py3-none-any.whl
Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9))
  Downloading ipywidgets-7.1.2-py2.py3-none-any.whl (68kB)
Collecting notebook (from jupyter->-r requirements-dev.txt (line 9))
  Using cached notebook-5.4.0-py2.py3-none-any.whl
Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached Babel-2.5.3-py2.py3-none-any.whl
Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10))
  Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl
Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10))
  Downloading imagesize-1.0.0-py2.py3-none-any.whl
Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl
Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached Jinja2-2.10-py2.py3-none-any.whl
Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached Pygments-2.2.0-py2.py3-none-any.whl
Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached requests-2.18.4-py2.py3-none-any.whl
Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached docutils-0.14-py3-none-any.whl
Collecting packaging (from sphinx->-r requirements-dev.txt (line 10))
  Using cached packaging-16.8-py2.py3-none-any.whl
Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10))
  Using cached alabaster-0.7.10-py2.py3-none-any.whl
Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11))
  Using cached traitlets-4.3.2-py2.py3-none-any.whl
Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12))
  Using cached CommonMark-0.5.4.tar.gz
Collecting pefile>=2017.8.1 (from pyinstaller->-r requirements-dev.txt (line 15))
  Using cached pefile-2017.11.5.tar.gz
Collecting macholib>=1.8 (from pyinstaller->-r requirements-dev.txt (line 15))
  Using cached macholib-1.9-py2.py3-none-any.whl
Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached python_dateutil-2.6.1-py2.py3-none-any.whl
Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached pytz-2018.3-py2.py3-none-any.whl
Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached smart_open-1.5.6.tar.gz
Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading scipy-1.0.0-cp36-cp36m-manylinux1_x86_64.whl (50.0MB)
Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached pbr-3.1.1-py2.py3-none-any.whl
Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached cycler-0.10.0-py2.py3-none-any.whl
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached pyparsing-2.2.0-py2.py3-none-any.whl
Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached PyYAML-3.12.tar.gz
Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached tornado-4.5.3.tar.gz
Collecting pillow (from wordcloud>=1.3.1->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading Pillow-5.0.0-cp36-cp36m-manylinux1_x86_64.whl (5.9MB)
Collecting ipython>=4.0.0 (from ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached ipython-6.2.1-py3-none-any.whl
Collecting ipython-genutils (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl
Collecting jupyter-core (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached jupyter_core-4.4.0-py2.py3-none-any.whl
Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached jsonschema-2.6.0-py2.py3-none-any.whl
Collecting pyzmq>=13 (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Downloading pyzmq-17.0.0-cp36-cp36m-manylinux1_x86_64.whl (3.1MB)
Collecting mistune>=0.7.4 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached mistune-0.8.3-py2.py3-none-any.whl
Collecting bleach (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached bleach-2.1.2-py2.py3-none-any.whl
Collecting entrypoints>=0.2.2 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached entrypoints-0.2.3-py2.py3-none-any.whl
Collecting pandocfilters>=1.4.1 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached pandocfilters-1.4.2.tar.gz
Collecting testpath (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached testpath-0.3.1-py2.py3-none-any.whl
Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9))
  Using cached prompt_toolkit-1.0.15-py3-none-any.whl
Collecting widgetsnbextension~=3.1.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9))
  Downloading widgetsnbextension-3.1.4-py2.py3-none-any.whl (2.2MB)
Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9))
  Downloading Send2Trash-1.5.0-py3-none-any.whl
Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9))
  Using cached terminado-0.8.1-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10))
  Using cached MarkupSafe-1.0.tar.gz
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10))
  Using cached chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10))
  Using cached idna-2.6-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10))
  Using cached certifi-2018.1.18-py2.py3-none-any.whl
Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10))
  Using cached urllib3-1.22-py2.py3-none-any.whl
Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11))
  Using cached decorator-4.2.1-py2.py3-none-any.whl
Collecting future (from pefile>=2017.8.1->pyinstaller->-r requirements-dev.txt (line 15))
  Using cached future-0.16.0.tar.gz
Collecting altgraph>=0.15 (from macholib>=1.8->pyinstaller->-r requirements-dev.txt (line 15))
  Using cached altgraph-0.15-py2.py3-none-any.whl
Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached boto-2.48.0-py2.py3-none-any.whl
Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached bz2file-0.98.tar.gz
Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading boto3-1.5.33-py2.py3-none-any.whl (128kB)
Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached simplegeneric-0.8.1.zip
Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached jedi-0.11.1-py2.py3-none-any.whl
Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached pexpect-4.4.0-py2.py3-none-any.whl
Collecting pickleshare (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached pickleshare-0.7.4-py2.py3-none-any.whl
Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached html5lib-1.0.1-py2.py3-none-any.whl
Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9))
  Using cached wcwidth-0.1.7-py2.py3-none-any.whl
Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9))
  Using cached ptyprocess-0.5.2-py2.py3-none-any.whl
Collecting botocore<1.9.0,>=1.8.47 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading botocore-1.8.47-py2.py3-none-any.whl (4.1MB)
Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Using cached jmespath-0.9.3-py2.py3-none-any.whl
Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.6.0.dev1->-r requirements.txt (line 1))
  Downloading s3transfer-0.1.13-py2.py3-none-any.whl (59kB)
Collecting parso==0.1.1 (from jedi>=0.10->ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached parso-0.1.1-py2.py3-none-any.whl
Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6))
  Using cached webencodings-0.5.1-py2.py3-none-any.whl
Building wheels for collected packages: pyinstaller, regex, bokeh, wordcloud, commonmark, pefile, smart-open, PyYAML, tornado, pandocfilters, MarkupSafe, future, bz2file, simplegeneric
  Running setup.py bdist_wheel for pyinstaller: started
  Running setup.py bdist_wheel for pyinstaller: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/2f/f8/8e/d8ee9b359f487a8488380f0b522d81fa5ee01956161d41b72e
  Running setup.py bdist_wheel for regex: started
  Running setup.py bdist_wheel for regex: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/63/9b/ce/be27920ad7bd9d2a69038ff983cd9b0a053da42332f4e86361
  Running setup.py bdist_wheel for bokeh: started
  Running setup.py bdist_wheel for bokeh: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/e1/7c/57/11a9006179bd0fa4c26b964644c862c2862d011278c67f2ee8
  Running setup.py bdist_wheel for wordcloud: started
  Running setup.py bdist_wheel for wordcloud: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/d9/4c/ac/e63c45f2ce09860e9459a410953039c30296e89d9f7234675f
  Running setup.py bdist_wheel for commonmark: started
  Running setup.py bdist_wheel for commonmark: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/fd/3a/ea/9ead7944d8ba3771888487ca4f6ef39bcde9fd4e986c32f442
  Running setup.py bdist_wheel for pefile: started
  Running setup.py bdist_wheel for pefile: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/30/34/1d/5f4e14feee63c615a1ae25c211b21237a6a74c1c4fb6639842
  Running setup.py bdist_wheel for smart-open: started
  Running setup.py bdist_wheel for smart-open: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/36/48/35/97efc2bd1b233627131c9a936c9de23681846db707b907d353
  Running setup.py bdist_wheel for PyYAML: started
  Running setup.py bdist_wheel for PyYAML: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/2c/f7/79/13f3a12cd723892437c0cfbde1230ab4d82947ff7b3839a4fc
  Running setup.py bdist_wheel for tornado: started
  Running setup.py bdist_wheel for tornado: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/0c/21/02/8cdc6a381450df92b449ea7c57be653dd7aa80ba42c716212c
  Running setup.py bdist_wheel for pandocfilters: started
  Running setup.py bdist_wheel for pandocfilters: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/08/5b/5b/66b3cde6f8215f8345479ef3699d6ddbb860f6ea7072008f8b
  Running setup.py bdist_wheel for MarkupSafe: started
  Running setup.py bdist_wheel for MarkupSafe: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/88/a7/30/e39a54a87bcbe25308fa3ca64e8ddc75d9b3e5afa21ee32d57
  Running setup.py bdist_wheel for future: started
  Running setup.py bdist_wheel for future: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/c2/50/7c/0d83b4baac4f63ff7a765bd16390d2ab43c93587fac9d6017a
  Running setup.py bdist_wheel for bz2file: started
  Running setup.py bdist_wheel for bz2file: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/31/9c/20/996d65ca104cbca940b1b053299b68459391c01c774d073126
  Running setup.py bdist_wheel for simplegeneric: started
  Running setup.py bdist_wheel for simplegeneric: finished with status 'done'
  Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/7b/31/08/c85e74c84188cbec6a6827beec4d640f2bd78ae003dc1ec09d
Successfully built pyinstaller regex bokeh wordcloud commonmark pefile smart-open PyYAML tornado pandocfilters MarkupSafe future bz2file simplegeneric
Installing collected packages: attrs, six, py, pluggy, pytest, coverage, pytest-cov, decorator, ipython-genutils, traitlets, tornado, python-dateutil, pyzmq, jupyter-core, jupyter-client, simplegeneric, parso, jedi, ptyprocess, pexpect, pickleshare, Pygments, wcwidth, prompt-toolkit, ipython, ipykernel, jsonschema, nbformat, mistune, webencodings, html5lib, bleach, entrypoints, pandocfilters, testpath, MarkupSafe, Jinja2, nbconvert, pyflakes, pytest-nbsmoke, qtconsole, jupyter-console, Send2Trash, terminado, notebook, widgetsnbextension, ipywidgets, jupyter, pytz, babel, sphinxcontrib-websupport, imagesize, snowballstemmer, chardet, idna, certifi, urllib3, requests, docutils, pyparsing, packaging, alabaster, sphinx, nbsphinx, commonmark, recommonmark, future, pefile, altgraph, macholib, pyinstaller, numpy, pandas, regex, boto, bz2file, jmespath, botocore, s3transfer, boto3, smart-open, scipy, gensim, pbr, lda, lxml, cycler, matplotlib, PyYAML, bokeh, pillow, wordcloud, dariah-topics
  Running setup.py install for nbsphinx: started
    Running setup.py install for nbsphinx: finished with status 'done'
  Running setup.py develop for dariah-topics
Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.5.0 alabaster-0.7.10 altgraph-0.15 attrs-17.4.0 babel-2.5.3 bleach-2.1.2 bokeh-0.12.14 boto-2.48.0 boto3-1.5.33 botocore-1.8.47 bz2file-0.98 certifi-2018.1.18 chardet-3.0.4 commonmark-0.5.4 coverage-4.5.1 cycler-0.10.0 dariah-topics decorator-4.2.1 docutils-0.14 entrypoints-0.2.3 future-0.16.0 gensim-3.3.0 html5lib-1.0.1 idna-2.6 imagesize-1.0.0 ipykernel-4.8.2 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.1.2 jedi-0.11.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.2 jupyter-console-5.2.0 jupyter-core-4.4.0 lda-1.0.5 lxml-4.1.1 macholib-1.9 matplotlib-2.1.2 mistune-0.8.3 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.3.1 notebook-5.4.0 numpy-1.14.1 packaging-16.8 pandas-0.22.0 pandocfilters-1.4.2 parso-0.1.1 pbr-3.1.1 pefile-2017.11.5 pexpect-4.4.0 pickleshare-0.7.4 pillow-5.0.0 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.2 pyflakes-1.6.0 pyinstaller-3.3.1 pyparsing-2.2.0 pytest-3.4.1 pytest-cov-2.5.1 pytest-nbsmoke-0.1.6 python-dateutil-2.6.1 pytz-2018.3 pyzmq-17.0.0 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.2.21 requests-2.18.4 s3transfer-0.1.13 scipy-1.0.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.6 snowballstemmer-1.2.1 sphinx-1.7.0 sphinxcontrib-websupport-1.0.1 terminado-0.8.1 testpath-0.3.1 tornado-4.5.3 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.1.4 wordcloud-1.3.1
+ ./setup.py sdist bdist_wheel
running sdist
running egg_info
writing dariah_topics.egg-info/PKG-INFO
writing dependency_links to dariah_topics.egg-info/dependency_links.txt
writing requirements to dariah_topics.egg-info/requires.txt
writing top-level names to dariah_topics.egg-info/top_level.txt
reading manifest file 'dariah_topics.egg-info/SOURCES.txt'
writing manifest file 'dariah_topics.egg-info/SOURCES.txt'
running check
warning: check: missing required meta-data: url

creating dariah_topics-0.6.0.dev1
creating dariah_topics-0.6.0.dev1/dariah_topics
creating dariah_topics-0.6.0.dev1/dariah_topics.egg-info
creating dariah_topics-0.6.0.dev1/test
copying files to dariah_topics-0.6.0.dev1...
copying README.md -> dariah_topics-0.6.0.dev1
copying setup.cfg -> dariah_topics-0.6.0.dev1
copying setup.py -> dariah_topics-0.6.0.dev1
copying dariah_topics/__init__.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/doc_meta_list.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/doclist.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/evaluation.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/mallet.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/meta.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/postprocessing.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/preprocessing.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics/visualization.py -> dariah_topics-0.6.0.dev1/dariah_topics
copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.6.0.dev1/dariah_topics.egg-info
copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.6.0.dev1/dariah_topics.egg-info
copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.6.0.dev1/dariah_topics.egg-info
copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.6.0.dev1/dariah_topics.egg-info
copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.6.0.dev1/dariah_topics.egg-info
copying test/test_fuzzy_segmenting.py -> dariah_topics-0.6.0.dev1/test
Writing dariah_topics-0.6.0.dev1/setup.cfg
Creating tar archive
removing 'dariah_topics-0.6.0.dev1' (and everything under it)
running bdist_wheel
running build
running build_py
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/doc_meta_list.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
running install_egg_info
Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.6.0.dev1-py3.6.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/dariah_topics-0.6.0.dev1.dist-info/WHEEL
creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.6.0.dev1-py3-none-any.whl' and adding '.' to it
adding 'dariah_topics/__init__.py'
adding 'dariah_topics/doc_meta_list.py'
adding 'dariah_topics/doclist.py'
adding 'dariah_topics/evaluation.py'
adding 'dariah_topics/mallet.py'
adding 'dariah_topics/meta.py'
adding 'dariah_topics/postprocessing.py'
adding 'dariah_topics/preprocessing.py'
adding 'dariah_topics/visualization.py'
adding 'dariah_topics-0.6.0.dev1.dist-info/DESCRIPTION.rst'
adding 'dariah_topics-0.6.0.dev1.dist-info/metadata.json'
adding 'dariah_topics-0.6.0.dev1.dist-info/top_level.txt'
adding 'dariah_topics-0.6.0.dev1.dist-info/WHEEL'
adding 'dariah_topics-0.6.0.dev1.dist-info/METADATA'
adding 'dariah_topics-0.6.0.dev1.dist-info/RECORD'
+ pytest --nbsmoke-run
============================= test session starts ==============================
platform linux -- Python 3.6.4, pytest-3.4.1, py-1.5.2, pluggy-0.6.0
rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg
plugins: nbsmoke-0.1.6, cov-2.5.1
collected 59 items

IntroducingGensim.ipynb .                                                [  1%]
IntroducingLda.ipynb .                                                   [  3%]
IntroducingMallet.ipynb .                                                [  5%]
Visualizations.ipynb .                                                   [  6%]
dariah_topics/mallet.py ......                                           [ 16%]
dariah_topics/postprocessing.py ...........                              [ 35%]
dariah_topics/preprocessing.py ...........................               [ 81%]
dariah_topics/visualization.py ..                                        [ 84%]
test/test_fuzzy_segmenting.py .........                                  [100%]

--- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ----

----------- coverage: platform linux, python 3.6.4-final-0 -----------
Coverage HTML written to dir htmlcov
Coverage XML written to file coverage.xml

=============================== warnings summary ===============================
dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing._stopwords_large_corpus_model
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_stopwords
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features
  /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:736: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
    document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))

-- Docs: http://doc.pytest.org/en/latest/warnings.html
=================== 59 passed, 10 warnings in 181.55 seconds ===================
+ ./setup.py build_sphinx -a
running build_sphinx
Running Sphinx v1.7.0
loading pickled environment... done
[autosummary] generating autosummary for: IntroducingGensim.ipynb, IntroducingLda.ipynb, IntroducingMallet.ipynb, README.md, Visualizations.ipynb, docs/CONTRIBUTING.md, docs/gen/dariah_topics.doclist.rst, docs/gen/dariah_topics.evaluation.rst, docs/gen/dariah_topics.mallet.rst, docs/gen/dariah_topics.meta.rst, docs/gen/dariah_topics.preprocessing.rst, docs/gen/dariah_topics.rst, docs/gen/dariah_topics.visualization.rst, docs/gen/modules.rst, index.rst
loading intersphinx inventory from http://docs.python.org/3/objects.inv...
intersphinx inventory has moved: http://docs.python.org/3/objects.inv -> https://docs.python.org/3/objects.inv
loading intersphinx inventory from http://pandas.pydata.org/pandas-docs/stable/objects.inv...
loading intersphinx inventory from https://matplotlib.org/objects.inv...
loading intersphinx inventory from https://radimrehurek.com/gensim/objects.inv...
building [mo]: all of 0 po files
building [html]: all source files
updating environment: [config changed] 15 added, 0 changed, 0 removed
reading sources... [  6%] IntroducingGensim
reading sources... [ 13%] IntroducingLda
reading sources... [ 20%] IntroducingMallet
reading sources... [ 26%] README
reading sources... [ 33%] Visualizations
reading sources... [ 40%] docs/CONTRIBUTING
reading sources... [ 46%] docs/gen/dariah_topics
reading sources... [ 53%] docs/gen/dariah_topics.doclist
reading sources... [ 60%] docs/gen/dariah_topics.evaluation
reading sources... [ 66%] docs/gen/dariah_topics.mallet
reading sources... [ 73%] docs/gen/dariah_topics.meta
reading sources... [ 80%] docs/gen/dariah_topics.preprocessing
reading sources... [ 86%] docs/gen/dariah_topics.visualization
reading sources... [ 93%] docs/gen/modules
reading sources... [100%] index

/mnt/data/jenkins/workspace/DARIAH-Topics/IntroducingMallet.ipynb:848: WARNING: Title level inconsistent:

Path to MALLET folder
^^^^^^^^^^^^^^^^^^^^^
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.with_segment_files:12: WARNING: Unexpected indentation.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.train_topics:56: WARNING: Definition list ends without a blank line; unexpected unindent.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.forall:8: WARNING: Inline strong start-string without end-string.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.forall:8: WARNING: Inline emphasis start-string without end-string.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.forall:16: WARNING: Definition list ends without a blank line; unexpected unindent.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.forall:8: WARNING: Inline strong start-string without end-string.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.forall:8: WARNING: Inline emphasis start-string without end-string.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.forall:16: WARNING: Definition list ends without a blank line; unexpected unindent.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.read_from_pathlist:12: WARNING: Inline literal start-string without end-string.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:docstring of dariah_topics.preprocessing.read_model:3: WARNING: Unknown interpreted text role "module".
/mnt/data/jenkins/workspace/DARIAH-Topics/index.rst:9: WARNING: toctree contains reference to nonexisting document 'demonstrator/README'
looking for now-outdated files... none found
pickling environment... done
checking consistency... /mnt/data/jenkins/workspace/DARIAH-Topics/Visualizations.ipynb: WARNING: document isn't included in any toctree
done
preparing documents... done
writing output... [  6%] IntroducingGensim
writing output... [ 13%] IntroducingLda
writing output... [ 20%] IntroducingMallet
writing output... [ 26%] README
writing output... [ 33%] Visualizations
writing output... [ 40%] docs/CONTRIBUTING
writing output... [ 46%] docs/gen/dariah_topics
writing output... [ 53%] docs/gen/dariah_topics.doclist
writing output... [ 60%] docs/gen/dariah_topics.evaluation
writing output... [ 66%] docs/gen/dariah_topics.mallet
writing output... [ 73%] docs/gen/dariah_topics.meta
writing output... [ 80%] docs/gen/dariah_topics.preprocessing
writing output... [ 86%] docs/gen/dariah_topics.visualization
writing output... [ 93%] docs/gen/modules
writing output... [100%] index

/mnt/data/jenkins/workspace/DARIAH-Topics/docs/CONTRIBUTING.md: WARNING: Could not lex literal_block as "json". Highlighting skipped.
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:5: WARNING: 'any' reference target not found: Path
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.full_path:10: WARNING: 'any' reference target not found: Path()
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.BaseDocList.get_docs:6: WARNING: 'any' reference target not found: _get_item(self, index)
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:5: WARNING: 'any' reference target not found: Path
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.full_path:10: WARNING: 'any' reference target not found: Path()
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.get_docs:6: WARNING: 'any' reference target not found: _get_item(self, index)
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/doclist.py:docstring of dariah_topics.doclist.PathDocList.with_segment_files:5: WARNING: 'any' reference target not found: strings.Formatter
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/evaluation.py:docstring of dariah_topics.evaluation.read_dictionary:4: WARNING: 'any' reference target not found: create_dictionary()
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/evaluation.py:docstring of dariah_topics.evaluation.read_sparse_bow:4: WARNING: 'any' reference target not found: create_sparse_bow()
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:8: WARNING: 'any' reference target not found: tokenized_document
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:8: WARNING: 'any' reference target not found: tokenized_corpus
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/mallet.py:docstring of dariah_topics.mallet.Mallet.import_tokenized_corpus:41: WARNING: 'any' reference target not found: replacements_files
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth:`dariah_topics.doclist.BaseDocList.segments` or :py:meth:`dariah_topics.doclist.PathDocList.segments`
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.AbstractCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:3: WARNING: 'any' reference target not found: documents
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: more than one target found for 'any' cross-reference 'segments': could be :py:meth:`dariah_topics.doclist.BaseDocList.segments` or :py:meth:`dariah_topics.doclist.PathDocList.segments`
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.TableCorpus.flatten_segments:7: WARNING: 'any' reference target not found: segments=True
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:5: WARNING: 'any' reference target not found: (?<name>pattern)
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:5: WARNING: 'any' reference target not found: _
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:5: WARNING: 'any' reference target not found: author
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/meta.py:docstring of dariah_topics.meta.fn2metadata:5: WARNING: 'any' reference target not found: title
generating indices... genindex py-modindex
highlighting module code... [ 16%] dariah_topics.doclist
highlighting module code... [ 33%] dariah_topics.evaluation
highlighting module code... [ 50%] dariah_topics.mallet
highlighting module code... [ 66%] dariah_topics.meta
highlighting module code... [ 83%] dariah_topics.preprocessing
highlighting module code... [100%] dariah_topics.visualization

writing additional pages... search
copying images... [ 12%] build/sphinx/doctrees/nbsphinx/IntroducingGensim_51_0.png
copying images... [ 25%] build/sphinx/doctrees/nbsphinx/IntroducingLda_58_0.png
copying images... [ 37%] build/sphinx/doctrees/nbsphinx/IntroducingMallet_60_0.png
copying images... [ 50%] build/sphinx/doctrees/nbsphinx/Visualizations_18_0.png
copying images... [ 62%] build/sphinx/doctrees/nbsphinx/Visualizations_20_0.png
copying images... [ 75%] build/sphinx/doctrees/nbsphinx/Visualizations_25_0.png
copying images... [ 87%] build/sphinx/doctrees/nbsphinx/Visualizations_27_0.png
copying images... [100%] build/sphinx/doctrees/nbsphinx/Visualizations_29_0.png

copying static files... WARNING: html_static_path entry '/mnt/data/jenkins/workspace/DARIAH-Topics/docs/_static' does not exist
done
copying extra files... done
dumping search index in English (code: en) ... done
dumping object inventory... done
build succeeded, 37 warnings.

The HTML pages are in build/sphinx/html.
[DocLinks] Copying Documentation to 1 ...
Recording test results
[Cobertura] Publishing Cobertura coverage report...

[Cobertura] Publishing Cobertura coverage results...

[Cobertura] Cobertura coverage report found.

[Set GitHub commit status (universal)] SUCCESS on repos [GHRepository@3ab6a544[description=A Python library for Topic Modeling.,homepage=,name=Topics,license=<null>,fork=true,size=38790,milestones={},language=Jupyter Notebook,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/octet-stream], Date=[Wed, 21 Feb 2018 09:35:35 GMT], ETag=["b2348a40d24fc9d904d886abd08499d6"], Last-Modified=[Fri, 02 Feb 2018 01:31:32 GMT], OkHttp-Received-Millis=[1519205735961], OkHttp-Response-Source=[CONDITIONAL_CACHE 304], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1519205735759], Server=[GitHub.com], Status=[304 Not Modified], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[E790:5073:6E8EB2:C8C498:5A8D3D67], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4972], X-RateLimit-Reset=[1519207805], X-Runtime-rack=[0.090151], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:375c018) with context:DARIAH-Topics
Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/375c0180d49816b1d3007d2d505b8e3ef8c0633d
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
[ci-game] evaluating rule: Build result
[ci-game] scored: 1.0
[ci-game] evaluating rule: Increased number of passed tests
[ci-game] evaluating rule: Decreased number of passed tests
[ci-game] evaluating rule: Increased number of failed tests
[ci-game] evaluating rule: Decreased number of failed tests
[ci-game] evaluating rule: Increased number of skipped tests
[ci-game] evaluating rule: Decreased number of skipped tests
[ci-game] evaluating rule: Open HIGH priority tasks
[ci-game] evaluating rule: Open NORMAL priority tasks
[ci-game] evaluating rule: Open LOW priority tasks
[ci-game] evaluating rule: Changed number of compiler warnings
Finished: SUCCESS