Started by GitHub push by severinsimmler
Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10
Fetching upstream changes from https://github.com/DARIAH-DE/Topics
> git --version # timeout=10
using GIT_ASKPASS to set credentials
> git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
> git rev-parse refs/remotes/origin/testing^{commit} # timeout=10
> git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10
Checking out Revision 9cf5f1c3ef9f95064e9429fcd96289d764ba2723 (refs/remotes/origin/testing)
> git config core.sparsecheckout # timeout=10
> git checkout -f 9cf5f1c3ef9f95064e9429fcd96289d764ba2723
> git rev-list f468e2afeaeca91e329200f3eb5cb89d226a4e90 # timeout=10
[DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9
Using base prefix '/usr'
New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3
Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python
Installing setuptools, pip, wheel...done.
[DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda1803528996566293056.sh
+ pip install -U pip
Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages
+ pip install -U -r requirements-dev.txt
Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1))
Collecting nose (from -r requirements-dev.txt (line 2))
Using cached nose-1.3.7-py3-none-any.whl
Collecting nosexcover (from -r requirements-dev.txt (line 3))
Using cached nosexcover-1.0.11-py2.py3-none-any.whl
Collecting jupyter (from -r requirements-dev.txt (line 4))
Using cached jupyter-1.0.0-py2.py3-none-any.whl
Collecting pandas>=0.19.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pandas-0.19.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting regex>=2017.01.14 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting gensim>=0.13.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting matplotlib==1.5.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached matplotlib-1.5.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting numpy>=1.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached numpy-1.12.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting scipy>=0.7 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached scipy-0.19.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting werkzeug>=0.11.15 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Werkzeug-0.12.1-py2.py3-none-any.whl
Collecting flask>=0.11.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Flask-0.12.1-py2.py3-none-any.whl
Collecting wikipedia>=1.4.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting lxml>=3.6.4 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached lxml-3.7.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting pyLDAvis>=2.0.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3))
Using cached coverage-4.3.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4))
Using cached ipywidgets-6.0.0-py2.py3-none-any.whl
Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4))
Using cached nbconvert-5.1.1-py2.py3-none-any.whl
Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4))
Using cached ipykernel-4.6.1-py3-none-any.whl
Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_console-5.1.0-py2.py3-none-any.whl
Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4))
Using cached qtconsole-4.3.0-py2.py3-none-any.whl
Collecting notebook (from jupyter->-r requirements-dev.txt (line 4))
Using cached notebook-5.0.0-py2.py3-none-any.whl
Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached python_dateutil-2.6.0-py2.py3-none-any.whl
Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pytz-2017.2-py2.py3-none-any.whl
Requirement already up-to-date: six>=1.5.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: pyparsing!=2.0.0,!=2.0.4,!=2.1.2,>=1.5.6 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting cycler (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached cycler-0.10.0-py2.py3-none-any.whl
Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached click-6.7-py2.py3-none-any.whl
Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting Jinja2>=2.4 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Jinja2-2.9.6-py2.py3-none-any.whl
Collecting beautifulsoup4 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached beautifulsoup4-4.5.3-py3-none-any.whl
Collecting requests<3.0.0,>=2.0.0 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached requests-2.13.0-py2.py3-none-any.whl
Collecting pytest (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pytest-3.0.7-py2.py3-none-any.whl
Collecting joblib>=0.8.4 (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached joblib-0.11-py2.py3-none-any.whl
Collecting future (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting numexpr (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached numexpr-2.6.2-cp35-cp35m-manylinux1_x86_64.whl
Requirement already up-to-date: wheel>=0.23.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting funcy (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting ipython>=4.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached ipython-5.3.0-py3-none-any.whl
Collecting traitlets>=4.3.1 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached traitlets-4.3.2-py2.py3-none-any.whl
Collecting nbformat>=4.2.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached nbformat-4.3.0-py2.py3-none-any.whl
Collecting widgetsnbextension~=2.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached widgetsnbextension-2.0.0-py2.py3-none-any.whl
Collecting pygments (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached Pygments-2.2.0-py2.py3-none-any.whl
Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached testpath-0.3-py2.py3-none-any.whl
Collecting mistune!=0.6 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached mistune-0.7.4-py2.py3-none-any.whl
Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached entrypoints-0.2.2-py2.py3-none-any.whl
Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached bleach-2.0.0-py2.py3-none-any.whl
Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Collecting jupyter-core (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_core-4.3.0-py2.py3-none-any.whl
Collecting jupyter-client (from ipykernel->jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_client-5.0.1-py2.py3-none-any.whl
Collecting tornado>=4.0 (from ipykernel->jupyter->-r requirements-dev.txt (line 4))
Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached prompt_toolkit-1.0.14-py3-none-any.whl
Collecting ipython-genutils (from qtconsole->jupyter->-r requirements-dev.txt (line 4))
Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl
Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4))
Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached boto-2.46.1-py2.py3-none-any.whl
Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting py>=1.4.29 (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached py-1.4.33-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached pexpect-4.2.1-py2.py3-none-any.whl
Collecting pickleshare (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached pickleshare-0.7.4-py2.py3-none-any.whl
Collecting decorator (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached decorator-4.0.11-py2.py3-none-any.whl
Collecting jsonschema!=2.5.0,>=2.4 (from nbformat>=4.2.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached jsonschema-2.6.0-py2.py3-none-any.whl
Collecting html5lib>=0.99999999 (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached html5lib-0.999999999-py2.py3-none-any.whl
Collecting pyzmq>=13 (from jupyter-client->ipykernel->jupyter->-r requirements-dev.txt (line 4))
Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached wcwidth-0.1.7-py2.py3-none-any.whl
Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4))
Using cached ptyprocess-0.5.1-py2.py3-none-any.whl
Requirement already up-to-date: appdirs>=1.4.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: packaging>=16.8 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting webencodings (from html5lib>=0.99999999->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached webencodings-0.5.1-py2.py3-none-any.whl
Installing collected packages: nose, coverage, nosexcover, pygments, decorator, ipython-genutils, traitlets, simplegeneric, wcwidth, prompt-toolkit, ptyprocess, pexpect, pickleshare, ipython, python-dateutil, jupyter-core, pyzmq, jupyter-client, tornado, ipykernel, jsonschema, nbformat, testpath, mistune, MarkupSafe, Jinja2, entrypoints, webencodings, html5lib, bleach, pandocfilters, nbconvert, terminado, notebook, widgetsnbextension, ipywidgets, jupyter-console, qtconsole, jupyter, numpy, pytz, pandas, regex, scipy, requests, boto, bz2file, smart-open, gensim, cycler, matplotlib, werkzeug, click, itsdangerous, flask, beautifulsoup4, wikipedia, lxml, py, pytest, joblib, future, numexpr, funcy, pyLDAvis, dariah-topics
Running setup.py develop for dariah-topics
Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 beautifulsoup4-4.5.3 bleach-2.0.0 boto-2.46.1 bz2file-0.98 click-6.7 coverage-4.3.4 cycler-0.10.0 dariah-topics decorator-4.0.11 entrypoints-0.2.2 flask-0.12.1 funcy-1.7.3 future-0.16.0 gensim-2.0.0 html5lib-0.999999999 ipykernel-4.6.1 ipython-5.3.0 ipython-genutils-0.2.0 ipywidgets-6.0.0 itsdangerous-0.24 joblib-0.11 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.0.1 jupyter-console-5.1.0 jupyter-core-4.3.0 lxml-3.7.3 matplotlib-1.5.3 mistune-0.7.4 nbconvert-5.1.1 nbformat-4.3.0 nose-1.3.7 nosexcover-1.0.11 notebook-5.0.0 numexpr-2.6.2 numpy-1.12.1 pandas-0.19.2 pandocfilters-1.4.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.14 ptyprocess-0.5.1 py-1.4.33 pyLDAvis-2.1.1 pygments-2.2.0 pytest-3.0.7 python-dateutil-2.6.0 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.0 regex-2017.4.5 requests-2.13.0 scipy-0.19.0 simplegeneric-0.8.1 smart-open-1.5.2 terminado-0.6 testpath-0.3 tornado-4.5 traitlets-4.3.2 wcwidth-0.1.7 webencodings-0.5.1 werkzeug-0.12.1 widgetsnbextension-2.0.0 wikipedia-1.4.0
+ ./setup.py sdist bdist_wheel
running sdist
running egg_info
writing dependency_links to dariah_topics.egg-info/dependency_links.txt
writing dariah_topics.egg-info/PKG-INFO
writing requirements to dariah_topics.egg-info/requires.txt
writing top-level names to dariah_topics.egg-info/top_level.txt
writing manifest file 'dariah_topics.egg-info/SOURCES.txt'
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:334: UserWarning: Normalizing '0.2.0dev0' to '0.2.0.dev0'
normalized_version,
warning: sdist: standard file not found: should have one of README, README.rst, README.txt
running check
warning: check: missing required meta-data: url
creating dariah_topics-0.2.0.dev0
creating dariah_topics-0.2.0.dev0/dariah_topics
creating dariah_topics-0.2.0.dev0/dariah_topics.egg-info
creating dariah_topics-0.2.0.dev0/test
copying files to dariah_topics-0.2.0.dev0...
copying setup.cfg -> dariah_topics-0.2.0.dev0
copying setup.py -> dariah_topics-0.2.0.dev0
copying dariah_topics/__init__.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/doclist.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/evaluation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/mallet.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/model_creation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/preprocessing.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/visualization.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying test/test_fuzzy_segmenting.py -> dariah_topics-0.2.0.dev0/test
Writing dariah_topics-0.2.0.dev0/setup.cfg
Creating tar archive
removing 'dariah_topics-0.2.0.dev0' (and everything under it)
running bdist_wheel
running build
running build_py
copying dariah_topics/preprocessing.py -> build/lib/dariah_topics
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics
running install_egg_info
Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0-py3.5.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0.dist-info/WHEEL
+ nosetests
........F.................................
======================================================================
FAIL: Doctest: dariah_topics.preprocessing.gensim2dataframe
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 2190, in runTest
raise self.failureException(self.format_failure(new.getvalue()))
AssertionError: Failed doctest test for dariah_topics.preprocessing.gensim2dataframe
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 755, in gensim2dataframe
----------------------------------------------------------------------
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 779, in dariah_topics.preprocessing.gensim2dataframe
Failed example:
isinstance(gensim2dataframe(model, 2), pd.DataFrame)
Exception raised:
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 1321, in __run
compileflags, 1), test.globs)
File "<doctest dariah_topics.preprocessing.gensim2dataframe[6]>", line 1, in <module>
isinstance(gensim2dataframe(model, 2), pd.DataFrame)
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 789, in gensim2dataframe
topics_df.loc[idx] = temp
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/indexing.py", line 141, in __setitem__
self._setitem_with_indexer(indexer, value)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/indexing.py", line 579, in _setitem_with_indexer
value=value)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/internals.py", line 3168, in setitem
return self.apply('setitem', **kwargs)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/internals.py", line 3056, in apply
applied = getattr(b, f)(**kwargs)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/internals.py", line 740, in setitem
values[indexer] = value
ValueError: cannot copy sequence with size 4 to array axis with dimension 2
-------------------- >> begin captured logging << --------------------
gensim.corpora.dictionary: INFO: adding document #0 to Dictionary(0 unique tokens: [])
gensim.corpora.dictionary: INFO: built Dictionary(4 unique tokens: ['corpus', 'test', 'for', 'testing']) from 2 documents (total 4 corpus positions)
gensim.models.ldamodel: INFO: using symmetric alpha at 1.0
gensim.models.ldamodel: INFO: using symmetric eta at 0.25
gensim.models.ldamodel: INFO: using serial LDA version on this node
gensim.models.ldamodel: INFO: running online LDA training, 1 topics, 1 passes over the supplied corpus of 2 documents, updating model once every 2 documents, evaluating perplexity every 2 documents, iterating 1x with a convergence threshold of 0.001000
gensim.models.ldamodel: WARNING: too few updates, training might not converge; consider increasing the number of passes or iterations to improve accuracy
gensim.models.ldamodel: DEBUG: bound: at document #0
gensim.models.ldamodel: INFO: -1.688 per-word bound, 3.2 perplexity estimate based on a held-out corpus of 2 documents with 4 words
gensim.models.ldamodel: INFO: PROGRESS: pass 0, at document #2/2
gensim.models.ldamodel: DEBUG: performing inference on a chunk of 2 documents
gensim.models.ldamodel: DEBUG: 0/2 documents converged within 1 iterations
gensim.models.ldamodel: DEBUG: updating topics
gensim.models.ldamodel: INFO: topic #0 (1.000): 0.250*"corpus" + 0.250*"test" + 0.250*"for" + 0.250*"testing"
gensim.models.ldamodel: INFO: topic diff=0.279796, rho=1.000000
--------------------- >> end captured logging << ---------------------
Name Stmts Miss Cover
-----------------------------------------------------
dariah_topics.py 0 0 100%
dariah_topics/doclist.py 89 14 84%
dariah_topics/evaluation.py 100 80 20%
dariah_topics/mallet.py 216 173 20%
dariah_topics/model_creation.py 60 43 28%
dariah_topics/preprocessing.py 202 8 96%
dariah_topics/visualization.py 165 136 18%
-----------------------------------------------------
TOTAL 832 454 45%
----------------------------------------------------------------------
Ran 42 tests in 18.178s
FAILED (failures=1)
Build step 'Virtualenv Builder' marked build as failure
Recording test results
Skipping Cobertura coverage report as build was not UNSTABLE or better ...
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@4e190373[description=A python library for topic modeling.,homepage=,name=Topics,license=<null>,fork=true,size=96681,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:9cf5f1c) with context:DARIAH-Topics
Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/9cf5f1c3ef9f95064e9429fcd96289d764ba2723
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
Finished: FAILURE