Skip to content
Failed

Console Output

Started by GitHub push by Philipduerholt
Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10
Fetching upstream changes from https://github.com/DARIAH-DE/Topics
 > git --version # timeout=10
using GIT_ASKPASS to set credentials 
 > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
 > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10
Checking out Revision 07a4caa3504af1636f35798bf13cdcb5a64139dc (refs/remotes/origin/testing)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 07a4caa3504af1636f35798bf13cdcb5a64139dc
 > git rev-list 556ec1b65885f4e42422beb88e3bb31ea8ece16a # timeout=10
[DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9
Using base prefix '/usr'
New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3
Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python
Installing setuptools, pip, wheel...done.
[DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda1585547556436106302.sh
+ pip install -U pip
Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages
+ pip install -U -r requirements-dev.txt
Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1))
Collecting nose (from -r requirements-dev.txt (line 2))
  Using cached nose-1.3.7-py3-none-any.whl
Collecting nosexcover (from -r requirements-dev.txt (line 3))
  Using cached nosexcover-1.0.11-py2.py3-none-any.whl
Collecting jupyter (from -r requirements-dev.txt (line 4))
  Using cached jupyter-1.0.0-py2.py3-none-any.whl
Collecting pandas>=0.19.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached pandas-0.19.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting regex>=2017.01.14 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting gensim>=0.13.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting matplotlib==1.5.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached matplotlib-1.5.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting numpy>=1.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached numpy-1.12.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting scipy>=0.7 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached scipy-0.18.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting werkzeug>=0.11.15 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached Werkzeug-0.11.15-py2.py3-none-any.whl
Collecting flask>=0.11.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached Flask-0.12-py2.py3-none-any.whl
Collecting wikipedia>=1.4.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting lxml>=3.6.4 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached lxml-3.7.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pyLDAvis>=2.0.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3))
  Using cached coverage-4.3.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4))
  Using cached nbconvert-5.1.1-py2.py3-none-any.whl
Collecting notebook (from jupyter->-r requirements-dev.txt (line 4))
  Using cached notebook-4.4.1-py2.py3-none-any.whl
Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4))
  Using cached ipywidgets-5.2.2-py2.py3-none-any.whl
Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4))
  Using cached jupyter_console-5.1.0-py2.py3-none-any.whl
Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4))
  Using cached ipykernel-4.5.2-py2.py3-none-any.whl
Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4))
  Using cached qtconsole-4.2.1-py2.py3-none-any.whl
Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached python_dateutil-2.6.0-py2.py3-none-any.whl
Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached pytz-2016.10-py2.py3-none-any.whl
Requirement already up-to-date: six>=1.5.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting cycler (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached cycler-0.10.0-py2.py3-none-any.whl
Requirement already up-to-date: pyparsing!=2.0.0,!=2.0.4,!=2.1.2,>=1.5.6 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached click-6.7-py2.py3-none-any.whl
Collecting Jinja2>=2.4 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached Jinja2-2.9.5-py2.py3-none-any.whl
Collecting beautifulsoup4 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached beautifulsoup4-4.5.3-py3-none-any.whl
Collecting requests<3.0.0,>=2.0.0 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached requests-2.13.0-py2.py3-none-any.whl
Collecting joblib>=0.8.4 (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached joblib-0.10.3-py2.py3-none-any.whl
Collecting pytest (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached pytest-3.0.6-py2.py3-none-any.whl
Requirement already up-to-date: wheel>=0.23.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting numexpr (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached numexpr-2.6.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting future (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting funcy (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting jupyter-core (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached jupyter_core-4.3.0-py2.py3-none-any.whl
Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached entrypoints-0.2.2-py2.py3-none-any.whl
Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached testpath-0.3-py2.py3-none-any.whl
Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached bleach-1.5.0-py2.py3-none-any.whl
Collecting pygments (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached Pygments-2.2.0-py2.py3-none-any.whl
Collecting traitlets>=4.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached traitlets-4.3.1-py2.py3-none-any.whl
Collecting nbformat (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached nbformat-4.2.0-py2.py3-none-any.whl
Collecting mistune!=0.6 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached mistune-0.7.3-py2.py3-none-any.whl
Collecting tornado>=4 (from notebook->jupyter->-r requirements-dev.txt (line 4))
Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4))
Collecting jupyter-client (from notebook->jupyter->-r requirements-dev.txt (line 4))
  Using cached jupyter_client-4.4.0-py2.py3-none-any.whl
Collecting ipython-genutils (from notebook->jupyter->-r requirements-dev.txt (line 4))
  Using cached ipython_genutils-0.1.0-py2.py3-none-any.whl
Collecting ipython>=4.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
  Using cached ipython-5.2.2-py3-none-any.whl
Collecting widgetsnbextension>=1.2.6 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
  Using cached widgetsnbextension-1.2.6-py2.py3-none-any.whl
Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
  Using cached prompt_toolkit-1.0.13-py3-none-any.whl
Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached boto-2.45.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting py>=1.4.29 (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
  Using cached py-1.4.32-py2.py3-none-any.whl
Collecting html5lib!=0.9999,!=0.99999,<0.99999999,>=0.999 (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Collecting decorator (from traitlets>=4.2->nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached decorator-4.0.11-py2.py3-none-any.whl
Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbconvert->jupyter->-r requirements-dev.txt (line 4))
  Using cached jsonschema-2.6.0-py2.py3-none-any.whl
Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4))
  Using cached ptyprocess-0.5.1-py2.py3-none-any.whl
Collecting pyzmq>=13 (from jupyter-client->notebook->jupyter->-r requirements-dev.txt (line 4))
  Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
  Using cached pexpect-4.2.1-py2.py3-none-any.whl
Collecting pickleshare (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
  Using cached pickleshare-0.7.4-py2.py3-none-any.whl
Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
  Using cached wcwidth-0.1.7-py2.py3-none-any.whl
Requirement already up-to-date: packaging>=16.8 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: appdirs>=1.4.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Installing collected packages: nose, coverage, nosexcover, decorator, ipython-genutils, traitlets, jupyter-core, entrypoints, testpath, pandocfilters, html5lib, bleach, pygments, MarkupSafe, Jinja2, jsonschema, nbformat, mistune, nbconvert, tornado, ptyprocess, terminado, pyzmq, jupyter-client, wcwidth, prompt-toolkit, pexpect, pickleshare, simplegeneric, ipython, ipykernel, notebook, widgetsnbextension, ipywidgets, jupyter-console, qtconsole, jupyter, python-dateutil, numpy, pytz, pandas, regex, scipy, bz2file, boto, requests, smart-open, gensim, cycler, matplotlib, werkzeug, itsdangerous, click, flask, beautifulsoup4, wikipedia, lxml, joblib, py, pytest, numexpr, future, funcy, pyLDAvis, dariah-topics
  Running setup.py develop for dariah-topics
Successfully installed Jinja2-2.9.5 MarkupSafe-0.23 beautifulsoup4-4.5.3 bleach-1.5.0 boto-2.45.0 bz2file-0.98 click-6.7 coverage-4.3.4 cycler-0.10.0 dariah-topics decorator-4.0.11 entrypoints-0.2.2 flask-0.12 funcy-1.7.3 future-0.16.0 gensim-0.13.4.1 html5lib-0.9999999 ipykernel-4.5.2 ipython-5.2.2 ipython-genutils-0.1.0 ipywidgets-5.2.2 itsdangerous-0.24 joblib-0.10.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-4.4.0 jupyter-console-5.1.0 jupyter-core-4.3.0 lxml-3.7.2 matplotlib-1.5.3 mistune-0.7.3 nbconvert-5.1.1 nbformat-4.2.0 nose-1.3.7 nosexcover-1.0.11 notebook-4.4.1 numexpr-2.6.2 numpy-1.12.0 pandas-0.19.2 pandocfilters-1.4.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.13 ptyprocess-0.5.1 py-1.4.32 pyLDAvis-2.1.1 pygments-2.2.0 pytest-3.0.6 python-dateutil-2.6.0 pytz-2016.10 pyzmq-16.0.2 qtconsole-4.2.1 regex-2017.2.8 requests-2.13.0 scipy-0.18.1 simplegeneric-0.8.1 smart-open-1.4.0 terminado-0.6 testpath-0.3 tornado-4.4.2 traitlets-4.3.1 wcwidth-0.1.7 werkzeug-0.11.15 widgetsnbextension-1.2.6 wikipedia-1.4.0
+ ./setup.py sdist bdist_wheel
running sdist
running egg_info
writing requirements to dariah_topics.egg-info/requires.txt
writing dariah_topics.egg-info/PKG-INFO
writing top-level names to dariah_topics.egg-info/top_level.txt
writing dependency_links to dariah_topics.egg-info/dependency_links.txt
reading manifest file 'dariah_topics.egg-info/SOURCES.txt'
writing manifest file 'dariah_topics.egg-info/SOURCES.txt'
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:333: UserWarning: Normalizing '0.2.0dev0' to '0.2.0.dev0'
  normalized_version,
warning: sdist: standard file not found: should have one of README, README.rst, README.txt

running check
warning: check: missing required meta-data: url

creating dariah_topics-0.2.0.dev0
creating dariah_topics-0.2.0.dev0/dariah_topics
creating dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying files to dariah_topics-0.2.0.dev0...
copying setup.cfg -> dariah_topics-0.2.0.dev0
copying setup.py -> dariah_topics-0.2.0.dev0
copying dariah_topics/__init__.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/evaluation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/mallet.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/model_creation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/preprocessing.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/visualization.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
Writing dariah_topics-0.2.0.dev0/setup.cfg
Creating tar archive
removing 'dariah_topics-0.2.0.dev0' (and everything under it)
running bdist_wheel
running build
running build_py
copying dariah_topics/preprocessing.py -> build/lib/dariah_topics
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/testing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
running install_egg_info
Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0-py3.5.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0.dist-info/WHEEL
+ nosetests
.EE..
======================================================================
ERROR: Failure: RuntimeError (Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request.  Consult the documentation on testing for
information about how to avoid this problem.)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/plugins/manager.py", line 154, in generate
    for r in result:
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/plugins/doctests.py", line 228, in loadTestsFromModule
    tests = self.finder.find(module)
  File "/usr/lib/python3.5/doctest.py", line 924, in find
    self._find(tests, obj, name, module, source_lines, globs, {})
  File "/usr/lib/python3.5/doctest.py", line 983, in _find
    if ((inspect.isroutine(inspect.unwrap(val))
  File "/usr/lib/python3.5/inspect.py", line 470, in unwrap
    while _is_wrapper(func):
  File "/usr/lib/python3.5/inspect.py", line 464, in _is_wrapper
    return hasattr(f, '__wrapped__')
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/local.py", line 343, in __getattr__
    return getattr(self._get_current_object(), name)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/local.py", line 302, in _get_current_object
    return self.__local()
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/globals.py", line 37, in _lookup_req_object
    raise RuntimeError(_request_ctx_err_msg)
RuntimeError: Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request.  Consult the documentation on testing for
information about how to avoid this problem.

======================================================================
ERROR: Integration test notebook (via Jupyter)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/case.py", line 198, in runTest
    self.test(*self.arg)
  File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/integration_test.py", line 20, in jupyter_integration_test
    stderr=STDOUT, universal_newlines=True)
  File "/usr/lib/python3.5/subprocess.py", line 316, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.5/subprocess.py", line 398, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['jupyter-nbconvert', '--execute', '--log-level=ERROR', '--ExecutePreprocessor.iopub_timeout=30', '--ExecutePreprocessor.timeout=None', '/mnt/data/jenkins/workspace/DARIAH-Topics/IntegrationTest_v01.ipynb']' returned non-zero exit status 1
-------------------- >> begin captured logging << --------------------
root: ERROR: An error occurred while executing the following cell:
------------------
#num_docs = max(sparse_bow.index.get_level_values("doc_id"))
#num_types = max(sparse_bow.index.get_level_values("token_id"))
#sum_counts = sum(sparse_bow[0])

#header_string = str(num_docs) + " " + str(num_types) + " " + str(sum_counts) + "\n"

#with open("gb_plain.mm", 'w', encoding = "utf-8") as f:
#    pass

#with open("gb_plain.mm", 'a', encoding = "utf-8") as f:
#    f.write("%%MatrixMarket matrix coordinate real general\n")
#    f.write(header_string)
#    sparse_bow.to_csv( f, sep = ' ', header = None)

pre.save_bow_mm(sparse_bow)
------------------

TypeError: save_bow_mm() missing 1 required positional argument: 'output_path'


--------------------- >> end captured logging << ---------------------

Name                              Stmts   Miss  Cover
-----------------------------------------------------
dariah_topics.py                      0      0   100%
dariah_topics/evaluation.py         100     75    25%
dariah_topics/mallet.py             133    105    21%
dariah_topics/model_creation.py      60     43    28%
dariah_topics/preprocessing.py      151     59    61%
dariah_topics/visualization.py      146    119    18%
-----------------------------------------------------
TOTAL                               590    401    32%
----------------------------------------------------------------------
Ran 5 tests in 13.218s

FAILED (errors=2)
Build step 'Virtualenv Builder' marked build as failure
Recording test results
Skipping Cobertura coverage report as build was not UNSTABLE or better ...
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@25eb58c9[description=<null>,homepage=<null>,name=Topics,license=<null>,fork=true,size=31530,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:07a4caa) with context:DARIAH-Topics
Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/07a4caa3504af1636f35798bf13cdcb5a64139dc
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
Finished: FAILURE