Started by GitHub push by severinsimmler Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision e49c66582d4f2a403adbf9ac2de6722beb6185c9 (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f e49c66582d4f2a403adbf9ac2de6722beb6185c9 Commit message: "Update IntroducingMallet.ipynb" > git rev-list cf1e1a212a7450a0b32830747125168b590e2dc5 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda8019793859000043606.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting nose (from -r requirements-dev.txt (line 2)) Using cached nose-1.3.7-py3-none-any.whl Collecting nosexcover (from -r requirements-dev.txt (line 3)) Using cached nosexcover-1.0.11-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 4)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 5)) Using cached Sphinx-1.6.4-py2.py3-none-any.whl Collecting nbsphinx (from -r requirements-dev.txt (line 6)) Using cached nbsphinx-0.2.14-py2.py3-none-any.whl Collecting recommonmark (from -r requirements-dev.txt (line 7)) Using cached recommonmark-0.4.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pandas-0.20.3-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting lda>=1.0.5 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached numpy-1.13.3-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached lxml-4.1.0-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting matplotlib>=1.5.3 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached matplotlib-2.1.0-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3)) Using cached coverage-4.4.1-cp35-cp35m-manylinux1_x86_64.whl Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4)) Using cached nbconvert-5.3.1-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4)) Using cached qtconsole-4.3.1-py2.py3-none-any.whl Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipykernel-4.6.1-py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 4)) Using cached notebook-5.2.0-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_console-5.2.0-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4)) Downloading ipywidgets-7.0.2-py2.py3-none-any.whl (66kB) Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 5)) Using cached requests-2.18.4-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 5)) Using cached docutils-0.14-py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Jinja2-2.9.6-py2.py3-none-any.whl Collecting six>=1.5 (from sphinx->-r requirements-dev.txt (line 5)) Using cached six-1.11.0-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 5)) Using cached imagesize-0.7.1-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 5)) Using cached snowballstemmer-1.2.1-py2.py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from sphinx->-r requirements-dev.txt (line 5)) Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 5)) Using cached Babel-2.5.1-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 5)) Using cached alabaster-0.7.10-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 5)) Using cached sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 6)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting nbformat (from nbsphinx->-r requirements-dev.txt (line 6)) Using cached nbformat-4.4.0-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 7)) Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.1-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pytz-2017.2-py2.py3-none-any.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached scipy-0.19.1-cp35-cp35m-manylinux1_x86_64.whl Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pbr-3.1.1-py2.py3-none-any.whl Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached Werkzeug-0.12.2-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Collecting mistune>=0.7.4 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached mistune-0.7.4-py2.py3-none-any.whl Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached bleach-2.1.1-py2.py3-none-any.whl Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting jupyter-core (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_core-4.3.0-py2.py3-none-any.whl Collecting ipython-genutils (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting jupyter-client>=4.1 (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_client-5.1.0-py2.py3-none-any.whl Collecting ipython>=4.0.0 (from ipykernel->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython-6.2.1-py3-none-any.whl Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached prompt_toolkit-1.0.15-py3-none-any.whl Collecting widgetsnbextension~=3.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Downloading widgetsnbextension-3.0.4-py2.py3-none-any.whl (2.5MB) Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached certifi-2017.7.27.1-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached urllib3-1.22-py2.py3-none-any.whl Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 5)) Using cached idna-2.6-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 5)) Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 6)) Using cached decorator-4.1.2-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbsphinx->-r requirements-dev.txt (line 6)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Using cached boto-2.48.0-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.3.0.dev1->-r requirements.txt (line 1)) Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached html5lib-1.0b10-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client>=4.1->qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl Collecting pickleshare (from ipython>=4.0.0->ipykernel->jupyter->-r requirements-dev.txt (line 4)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->jupyter->-r requirements-dev.txt (line 4)) Using cached pexpect-4.2.1-py2.py3-none-any.whl Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->jupyter->-r requirements-dev.txt (line 4)) Using cached jedi-0.11.0-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->jupyter->-r requirements-dev.txt (line 4)) Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Collecting parso==0.1.0 (from jedi>=0.10->ipython>=4.0.0->ipykernel->jupyter->-r requirements-dev.txt (line 4)) Using cached parso-0.1.0-py2.py3-none-any.whl Installing collected packages: nose, coverage, nosexcover, entrypoints, pandocfilters, decorator, six, ipython-genutils, traitlets, jsonschema, jupyter-core, nbformat, MarkupSafe, Jinja2, mistune, Pygments, webencodings, html5lib, bleach, testpath, nbconvert, pyzmq, python-dateutil, jupyter-client, tornado, pickleshare, ptyprocess, pexpect, parso, jedi, simplegeneric, wcwidth, prompt-toolkit, ipython, ipykernel, qtconsole, terminado, notebook, jupyter-console, widgetsnbextension, ipywidgets, jupyter, certifi, chardet, urllib3, idna, requests, docutils, imagesize, snowballstemmer, pytz, babel, alabaster, sphinxcontrib-websupport, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, boto, bz2file, smart-open, scipy, gensim, pbr, lda, lxml, click, Werkzeug, itsdangerous, flask, pyparsing, cycler, matplotlib, PyYAML, bokeh, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Werkzeug-0.12.2 alabaster-0.7.10 babel-2.5.1 bleach-2.1.1 bokeh-0.12.10 boto-2.48.0 bz2file-0.98 certifi-2017.7.27.1 chardet-3.0.4 click-6.7 commonmark-0.5.4 coverage-4.4.1 cycler-0.10.0 dariah-topics decorator-4.1.2 docutils-0.14 entrypoints-0.2.3 flask-0.12.2 gensim-3.0.1 html5lib-1.0b10 idna-2.6 imagesize-0.7.1 ipykernel-4.6.1 ipython-6.2.1 ipython-genutils-0.2.0 ipywidgets-7.0.2 itsdangerous-0.24 jedi-0.11.0 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.1.0 jupyter-console-5.2.0 jupyter-core-4.3.0 lda-1.0.5 lxml-4.1.0 matplotlib-2.1.0 mistune-0.7.4 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.2.14 nose-1.3.7 nosexcover-1.0.11 notebook-5.2.0 numpy-1.13.3 pandas-0.20.3 pandocfilters-1.4.2 parso-0.1.0 pbr-3.1.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.15 ptyprocess-0.5.2 pyparsing-2.2.0 python-dateutil-2.6.1 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.1 recommonmark-0.4.0 regex-2017.9.23 requests-2.18.4 scipy-0.19.1 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.3 snowballstemmer-1.2.1 sphinx-1.6.4 sphinxcontrib-websupport-1.0.1 terminado-0.6 testpath-0.3.1 tornado-4.5.2 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.0.4 + ./setup.py sdist bdist_wheel running sdist running egg_info writing top-level names to dariah_topics.egg-info/top_level.txt writing requirements to dariah_topics.egg-info/requires.txt writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing dariah_topics.egg-info/PKG-INFO reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check warning: check: missing required meta-data: url creating dariah_topics-0.3.0.dev1 creating dariah_topics-0.3.0.dev1/dariah_topics creating dariah_topics-0.3.0.dev1/dariah_topics.egg-info creating dariah_topics-0.3.0.dev1/test copying files to dariah_topics-0.3.0.dev1... copying README.md -> dariah_topics-0.3.0.dev1 copying setup.cfg -> dariah_topics-0.3.0.dev1 copying setup.py -> dariah_topics-0.3.0.dev1 copying dariah_topics/__init__.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.3.0.dev1/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.3.0.dev1/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.3.0.dev1/test Writing dariah_topics-0.3.0.dev1/setup.cfg Creating tar archive removing 'dariah_topics-0.3.0.dev1' (and everything under it) running bdist_wheel running build running build_py copying dariah_topics/preprocessing.py -> build/lib/dariah_topics copying dariah_topics/meta.py -> build/lib/dariah_topics copying dariah_topics/mallet.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.3.0.dev1-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.3.0.dev1.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-0.3.0.dev1-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/doclist.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/mallet.py' adding 'dariah_topics/meta.py' adding 'dariah_topics/model_creation.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-0.3.0.dev1.dist-info/DESCRIPTION.rst' adding 'dariah_topics-0.3.0.dev1.dist-info/metadata.json' adding 'dariah_topics-0.3.0.dev1.dist-info/top_level.txt' adding 'dariah_topics-0.3.0.dev1.dist-info/WHEEL' adding 'dariah_topics-0.3.0.dev1.dist-info/METADATA' adding 'dariah_topics-0.3.0.dev1.dist-info/RECORD' + nosetests --cover-html .E....................E ====================================================================== ERROR: test_topic_modeling (demonstrator_test.DemonstratorTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/demonstrator_test.py", line 57, in test_topic_modeling resp = self.app.post('/upload', data=data) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 801, in post return self.open(*args, **kw) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/testing.py", line 127, in open follow_redirects=follow_redirects) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 764, in open response = self.run_wsgi_app(environ, buffered=buffered) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 677, in run_wsgi_app rv = run_wsgi_app(self.application, environ, buffered=buffered) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 884, in run_wsgi_app app_rv = app(environ, start_response) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1997, in __call__ return self.wsgi_app(environ, start_response) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1985, in wsgi_app response = self.handle_exception(e) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1540, in handle_exception reraise(exc_type, exc_value, tb) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise raise value File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app response = self.full_dispatch_request() File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request rv = self.handle_user_exception(e) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception reraise(exc_type, exc_value, tb) File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise raise value File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request rv = self.dispatch_request() File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "demonstrator/demonstrator.py", line 131, in upload_file doc_term_matrix = preprocessing.create_doc_term_matrix( AttributeError: module 'dariah_topics.preprocessing' has no attribute 'create_doc_term_matrix' -------------------- >> begin captured stdout << --------------------- Accessing user input ... 1 text files. 1 topics. 1 iterations. Using external stopwords list. Tokenizing <FileStorage: 'document.txt' ('text/plain')> ... --------------------- >> end captured stdout << ---------------------- -------------------- >> begin captured logging << -------------------- dariah_topics.preprocessing: INFO: Tokenizing document ... dariah_topics.preprocessing: DEBUG: Lowering all characters ... --------------------- >> end captured logging << --------------------- ====================================================================== ERROR: segment convenience wrapper ---------------------------------------------------------------------- Traceback (most recent call last): File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/case.py", line 198, in runTest self.test(*self.arg) File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/test_fuzzy_segmenting.py", line 104, in test_segment text = path.read_text(encoding='utf-8') File "/usr/lib/python3.5/pathlib.py", line 1164, in read_text with self.open(mode='r', encoding=encoding, errors=errors) as f: File "/usr/lib/python3.5/pathlib.py", line 1151, in open opener=self._opener) File "/usr/lib/python3.5/pathlib.py", line 1005, in _opener return self._accessor.open(self, flags, mode) File "/usr/lib/python3.5/pathlib.py", line 371, in wrapped return strfunc(str(pathobj), *args) FileNotFoundError: [Errno 2] No such file or directory: '/mnt/data/jenkins/workspace/DARIAH-Topics/grenzboten_sample/Grenzboten_1844_Tagebuch_56.txt' Name Stmts Miss Cover ---------------------------------------------------- dariah_topics/__init__.py 7 0 100% dariah_topics/doclist.py 93 8 91% dariah_topics/meta.py 57 31 46% dariah_topics/preprocessing.py 224 34 85% dariah_topics/visualization.py 176 98 44% ---------------------------------------------------- TOTAL 557 171 69% ---------------------------------------------------------------------- Ran 23 tests in 21.468s FAILED (errors=2) Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... Publishing Cobertura coverage results... Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@3b990ed[description=A Python library for Text Mining and Topic Modeling.,homepage=,name=Topics,license=<null>,fork=true,size=187110,milestones={},language=Jupyter Notebook,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Fri, 20 Oct 2017 13:57:09 GMT], ETag=[W/"7eb0f60a802f0f946a1ee4e7a3f09f6f"], Expect-CT=[max-age=2592000; report-uri="https://api.github.com/_private/browser/errors"], Last-Modified=[Tue, 17 Oct 2017 15:04:29 GMT], OkHttp-Received-Millis=[1508507829134], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1508507828974], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[91FE:6DF1:3C0AE3:802838:59EA00B4], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4967], X-RateLimit-Reset=[1508509240], X-Runtime-rack=[0.068323], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:e49c665) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/e49c66582d4f2a403adbf9ac2de6722beb6185c9 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: -10.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE