Started by GitHub push by severinsimmler Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 89b181c13292d351797ba58276f25b1285d47762 (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f 89b181c13292d351797ba58276f25b1285d47762 > git rev-list 6752569ec2ae787828807148ba57cacd0f51d24e # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda6732915251495898458.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting nose (from -r requirements-dev.txt (line 2)) Using cached nose-1.3.7-py3-none-any.whl Collecting nosexcover (from -r requirements-dev.txt (line 3)) Using cached nosexcover-1.0.11-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 4)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pandas-0.19.2-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting matplotlib==1.5.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached matplotlib-1.5.3-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached numpy-1.12.1-cp35-cp35m-manylinux1_x86_64.whl Collecting scipy>=0.7 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached scipy-0.19.0-cp35-cp35m-manylinux1_x86_64.whl Collecting werkzeug>=0.11.15 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Werkzeug-0.12.1-py2.py3-none-any.whl Collecting flask>=0.11.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Flask-0.12.1-py2.py3-none-any.whl Collecting wikipedia>=1.4.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting lxml>=3.6.4 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached lxml-3.7.3-cp35-cp35m-manylinux1_x86_64.whl Collecting pyLDAvis>=2.0.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3)) Using cached coverage-4.3.4-cp35-cp35m-manylinux1_x86_64.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipywidgets-6.0.0-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_console-5.1.0-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4)) Using cached qtconsole-4.3.0-py2.py3-none-any.whl Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4)) Using cached nbconvert-5.1.1-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 4)) Using cached notebook-5.0.0-py2.py3-none-any.whl Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipykernel-4.6.0-py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pytz-2017.2-py2.py3-none-any.whl Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.0-py2.py3-none-any.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Requirement already up-to-date: six>=1.5.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting cycler (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Requirement already up-to-date: pyparsing!=2.0.0,!=2.0.4,!=2.1.2,>=1.5.6 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting Jinja2>=2.4 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Jinja2-2.9.6-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting beautifulsoup4 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached beautifulsoup4-4.5.3-py3-none-any.whl Collecting requests<3.0.0,>=2.0.0 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached requests-2.13.0-py2.py3-none-any.whl Collecting numexpr (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached numexpr-2.6.2-cp35-cp35m-manylinux1_x86_64.whl Collecting future (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Requirement already up-to-date: wheel>=0.23.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting joblib>=0.8.4 (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached joblib-0.11-py2.py3-none-any.whl Collecting pytest (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pytest-3.0.7-py2.py3-none-any.whl Collecting funcy (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting widgetsnbextension~=2.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached widgetsnbextension-2.0.0-py2.py3-none-any.whl Collecting traitlets>=4.3.1 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting nbformat>=4.2.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached nbformat-4.3.0-py2.py3-none-any.whl Collecting ipython>=4.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython-5.3.0-py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached prompt_toolkit-1.0.14-py3-none-any.whl Collecting jupyter-client (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_client-5.0.1-py2.py3-none-any.whl Collecting pygments (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting ipython-genutils (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting jupyter-core (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_core-4.3.0-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Collecting mistune!=0.6 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached mistune-0.7.4-py2.py3-none-any.whl Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached bleach-2.0.0-py2.py3-none-any.whl Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached entrypoints-0.2.2-py2.py3-none-any.whl Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached testpath-0.3-py2.py3-none-any.whl Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting tornado>=4 (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached boto-2.46.1-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting py>=1.4.29 (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached py-1.4.33-py2.py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting decorator (from traitlets>=4.3.1->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached decorator-4.0.11-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat>=4.2.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached pexpect-4.2.1-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0->ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl Collecting html5lib>=0.99999999 (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached html5lib-0.999999999-py2.py3-none-any.whl Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4)) Using cached ptyprocess-0.5.1-py2.py3-none-any.whl Requirement already up-to-date: appdirs>=1.4.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Requirement already up-to-date: packaging>=16.8 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting webencodings (from html5lib>=0.99999999->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Installing collected packages: nose, coverage, nosexcover, ipython-genutils, decorator, traitlets, jupyter-core, pyzmq, python-dateutil, jupyter-client, jsonschema, nbformat, tornado, ptyprocess, terminado, pandocfilters, mistune, webencodings, html5lib, bleach, entrypoints, testpath, pygments, MarkupSafe, Jinja2, nbconvert, wcwidth, prompt-toolkit, simplegeneric, pexpect, pickleshare, ipython, ipykernel, notebook, widgetsnbextension, ipywidgets, jupyter-console, qtconsole, jupyter, pytz, numpy, pandas, regex, scipy, bz2file, requests, boto, smart-open, gensim, cycler, matplotlib, werkzeug, click, itsdangerous, flask, beautifulsoup4, wikipedia, lxml, numexpr, future, joblib, py, pytest, funcy, pyLDAvis, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 beautifulsoup4-4.5.3 bleach-2.0.0 boto-2.46.1 bz2file-0.98 click-6.7 coverage-4.3.4 cycler-0.10.0 dariah-topics decorator-4.0.11 entrypoints-0.2.2 flask-0.12.1 funcy-1.7.3 future-0.16.0 gensim-2.0.0 html5lib-0.999999999 ipykernel-4.6.0 ipython-5.3.0 ipython-genutils-0.2.0 ipywidgets-6.0.0 itsdangerous-0.24 joblib-0.11 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.0.1 jupyter-console-5.1.0 jupyter-core-4.3.0 lxml-3.7.3 matplotlib-1.5.3 mistune-0.7.4 nbconvert-5.1.1 nbformat-4.3.0 nose-1.3.7 nosexcover-1.0.11 notebook-5.0.0 numexpr-2.6.2 numpy-1.12.1 pandas-0.19.2 pandocfilters-1.4.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.14 ptyprocess-0.5.1 py-1.4.33 pyLDAvis-2.1.1 pygments-2.2.0 pytest-3.0.7 python-dateutil-2.6.0 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.0 regex-2017.4.5 requests-2.13.0 scipy-0.19.0 simplegeneric-0.8.1 smart-open-1.5.1 terminado-0.6 testpath-0.3 tornado-4.4.3 traitlets-4.3.2 wcwidth-0.1.7 webencodings-0.5.1 werkzeug-0.12.1 widgetsnbextension-2.0.0 wikipedia-1.4.0 + ./setup.py sdist bdist_wheel running sdist running egg_info writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing top-level names to dariah_topics.egg-info/top_level.txt writing requirements to dariah_topics.egg-info/requires.txt writing dariah_topics.egg-info/PKG-INFO reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:334: UserWarning: Normalizing '0.2.0dev0' to '0.2.0.dev0' normalized_version, warning: sdist: standard file not found: should have one of README, README.rst, README.txt running check warning: check: missing required meta-data: url creating dariah_topics-0.2.0.dev0 creating dariah_topics-0.2.0.dev0/dariah_topics creating dariah_topics-0.2.0.dev0/dariah_topics.egg-info creating dariah_topics-0.2.0.dev0/test copying files to dariah_topics-0.2.0.dev0... copying setup.cfg -> dariah_topics-0.2.0.dev0 copying setup.py -> dariah_topics-0.2.0.dev0 copying dariah_topics/__init__.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/model_creation.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.2.0.dev0/test Writing dariah_topics-0.2.0.dev0/setup.cfg Creating tar archive removing 'dariah_topics-0.2.0.dev0' (and everything under it) running bdist_wheel running build running build_py copying dariah_topics/preprocessing.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/testing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0.dist-info/WHEEL + nosetests .FF.FFF..../mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/util.py:453: DeprecationWarning: inspect.getargspec() is deprecated, use inspect.signature() instead inspect.getargspec(func) ..........E........... ====================================================================== ERROR: Integration test notebook (via Jupyter) ---------------------------------------------------------------------- Traceback (most recent call last): File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/case.py", line 198, in runTest self.test(*self.arg) File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/integration_test.py", line 20, in jupyter_integration_test stderr=STDOUT, universal_newlines=True) File "/usr/lib/python3.5/subprocess.py", line 316, in check_output **kwargs).stdout File "/usr/lib/python3.5/subprocess.py", line 398, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['jupyter-nbconvert', '--execute', '--log-level=ERROR', '--ExecutePreprocessor.iopub_timeout=30', '--ExecutePreprocessor.timeout=None', '/mnt/data/jenkins/workspace/DARIAH-Topics/IntegrationTest_v01.ipynb']' returned non-zero exit status 1 -------------------- >> begin captured logging << -------------------- root: ERROR: An error occurred while executing the following cell: ------------------ doc_tokens = pre.filter_POS_tags(corpus_csv) list(doc_tokens)[0][:5] ------------------ AttributeError: module 'dariah_topics.preprocessing' has no attribute 'filter_POS_tags' --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.preprocessing.create_document_list ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.preprocessing.create_document_list File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 37, in create_document_list ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 55, in dariah_topics.preprocessing.create_document_list Failed example: create_document_list('corpus_txt') # doctest: +NORMALIZE_WHITESPACE Expected: ['corpus_txt/Doyle_AScandalinBohemia.txt', 'corpus_txt/Doyle_AStudyinScarlet.txt', 'corpus_txt/Doyle_TheHoundoftheBaskervilles.txt', 'corpus_txt/Doyle_TheSignoftheFour.txt', 'corpus_txt/Howard_GodsoftheNorth.txt', 'corpus_txt/Howard_SchadowsinZamboula.txt', 'corpus_txt/Howard_ShadowsintheMoonlight.txt', 'corpus_txt/Howard_TheDevilinIron.txt', 'corpus_txt/Kipling_TheEndofthePassage.txt', 'corpus_txt/Kipling_TheJungleBook.txt', 'corpus_txt/Kipling_ThyServantaDog.txt', 'corpus_txt/Lovecraft_AttheMountainofMadness.txt', 'corpus_txt/Lovecraft_TheShunnedHouse.txt', 'corpus_txt/Poe_EurekaAProsePoem.txt', 'corpus_txt/Poe_TheCaskofAmontillado.txt', 'corpus_txt/Poe_TheMasqueoftheRedDeath.txt', 'corpus_txt/Poe_ThePurloinedLetter.txt'] Got: ['corpus_txt/Poe_EurekaAProsePoem.txt', 'corpus_txt/Howard_GodsoftheNorth.txt', 'corpus_txt/Poe_ThePurloinedLetter.txt', 'corpus_txt/Poe_TheCaskofAmontillado.txt', 'corpus_txt/Howard_TheDevilinIron.txt', 'corpus_txt/Howard_ShadowsintheMoonlight.txt', 'corpus_txt/Doyle_AScandalinBohemia.txt', 'corpus_txt/Lovecraft_TheShunnedHouse.txt', 'corpus_txt/Doyle_AStudyinScarlet.txt', 'corpus_txt/Kipling_ThyServantaDog.txt', 'corpus_txt/Doyle_TheHoundoftheBaskervilles.txt', 'corpus_txt/Doyle_TheSignoftheFour.txt', 'corpus_txt/Poe_TheMasqueoftheRedDeath.txt', 'corpus_txt/Howard_SchadowsinZamboula.txt', 'corpus_txt/Kipling_TheJungleBook.txt', 'corpus_txt/Lovecraft_AttheMountainofMadness.txt', 'corpus_txt/Kipling_TheEndofthePassage.txt'] -------------------- >> begin captured logging << -------------------- preprocessing: INFO: Creating document list from TXT files ... --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.preprocessing.filter_pos_tags ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.preprocessing.filter_pos_tags File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 412, in filter_pos_tags ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 433, in dariah_topics.preprocessing.filter_pos_tags Failed example: list(filter_pos_tags(df))[0] # doctest: +NORMALIZE_WHITESPACE Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "<doctest dariah_topics.preprocessing.filter_pos_tags[1]>", line 1, in <module> list(filter_pos_tags(df))[0] # doctest: +NORMALIZE_WHITESPACE File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 441, in filter_pos_tags df = df.loc[df['CPOS'] == pos] UnboundLocalError: local variable 'df' referenced before assignment -------------------- >> begin captured logging << -------------------- preprocessing: INFO: Accessing ['ADJ', 'V', 'NN'] ... --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.preprocessing.read_from_csv ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.preprocessing.read_from_csv File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 155, in read_from_csv ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 188, in dariah_topics.preprocessing.read_from_csv Failed example: list(read_from_csv(doclist))[0][:4] # doctest: +NORMALIZE_WHITESPACE Expected: ParagraphId TokenId Lemma CPOS NamedEntity 0 0 0 a ART _ 1 0 1 scandal NP _ 2 0 2 in PP _ 3 0 3 bohemia NP _ Got: ParagraphId TokenId Lemma CPOS NamedEntity 0 0 0 eureka NP _ 1 0 1 : PUNC _ 2 0 2 a ART _ 3 0 3 prose NP _ -------------------- >> begin captured logging << -------------------- preprocessing: INFO: Accessing CSV documents ... preprocessing: INFO: Creating document list from CSV files ... preprocessing: INFO: Accessing CSV documents ... --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.preprocessing.read_from_tei ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.preprocessing.read_from_tei File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 119, in read_from_tei ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 139, in dariah_topics.preprocessing.read_from_tei Failed example: list(read_from_tei('corpus_tei/Schnitzler_Amerika.xml'))[0][146:163] Expected: 'Arthur Schnitzler' Got: 'ur Schnitzler\n ' ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 142, in dariah_topics.preprocessing.read_from_tei Failed example: list(read_from_tei(doclist))[0][142:159] Expected: 'Arthur Schnitzler' Got: ' \nArthur Schnitz' -------------------- >> begin captured logging << -------------------- preprocessing: INFO: Accessing TEI XML documents ... preprocessing: INFO: Creating document list from XML files ... preprocessing: INFO: Accessing TEI XML documents ... --------------------- >> end captured logging << --------------------- ====================================================================== FAIL: Doctest: dariah_topics.preprocessing.read_from_txt ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for dariah_topics.preprocessing.read_from_txt File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 83, in read_from_txt ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 106, in dariah_topics.preprocessing.read_from_txt Failed example: list(read_from_txt(doclist))[0][:20] Expected: 'A SCANDAL IN BOHEMIA' Got: '\t\t\t\tEUREKA:\n ' -------------------- >> begin captured logging << -------------------- preprocessing: INFO: Accessing TXT documents ... preprocessing: INFO: Creating document list from TXT files ... preprocessing: INFO: Accessing TXT documents ... --------------------- >> end captured logging << --------------------- Name Stmts Miss Cover ----------------------------------------------------- dariah_topics.py 0 0 100% dariah_topics/doclist.py 89 14 84% dariah_topics/evaluation.py 100 75 25% dariah_topics/mallet.py 216 150 31% dariah_topics/model_creation.py 60 43 28% dariah_topics/preprocessing.py 209 67 68% dariah_topics/visualization.py 166 135 19% ----------------------------------------------------- TOTAL 840 484 42% ---------------------------------------------------------------------- Ran 33 tests in 9.857s FAILED (errors=1, failures=5) Build step 'Virtualenv Builder' marked build as failure Recording test results Skipping Cobertura coverage report as build was not UNSTABLE or better ... [Set GitHub commit status (universal)] ERROR on repos [GHRepository@4499698d[description=A python library for topic modeling.,homepage=,name=Topics,license=<null>,fork=true,size=96597,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:89b181c) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/89b181c13292d351797ba58276f25b1285d47762 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion Finished: FAILURE