Started by GitHub push by severinsimmler
Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10
Fetching upstream changes from https://github.com/DARIAH-DE/Topics
> git --version # timeout=10
using GIT_ASKPASS to set credentials
> git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
> git rev-parse refs/remotes/origin/testing^{commit} # timeout=10
> git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10
Checking out Revision 31155e9e43b97b8c463efa654499f443d10fdaba (refs/remotes/origin/testing)
> git config core.sparsecheckout # timeout=10
> git checkout -f 31155e9e43b97b8c463efa654499f443d10fdaba
> git rev-list 6ffba9a6d883b81b9fa8c0c334afd27284fad5f6 # timeout=10
[DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9
Using base prefix '/usr'
New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3
Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python
Installing setuptools, pip, wheel...done.
[DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda5658701270486118645.sh
+ pip install -U pip
Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages
+ pip install -U -r requirements-dev.txt
Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1))
Collecting nose (from -r requirements-dev.txt (line 2))
Using cached nose-1.3.7-py3-none-any.whl
Collecting nosexcover (from -r requirements-dev.txt (line 3))
Using cached nosexcover-1.0.11-py2.py3-none-any.whl
Collecting jupyter (from -r requirements-dev.txt (line 4))
Using cached jupyter-1.0.0-py2.py3-none-any.whl
Collecting pandas>=0.19.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pandas-0.19.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting regex>=2017.01.14 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting gensim>=0.13.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting matplotlib==1.5.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached matplotlib-1.5.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting numpy>=1.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached numpy-1.12.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting scipy>=0.7 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached scipy-0.19.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting werkzeug>=0.11.15 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Werkzeug-0.12.1-py2.py3-none-any.whl
Collecting flask>=0.11.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Flask-0.12.1-py2.py3-none-any.whl
Collecting wikipedia>=1.4.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting lxml>=3.6.4 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached lxml-3.7.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting pyLDAvis>=2.0.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3))
Using cached coverage-4.3.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_console-5.1.0-py2.py3-none-any.whl
Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4))
Using cached ipykernel-4.6.0-py3-none-any.whl
Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4))
Using cached nbconvert-5.1.1-py2.py3-none-any.whl
Collecting notebook (from jupyter->-r requirements-dev.txt (line 4))
Using cached notebook-5.0.0-py2.py3-none-any.whl
Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4))
Using cached ipywidgets-6.0.0-py2.py3-none-any.whl
Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4))
Using cached qtconsole-4.3.0-py2.py3-none-any.whl
Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached python_dateutil-2.6.0-py2.py3-none-any.whl
Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pytz-2017.2-py2.py3-none-any.whl
Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: six>=1.5.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: pyparsing!=2.0.0,!=2.0.4,!=2.1.2,>=1.5.6 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting cycler (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached cycler-0.10.0-py2.py3-none-any.whl
Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached click-6.7-py2.py3-none-any.whl
Collecting Jinja2>=2.4 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached Jinja2-2.9.6-py2.py3-none-any.whl
Collecting requests<3.0.0,>=2.0.0 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached requests-2.13.0-py2.py3-none-any.whl
Collecting beautifulsoup4 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached beautifulsoup4-4.5.3-py3-none-any.whl
Requirement already up-to-date: wheel>=0.23.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting future (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting funcy (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting joblib>=0.8.4 (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached joblib-0.11-py2.py3-none-any.whl
Collecting pytest (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached pytest-3.0.7-py2.py3-none-any.whl
Collecting numexpr (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached numexpr-2.6.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached prompt_toolkit-1.0.14-py3-none-any.whl
Collecting pygments (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached Pygments-2.2.0-py2.py3-none-any.whl
Collecting jupyter-client (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_client-5.0.1-py2.py3-none-any.whl
Collecting ipython (from jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached ipython-5.3.0-py3-none-any.whl
Collecting traitlets>=4.1.0 (from ipykernel->jupyter->-r requirements-dev.txt (line 4))
Using cached traitlets-4.3.2-py2.py3-none-any.whl
Collecting tornado>=4.0 (from ipykernel->jupyter->-r requirements-dev.txt (line 4))
Collecting nbformat (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached nbformat-4.3.0-py2.py3-none-any.whl
Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached entrypoints-0.2.2-py2.py3-none-any.whl
Collecting mistune!=0.6 (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached mistune-0.7.4-py2.py3-none-any.whl
Collecting jupyter-core (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached jupyter_core-4.3.0-py2.py3-none-any.whl
Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached testpath-0.3-py2.py3-none-any.whl
Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached bleach-2.0.0-py2.py3-none-any.whl
Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4))
Collecting ipython-genutils (from notebook->jupyter->-r requirements-dev.txt (line 4))
Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl
Collecting widgetsnbextension~=2.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4))
Using cached widgetsnbextension-2.0.0-py2.py3-none-any.whl
Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached boto-2.46.1-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting py>=1.4.29 (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Using cached py-1.4.33-py2.py3-none-any.whl
Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached wcwidth-0.1.7-py2.py3-none-any.whl
Collecting pyzmq>=13 (from jupyter-client->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pexpect; sys_platform != "win32" (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached pexpect-4.2.1-py2.py3-none-any.whl
Collecting decorator (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached decorator-4.0.11-py2.py3-none-any.whl
Collecting pickleshare (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Using cached pickleshare-0.7.4-py2.py3-none-any.whl
Collecting simplegeneric>0.8 (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4))
Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached jsonschema-2.6.0-py2.py3-none-any.whl
Collecting html5lib>=0.99999999 (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached html5lib-0.999999999-py2.py3-none-any.whl
Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4))
Using cached ptyprocess-0.5.1-py2.py3-none-any.whl
Requirement already up-to-date: packaging>=16.8 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Requirement already up-to-date: appdirs>=1.4.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from setuptools->pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1))
Collecting webencodings (from html5lib>=0.99999999->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4))
Using cached webencodings-0.5.1-py2.py3-none-any.whl
Installing collected packages: nose, coverage, nosexcover, wcwidth, prompt-toolkit, pygments, ipython-genutils, decorator, traitlets, tornado, pyzmq, python-dateutil, jupyter-core, jupyter-client, ptyprocess, pexpect, pickleshare, simplegeneric, ipython, ipykernel, jupyter-console, jsonschema, nbformat, pandocfilters, entrypoints, mistune, MarkupSafe, Jinja2, testpath, webencodings, html5lib, bleach, nbconvert, terminado, notebook, widgetsnbextension, ipywidgets, qtconsole, jupyter, numpy, pytz, pandas, regex, requests, bz2file, boto, smart-open, scipy, gensim, cycler, matplotlib, werkzeug, itsdangerous, click, flask, beautifulsoup4, wikipedia, lxml, future, funcy, joblib, py, pytest, numexpr, pyLDAvis, dariah-topics
Running setup.py develop for dariah-topics
Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 beautifulsoup4-4.5.3 bleach-2.0.0 boto-2.46.1 bz2file-0.98 click-6.7 coverage-4.3.4 cycler-0.10.0 dariah-topics decorator-4.0.11 entrypoints-0.2.2 flask-0.12.1 funcy-1.7.3 future-0.16.0 gensim-2.0.0 html5lib-0.999999999 ipykernel-4.6.0 ipython-5.3.0 ipython-genutils-0.2.0 ipywidgets-6.0.0 itsdangerous-0.24 joblib-0.11 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.0.1 jupyter-console-5.1.0 jupyter-core-4.3.0 lxml-3.7.3 matplotlib-1.5.3 mistune-0.7.4 nbconvert-5.1.1 nbformat-4.3.0 nose-1.3.7 nosexcover-1.0.11 notebook-5.0.0 numexpr-2.6.2 numpy-1.12.1 pandas-0.19.2 pandocfilters-1.4.1 pexpect-4.2.1 pickleshare-0.7.4 prompt-toolkit-1.0.14 ptyprocess-0.5.1 py-1.4.33 pyLDAvis-2.1.1 pygments-2.2.0 pytest-3.0.7 python-dateutil-2.6.0 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.0 regex-2017.4.5 requests-2.13.0 scipy-0.19.0 simplegeneric-0.8.1 smart-open-1.5.1 terminado-0.6 testpath-0.3 tornado-4.4.3 traitlets-4.3.2 wcwidth-0.1.7 webencodings-0.5.1 werkzeug-0.12.1 widgetsnbextension-2.0.0 wikipedia-1.4.0
+ ./setup.py sdist bdist_wheel
running sdist
running egg_info
writing top-level names to dariah_topics.egg-info/top_level.txt
writing dependency_links to dariah_topics.egg-info/dependency_links.txt
writing dariah_topics.egg-info/PKG-INFO
writing requirements to dariah_topics.egg-info/requires.txt
reading manifest file 'dariah_topics.egg-info/SOURCES.txt'
writing manifest file 'dariah_topics.egg-info/SOURCES.txt'
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:334: UserWarning: Normalizing '0.2.0dev0' to '0.2.0.dev0'
normalized_version,
warning: sdist: standard file not found: should have one of README, README.rst, README.txt
running check
warning: check: missing required meta-data: url
creating dariah_topics-0.2.0.dev0
creating dariah_topics-0.2.0.dev0/dariah_topics
creating dariah_topics-0.2.0.dev0/dariah_topics.egg-info
creating dariah_topics-0.2.0.dev0/test
copying files to dariah_topics-0.2.0.dev0...
copying setup.cfg -> dariah_topics-0.2.0.dev0
copying setup.py -> dariah_topics-0.2.0.dev0
copying dariah_topics/__init__.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/doclist.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/evaluation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/mallet.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/model_creation.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/preprocessing.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics/visualization.py -> dariah_topics-0.2.0.dev0/dariah_topics
copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info
copying test/test_fuzzy_segmenting.py -> dariah_topics-0.2.0.dev0/test
Writing dariah_topics-0.2.0.dev0/setup.cfg
Creating tar archive
removing 'dariah_topics-0.2.0.dev0' (and everything under it)
running bdist_wheel
running build
running build_py
copying dariah_topics/preprocessing.py -> build/lib/dariah_topics
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/testing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
running install_egg_info
Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0-py3.5.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0.dist-info/WHEEL
+ nosetests
.FF..F...../mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/util.py:453: DeprecationWarning: inspect.getargspec() is deprecated, use inspect.signature() instead
inspect.getargspec(func)
..........E...........
======================================================================
ERROR: Integration test notebook (via Jupyter)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/case.py", line 198, in runTest
self.test(*self.arg)
File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/integration_test.py", line 20, in jupyter_integration_test
stderr=STDOUT, universal_newlines=True)
File "/usr/lib/python3.5/subprocess.py", line 316, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 398, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['jupyter-nbconvert', '--execute', '--log-level=ERROR', '--ExecutePreprocessor.iopub_timeout=30', '--ExecutePreprocessor.timeout=None', '/mnt/data/jenkins/workspace/DARIAH-Topics/IntegrationTest_v01.ipynb']' returned non-zero exit status 1
-------------------- >> begin captured logging << --------------------
root: ERROR: An error occurred while executing the following cell:
------------------
doc_tokens = pre.filter_pos_tags(corpus_csv)
list(doc_tokens)[0][:5]
------------------
AttributeError: 'generator' object has no attribute 'loc'
--------------------- >> end captured logging << ---------------------
======================================================================
FAIL: Doctest: dariah_topics.preprocessing.create_document_list
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 2190, in runTest
raise self.failureException(self.format_failure(new.getvalue()))
AssertionError: Failed doctest test for dariah_topics.preprocessing.create_document_list
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 37, in create_document_list
----------------------------------------------------------------------
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 55, in dariah_topics.preprocessing.create_document_list
Failed example:
create_document_list('corpus_txt')[0]
Expected:
'corpus_txt/Doyle_AScandalinBohemia.txt'
Got:
'corpus_txt/Poe_EurekaAProsePoem.txt'
-------------------- >> begin captured logging << --------------------
preprocessing: INFO: Creating document list from TXT files ...
--------------------- >> end captured logging << ---------------------
======================================================================
FAIL: Doctest: dariah_topics.preprocessing.filter_pos_tags
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 2190, in runTest
raise self.failureException(self.format_failure(new.getvalue()))
AssertionError: Failed doctest test for dariah_topics.preprocessing.filter_pos_tags
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 386, in filter_pos_tags
----------------------------------------------------------------------
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 407, in dariah_topics.preprocessing.filter_pos_tags
Failed example:
list(filter_pos_tags(df))[0] # doctest: +NORMALIZE_WHITESPACE
Exception raised:
Traceback (most recent call last):
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/indexes/base.py", line 2134, in get_loc
return self._engine.get_loc(key)
File "pandas/index.pyx", line 132, in pandas.index.IndexEngine.get_loc (pandas/index.c:4433)
File "pandas/index.pyx", line 154, in pandas.index.IndexEngine.get_loc (pandas/index.c:4279)
File "pandas/src/hashtable_class_helper.pxi", line 732, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:13742)
File "pandas/src/hashtable_class_helper.pxi", line 740, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:13696)
KeyError: 'Lemma'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 1321, in __run
compileflags, 1), test.globs)
File "<doctest dariah_topics.preprocessing.filter_pos_tags[1]>", line 1, in <module>
list(filter_pos_tags(df))[0] # doctest: +NORMALIZE_WHITESPACE
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 416, in filter_pos_tags
yield doc_csv.loc[doc_csv['CPOS'] == pos]['Lemma']
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 2059, in __getitem__
return self._getitem_column(key)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 2066, in _getitem_column
return self._get_item_cache(key)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/generic.py", line 1386, in _get_item_cache
values = self._data.get(item)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/internals.py", line 3543, in get
loc = self.items.get_loc(item)
File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/indexes/base.py", line 2136, in get_loc
return self._engine.get_loc(self._maybe_cast_indexer(key))
File "pandas/index.pyx", line 132, in pandas.index.IndexEngine.get_loc (pandas/index.c:4433)
File "pandas/index.pyx", line 154, in pandas.index.IndexEngine.get_loc (pandas/index.c:4279)
File "pandas/src/hashtable_class_helper.pxi", line 732, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:13742)
File "pandas/src/hashtable_class_helper.pxi", line 740, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:13696)
KeyError: 'Lemma'
-------------------- >> begin captured logging << --------------------
preprocessing: INFO: Accessing ['ADJ', 'V', 'NN'] ...
--------------------- >> end captured logging << ---------------------
======================================================================
FAIL: Doctest: dariah_topics.preprocessing.read_from_tei
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.5/doctest.py", line 2190, in runTest
raise self.failureException(self.format_failure(new.getvalue()))
AssertionError: Failed doctest test for dariah_topics.preprocessing.read_from_tei
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 100, in read_from_tei
----------------------------------------------------------------------
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 120, in dariah_topics.preprocessing.read_from_tei
Failed example:
list(read_from_tei('corpus_tei/Schnitzler_Amerika.xml'))[0][143:160]
Expected:
'Arthur Schnitzler'
Got:
'rthur Schnitzler\n'
----------------------------------------------------------------------
File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 123, in dariah_topics.preprocessing.read_from_tei
Failed example:
list(read_from_tei(doclist))[0][142:159]
Expected:
'Arthur Schnitzler'
Got:
' \nArthur Schnitz'
-------------------- >> begin captured logging << --------------------
preprocessing: INFO: Accessing TEI XML documents ...
preprocessing: INFO: Creating document list from XML files ...
preprocessing: INFO: Accessing TEI XML documents ...
--------------------- >> end captured logging << ---------------------
Name Stmts Miss Cover
-----------------------------------------------------
dariah_topics.py 0 0 100%
dariah_topics/doclist.py 89 14 84%
dariah_topics/evaluation.py 100 75 25%
dariah_topics/mallet.py 216 150 31%
dariah_topics/model_creation.py 60 43 28%
dariah_topics/preprocessing.py 208 62 70%
dariah_topics/visualization.py 166 135 19%
-----------------------------------------------------
TOTAL 839 479 43%
----------------------------------------------------------------------
Ran 33 tests in 4.710s
FAILED (errors=1, failures=3)
Build step 'Virtualenv Builder' marked build as failure
Recording test results
Skipping Cobertura coverage report as build was not UNSTABLE or better ...
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@36e2d928[description=A python library for topic modeling.,homepage=,name=Topics,license=<null>,fork=true,size=96597,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:31155e9) with context:DARIAH-Topics
Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/31155e9e43b97b8c463efa654499f443d10fdaba
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
Finished: FAILURE