Started by user Thorsten Vitt Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics Cloning the remote Git repository Cloning repository https://github.com/DARIAH-DE/Topics > /usr/bin/git init /mnt/data/jenkins/workspace/DARIAH-Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > /usr/bin/git --version # timeout=10 using GIT_ASKPASS to set credentials > /usr/bin/git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > /usr/bin/git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 > /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > /usr/bin/git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics using GIT_ASKPASS to set credentials > /usr/bin/git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > /usr/bin/git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 08f95941bf3af56e68d7b61296131eed6676b1d5 (refs/remotes/origin/testing) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f 08f95941bf3af56e68d7b61296131eed6676b1d5 Commit message: "changed decimal places in doc_topics" > /usr/bin/git rev-list --no-walk 1e2a12693eefd1e8ed1c266edca3970b45fbdab8 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda649695069040023291.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (10.0.1) + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/62/59/950a805f90587d6e2f3692cf43700becb7cdf6c16b06d84e7516b199236b/pytest-3.6.0-py2.py3-none-any.whl (194kB) Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Using cached https://files.pythonhosted.org/packages/30/7d/7f6a78ae44a1248ee28cc777586c18b28a1df903470e5d34a6e25712b8aa/pytest_cov-2.5.1-py2.py3-none-any.whl Collecting pytest-nbsmoke (from -r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/ea/fc/59a5a78686afdec342c783399e21aec0c721998be2a04b5e9ff49fe43a29/pytest_nbsmoke-0.1.6-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/83/df/0f5dd132200728a86190397e1ea87cd76244e42d39ec5e88efd25b2abd7e/jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/1b/9f/98d67201c5a6e1aececed03a44a819d0e32adba81414081e303cfaf8c54c/Sphinx-1.7.5-py2.py3-none-any.whl (1.9MB) Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-install-1bfz0l4y/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Downloading https://files.pythonhosted.org/packages/df/a5/8ee4b84af7f997dfdba71254a88008cfc19c49df98983c9a4919e798f8ce/recommonmark-0.4.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/a5/c1/43966a4ce89d0c64111f46c6364ed57d6d87e6fab7d685dca06197a19cf7/pandas-0.23.0-cp35-cp35m-manylinux1_x86_64.whl (11.6MB) Collecting regex>=2017.01.14 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/a2/51/c39562cfed3272592c60cfd229e5464d715b78537e332eac2b695422dc49/regex-2018.02.21.tar.gz (620kB) Collecting gensim>=0.13.2 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/b8/ed/89f9fa9c3a290ebc454249df90891c804c3760cf054d54c5f701f2675122/gensim-3.4.0-cp35-cp35m-manylinux1_x86_64.whl (22.6MB) Collecting lda>=1.0.5 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/31/95/91c8de340a9d00322d9b2d81ef9ffac9afc116e8a19ac501c7df55fa0d73/lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/7b/61/11b05cc37ccdaabad89f04dbdc2a02905cf6de6f9b05816dba843beed328/numpy-1.14.3-cp35-cp35m-manylinux1_x86_64.whl (12.1MB) Collecting lxml>=3.6.4 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/30/65/6dcc7a1a0ec3bbc10a1316b3610f9997ca132183a5f5345c5b88fc1eaf79/lxml-4.2.1-cp35-cp35m-manylinux1_x86_64.whl Collecting matplotlib>=1.5.3 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/81/31/4e261379e0cd4e9bbacfc96b124ebac0706b44374bd1d34ef899796f741b/matplotlib-2.2.2-cp35-cp35m-manylinux1_x86_64.whl (12.6MB) Collecting bokeh>=0.12.6 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/cd/47/201408029628164342e65a4552ee00abc79ea7be1b64031281b81b0e2f4d/bokeh-0.12.16.tar.gz (14.7MB) Collecting metadata_toolbox (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/20/f6/30218493f87bcd084cc9a259ca2be65ca21a32efb4cc505a953374fcef54/metadata_toolbox-0.1.0.dev0.tar.gz Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/ba/65/ded3bc40bbf8d887f262f150fbe1ae6637765b5c9534bd55690ed2c0b0f7/pluggy-0.6.0-py3-none-any.whl Collecting more-itertools>=4.0.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/85/40/90c3b0393e12b9827381004224de8814686e3d7182f9d4182477f600826d/more_itertools-4.2.0-py3-none-any.whl Collecting attrs>=17.4.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/41/59/cedf87e91ed541be7957c501a92102f9cc6363c623a7666d69d51c78ac5b/attrs-18.1.0-py2.py3-none-any.whl Collecting atomicwrites>=1.0 (from pytest->-r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/0a/e8/cd6375e7a59664eeea9e1c77a766eeac0fc3083bb958c2b41ec46b95f29c/atomicwrites-1.1.5-py2.py3-none-any.whl Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/67/a5/f77982214dd4c8fd104b066f249adea2c49e25e8703d284382eb5e9ab35a/py-1.5.3-py2.py3-none-any.whl Requirement not upgraded as not directly required: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) (39.2.0) Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5)) Using cached https://files.pythonhosted.org/packages/2c/c0/8047b7cbbcdbd7d21f8d68126196b7915da892c5af3d1a99dba082d33ec0/coverage-4.5.1-cp35-cp35m-manylinux1_x86_64.whl Collecting ipykernel (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/ab/3f/cd624c835aa3336a9110d0a99e15070f343b881b7d651ab1375ef226a3ac/ipykernel-4.8.2-py3-none-any.whl (108kB) Collecting nbformat (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/da/27/9a654d2b6cc1eaa517d1c5a4405166c7f6d72f04f6e7eea41855fe808a46/nbformat-4.4.0-py2.py3-none-any.whl (155kB) Collecting nbconvert (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/39/ea/280d6c0d92f8e3ca15fd798bbcc2ea141489f9539de7133d8fe10ea4b049/nbconvert-5.3.1-py2.py3-none-any.whl (387kB) Collecting jupyter-client (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/94/dd/fe6c4d683b09eb05342bd2816b7779663f71762b4fa9c2d5203d35d17354/jupyter_client-5.2.3-py2.py3-none-any.whl (89kB) Collecting pyflakes (from pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/44/98/af7a72c9a543b1487d92813c648cb9b9adfbc96faef5455d60f4439aa99b/pyflakes-2.0.0-py2.py3-none-any.whl (53kB) Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/7d/24/fabc09ad81c6071159a4d12d5bfbddcbea69bd9e3b16c3250ef300c0285f/ipywidgets-7.2.1-py2.py3-none-any.whl (106kB) Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/c0/66/cfed59f574d03ca5f1b7c5281485a9cc9a0f21342d24e0f057572316dae5/notebook-5.5.0-py2.py3-none-any.whl (8.4MB) Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/90/ff/047e0dca2627b162866920e7aa93f04523c0ae81e5c67060eec85701992d/qtconsole-4.3.1-py2.py3-none-any.whl (108kB) Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/77/82/6469cd7fccf7958cbe5dce2e623f1e3c5e27f1bb1ad36d90519bc2d5d370/jupyter_console-5.2.0-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl (543kB) Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/b8/ad/c6f60602d3ee3d92fbed87675b6fb6a6f9a38c223343ababdb44ba201f10/Babel-2.6.0-py2.py3-none-any.whl (8.1MB) Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/2e/c3/9b7dcd8548cf2c00531763ba154e524af575e8f36701bacfe5bcadc67440/alabaster-0.7.10-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/e9/79/31cc1c2e0daf575f8fd2b581e2975e6a6938bd439581f766b79c50479521/imagesize-1.0.0-py2.py3-none-any.whl Collecting requests>=2.0.0 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB) Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/02/ee/b6e02dc6529e82b75bb06823ff7d005b141037cb1416b10c6f00fc419dca/Pygments-2.2.0-py2.py3-none-any.whl (841kB) Collecting packaging (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/ad/c2/b500ea05d5f9f361a562f089fc91f77ed3b4783e13a08a3daf82069b1224/packaging-17.1-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/56/0f/3ee19ca5e5a1d9751cf4bbeb372d40a46421c4321fe55a4703ba66d0bafb/sphinxcontrib_websupport-1.0.1-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/d4/6c/8a935e2c7b54a37714656d753e4187ee0631988184ed50c0cf6476858566/snowballstemmer-1.2.1-py2.py3-none-any.whl (64kB) Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Downloading https://files.pythonhosted.org/packages/93/d6/abcb22de61d78e2fc3959c964628a5771e47e7cc60d53e9342e21ed6cc9a/traitlets-4.3.2-py2.py3-none-any.whl (74kB) Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Downloading https://files.pythonhosted.org/packages/4d/93/3808cbcebe94d205f55a9a32857df733a603339d32c46cd32669d808d964/CommonMark-0.5.4.tar.gz (120kB) Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/dc/83/15f7833b70d3e067ca91467ca245bae0f6fe56ddc7451aa0dc5606b120f2/pytz-2018.4-py2.py3-none-any.whl Collecting python-dateutil>=2.5.0 (from pandas>=0.19.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/cd/32/5196b64476bd41d596a8aba43506e2403e019c90e1a3dfc21d51b83db5a6/scipy-1.1.0-cp35-cp35m-manylinux1_x86_64.whl (33.1MB) Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/4b/69/c92661a333f733510628f28b8282698b62cdead37291c8491f3271677c02/smart_open-1.5.7.tar.gz Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/2d/9d/7bfab757977067556c7ca5fe437f28e8b8843c95564fca504de79df63b25/pbr-4.0.3-py2.py3-none-any.whl Collecting kiwisolver>=1.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/7e/31/d6fedd4fb2c94755cd101191e581af30e1650ccce7a35bddb7930fed6574/kiwisolver-1.0.1-cp35-cp35m-manylinux1_x86_64.whl (949kB) Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/6a/8a/718fd7d3458f9fab8e67186b00abdd345b639976bc7fb3ae722e1b026a50/pyparsing-2.2.0-py2.py3-none-any.whl (56kB) Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/4a/85/db5a2df477072b2902b0eb892feb37d88ac635d36245a72a6a69b23b383a/PyYAML-3.12.tar.gz Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/cf/d1/3be271ae5eba9fb59df63c9891fdc7d8044b999e8ac145994cdbfd2ae66a/tornado-5.0.2.tar.gz Collecting parse>=1.8.2 (from metadata_toolbox->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/79/e1/522401e2cb06d09497f2f56baa3b902116c97dec6f448d02b730e63b44a8/parse-1.8.4.tar.gz Collecting ipython>=4.0.0 (from ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/b1/7f/91d50f28af3e3a24342561983a7857e399ce24093876e6970b986a0b6677/ipython-6.4.0-py3-none-any.whl (750kB) Collecting ipython-genutils (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl Collecting jupyter-core (from nbformat->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/1d/44/065d2d7bae7bebc06f1dd70d23c36da8c50c0f08b4236716743d706762a8/jupyter_core-4.4.0-py2.py3-none-any.whl (126kB) Collecting entrypoints>=0.2.2 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/cc/8b/4eefa9b47f1910b3d2081da67726b066e379b04ca897acfe9f92bac56147/entrypoints-0.2.3-py2.py3-none-any.whl Collecting testpath (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/15/19/d7bc380c479a184e4a5a9ce516e4e2a773165f89b445f7cdced83d94de25/testpath-0.3.1-py2.py3-none-any.whl (161kB) Collecting pandocfilters>=1.4.1 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz Collecting bleach (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/30/b6/a8cffbb9ab4b62b557c22703163735210e9cd857d533740c64e1467d228e/bleach-2.1.3-py2.py3-none-any.whl Collecting mistune>=0.7.4 (from nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/c8/8c/87f4d359438ba0321a2ae91936030110bfcc62fef752656321a72b8c1af9/mistune-0.8.3-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/4a/9c/c736101a42da5947985ff297c2837795f55a1c016cebcc3b49f31ce75c67/pyzmq-17.0.0-cp35-cp35m-manylinux1_x86_64.whl (3.1MB) Collecting widgetsnbextension~=3.2.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/ff/fa/64acc09fc845a6b2dc0724d6f3f81e829b778ed5e9a7559567b4f19a3f4b/widgetsnbextension-3.2.1-py2.py3-none-any.whl (2.2MB) Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/2e/20/a26211a24425923d46e1213b376a6ee60dc30bcdf1b0c345e2c3769deb1c/terminado-0.8.1-py2.py3-none-any.whl Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/49/46/c3dc27481d1cc57b9385aff41c474ceb7714f7935b1247194adae45db714/Send2Trash-1.5.0-py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/04/d1/c6616dd03701e7e2073f06d5c3b41b012256e42b72561f16a7bd86dd7b43/prompt_toolkit-1.0.15-py3-none-any.whl (247kB) Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB) Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB) Collecting certifi>=2017.4.17 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB) Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/4d/de/32d741db316d8fdb7680822dd37001ef7a448255de9699ab4bfcbdf4172b/MarkupSafe-1.0.tar.gz Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Downloading https://files.pythonhosted.org/packages/bc/bb/a24838832ba35baf52f32ab1a49b906b5f82fb7c76b2f6a7e35e140bac30/decorator-4.3.0-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/bd/b7/a88a67002b1185ed9a8e8a6ef15266728c2361fcb4f1d02ea331e4c7741d/boto-2.48.0-py2.py3-none-any.whl (1.4MB) Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/61/39/122222b5e85cd41c391b68a99ee296584b2a2d1d233e7ee32b4532384f2d/bz2file-0.98.tar.gz Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/61/e3/79e80d130ee3dfbb576bd0adb1544ea27d14bfb7bdbaac415c73abc92385/boto3-1.7.29-py2.py3-none-any.whl (128kB) Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/e7/42/074192a165622e645ed4aeade63e76e56b3496a044569b3c6cae3a918352/jedi-0.12.0-py2.py3-none-any.whl (172kB) Collecting backcall (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/84/71/c8ca4f5bb1e08401b916c68003acf0a0655df935d74d93bf3f3364b310e0/backcall-0.1.0.tar.gz Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/89/e6/b5a1de8b0cc4e07ca1b305a4fcc3f9806025c1b651ea302646341222f88b/pexpect-4.6.0-py2.py3-none-any.whl (57kB) Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip Collecting pickleshare (from ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/9f/17/daa142fc9be6b76f26f24eeeb9a138940671490b91cb5587393f297c8317/pickleshare-0.7.4-py2.py3-none-any.whl Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/a5/62/bbd2be0e7943ec8504b517e62bab011b4946e1258842bc159e5dfde15b96/html5lib-1.0.1-py2.py3-none-any.whl (117kB) Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/ff/4e/fa4a73ccfefe2b37d7b6898329e7dbcd1ac846ba3a3b26b294a78a3eb997/ptyprocess-0.5.2-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/d7/14/2a0004d487464d120c9fb85313a75cd3d71a7506955be458eebfe19a6b1d/s3transfer-0.1.13-py2.py3-none-any.whl (59kB) Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/b7/31/05c8d001f7f87f0f07289a5fc0fc3832e9a57f2dbd4d3b0fee70e0d51365/jmespath-0.9.3-py2.py3-none-any.whl Collecting botocore<1.11.0,>=1.10.29 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/2c/f7/b2bd70e45a6efb008c9995d895ab00ef54cf2af35a84fc16625e020455b1/botocore-1.10.29-py2.py3-none-any.whl (4.3MB) Collecting parso>=0.2.0 (from jedi>=0.10->ipython>=4.0.0->ipykernel->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/cd/3e/5908f9577dbd1e5df53e64349bfd11e46b726c1e4d8cd676bbe8aa4de316/parso-0.2.1-py2.py3-none-any.whl (91kB) Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->pytest-nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl Building wheels for collected packages: nbsphinx, regex, bokeh, metadata-toolbox, commonmark, smart-open, PyYAML, tornado, parse, pandocfilters, MarkupSafe, bz2file, backcall, simplegeneric Running setup.py bdist_wheel for nbsphinx: started Running setup.py bdist_wheel for nbsphinx: finished with status 'done' Stored in directory: /tmp/pip-ephem-wheel-cache-il320lqg/wheels/a3/7b/ff/b71160a2e088926eb13abcf8e0cde94c0aa8400c67d3df57f5 Running setup.py bdist_wheel for regex: started Running setup.py bdist_wheel for regex: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/6b/c9/cf/230425cdd343d6b98e8da5a5841c3dab1e0c8aaa134e29edb0 Running setup.py bdist_wheel for bokeh: started Running setup.py bdist_wheel for bokeh: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/ff/28/51/22e8d08e9d5383ee1de981aaa8ff7bc53c7d65022e5101400f Running setup.py bdist_wheel for metadata-toolbox: started Running setup.py bdist_wheel for metadata-toolbox: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/b5/2f/fb/bab1240c4d6756f31265919deefed0908b81fc62246bf93dbb Running setup.py bdist_wheel for commonmark: started Running setup.py bdist_wheel for commonmark: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/a0/f4/35/019d917f6875107ade3aad634c982f5c6b604c5631cddf20ac Running setup.py bdist_wheel for smart-open: started Running setup.py bdist_wheel for smart-open: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/b1/9e/7d/bb3d3b55c597e72617140a0638c06382a5f17283881eae163e Running setup.py bdist_wheel for PyYAML: started Running setup.py bdist_wheel for PyYAML: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/03/05/65/bdc14f2c6e09e82ae3e0f13d021e1b6b2481437ea2f207df3f Running setup.py bdist_wheel for tornado: started Running setup.py bdist_wheel for tornado: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/29/8c/cf/6a5a8f6e35d877c0cb72b109d21c34981504897ce9a605e599 Running setup.py bdist_wheel for parse: started Running setup.py bdist_wheel for parse: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/d9/25/e7/8f6f6a0e923b019d09148f5ea711333b94fab03f585e990eee Running setup.py bdist_wheel for pandocfilters: started Running setup.py bdist_wheel for pandocfilters: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/39/01/56/f1b08a6275acc59e846fa4c1e1b65dbc1919f20157d9e66c20 Running setup.py bdist_wheel for MarkupSafe: started Running setup.py bdist_wheel for MarkupSafe: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/33/56/20/ebe49a5c612fffe1c5a632146b16596f9e64676768661e4e46 Running setup.py bdist_wheel for bz2file: started Running setup.py bdist_wheel for bz2file: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/81/75/d6/e1317bf09bf1af5a30befc2a007869fa6e1f516b8f7c591cb9 Running setup.py bdist_wheel for backcall: started Running setup.py bdist_wheel for backcall: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/98/b0/dd/29e28ff615af3dda4c67cab719dd51357597eabff926976b45 Running setup.py bdist_wheel for simplegeneric: started Running setup.py bdist_wheel for simplegeneric: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/a9/28/53/f24776b4c5bcbe91aaf1f1e247bd6fadd17191aa12fac63902 Successfully built nbsphinx regex bokeh metadata-toolbox commonmark smart-open PyYAML tornado parse pandocfilters MarkupSafe bz2file backcall simplegeneric Installing collected packages: six, pluggy, more-itertools, attrs, atomicwrites, py, pytest, coverage, pytest-cov, tornado, pyzmq, python-dateutil, ipython-genutils, decorator, traitlets, jupyter-core, jupyter-client, Pygments, parso, jedi, wcwidth, prompt-toolkit, backcall, ptyprocess, pexpect, simplegeneric, pickleshare, ipython, ipykernel, jsonschema, nbformat, entrypoints, MarkupSafe, Jinja2, testpath, pandocfilters, webencodings, html5lib, bleach, mistune, nbconvert, pyflakes, pytest-nbsmoke, terminado, Send2Trash, notebook, widgetsnbextension, ipywidgets, qtconsole, jupyter-console, jupyter, docutils, pytz, babel, alabaster, imagesize, urllib3, idna, certifi, chardet, requests, pyparsing, packaging, sphinxcontrib-websupport, snowballstemmer, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, scipy, boto, bz2file, jmespath, botocore, s3transfer, boto3, smart-open, gensim, pbr, lda, lxml, kiwisolver, cycler, matplotlib, PyYAML, bokeh, parse, metadata-toolbox, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.5.0 alabaster-0.7.10 atomicwrites-1.1.5 attrs-18.1.0 babel-2.6.0 backcall-0.1.0 bleach-2.1.3 bokeh-0.12.16 boto-2.48.0 boto3-1.7.29 botocore-1.10.29 bz2file-0.98 certifi-2018.4.16 chardet-3.0.4 commonmark-0.5.4 coverage-4.5.1 cycler-0.10.0 dariah-topics decorator-4.3.0 docutils-0.14 entrypoints-0.2.3 gensim-3.4.0 html5lib-1.0.1 idna-2.6 imagesize-1.0.0 ipykernel-4.8.2 ipython-6.4.0 ipython-genutils-0.2.0 ipywidgets-7.2.1 jedi-0.12.0 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.3 jupyter-console-5.2.0 jupyter-core-4.4.0 kiwisolver-1.0.1 lda-1.0.5 lxml-4.2.1 matplotlib-2.2.2 metadata-toolbox-0.1.0.dev0 mistune-0.8.3 more-itertools-4.2.0 nbconvert-5.3.1 nbformat-4.4.0 nbsphinx-0.3.3 notebook-5.5.0 numpy-1.14.3 packaging-17.1 pandas-0.23.0 pandocfilters-1.4.2 parse-1.8.4 parso-0.2.1 pbr-4.0.3 pexpect-4.6.0 pickleshare-0.7.4 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.3 pyflakes-2.0.0 pyparsing-2.2.0 pytest-3.6.0 pytest-cov-2.5.1 pytest-nbsmoke-0.1.6 python-dateutil-2.7.3 pytz-2018.4 pyzmq-17.0.0 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.2.21 requests-2.18.4 s3transfer-0.1.13 scipy-1.1.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.7 snowballstemmer-1.2.1 sphinx-1.7.5 sphinxcontrib-websupport-1.0.1 terminado-0.8.1 testpath-0.3.1 tornado-5.0.2 traitlets-4.3.2 urllib3-1.22 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.2.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing dariah_topics.egg-info/PKG-INFO writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing top-level names to dariah_topics.egg-info/top_level.txt writing requirements to dariah_topics.egg-info/requires.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check creating dariah_topics-1.0.0.dev0 creating dariah_topics-1.0.0.dev0/dariah_topics creating dariah_topics-1.0.0.dev0/dariah_topics.egg-info creating dariah_topics-1.0.0.dev0/test copying files to dariah_topics-1.0.0.dev0... copying README.md -> dariah_topics-1.0.0.dev0 copying setup.cfg -> dariah_topics-1.0.0.dev0 copying setup.py -> dariah_topics-1.0.0.dev0 copying dariah_topics/__init__.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/modeling.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/utils.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-1.0.0.dev0/test Writing dariah_topics-1.0.0.dev0/setup.cfg creating dist Creating tar archive removing 'dariah_topics-1.0.0.dev0' (and everything under it) running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/dariah_topics copying dariah_topics/modeling.py -> build/lib/dariah_topics copying dariah_topics/preprocessing.py -> build/lib/dariah_topics copying dariah_topics/utils.py -> build/lib/dariah_topics copying dariah_topics/postprocessing.py -> build/lib/dariah_topics copying dariah_topics/evaluation.py -> build/lib/dariah_topics copying dariah_topics/__init__.py -> build/lib/dariah_topics copying dariah_topics/visualization.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/modeling.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/utils.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-1.0.0.dev0-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-1.0.0.dev0.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-1.0.0.dev0-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/modeling.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/utils.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-1.0.0.dev0.dist-info/top_level.txt' adding 'dariah_topics-1.0.0.dev0.dist-info/WHEEL' adding 'dariah_topics-1.0.0.dev0.dist-info/METADATA' adding 'dariah_topics-1.0.0.dev0.dist-info/RECORD' removing build/bdist.linux-x86_64/wheel /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:398: UserWarning: Normalizing '1.0.0.dev' to '1.0.0.dev0' normalized_version, + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.5.3, pytest-3.6.0, py-1.5.3, pluggy-0.6.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: nbsmoke-0.1.6, cov-2.5.1 collected 57 items dariah_topics/postprocessing.py ........... [ 19%] dariah_topics/preprocessing.py ........................... [ 66%] dariah_topics/utils.py ...... [ 77%] notebooks/IntroducingGensim.ipynb . [ 78%] notebooks/IntroducingLda.ipynb F [ 80%] notebooks/IntroducingMallet.ipynb F [ 82%] notebooks/Visualizations.ipynb F [ 84%] test/test_fuzzy_segmenting.py ........F [100%] =================================== FAILURES =================================== _______________________________________ _______________________________________ self = func = . at 0x7fdd00498950> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > return CallInfo(lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb")) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': } notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7fdd2e3c82b0> hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = def pytest_runtest_call(item): _update_current_test_var(item, 'call') sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pytest_nbsmoke.py:136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'git': {'suppress_outputs': True}, 'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': '...: 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'metadata': {}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'git': {'suppress_outputs': True}, 'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': '...: 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'metadata': {}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = cell = {'execution_count': 6, 'outputs': [{'evalue': "'PosixPath' object has no attribute 'rfind'", 'output_type': 'error', '...[:5]' to the variable, only the first 5 elements will be printed", 'metadata': {'scrolled': True}, 'cell_type': 'code'} resources = {}, cell_index = 16 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E ------------------ E E --------------------------------------------------------------------------- E AttributeError Traceback (most recent call last) E  in () E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E  in (.0) E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/metadata_toolbox/utils.py in fname2metadata(fname, pattern) E  60 """ E  61 log.debug("Extracting metadata from filename '{0}' with pattern '{1}' ...".format(fname, pattern)) E ---> 62 basename, _ = os.path.splitext(os.path.basename(fname)) E  63 metadata = parse(pattern, basename) E  64 if metadata is not None: E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/posixpath.py in basename(p) E  137 """Returns the final component of a pathname""" E  138 sep = _get_sep(p) E --> 139 i = p.rfind(sep) + 1 E  140 return p[i:] E  141  E E AttributeError: 'PosixPath' object has no attribute 'rfind' E AttributeError: 'PosixPath' object has no attribute 'rfind' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _______________________________________ _______________________________________ self = func = . at 0x7fdcff52dbf8> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > return CallInfo(lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb")) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': } notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7fdd2e3c82b0> hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = def pytest_runtest_call(item): _update_current_test_var(item, 'call') sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pytest_nbsmoke.py:136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'git': {'suppress_outputs': True}, 'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': '...: 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'metadata': {}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'git': {'suppress_outputs': True}, 'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': '...: 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'metadata': {}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = cell = {'execution_count': 6, 'outputs': [{'evalue': "'PosixPath' object has no attribute 'rfind'", 'output_type': 'error', '...5] # by adding '[:5]' to the variable, only the first 5 elements will be printed", 'metadata': {}, 'cell_type': 'code'} resources = {}, cell_index = 16 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E ------------------ E E --------------------------------------------------------------------------- E AttributeError Traceback (most recent call last) E  in () E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E  in (.0) E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/metadata_toolbox/utils.py in fname2metadata(fname, pattern) E  60 """ E  61 log.debug("Extracting metadata from filename '{0}' with pattern '{1}' ...".format(fname, pattern)) E ---> 62 basename, _ = os.path.splitext(os.path.basename(fname)) E  63 metadata = parse(pattern, basename) E  64 if metadata is not None: E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/posixpath.py in basename(p) E  137 """Returns the final component of a pathname""" E  138 sep = _get_sep(p) E --> 139 i = p.rfind(sep) + 1 E  140 return p[i:] E  141  E E AttributeError: 'PosixPath' object has no attribute 'rfind' E AttributeError: 'PosixPath' object has no attribute 'rfind' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _______________________________________ _______________________________________ self = func = . at 0x7fdcff53e730> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > return CallInfo(lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb")) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': } notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7fdd2e3c82b0> hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [, , ] kwargs = {'item': } self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [, , ] caller_kwargs = {'item': } firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = def pytest_runtest_call(item): _update_current_test_var(item, 'call') sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pytest_nbsmoke.py:136: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': 'ipython3', 'mimetype': 'text/x-pyth...show(interactive_barchart_per_document, notebook_handle=True)", 'metadata': {'scrolled': False}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = nb = {'metadata': {'language_info': {'nbconvert_exporter': 'python', 'pygments_lexer': 'ipython3', 'mimetype': 'text/x-pyth...show(interactive_barchart_per_document, notebook_handle=True)", 'metadata': {'scrolled': False}, 'cell_type': 'code'}]} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = cell = {'execution_count': 3, 'outputs': [{'evalue': "'PosixPath' object has no attribute 'rfind'", 'output_type': 'error', '...5] # by adding '[:5]' to the variable, only the first 5 elements will be printed", 'metadata': {}, 'cell_type': 'code'} resources = {}, cell_index = 2 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E ------------------ E E --------------------------------------------------------------------------- E AttributeError Traceback (most recent call last) E  in () E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E  in (.0) E ----> 1 meta = pd.concat([metadata.fname2metadata(path, pattern=pattern) for path in path_to_corpus.glob('*.txt')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/metadata_toolbox/utils.py in fname2metadata(fname, pattern) E  60 """ E  61 log.debug("Extracting metadata from filename '{0}' with pattern '{1}' ...".format(fname, pattern)) E ---> 62 basename, _ = os.path.splitext(os.path.basename(fname)) E  63 metadata = parse(pattern, basename) E  64 if metadata is not None: E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/posixpath.py in basename(p) E  137 """Returns the final component of a pathname""" E  138 sep = _get_sep(p) E --> 139 i = p.rfind(sep) + 1 E  140 return p[i:] E  141  E E AttributeError: 'PosixPath' object has no attribute 'rfind' E AttributeError: 'PosixPath' object has no attribute 'rfind' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _________________________________ test_segment _________________________________ def test_segment(): """segment convenience wrapper""" path = project_path.joinpath('grenzboten_sample', 'Beck_1844_Tagebuch_56.txt') > text = path.read_text(encoding='utf-8') test/test_fuzzy_segmenting.py:102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.5/pathlib.py:1164: in read_text with self.open(mode='r', encoding=encoding, errors=errors) as f: /usr/lib/python3.5/pathlib.py:1151: in open opener=self._opener) /usr/lib/python3.5/pathlib.py:1005: in _opener return self._accessor.open(self, flags, mode) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pathobj = PosixPath('/mnt/data/jenkins/workspace/DARIAH-Topics/grenzboten_sample/Beck_1844_Tagebuch_56.txt') args = (524288, 438) @functools.wraps(strfunc) def wrapped(pathobj, *args): > return strfunc(str(pathobj), *args) E FileNotFoundError: [Errno 2] No such file or directory: '/mnt/data/jenkins/workspace/DARIAH-Topics/grenzboten_sample/Beck_1844_Tagebuch_56.txt' /usr/lib/python3.5/pathlib.py:371: FileNotFoundError --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.5.3-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =============================== warnings summary =============================== dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_document_topics /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_topics /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._stopwords_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_stopwords /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) -- Docs: http://doc.pytest.org/en/latest/warnings.html ============== 4 failed, 53 passed, 12 warnings in 55.07 seconds =============== Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@43925ee7[description=A Python library for topic modeling.,homepage=https://dariah-de.github.io/Topics,name=Topics,fork=true,size=42253,milestones={},language=Python,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 30 May 2018 12:57:40 GMT], ETag=[W/"71601962f23e9582e1b2390e8fd4de38"], Last-Modified=[Tue, 22 May 2018 20:11:01 GMT], OkHttp-Received-Millis=[1527685060085], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1527685059893], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[A6D3:1616:40CF4D1:80DBA9A:5B0E9FC3], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4975], X-RateLimit-Reset=[1527687335], X-Runtime-rack=[0.092824], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:08f9594) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/08f95941bf3af56e68d7b61296131eed6676b1d5 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: -10.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE