Started by GitHub push by ThoraHagen Building remotely on build3 (digibib) in workspace /mnt/data/jenkins/workspace/DARIAH-Topics Cloning the remote Git repository Cloning repository git://github.com/DARIAH-DE/Topics > /usr/bin/git init /mnt/data/jenkins/workspace/DARIAH-Topics # timeout=10 Fetching upstream changes from git://github.com/DARIAH-DE/Topics > /usr/bin/git --version # timeout=10 using GIT_ASKPASS to set credentials > /usr/bin/git fetch --tags --progress git://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > /usr/bin/git config remote.origin.url git://github.com/DARIAH-DE/Topics # timeout=10 > /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > /usr/bin/git config remote.origin.url git://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from git://github.com/DARIAH-DE/Topics using GIT_ASKPASS to set credentials > /usr/bin/git fetch --tags --progress git://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > /usr/bin/git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision fa8b872c7b9d265332c411fa666db2371d256e62 (refs/remotes/origin/testing) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f fa8b872c7b9d265332c411fa666db2371d256e62 Commit message: "added DKProWrapper notebook" > /usr/bin/git rev-list --no-walk c642d76502236177ea01170dc42737ed78dceffd # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda6488513194368952292.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages (18.1) + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/80/2a/2995cbec008f624e1dd8ddc5350535a5d2f33c10b82c06037298e6c52bee/pytest-3.9.2-py2.py3-none-any.whl (214kB) Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Downloading https://files.pythonhosted.org/packages/30/0a/1b009b525526cd3cd9f52f52391b426c5a3597447be811a10bcb1f6b05eb/pytest_cov-2.6.0-py2.py3-none-any.whl Collecting nbsmoke (from -r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/36/be/1a91e27aa140a859322182d56787c0e55e0d417537cb1ec9c781a772f9c5/nbsmoke-0.2.7-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/83/df/0f5dd132200728a86190397e1ea87cd76244e42d39ec5e88efd25b2abd7e/jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/35/e0/e9e83b244eaa382ba21896dda6172617e47aff0be225eb72782cca105d3c/Sphinx-1.8.1-py2.py3-none-any.whl (3.1MB) Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-install-bzfglbsn/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Using cached https://files.pythonhosted.org/packages/df/a5/8ee4b84af7f997dfdba71254a88008cfc19c49df98983c9a4919e798f8ce/recommonmark-0.4.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/e1/d8/feeb346d41f181e83fba45224ab14a8d8af019b48af742e047f3845d8cff/pandas-0.23.4-cp36-cp36m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/2a/0a/944977367c8a6cfcfa6fcb8ac6b1f0f9a667c1f34194091c766b5d7c44d7/regex-2018.08.29.tar.gz (643kB) Collecting gensim>=0.13.2 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/27/a4/d10c0acc8528d838cda5eede0ee9c784caa598dbf40bd0911ff8d067a7eb/gensim-3.6.0-cp36-cp36m-manylinux1_x86_64.whl (23.6MB) Collecting lda>=1.0.5 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/fd/27/d62628d914bff7f048e2b433c3adea9e7072fa20028f1d4194999051cd9d/lda-1.1.0-cp36-cp36m-manylinux1_x86_64.whl (348kB) Collecting numpy>=1.3 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/16/21/2e88568c134cc3c8d22af290865e2abbd86efa58a1358ffcb19b6c74f9a3/numpy-1.15.3-cp36-cp36m-manylinux1_x86_64.whl (13.9MB) Collecting lxml>=3.6.4 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/03/a4/9eea8035fc7c7670e5eab97f34ff2ef0ddd78a491bf96df5accedb0e63f5/lxml-4.2.5-cp36-cp36m-manylinux1_x86_64.whl (5.8MB) Collecting matplotlib>=1.5.3 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/ed/89/dd823436a5f8d5ca9304b51b554863bfd366ca84708d5812f5ee87c923bc/matplotlib-3.0.0-cp36-cp36m-manylinux1_x86_64.whl (12.8MB) Collecting bokeh>=0.12.6 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Collecting metadata_toolbox (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Collecting cophi_toolbox>=0.1.1.dev0 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Collecting more-itertools>=4.0.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/79/b1/eace304ef66bd7d3d8b2f78cc374b73ca03bc53664d78151e9df3b3996cc/more_itertools-4.3.0-py3-none-any.whl Collecting atomicwrites>=1.0 (from pytest->-r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/3a/9a/9d878f8d885706e2530402de6417141129a943802c084238914fa6798d97/atomicwrites-1.2.1-py2.py3-none-any.whl Collecting pluggy>=0.7 (from pytest->-r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/1c/e7/017c262070af41fe251401cb0d0e1b7c38f656da634cd0c15604f1f30864/pluggy-0.8.0-py2.py3-none-any.whl Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/3e/c7/3da685ef117d42ac8d71af525208759742dd235f8094221fdaafcd3dba8f/py-1.7.0-py2.py3-none-any.whl (83kB) Collecting attrs>=17.4.0 (from pytest->-r requirements-dev.txt (line 4)) Downloading https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl Requirement already satisfied, skipping upgrade: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages (from pytest->-r requirements-dev.txt (line 4)) (40.4.3) Collecting coverage>=4.4 (from pytest-cov->-r requirements-dev.txt (line 5)) Using cached https://files.pythonhosted.org/packages/3d/a0/b12090c40e0b8196b973962ec71c1c541a6c04af58ba5ad85683b3de251a/coverage-4.5.1-cp36-cp36m-manylinux1_x86_64.whl Collecting jupyter-client (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/94/dd/fe6c4d683b09eb05342bd2816b7779663f71762b4fa9c2d5203d35d17354/jupyter_client-5.2.3-py2.py3-none-any.whl Collecting requests (from nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/f1/ca/10332a30cb25b627192b4ea272c351bce3ca1091e541245cccbace6051d8/requests-2.20.0-py2.py3-none-any.whl (60kB) Collecting ipykernel (from nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/d8/b0/f0be5c5ab335196f5cce96e5b889a4fcf5bfe462eb0acc05cd7e2caf65eb/ipykernel-5.1.0-py3-none-any.whl (113kB) Collecting pyflakes (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/44/98/af7a72c9a543b1487d92813c648cb9b9adfbc96faef5455d60f4439aa99b/pyflakes-2.0.0-py2.py3-none-any.whl Collecting beautifulsoup4 (from nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/21/0a/47fdf541c97fd9b6a610cb5fd518175308a7cc60569962e776ac52420387/beautifulsoup4-4.6.3-py3-none-any.whl (90kB) Collecting nbconvert (from nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/b5/bb/94c493051d60e5b9c0f7f9a368b324201818c1b1c4cae85d1e49a41846c7/nbconvert-5.4.0-py2.py3-none-any.whl (405kB) Collecting nbformat (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/da/27/9a654d2b6cc1eaa517d1c5a4405166c7f6d72f04f6e7eea41855fe808a46/nbformat-4.4.0-py2.py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/30/9a/a008c7b1183fac9e52066d80a379b3c64eab535bd9d86cdc29a0b766fd82/ipywidgets-7.4.2-py2.py3-none-any.whl (111kB) Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/cb/ee/6374ae8c21b7d0847f9c3722dcdfac986b8e54fa9ad9ea66e1eb6320d2b8/jupyter_console-6.0.0-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/44/16/9f108b675828c4117cfe72d8d0f97094163c40584e40c46ec48a1e862693/notebook-5.7.0-py2.py3-none-any.whl (9.0MB) Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/37/22/0d8474f78a8c421d485ac2339de7c871d535160f09f170de90c8185b87c4/qtconsole-4.4.2-py2.py3-none-any.whl (112kB) Collecting packaging (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/89/d1/92e6df2e503a69df9faab187c684585f0136662c12bb1f36901d426f3fab/packaging-18.0-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/52/69/3c2fbdc3702358c5b34ee25e387b24838597ef099761fc9a42c166796e8f/sphinxcontrib_websupport-1.1.0-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/fc/b6/aef66b4c52a6ad6ac18cf6ebc5731ed06d8c9ae4d3b2d9951f261150be67/imagesize-1.1.0-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/02/ee/b6e02dc6529e82b75bb06823ff7d005b141037cb1416b10c6f00fc419dca/Pygments-2.2.0-py2.py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/b8/ad/c6f60602d3ee3d92fbed87675b6fb6a6f9a38c223343ababdb44ba201f10/Babel-2.6.0-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Downloading https://files.pythonhosted.org/packages/10/ad/00b090d23a222943eb0eda509720a404f531a439e803f6538f35136cae9e/alabaster-0.7.12-py2.py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/d4/6c/8a935e2c7b54a37714656d753e4187ee0631988184ed50c0cf6476858566/snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Using cached https://files.pythonhosted.org/packages/93/d6/abcb22de61d78e2fc3959c964628a5771e47e7cc60d53e9342e21ed6cc9a/traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Collecting python-dateutil>=2.5.0 (from pandas>=0.19.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/52/8b/876c5745f617630be90cfb8fafe363c6d7204b176dc707d1805d1e9a0a35/pytz-2018.6-py2.py3-none-any.whl (507kB) Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/4b/1f/6f27e3682124de63ac97a0a5876da6186de6c19410feab66c1543afab055/smart_open-1.7.1.tar.gz Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/a8/0b/f163da98d3a01b3e0ef1cab8dd2123c34aee2bafbb1c5bffa354cc8a1730/scipy-1.1.0-cp36-cp36m-manylinux1_x86_64.whl Collecting pbr<4,>=0.6 (from lda>=1.0.5->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/0c/5d/b077dbf309993d52c1d71e6bf6fe443a8029ea215135ebbe0b1b10e7aefc/pbr-3.1.1-py2.py3-none-any.whl (99kB) Collecting kiwisolver>=1.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/69/a7/88719d132b18300b4369fbffa741841cfd36d1e637e1990f27929945b538/kiwisolver-1.0.1-cp36-cp36m-manylinux1_x86_64.whl Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/2b/4a/f06b45ab9690d4c37641ec776f7ad691974f4cf6943a73267475b05cbfca/pyparsing-2.2.2-py2.py3-none-any.whl (57kB) Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/e6/78/6e7b5af12c12bdf38ca9bfe863fcaf53dc10430a312d0324e76c1e5ca426/tornado-5.1.1.tar.gz (516kB) Collecting parse>=1.8.2 (from metadata_toolbox->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/b6/98/809e53e5778c59c4af9eb920605e7a8ab439407efbe89a6d51a46efd1937/parse-1.9.0.tar.gz Collecting jupyter-core (from jupyter-client->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/1d/44/065d2d7bae7bebc06f1dd70d23c36da8c50c0f08b4236716743d706762a8/jupyter_core-4.4.0-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/48/93/59592cb294761aaa40589b544eaa5175446d687ff95beeeb666de60f3274/pyzmq-17.1.2-cp36-cp36m-manylinux1_x86_64.whl (998kB) Collecting chardet<3.1.0,>=3.0.2 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/56/9d/1d02dd80bc4cd955f98980f28c5ee2200e1209292d5f9e9cc8d030d18655/certifi-2018.10.15-py2.py3-none-any.whl (146kB) Collecting urllib3<1.25,>=1.21.1 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/8c/4b/5cbc4cb46095f369117dcb751821e1bef9dd86a07c968d8757e9204c324c/urllib3-1.24-py2.py3-none-any.whl (117kB) Collecting idna<2.8,>=2.5 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl Collecting ipython>=5.0.0 (from ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/a0/27/29d66ed395a5c2c3a912332d446a54e2bc3277c36b0bbd22bc71623e0193/ipython-7.0.1-py3-none-any.whl (760kB) Collecting testpath (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/be/a4/162f9ebb6489421fe46dcca2ae420369edfee4b563c668d93cb4605d12ba/testpath-0.4.2-py2.py3-none-any.whl (163kB) Collecting entrypoints>=0.2.2 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/cc/8b/4eefa9b47f1910b3d2081da67726b066e379b04ca897acfe9f92bac56147/entrypoints-0.2.3-py2.py3-none-any.whl Collecting bleach (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/d4/0d/4696373c3b714f6022d668fbab619690a42050dbeacede6d10ed97fbd3e2/bleach-3.0.2-py2.py3-none-any.whl (148kB) Collecting defusedxml (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/87/1c/17f3e3935a913dfe2a5ca85fa5ccbef366bfd82eb318b1f75dadbf0affca/defusedxml-0.5.0-py2.py3-none-any.whl Collecting mistune>=0.8.1 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/09/ec/4b43dae793655b7d8a25f76119624350b4d65eb663459eb9603d7f1f0345/mistune-0.8.4-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl Collecting ipython-genutils (from nbformat->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting widgetsnbextension~=3.4.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/8a/81/35789a3952afb48238289171728072d26d6e76649ddc8b3588657a2d78c1/widgetsnbextension-3.4.2-py2.py3-none-any.whl (2.2MB) Collecting prompt-toolkit<2.1.0,>=2.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/06/6f/03f10d2206e34a81934b10fb43975a29028e0fe2ae330f9f000bbc0780b7/prompt_toolkit-2.0.6-py3-none-any.whl (336kB) Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/2e/20/a26211a24425923d46e1213b376a6ee60dc30bcdf1b0c345e2c3769deb1c/terminado-0.8.1-py2.py3-none-any.whl Collecting prometheus-client (from notebook->jupyter->-r requirements-dev.txt (line 9)) Downloading https://files.pythonhosted.org/packages/61/84/9aa657b215b04f21a72ca8e50ff159eef9795096683e4581a357baf4dde6/prometheus_client-0.4.2.tar.gz Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/49/46/c3dc27481d1cc57b9385aff41c474ceb7714f7935b1247194adae45db714/Send2Trash-1.5.0-py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Using cached https://files.pythonhosted.org/packages/bc/bb/a24838832ba35baf52f32ab1a49b906b5f82fb7c76b2f6a7e35e140bac30/decorator-4.3.0-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/23/10/c0b78c27298029e4454a472a1919bde20cb182dab1662cec7f2ca1dcc523/boto-2.49.0-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/9b/97/6de72ccee57194621f95738535ad2b61f72f981cd3c45a44a1f53cc21341/boto3-1.9.30-py2.py3-none-any.whl (128kB) Collecting backcall (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Collecting jedi>=0.10 (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/7a/1a/9bd24a185873b998611c2d8d4fb15cd5e8a879ead36355df7ee53e9111bf/jedi-0.13.1-py2.py3-none-any.whl (177kB) Collecting pexpect; sys_platform != "win32" (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/89/e6/b5a1de8b0cc4e07ca1b305a4fcc3f9806025c1b651ea302646341222f88b/pexpect-4.6.0-py2.py3-none-any.whl Collecting pickleshare (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Downloading https://files.pythonhosted.org/packages/9a/41/220f49aaea88bc6fa6cba8d05ecf24676326156c23b991e80b3f2fc24c77/pickleshare-0.7.5-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Collecting webencodings (from bleach->nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.1.0,>=2.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/d1/29/605c2cc68a9992d18dada28206eeada56ea4bd07a239669da41674648b6f/ptyprocess-0.6.0-py2.py3-none-any.whl Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/b7/31/05c8d001f7f87f0f07289a5fc0fc3832e9a57f2dbd4d3b0fee70e0d51365/jmespath-0.9.3-py2.py3-none-any.whl Collecting botocore<1.13.0,>=1.12.30 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Downloading https://files.pythonhosted.org/packages/d8/8e/8d8f131be049fde5c925c9cf7524791f3601e761b66113892a5dd430a6d7/botocore-1.12.30-py2.py3-none-any.whl (4.7MB) Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/d7/14/2a0004d487464d120c9fb85313a75cd3d71a7506955be458eebfe19a6b1d/s3transfer-0.1.13-py2.py3-none-any.whl Collecting parso>=0.3.0 (from jedi>=0.10->ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/09/51/9c48a46334be50c13d25a3afe55fa05c445699304c5ad32619de953a2305/parso-0.3.1-py2.py3-none-any.whl Building wheels for collected packages: nbsphinx, regex, smart-open, tornado, parse, prometheus-client Running setup.py bdist_wheel for nbsphinx: started Running setup.py bdist_wheel for nbsphinx: finished with status 'done' Stored in directory: /tmp/pip-ephem-wheel-cache-9ijul47t/wheels/a3/7b/ff/b71160a2e088926eb13abcf8e0cde94c0aa8400c67d3df57f5 Running setup.py bdist_wheel for regex: started Running setup.py bdist_wheel for regex: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/a4/8e/5e/cb6fef2a27a1a25f66505ccd87497fbebdd87222ae1168d970 Running setup.py bdist_wheel for smart-open: started Running setup.py bdist_wheel for smart-open: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/23/00/44/e5b939f7a80c04e32297dbd6d96fa3065af89ecf57e2b5f89f Running setup.py bdist_wheel for tornado: started Running setup.py bdist_wheel for tornado: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/6d/e1/ce/f4ee2fa420cc6b940123c64992b81047816d0a9fad6b879325 Running setup.py bdist_wheel for parse: started Running setup.py bdist_wheel for parse: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/ef/db/c6/18568a2cc574848f3996ac4552241fbec046b7be29feb2077d Running setup.py bdist_wheel for prometheus-client: started Running setup.py bdist_wheel for prometheus-client: finished with status 'done' Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/b9/96/bc/e2acadc6bbfe57a1f631a34ca4ce6dd057af059b8d6878202b Successfully built nbsphinx regex smart-open tornado parse prometheus-client nbsphinx 0.3.5 has requirement nbconvert!=5.4, but you'll have nbconvert 5.4.0 which is incompatible. Installing collected packages: six, more-itertools, atomicwrites, pluggy, py, attrs, pytest, coverage, pytest-cov, python-dateutil, tornado, ipython-genutils, decorator, traitlets, jupyter-core, pyzmq, jupyter-client, chardet, certifi, urllib3, idna, requests, backcall, parso, jedi, ptyprocess, pexpect, wcwidth, prompt-toolkit, pickleshare, Pygments, simplegeneric, ipython, ipykernel, pyflakes, beautifulsoup4, testpath, entrypoints, webencodings, bleach, defusedxml, jsonschema, nbformat, mistune, pandocfilters, MarkupSafe, Jinja2, nbconvert, nbsmoke, terminado, prometheus-client, Send2Trash, notebook, widgetsnbextension, ipywidgets, jupyter-console, qtconsole, jupyter, pyparsing, packaging, sphinxcontrib-websupport, imagesize, pytz, babel, alabaster, snowballstemmer, docutils, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, boto, bz2file, jmespath, botocore, s3transfer, boto3, smart-open, scipy, gensim, pbr, lda, lxml, kiwisolver, cycler, matplotlib, PyYAML, bokeh, parse, metadata-toolbox, cophi-toolbox, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.13 Pygments-2.2.0 Send2Trash-1.5.0 alabaster-0.7.12 atomicwrites-1.2.1 attrs-18.2.0 babel-2.6.0 backcall-0.1.0 beautifulsoup4-4.6.3 bleach-3.0.2 bokeh-0.13.0 boto-2.49.0 boto3-1.9.30 botocore-1.12.30 bz2file-0.98 certifi-2018.10.15 chardet-3.0.4 commonmark-0.5.4 cophi-toolbox-0.1.1.dev0 coverage-4.5.1 cycler-0.10.0 dariah-topics decorator-4.3.0 defusedxml-0.5.0 docutils-0.14 entrypoints-0.2.3 gensim-3.6.0 idna-2.7 imagesize-1.1.0 ipykernel-5.1.0 ipython-7.0.1 ipython-genutils-0.2.0 ipywidgets-7.4.2 jedi-0.13.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.3 jupyter-console-6.0.0 jupyter-core-4.4.0 kiwisolver-1.0.1 lda-1.1.0 lxml-4.2.5 matplotlib-3.0.0 metadata-toolbox-0.1.0.dev0 mistune-0.8.4 more-itertools-4.3.0 nbconvert-5.4.0 nbformat-4.4.0 nbsmoke-0.2.7 nbsphinx-0.3.5 notebook-5.7.0 numpy-1.15.3 packaging-18.0 pandas-0.23.4 pandocfilters-1.4.2 parse-1.9.0 parso-0.3.1 pbr-3.1.1 pexpect-4.6.0 pickleshare-0.7.5 pluggy-0.8.0 prometheus-client-0.4.2 prompt-toolkit-2.0.6 ptyprocess-0.6.0 py-1.7.0 pyflakes-2.0.0 pyparsing-2.2.2 pytest-3.9.2 pytest-cov-2.6.0 python-dateutil-2.7.3 pytz-2018.6 pyzmq-17.1.2 qtconsole-4.4.2 recommonmark-0.4.0 regex-2018.8.29 requests-2.20.0 s3transfer-0.1.13 scipy-1.1.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.7.1 snowballstemmer-1.2.1 sphinx-1.8.1 sphinxcontrib-websupport-1.1.0 terminado-0.8.1 testpath-0.4.2 tornado-5.1.1 traitlets-4.3.2 urllib3-1.24 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.4.2 + ./setup.py sdist bdist_wheel running sdist running egg_info writing dariah_topics.egg-info/PKG-INFO writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing requirements to dariah_topics.egg-info/requires.txt writing top-level names to dariah_topics.egg-info/top_level.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check creating dariah_topics-1.0.2.dev0 creating dariah_topics-1.0.2.dev0/dariah_topics creating dariah_topics-1.0.2.dev0/dariah_topics.egg-info creating dariah_topics-1.0.2.dev0/test copying files to dariah_topics-1.0.2.dev0... copying README.md -> dariah_topics-1.0.2.dev0 copying setup.cfg -> dariah_topics-1.0.2.dev0 copying setup.py -> dariah_topics-1.0.2.dev0 copying dariah_topics/__init__.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/modeling.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/utils.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-1.0.2.dev0/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-1.0.2.dev0/test Writing dariah_topics-1.0.2.dev0/setup.cfg creating dist Creating tar archive removing 'dariah_topics-1.0.2.dev0' (and everything under it) running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/dariah_topics copying dariah_topics/postprocessing.py -> build/lib/dariah_topics copying dariah_topics/preprocessing.py -> build/lib/dariah_topics copying dariah_topics/visualization.py -> build/lib/dariah_topics copying dariah_topics/utils.py -> build/lib/dariah_topics copying dariah_topics/__init__.py -> build/lib/dariah_topics copying dariah_topics/modeling.py -> build/lib/dariah_topics copying dariah_topics/evaluation.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/utils.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/modeling.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-1.0.2.dev0-py3.6.egg-info running install_scripts adding license file "LICENSE" (matched pattern "LICEN[CS]E*") creating build/bdist.linux-x86_64/wheel/dariah_topics-1.0.2.dev0.dist-info/WHEEL creating 'dist/dariah_topics-1.0.2.dev0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/modeling.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/utils.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-1.0.2.dev0.dist-info/LICENSE' adding 'dariah_topics-1.0.2.dev0.dist-info/METADATA' adding 'dariah_topics-1.0.2.dev0.dist-info/WHEEL' adding 'dariah_topics-1.0.2.dev0.dist-info/top_level.txt' adding 'dariah_topics-1.0.2.dev0.dist-info/RECORD' removing build/bdist.linux-x86_64/wheel /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/setuptools/dist.py:398: UserWarning: Normalizing '1.0.2.dev' to '1.0.2.dev0' normalized_version, + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.6.6, pytest-3.9.2, py-1.7.0, pluggy-0.8.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: cov-2.6.0, nbsmoke-0.2.7 collected 58 items dariah_topics/postprocessing.py ........... [ 18%] dariah_topics/preprocessing.py ........................... [ 65%] dariah_topics/utils.py ...... [ 75%] notebooks/IntroducingGensim.ipynb . [ 77%] notebooks/IntroducingLda.ipynb . [ 79%] notebooks/IntroducingLda_DKProWrapper.ipynb F [ 81%] notebooks/IntroducingMallet.ipynb . [ 82%] notebooks/Visualizations.ipynb . [ 84%] test/test_fuzzy_segmenting.py ......... [100%] =================================== FAILURES =================================== _______________________________________ _______________________________________ self = <CallInfo when='call' exception: An error occurred while executing the following cell: ------------------ meta = pd.co...0m: ValueError: No objects to concatenate ValueError: No objects to concatenate > func = <function call_runtest_hook.<locals>.<lambda> at 0x7f5465d18048> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/_pytest/runner.py:206: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/_pytest/runner.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>} notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.spec and self.spec.argnames: notincall = ( set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys()) ) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call".format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self.get_hookimpls(), kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/hooks.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7f54a898ec88> hook = <_HookCaller 'pytest_runtest_call'> methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...54a3d73b00>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f54a3d11160>>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>} def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/manager.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...54a3d73b00>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f54a3d11160>>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>} self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall( methods, kwargs, > firstresult=hook.spec.opts.get("firstresult") if hook.spec else False, ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/manager.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...54a3d73b00>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f54a3d11160>>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,) ) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/callers.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pluggy.callers._Result object at 0x7f546621edd8> def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/callers.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...54a3d73b00>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f54a3d11160>>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,) ) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pluggy/callers.py:187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'> def pytest_runtest_call(item): _update_current_test_var(item, "call") sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/_pytest/runner.py:116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'> def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/nbsmoke/__init__.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f5466273ba8> nb = {'cells': [{'cell_type': 'markdown', 'metadata': {}, 'source': "# Topics – Easy Topic Modeling in Python\n\nThe text m...'nbconvert_exporter': 'python', 'pygments_lexer': 'ipython3', 'version': '3.5.1'}}, 'nbformat': 4, 'nbformat_minor': 1} resources = {}, km = None def preprocess(self, nb, resources, km=None): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. km: KernelManager (optional) Optional kernel manager. If none is provided, a kernel manager will be created. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ with self.setup_preprocessor(nb, resources, km=km): self.log.info("Executing notebook with kernel: %s" % self.kernel_name) > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/nbconvert/preprocessors/execute.py:361: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f5466273ba8> nb = {'cells': [{'cell_type': 'markdown', 'metadata': {}, 'source': "# Topics – Easy Topic Modeling in Python\n\nThe text m...'nbconvert_exporter': 'python', 'pygments_lexer': 'ipython3', 'version': '3.5.1'}}, 'nbformat': 4, 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f5466273ba8> cell = {'cell_type': 'code', 'execution_count': 7, 'metadata': {'collapsed': False, 'scrolled': True}, 'outputs': [{'output_t...ath_to_corpus.glob('*.csv')])\nmeta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed"} resources = {}, cell_index = 19 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code' or not cell.source.strip(): return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs cell_allows_errors = (self.allow_errors or "raises-exception" in cell.metadata.get("tags", [])) if self.force_raise_errors or not cell_allows_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E meta = pd.concat([metadata.fname2metadata(str(path), pattern=pattern) for path in path_to_corpus.glob('*.csv')]) E meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E ------------------ E E --------------------------------------------------------------------------- E ValueError Traceback (most recent call last) E <ipython-input-7-30c5c181683a> in <module> E ----> 1 meta = pd.concat([metadata.fname2metadata(str(path), pattern=pattern) for path in path_to_corpus.glob('*.csv')]) E  2 meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pandas/core/reshape/concat.py in concat(objs, axis, join, join_axes, ignore_index, keys, levels, names, verify_integrity, sort, copy) E  223 keys=keys, levels=levels, names=names, E  224 verify_integrity=verify_integrity, E --> 225 copy=copy, sort=sort) E  226 return op.get_result() E  227  E E ~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/pandas/core/reshape/concat.py in __init__(self, objs, axis, join, join_axes, keys, levels, names, ignore_index, verify_integrity, copy, sort) E  257  E  258 if len(objs) == 0: E --> 259 raise ValueError('No objects to concatenate') E  260  E  261 if keys is None: E E ValueError: No objects to concatenate E ValueError: No objects to concatenate ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.6.6-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =============================== warnings summary =============================== /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/utils.py:290: DeprecationWarning: invalid escape sequence \p """ /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/scipy/sparse/sparsetools.py:21: DeprecationWarning: `scipy.sparse.sparsetools` is deprecated! scipy.sparse.sparsetools is a private module for scipy.sparse, and should not be used. _deprecated() /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:657: DeprecationWarning: invalid escape sequence \p """ /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead. score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead. score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead. score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead. score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc) /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/cophi_toolbox/preprocessing.py:625: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/cophi_toolbox/preprocessing.py:625: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.6/site-packages/jupyter_client/manager.py:72: DeprecationWarning: KernelManager._kernel_name_changed is deprecated in traitlets 4.1: use @observe and @unobserve instead. def _kernel_name_changed(self, name, old, new): source:409: DeprecationWarning: invalid escape sequence \. -- Docs: https://docs.pytest.org/en/latest/warnings.html ============== 1 failed, 57 passed, 13 warnings in 86.01 seconds =============== Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@77f168e0[description=A Python library for topic modeling.,homepage=https://dariah-de.github.io/Topics,name=Topics,fork=true,size=42286,milestones={},language=Python,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 24 Oct 2018 13:46:18 GMT], ETag=[W/"4a884231fceb92fe7f52265bd3eb6e4f"], Last-Modified=[Thu, 04 Oct 2018 17:51:13 GMT], OkHttp-Received-Millis=[1540388778930], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1540388778721], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[ECE0:4347:3C0C1EF:866462B:5BD077AA], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4983], X-RateLimit-Reset=[1540391454], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:fa8b872) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/fa8b872c7b9d265332c411fa666db2371d256e62 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: -10.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE