Started by GitHub push by SKrywinski Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > /usr/bin/git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > /usr/bin/git config remote.origin.url git://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from git://github.com/DARIAH-DE/Topics > /usr/bin/git --version # timeout=10 using GIT_ASKPASS to set credentials > /usr/bin/git fetch --tags --progress git://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > /usr/bin/git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 6951841891e07d6dd5eadeb09c897aa3b405cfdb (refs/remotes/origin/testing) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f 6951841891e07d6dd5eadeb09c897aa3b405cfdb Commit message: "import preprocessing from cophi_toolbox instead of dariah_topics renames from cophi_toolbox Issues #5, #9, #13" > /usr/bin/git rev-list --no-walk 694096a9989b2e90dc9bf567b3fe9d941fbe523b # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda229209115333750503.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (10.0.1) + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting pytest (from -r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/d3/75/e79b66c9fe6166a90004bb8fb02bab06213c3348e93f3be41d7eaf625554/pytest-3.6.1-py2.py3-none-any.whl Collecting pytest-cov (from -r requirements-dev.txt (line 5)) Using cached https://files.pythonhosted.org/packages/30/7d/7f6a78ae44a1248ee28cc777586c18b28a1df903470e5d34a6e25712b8aa/pytest_cov-2.5.1-py2.py3-none-any.whl Collecting nbsmoke (from -r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/36/be/1a91e27aa140a859322182d56787c0e55e0d417537cb1ec9c781a772f9c5/nbsmoke-0.2.7-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/83/df/0f5dd132200728a86190397e1ea87cd76244e42d39ec5e88efd25b2abd7e/jupyter-1.0.0-py2.py3-none-any.whl Collecting sphinx (from -r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/1b/9f/98d67201c5a6e1aececed03a44a819d0e32adba81414081e303cfaf8c54c/Sphinx-1.7.5-py2.py3-none-any.whl Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11)) Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-install-9qz8cps9/nbsphinx Collecting recommonmark (from -r requirements-dev.txt (line 12)) Using cached https://files.pythonhosted.org/packages/df/a5/8ee4b84af7f997dfdba71254a88008cfc19c49df98983c9a4919e798f8ce/recommonmark-0.4.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/5a/9f/cea5f2fdf962724c30b306a9bdb60f1800b9a14e0388584540b4a80c7ae4/pandas-0.23.1-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/b8/ed/89f9fa9c3a290ebc454249df90891c804c3760cf054d54c5f701f2675122/gensim-3.4.0-cp35-cp35m-manylinux1_x86_64.whl Collecting lda>=1.0.5 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/31/95/91c8de340a9d00322d9b2d81ef9ffac9afc116e8a19ac501c7df55fa0d73/lda-1.0.5-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/43/17/cd9fa14492dbef2aaf22622db79dba087c10f125473e730cda2f2019c40b/numpy-1.14.5-cp35-cp35m-manylinux1_x86_64.whl Collecting lxml>=3.6.4 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/30/65/6dcc7a1a0ec3bbc10a1316b3610f9997ca132183a5f5345c5b88fc1eaf79/lxml-4.2.1-cp35-cp35m-manylinux1_x86_64.whl Collecting matplotlib>=1.5.3 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/81/31/4e261379e0cd4e9bbacfc96b124ebac0706b44374bd1d34ef899796f741b/matplotlib-2.2.2-cp35-cp35m-manylinux1_x86_64.whl Collecting bokeh>=0.12.6 (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting metadata_toolbox (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting cophi_toolbox (from dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/67/a5/f77982214dd4c8fd104b066f249adea2c49e25e8703d284382eb5e9ab35a/py-1.5.3-py2.py3-none-any.whl Collecting attrs>=17.4.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/41/59/cedf87e91ed541be7957c501a92102f9cc6363c623a7666d69d51c78ac5b/attrs-18.1.0-py2.py3-none-any.whl Collecting pluggy<0.7,>=0.5 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/ba/65/ded3bc40bbf8d887f262f150fbe1ae6637765b5c9534bd55690ed2c0b0f7/pluggy-0.6.0-py3-none-any.whl Collecting atomicwrites>=1.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/0a/e8/cd6375e7a59664eeea9e1c77a766eeac0fc3083bb958c2b41ec46b95f29c/atomicwrites-1.1.5-py2.py3-none-any.whl Requirement not upgraded as not directly required: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) (39.2.0) Collecting more-itertools>=4.0.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/85/40/90c3b0393e12b9827381004224de8814686e3d7182f9d4182477f600826d/more_itertools-4.2.0-py3-none-any.whl Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4)) Using cached https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl Collecting coverage>=3.7.1 (from pytest-cov->-r requirements-dev.txt (line 5)) Using cached https://files.pythonhosted.org/packages/2c/c0/8047b7cbbcdbd7d21f8d68126196b7915da892c5af3d1a99dba082d33ec0/coverage-4.5.1-cp35-cp35m-manylinux1_x86_64.whl Collecting nbformat (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/da/27/9a654d2b6cc1eaa517d1c5a4405166c7f6d72f04f6e7eea41855fe808a46/nbformat-4.4.0-py2.py3-none-any.whl Collecting jupyter-client (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/94/dd/fe6c4d683b09eb05342bd2816b7779663f71762b4fa9c2d5203d35d17354/jupyter_client-5.2.3-py2.py3-none-any.whl Collecting nbconvert (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/39/ea/280d6c0d92f8e3ca15fd798bbcc2ea141489f9539de7133d8fe10ea4b049/nbconvert-5.3.1-py2.py3-none-any.whl Collecting requests (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/cc/15/e1c318dbc20032ffbe5628837ca0de2d5b116ffd1b849c699634010f6a5d/requests-2.19.0-py2.py3-none-any.whl Collecting beautifulsoup4 (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/9e/d4/10f46e5cfac773e22707237bfcd51bbffeaf0a576b0a847ec7ab15bd7ace/beautifulsoup4-4.6.0-py3-none-any.whl Collecting pyflakes (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/44/98/af7a72c9a543b1487d92813c648cb9b9adfbc96faef5455d60f4439aa99b/pyflakes-2.0.0-py2.py3-none-any.whl Collecting ipykernel (from nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/ab/3f/cd624c835aa3336a9110d0a99e15070f343b881b7d651ab1375ef226a3ac/ipykernel-4.8.2-py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/7d/24/fabc09ad81c6071159a4d12d5bfbddcbea69bd9e3b16c3250ef300c0285f/ipywidgets-7.2.1-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/77/82/6469cd7fccf7958cbe5dce2e623f1e3c5e27f1bb1ad36d90519bc2d5d370/jupyter_console-5.2.0-py2.py3-none-any.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/90/ff/047e0dca2627b162866920e7aa93f04523c0ae81e5c67060eec85701992d/qtconsole-4.3.1-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/c0/66/cfed59f574d03ca5f1b7c5281485a9cc9a0f21342d24e0f057572316dae5/notebook-5.5.0-py2.py3-none-any.whl Collecting packaging (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/ad/c2/b500ea05d5f9f361a562f089fc91f77ed3b4783e13a08a3daf82069b1224/packaging-17.1-py2.py3-none-any.whl Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/02/ee/b6e02dc6529e82b75bb06823ff7d005b141037cb1416b10c6f00fc419dca/Pygments-2.2.0-py2.py3-none-any.whl Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/52/69/3c2fbdc3702358c5b34ee25e387b24838597ef099761fc9a42c166796e8f/sphinxcontrib_websupport-1.1.0-py2.py3-none-any.whl Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/d4/6c/8a935e2c7b54a37714656d753e4187ee0631988184ed50c0cf6476858566/snowballstemmer-1.2.1-py2.py3-none-any.whl Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/2e/c3/9b7dcd8548cf2c00531763ba154e524af575e8f36701bacfe5bcadc67440/alabaster-0.7.10-py2.py3-none-any.whl Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/b8/ad/c6f60602d3ee3d92fbed87675b6fb6a6f9a38c223343ababdb44ba201f10/Babel-2.6.0-py2.py3-none-any.whl Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10)) Using cached https://files.pythonhosted.org/packages/e9/79/31cc1c2e0daf575f8fd2b581e2975e6a6938bd439581f766b79c50479521/imagesize-1.0.0-py2.py3-none-any.whl Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11)) Using cached https://files.pythonhosted.org/packages/93/d6/abcb22de61d78e2fc3959c964628a5771e47e7cc60d53e9342e21ed6cc9a/traitlets-4.3.2-py2.py3-none-any.whl Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12)) Collecting python-dateutil>=2.5.0 (from pandas>=0.19.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/dc/83/15f7833b70d3e067ca91467ca245bae0f6fe56ddc7451aa0dc5606b120f2/pytz-2018.4-py2.py3-none-any.whl Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/cd/32/5196b64476bd41d596a8aba43506e2403e019c90e1a3dfc21d51b83db5a6/scipy-1.1.0-cp35-cp35m-manylinux1_x86_64.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting pbr>=0.6 (from lda>=1.0.5->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/b3/5d/c196041ffdf3e34ba206db6d61d1f893a75e1f3435699ade9bd65e089a3d/pbr-4.0.4-py2.py3-none-any.whl Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/6a/8a/718fd7d3458f9fab8e67186b00abdd345b639976bc7fb3ae722e1b026a50/pyparsing-2.2.0-py2.py3-none-any.whl Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl Collecting kiwisolver>=1.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/7e/31/d6fedd4fb2c94755cd101191e581af30e1650ccce7a35bddb7930fed6574/kiwisolver-1.0.1-cp35-cp35m-manylinux1_x86_64.whl Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting parse>=1.8.2 (from metadata_toolbox->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl Collecting jupyter-core (from nbformat->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/1d/44/065d2d7bae7bebc06f1dd70d23c36da8c50c0f08b4236716743d706762a8/jupyter_core-4.4.0-py2.py3-none-any.whl Collecting ipython-genutils (from nbformat->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/4a/9c/c736101a42da5947985ff297c2837795f55a1c016cebcc3b49f31ce75c67/pyzmq-17.0.0-cp35-cp35m-manylinux1_x86_64.whl Collecting mistune>=0.7.4 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/c8/8c/87f4d359438ba0321a2ae91936030110bfcc62fef752656321a72b8c1af9/mistune-0.8.3-py2.py3-none-any.whl Collecting bleach (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/30/b6/a8cffbb9ab4b62b557c22703163735210e9cd857d533740c64e1467d228e/bleach-2.1.3-py2.py3-none-any.whl Collecting entrypoints>=0.2.2 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/cc/8b/4eefa9b47f1910b3d2081da67726b066e379b04ca897acfe9f92bac56147/entrypoints-0.2.3-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Collecting testpath (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/15/19/d7bc380c479a184e4a5a9ce516e4e2a773165f89b445f7cdced83d94de25/testpath-0.3.1-py2.py3-none-any.whl Collecting urllib3<1.24,>=1.21.1 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/bd/c9/6fdd990019071a4a32a5e7cb78a1d92c53851ef4f56f62a3486e6a7d8ffb/urllib3-1.23-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl Collecting idna<2.8,>=2.5 (from requests->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl Collecting ipython>=4.0.0 (from ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/b1/7f/91d50f28af3e3a24342561983a7857e399ce24093876e6970b986a0b6677/ipython-6.4.0-py3-none-any.whl Collecting widgetsnbextension~=3.2.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/ff/fa/64acc09fc845a6b2dc0724d6f3f81e829b778ed5e9a7559567b4f19a3f4b/widgetsnbextension-3.2.1-py2.py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/04/d1/c6616dd03701e7e2073f06d5c3b41b012256e42b72561f16a7bd86dd7b43/prompt_toolkit-1.0.15-py3-none-any.whl Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/49/46/c3dc27481d1cc57b9385aff41c474ceb7714f7935b1247194adae45db714/Send2Trash-1.5.0-py3-none-any.whl Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/2e/20/a26211a24425923d46e1213b376a6ee60dc30bcdf1b0c345e2c3769deb1c/terminado-0.8.1-py2.py3-none-any.whl Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10)) Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11)) Using cached https://files.pythonhosted.org/packages/bc/bb/a24838832ba35baf52f32ab1a49b906b5f82fb7c76b2f6a7e35e140bac30/decorator-4.3.0-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/bd/b7/a88a67002b1185ed9a8e8a6ef15266728c2361fcb4f1d02ea331e4c7741d/boto-2.48.0-py2.py3-none-any.whl Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/2f/e0/b9e3b23e100e0df42d3646e1d126a0f5e3d738921be142a2f0133ed502a5/boto3-1.7.37-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Collecting html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre (from bleach->nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/a5/62/bbd2be0e7943ec8504b517e62bab011b4946e1258842bc159e5dfde15b96/html5lib-1.0.1-py2.py3-none-any.whl Collecting pickleshare (from ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/9f/17/daa142fc9be6b76f26f24eeeb9a138940671490b91cb5587393f297c8317/pickleshare-0.7.4-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Collecting jedi>=0.10 (from ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/e7/42/074192a165622e645ed4aeade63e76e56b3496a044569b3c6cae3a918352/jedi-0.12.0-py2.py3-none-any.whl Collecting backcall (from ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Collecting pexpect; sys_platform != "win32" (from ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/89/e6/b5a1de8b0cc4e07ca1b305a4fcc3f9806025c1b651ea302646341222f88b/pexpect-4.6.0-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9)) Using cached https://files.pythonhosted.org/packages/ff/4e/fa4a73ccfefe2b37d7b6898329e7dbcd1ac846ba3a3b26b294a78a3eb997/ptyprocess-0.5.2-py2.py3-none-any.whl Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/d7/14/2a0004d487464d120c9fb85313a75cd3d71a7506955be458eebfe19a6b1d/s3transfer-0.1.13-py2.py3-none-any.whl Collecting botocore<1.11.0,>=1.10.37 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/0b/97/60eba2ca866d346459bfb6f95ff5b2b611d7b1b844d0733acb436d59c893/botocore-1.10.37-py2.py3-none-any.whl Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.0.dev0->-r requirements.txt (line 1)) Using cached https://files.pythonhosted.org/packages/b7/31/05c8d001f7f87f0f07289a5fc0fc3832e9a57f2dbd4d3b0fee70e0d51365/jmespath-0.9.3-py2.py3-none-any.whl Collecting webencodings (from html5lib!=1.0b1,!=1.0b2,!=1.0b3,!=1.0b4,!=1.0b5,!=1.0b6,!=1.0b7,!=1.0b8,>=0.99999999pre->bleach->nbconvert->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl Collecting parso>=0.2.0 (from jedi>=0.10->ipython>=4.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6)) Using cached https://files.pythonhosted.org/packages/cd/3e/5908f9577dbd1e5df53e64349bfd11e46b726c1e4d8cd676bbe8aa4de316/parso-0.2.1-py2.py3-none-any.whl Building wheels for collected packages: nbsphinx Running setup.py bdist_wheel for nbsphinx: started Running setup.py bdist_wheel for nbsphinx: finished with status 'done' Stored in directory: /tmp/pip-ephem-wheel-cache-pj6rv26g/wheels/a3/7b/ff/b71160a2e088926eb13abcf8e0cde94c0aa8400c67d3df57f5 Successfully built nbsphinx Installing collected packages: py, attrs, pluggy, atomicwrites, six, more-itertools, pytest, coverage, pytest-cov, jsonschema, decorator, ipython-genutils, traitlets, jupyter-core, nbformat, pyzmq, tornado, python-dateutil, jupyter-client, mistune, webencodings, html5lib, bleach, Pygments, entrypoints, pandocfilters, MarkupSafe, Jinja2, testpath, nbconvert, urllib3, chardet, certifi, idna, requests, beautifulsoup4, pyflakes, pickleshare, simplegeneric, parso, jedi, backcall, ptyprocess, pexpect, wcwidth, prompt-toolkit, ipython, ipykernel, nbsmoke, Send2Trash, terminado, notebook, widgetsnbextension, ipywidgets, jupyter-console, qtconsole, jupyter, pyparsing, packaging, sphinxcontrib-websupport, docutils, snowballstemmer, alabaster, pytz, babel, imagesize, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, scipy, boto, jmespath, botocore, s3transfer, boto3, bz2file, smart-open, gensim, pbr, lda, lxml, cycler, kiwisolver, matplotlib, PyYAML, bokeh, parse, metadata-toolbox, cophi-toolbox, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.10 MarkupSafe-1.0 PyYAML-3.12 Pygments-2.2.0 Send2Trash-1.5.0 alabaster-0.7.10 atomicwrites-1.1.5 attrs-18.1.0 babel-2.6.0 backcall-0.1.0 beautifulsoup4-4.6.0 bleach-2.1.3 bokeh-0.12.16 boto-2.48.0 boto3-1.7.37 botocore-1.10.37 bz2file-0.98 certifi-2018.4.16 chardet-3.0.4 commonmark-0.5.4 cophi-toolbox-0.1.0.dev0 coverage-4.5.1 cycler-0.10.0 dariah-topics decorator-4.3.0 docutils-0.14 entrypoints-0.2.3 gensim-3.4.0 html5lib-1.0.1 idna-2.7 imagesize-1.0.0 ipykernel-4.8.2 ipython-6.4.0 ipython-genutils-0.2.0 ipywidgets-7.2.1 jedi-0.12.0 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.3 jupyter-console-5.2.0 jupyter-core-4.4.0 kiwisolver-1.0.1 lda-1.0.5 lxml-4.2.1 matplotlib-2.2.2 metadata-toolbox-0.1.0.dev0 mistune-0.8.3 more-itertools-4.2.0 nbconvert-5.3.1 nbformat-4.4.0 nbsmoke-0.2.7 nbsphinx-0.3.3 notebook-5.5.0 numpy-1.14.5 packaging-17.1 pandas-0.23.1 pandocfilters-1.4.2 parse-1.8.4 parso-0.2.1 pbr-4.0.4 pexpect-4.6.0 pickleshare-0.7.4 pluggy-0.6.0 prompt-toolkit-1.0.15 ptyprocess-0.5.2 py-1.5.3 pyflakes-2.0.0 pyparsing-2.2.0 pytest-3.6.1 pytest-cov-2.5.1 python-dateutil-2.7.3 pytz-2018.4 pyzmq-17.0.0 qtconsole-4.3.1 recommonmark-0.4.0 regex-2018.6.9 requests-2.19.0 s3transfer-0.1.13 scipy-1.1.0 simplegeneric-0.8.1 six-1.11.0 smart-open-1.5.7 snowballstemmer-1.2.1 sphinx-1.7.5 sphinxcontrib-websupport-1.1.0 terminado-0.8.1 testpath-0.3.1 tornado-5.0.2 traitlets-4.3.2 urllib3-1.23 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.2.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing top-level names to dariah_topics.egg-info/top_level.txt writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing requirements to dariah_topics.egg-info/requires.txt writing dariah_topics.egg-info/PKG-INFO reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' running check creating dariah_topics-1.0.0.dev0 creating dariah_topics-1.0.0.dev0/dariah_topics creating dariah_topics-1.0.0.dev0/dariah_topics.egg-info creating dariah_topics-1.0.0.dev0/test copying files to dariah_topics-1.0.0.dev0... copying README.md -> dariah_topics-1.0.0.dev0 copying setup.cfg -> dariah_topics-1.0.0.dev0 copying setup.py -> dariah_topics-1.0.0.dev0 copying dariah_topics/__init__.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/modeling.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/postprocessing.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/utils.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-1.0.0.dev0/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-1.0.0.dev0/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-1.0.0.dev0/test Writing dariah_topics-1.0.0.dev0/setup.cfg Creating tar archive removing 'dariah_topics-1.0.0.dev0' (and everything under it) running bdist_wheel running build running build_py copying dariah_topics/preprocessing.py -> build/lib/dariah_topics copying dariah_topics/postprocessing.py -> build/lib/dariah_topics copying dariah_topics/__init__.py -> build/lib/dariah_topics installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/modeling.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/utils.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-1.0.0.dev0-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-1.0.0.dev0.dist-info/WHEEL creating '/mnt/data/jenkins/workspace/DARIAH-Topics/dist/dariah_topics-1.0.0.dev0-py3-none-any.whl' and adding '.' to it adding 'dariah_topics/__init__.py' adding 'dariah_topics/evaluation.py' adding 'dariah_topics/modeling.py' adding 'dariah_topics/postprocessing.py' adding 'dariah_topics/preprocessing.py' adding 'dariah_topics/utils.py' adding 'dariah_topics/visualization.py' adding 'dariah_topics-1.0.0.dev0.dist-info/top_level.txt' adding 'dariah_topics-1.0.0.dev0.dist-info/WHEEL' adding 'dariah_topics-1.0.0.dev0.dist-info/METADATA' adding 'dariah_topics-1.0.0.dev0.dist-info/RECORD' removing build/bdist.linux-x86_64/wheel /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:398: UserWarning: Normalizing '1.0.0.dev' to '1.0.0.dev0' normalized_version, + pytest --nbsmoke-run ============================= test session starts ============================== platform linux -- Python 3.5.3, pytest-3.6.1, py-1.5.3, pluggy-0.6.0 rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg plugins: cov-2.5.1, nbsmoke-0.2.7 collected 57 items dariah_topics/postprocessing.py ........F.. [ 19%] dariah_topics/preprocessing.py ........................... [ 66%] dariah_topics/utils.py ...... [ 77%] notebooks/IntroducingGensim.ipynb F [ 78%] notebooks/IntroducingLda.ipynb F [ 80%] notebooks/IntroducingMallet.ipynb F [ 82%] notebooks/Visualizations.ipynb F [ 84%] test/test_fuzzy_segmenting.py ......... [100%] =================================== FAILURES =================================== ______________ [doctest] dariah_topics.postprocessing.save_model _______________ 157 158 Returns: 159 None. 160 161 Example: 162 >>> from lda import LDA 163 >>> from gensim.models import LdaModel 164 >>> from cophi_toolbox import preprocessing 165 >>> save_model(LDA, 'model.pickle') 166 >>> preprocessing.read_model('model.pickle') == LDA UNEXPECTED EXCEPTION: AttributeError("module 'cophi_toolbox.preprocessing' has no attribute 'read_model'",) Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "<doctest dariah_topics.postprocessing.save_model[4]>", line 1, in <module> AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_model' /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:166: UnexpectedException _______________________________________ _______________________________________ self = <CallInfo when='call' exception: An error occurred while executing the following cell: ------------------ corpus = lis...ing' has no attribute 'read_files' AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' > func = <function call_runtest_hook.<locals>.<lambda> at 0x7f323a2fbea0> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>} notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7f3268364390> hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>} def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>} self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pluggy.callers._Result object at 0x7f323a2d16d8> def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'> def pytest_runtest_call(item): _update_current_test_var(item, "call") sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'> def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32643c30b8> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32643c30b8> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32643c30b8> cell = {'cell_type': 'code', 'metadata': {}, 'outputs': [{'output_type': 'error', 'ename': 'AttributeError', 'evalue': "modul...list(preprocessing.read_files(meta.index))\ncorpus[0][:255] # printing the first 255 characters of the first document'} resources = {}, cell_index = 17 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E corpus = list(preprocessing.read_files(meta.index)) E corpus[0][:255] # printing the first 255 characters of the first document E ------------------ E E [0;31m---------------------------------------------------------------------------[0m E [0;31mAttributeError[0m Traceback (most recent call last) E [0;32m<ipython-input-7-1004dfe5ae0a>[0m in [0;36m<module>[0;34m()[0m E [0;32m----> 1[0;31m [0mcorpus[0m [0;34m=[0m [0mlist[0m[0;34m([0m[0mpreprocessing[0m[0;34m.[0m[0mread_files[0m[0;34m([0m[0mmeta[0m[0;34m.[0m[0mindex[0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0m E [0m[1;32m 2[0m [0mcorpus[0m[0;34m[[0m[0;36m0[0m[0;34m][0m[0;34m[[0m[0;34m:[0m[0;36m255[0m[0;34m][0m [0;31m# printing the first 255 characters of the first document[0m[0;34m[0m[0m E E [0;31mAttributeError[0m: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' E AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _______________________________________ _______________________________________ self = <CallInfo when='call' exception: An error occurred while executing the following cell: ------------------ corpus = lis...ing' has no attribute 'read_files' AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' > func = <function call_runtest_hook.<locals>.<lambda> at 0x7f323935d268> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>} notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7f3268364390> hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>} def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>} self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pluggy.callers._Result object at 0x7f323a278b70> def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'> def pytest_runtest_call(item): _update_current_test_var(item, "call") sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'> def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32393fe6d8> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32393fe6d8> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f32393fe6d8> cell = {'cell_type': 'code', 'metadata': {}, 'outputs': [{'output_type': 'error', 'ename': 'AttributeError', 'evalue': "modul...list(preprocessing.read_files(meta.index))\ncorpus[0][:255] # printing the first 255 characters of the first document'} resources = {}, cell_index = 18 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E corpus = list(preprocessing.read_files(meta.index)) E corpus[0][:255] # printing the first 255 characters of the first document E ------------------ E E [0;31m---------------------------------------------------------------------------[0m E [0;31mAttributeError[0m Traceback (most recent call last) E [0;32m<ipython-input-7-1004dfe5ae0a>[0m in [0;36m<module>[0;34m()[0m E [0;32m----> 1[0;31m [0mcorpus[0m [0;34m=[0m [0mlist[0m[0;34m([0m[0mpreprocessing[0m[0;34m.[0m[0mread_files[0m[0;34m([0m[0mmeta[0m[0;34m.[0m[0mindex[0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0m E [0m[1;32m 2[0m [0mcorpus[0m[0;34m[[0m[0;36m0[0m[0;34m][0m[0;34m[[0m[0;34m:[0m[0;36m255[0m[0;34m][0m [0;31m# printing the first 255 characters of the first document[0m[0;34m[0m[0m E E [0;31mAttributeError[0m: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' E AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _______________________________________ _______________________________________ self = <CallInfo when='call' exception: An error occurred while executing the following cell: ------------------ corpus = lis...ing' has no attribute 'read_files' AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' > func = <function call_runtest_hook.<locals>.<lambda> at 0x7f323932ba60> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>} notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7f3268364390> hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>} def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>} self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pluggy.callers._Result object at 0x7f3239231160> def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'> def pytest_runtest_call(item): _update_current_test_var(item, "call") sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'> def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3239231518> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3239231518> nb = {'metadata': {'git': {'suppress_outputs': True}, 'kernelspec': {'language': 'python', 'name': 'python3', 'display_name...: None, 'source': 'static_heatmap = PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()'}], 'nbformat_minor': 1} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3239231518> cell = {'cell_type': 'code', 'metadata': {}, 'outputs': [{'output_type': 'error', 'ename': 'AttributeError', 'evalue': "modul...list(preprocessing.read_files(meta.index))\ncorpus[0][:255] # printing the first 255 characters of the first document'} resources = {}, cell_index = 18 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E corpus = list(preprocessing.read_files(meta.index)) E corpus[0][:255] # printing the first 255 characters of the first document E ------------------ E E [0;31m---------------------------------------------------------------------------[0m E [0;31mAttributeError[0m Traceback (most recent call last) E [0;32m<ipython-input-7-1004dfe5ae0a>[0m in [0;36m<module>[0;34m()[0m E [0;32m----> 1[0;31m [0mcorpus[0m [0;34m=[0m [0mlist[0m[0;34m([0m[0mpreprocessing[0m[0;34m.[0m[0mread_files[0m[0;34m([0m[0mmeta[0m[0;34m.[0m[0mindex[0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0m E [0m[1;32m 2[0m [0mcorpus[0m[0;34m[[0m[0;36m0[0m[0;34m][0m[0;34m[[0m[0;34m:[0m[0;36m255[0m[0;34m][0m [0;31m# printing the first 255 characters of the first document[0m[0;34m[0m[0m E E [0;31mAttributeError[0m: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' E AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError _______________________________________ _______________________________________ self = <CallInfo when='call' exception: An error occurred while executing the following cell: ------------------ corpus = lis...ing' has no attribute 'read_files' AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' > func = <function call_runtest_hook.<locals>.<lambda> at 0x7f323935d378> when = 'call', treat_keyboard_interrupt_as_exception = False def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False): #: context of invocation: one of "setup", "call", #: "teardown", "memocollect" self.when = when self.start = time() try: > self.result = func() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > lambda: ihook(item=item, **kwds), when=when, treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_HookCaller 'pytest_runtest_call'>, args = () kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>} notincall = set() def __call__(self, *args, **kwargs): if args: raise TypeError("hook calling supports only keyword arguments") assert not self.is_historic() if self.argnames: notincall = set(self.argnames) - set(['__multicall__']) - set( kwargs.keys()) if notincall: warnings.warn( "Argument(s) {} which are declared in the hookspec " "can not be found in this hook call" .format(tuple(notincall)), stacklevel=2, ) > return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:617: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <_pytest.config.PytestPluginManager object at 0x7f3268364390> hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>} def _hookexec(self, hook, methods, kwargs): # called from all hookcaller instances. # enable_tracing will set its own wrapping function at self._inner_hookexec > return self._inner_hookexec(hook, methods, kwargs) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook = <_HookCaller 'pytest_runtest_call'> methods = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>} self._inner_hookexec = lambda hook, methods, kwargs: \ hook.multicall( methods, kwargs, > firstresult=hook.spec_opts.get('firstresult'), ) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/__init__.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: res = hook_impl.function(*args) if res is not None: results.append(res) if firstresult: # halt further impl calls break except BaseException: excinfo = sys.exc_info() finally: if firstresult: # first result hooks return a single value outcome = _Result(results[0] if results else None, excinfo) else: outcome = _Result(results, excinfo) # run all wrapper post-yield blocks for gen in reversed(teardowns): try: gen.send(outcome) _raise_wrapfail(gen, "has second yield") except StopIteration: pass > return outcome.get_result() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pluggy.callers._Result object at 0x7f323940c898> def get_result(self): """Get the result(s) for this hook call. If the hook was marked as a ``firstresult`` only a single value will be returned otherwise a list of results. """ __tracebackhide__ = True if self._excinfo is None: return self._result else: ex = self._excinfo if _py3: > raise ex[1].with_traceback(ex[2]) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ hook_impls = [<pluggy.HookImpl object at 0x7f32661d6e80>, <pluggy.HookImpl object at 0x7f32630c7358>, <pluggy.HookImpl object at 0x7f32630570f0>] caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>} firstresult = False def _multicall(hook_impls, caller_kwargs, firstresult=False): """Execute a call into multiple python functions/methods and return the result(s). ``caller_kwargs`` comes from _HookCaller.__call__(). """ __tracebackhide__ = True results = [] excinfo = None try: # run impl and wrapper setup functions in a loop teardowns = [] try: for hook_impl in reversed(hook_impls): try: args = [caller_kwargs[argname] for argname in hook_impl.argnames] except KeyError: for argname in hook_impl.argnames: if argname not in caller_kwargs: raise HookCallError( "hook call must provide argument %r" % (argname,)) if hook_impl.hookwrapper: try: gen = hook_impl.function(*args) next(gen) # first yield teardowns.append(gen) except StopIteration: _raise_wrapfail(gen, "did not yield") else: > res = hook_impl.function(*args) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'> def pytest_runtest_call(item): _update_current_test_var(item, "call") sys.last_type, sys.last_value, sys.last_traceback = (None, None, None) try: > item.runtest() ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'> def runtest(self): self._skip() with io.open(self.name,encoding='utf8') as nb: notebook = nbformat.read(nb, as_version=4) # TODO: which kernel? run in pytest's or use new one (make it option) _timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout') kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300, allow_errors=False, # or sys.version_info[1] ? kernel_name='python') ep = ExecutePreprocessor(**kwargs) with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right? > ep.preprocess(notebook,{}) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f323940cfd0> nb = {'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}, 'language_info': {'... width=800)\nshow(interactive_barchart_per_document, notebook_handle=True)"}], 'nbformat_minor': 2} resources = {} def preprocess(self, nb, resources): """ Preprocess notebook executing each code cell. The input argument `nb` is modified in-place. Parameters ---------- nb : NotebookNode Notebook being executed. resources : dictionary Additional resources used in the conversion process. For example, passing ``{'metadata': {'path': run_path}}`` sets the execution path to ``run_path``. Returns ------- nb : NotebookNode The executed notebook. resources : dictionary Additional resources used in the conversion process. """ path = resources.get('metadata', {}).get('path', '') if path == '': path = None # clear display_id map self._display_id_map = {} # from jupyter_client.manager import start_new_kernel def start_new_kernel(startup_timeout=60, kernel_name='python', **kwargs): km = self.kernel_manager_class(kernel_name=kernel_name) km.start_kernel(**kwargs) kc = km.client() kc.start_channels() try: kc.wait_for_ready(timeout=startup_timeout) except RuntimeError: kc.stop_channels() km.shutdown_kernel() raise return km, kc kernel_name = nb.metadata.get('kernelspec', {}).get('name', 'python') if self.kernel_name: kernel_name = self.kernel_name self.log.info("Executing notebook with kernel: %s" % kernel_name) self.km, self.kc = start_new_kernel( startup_timeout=self.startup_timeout, kernel_name=kernel_name, extra_arguments=self.extra_arguments, cwd=path) self.kc.allow_stdin = False self.nb = nb try: > nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f323940cfd0> nb = {'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}, 'language_info': {'... width=800)\nshow(interactive_barchart_per_document, notebook_handle=True)"}], 'nbformat_minor': 2} resources = {} def preprocess(self, nb, resources): """ Preprocessing to apply on each notebook. Must return modified nb, resources. If you wish to apply your preprocessing to each cell, you might want to override preprocess_cell method instead. Parameters ---------- nb : NotebookNode Notebook being converted resources : dictionary Additional resources used in the conversion process. Allows preprocessors to pass variables into the Jinja engine. """ for index, cell in enumerate(nb.cells): > nb.cells[index], resources = self.preprocess_cell(cell, resources, index) ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f323940cfd0> cell = {'cell_type': 'code', 'metadata': {}, 'outputs': [{'output_type': 'error', 'ename': 'AttributeError', 'evalue': "modul...list(preprocessing.read_files(meta.index))\ncorpus[0][:255] # printing the first 255 characters of the first document'} resources = {}, cell_index = 3 def preprocess_cell(self, cell, resources, cell_index): """ Executes a single code cell. See base.py for details. To execute all cells see :meth:`preprocess`. """ if cell.cell_type != 'code': return cell, resources reply, outputs = self.run_cell(cell, cell_index) cell.outputs = outputs if not self.allow_errors: for out in outputs: if out.output_type == 'error': > raise CellExecutionError.from_cell_and_msg(cell, out) E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell: E ------------------ E corpus = list(preprocessing.read_files(meta.index)) E corpus[0][:255] # printing the first 255 characters of the first document E ------------------ E E [0;31m---------------------------------------------------------------------------[0m E [0;31mAttributeError[0m Traceback (most recent call last) E [0;32m<ipython-input-4-1004dfe5ae0a>[0m in [0;36m<module>[0;34m()[0m E [0;32m----> 1[0;31m [0mcorpus[0m [0;34m=[0m [0mlist[0m[0;34m([0m[0mpreprocessing[0m[0;34m.[0m[0mread_files[0m[0;34m([0m[0mmeta[0m[0;34m.[0m[0mindex[0m[0;34m)[0m[0;34m)[0m[0;34m[0m[0m E [0m[1;32m 2[0m [0mcorpus[0m[0;34m[[0m[0;36m0[0m[0;34m][0m[0;34m[[0m[0;34m:[0m[0;36m255[0m[0;34m][0m [0;31m# printing the first 255 characters of the first document[0m[0;34m[0m[0m E E [0;31mAttributeError[0m: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' E AttributeError: module 'cophi_toolbox.preprocessing' has no attribute 'read_files' ../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:286: CellExecutionError --- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ---- ----------- coverage: platform linux, python 3.5.3-final-0 ----------- Coverage HTML written to dir htmlcov Coverage XML written to file coverage.xml =============================== warnings summary =============================== dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_document_topics /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_topics /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead. """ dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/cophi_toolbox/preprocessing.py:624: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/cophi_toolbox/preprocessing.py:624: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._create_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._hapax_legomena_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._list_mfw_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing._remove_features_from_large_corpus_model /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.create_document_term_matrix /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.find_hapax_legomena /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.list_mfw /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) dariah_topics/preprocessing.py::dariah_topics.preprocessing.remove_features /mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py:732: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id])) -- Docs: http://doc.pytest.org/en/latest/warnings.html ============== 5 failed, 52 passed, 12 warnings in 130.57 seconds ============== Build step 'Virtualenv Builder' marked build as failure Recording test results [Cobertura] Publishing Cobertura coverage report... [Cobertura] Publishing Cobertura coverage results... [Cobertura] Cobertura coverage report found. [Set GitHub commit status (universal)] ERROR on repos [GHRepository@51ee05b5[description=A Python library for topic modeling.,homepage=https://dariah-de.github.io/Topics,name=Topics,fork=true,size=42271,milestones={},language=Python,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 13 Jun 2018 15:12:11 GMT], ETag=[W/"31f7150108d94b8dd6a57beccf0e4832"], Last-Modified=[Mon, 04 Jun 2018 10:20:02 GMT], OkHttp-Received-Millis=[1528902731741], OkHttp-Response-Source=[CONDITIONAL_CACHE 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1528902731535], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[9AA0:0FDE:3720D9:7EFF29:5B21344B], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4997], X-RateLimit-Reset=[1528906075], X-Runtime-rack=[0.108915], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:6951841) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/6951841891e07d6dd5eadeb09c897aa3b405cfdb [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] scored: -10.0 [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE