Started by GitHub push by ThoraHagen
Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics
Cloning the remote Git repository
Cloning repository git://github.com/DARIAH-DE/Topics
> /usr/bin/git init /mnt/data/jenkins/workspace/DARIAH-Topics # timeout=10
Fetching upstream changes from git://github.com/DARIAH-DE/Topics
> /usr/bin/git --version # timeout=10
using GIT_ASKPASS to set credentials
> /usr/bin/git fetch --tags --progress git://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
> /usr/bin/git config remote.origin.url git://github.com/DARIAH-DE/Topics # timeout=10
> /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> /usr/bin/git config remote.origin.url git://github.com/DARIAH-DE/Topics # timeout=10
Fetching upstream changes from git://github.com/DARIAH-DE/Topics
using GIT_ASKPASS to set credentials
> /usr/bin/git fetch --tags --progress git://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/*
> /usr/bin/git rev-parse refs/remotes/origin/testing^{commit} # timeout=10
> /usr/bin/git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10
Checking out Revision b3714747ccf8cde3e96a5efca4b6918aa1fe1863 (refs/remotes/origin/testing)
> /usr/bin/git config core.sparsecheckout # timeout=10
> /usr/bin/git checkout -f b3714747ccf8cde3e96a5efca4b6918aa1fe1863
Commit message: "updated paths in cmd command"
> /usr/bin/git rev-list --no-walk b3714747ccf8cde3e96a5efca4b6918aa1fe1863 # timeout=10
[DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9
Using base prefix '/usr'
New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3
Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python
Installing setuptools, pip, wheel...done.
[DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda7121193602667143410.sh
+ pip install -U pip
Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (18.1)
+ pip install -U -r requirements-dev.txt
Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1))
Collecting pytest (from -r requirements-dev.txt (line 4))
Downloading https://files.pythonhosted.org/packages/bb/d5/7601c468ded9a59478dcb39d21e24d58bb375681c64a06fbb629d2bc2ac3/pytest-4.0.0-py2.py3-none-any.whl (217kB)
Collecting pytest-cov (from -r requirements-dev.txt (line 5))
Using cached https://files.pythonhosted.org/packages/30/0a/1b009b525526cd3cd9f52f52391b426c5a3597447be811a10bcb1f6b05eb/pytest_cov-2.6.0-py2.py3-none-any.whl
Collecting nbsmoke (from -r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/36/be/1a91e27aa140a859322182d56787c0e55e0d417537cb1ec9c781a772f9c5/nbsmoke-0.2.7-py2.py3-none-any.whl
Collecting jupyter (from -r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/83/df/0f5dd132200728a86190397e1ea87cd76244e42d39ec5e88efd25b2abd7e/jupyter-1.0.0-py2.py3-none-any.whl
Collecting sphinx (from -r requirements-dev.txt (line 10))
Downloading https://files.pythonhosted.org/packages/ff/d5/3a8727d6f890b1ae45da72a55bf8449e9f2c535a444923b338c3f509f203/Sphinx-1.8.2-py2.py3-none-any.whl (3.1MB)
Collecting nbsphinx from git+https://github.com/spatialaudio/nbsphinx#egg=nbsphinx (from -r requirements-dev.txt (line 11))
Cloning https://github.com/spatialaudio/nbsphinx to /tmp/pip-install-j8ykeblx/nbsphinx
Collecting recommonmark (from -r requirements-dev.txt (line 12))
Using cached https://files.pythonhosted.org/packages/df/a5/8ee4b84af7f997dfdba71254a88008cfc19c49df98983c9a4919e798f8ce/recommonmark-0.4.0-py2.py3-none-any.whl
Collecting pandas>=0.19.2 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting regex>=2017.01.14 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/16/07/ee3e02770ed456a088b90da7c9b1e9aa227e3c956d37b845cef2aab93764/regex-2018.11.22.tar.gz (648kB)
Collecting gensim>=0.13.2 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/03/0a/02c7ac51565a0a5b05a07936e5559a635d5d2e8cf19801e0f00204df5ece/gensim-3.6.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting lda>=1.0.5 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/54/4c/e387febde4dfffa6bd81d1feff7c3764816c8030561e5676a3f4ac635447/lda-1.1.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting numpy>=1.3 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/86/04/bd774106ae0ae1ada68c67efe89f1a16b2aa373cc2db15d974002a9f136d/numpy-1.15.4-cp35-cp35m-manylinux1_x86_64.whl (13.8MB)
Collecting lxml>=3.6.4 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/eb/e2/02d18a1b3021b65409dd860f91cf0d68d79900f172bb3cc93cff21c3c951/lxml-4.2.5-cp35-cp35m-manylinux1_x86_64.whl
Collecting matplotlib>=1.5.3 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/ad/4c/0415f15f96864c3a2242b1c74041a806c100c1b21741206c5d87684437c6/matplotlib-3.0.2-cp35-cp35m-manylinux1_x86_64.whl (12.9MB)
Collecting bokeh>=0.12.6 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/68/e1/4cb565f6186b31be1917e26372c0fab24683cd6e0faa542321a01460748e/bokeh-1.0.1.tar.gz (16.1MB)
Collecting metadata_toolbox (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting cophi_toolbox>=0.1.1.dev0 (from dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting pluggy>=0.7 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/1c/e7/017c262070af41fe251401cb0d0e1b7c38f656da634cd0c15604f1f30864/pluggy-0.8.0-py2.py3-none-any.whl
Collecting attrs>=17.4.0 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl
Collecting more-itertools>=4.0.0 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/79/b1/eace304ef66bd7d3d8b2f78cc374b73ca03bc53664d78151e9df3b3996cc/more_itertools-4.3.0-py3-none-any.whl
Requirement already satisfied, skipping upgrade: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->-r requirements-dev.txt (line 4)) (40.6.2)
Collecting pathlib2>=2.2.0; python_version < "3.6" (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/66/a7/9f8d84f31728d78beade9b1271ccbfb290c41c1e4dc13dbd4997ad594dcd/pathlib2-2.3.2-py2.py3-none-any.whl
Collecting atomicwrites>=1.0 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/3a/9a/9d878f8d885706e2530402de6417141129a943802c084238914fa6798d97/atomicwrites-1.2.1-py2.py3-none-any.whl
Collecting six>=1.10.0 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
Collecting py>=1.5.0 (from pytest->-r requirements-dev.txt (line 4))
Using cached https://files.pythonhosted.org/packages/3e/c7/3da685ef117d42ac8d71af525208759742dd235f8094221fdaafcd3dba8f/py-1.7.0-py2.py3-none-any.whl
Collecting coverage>=4.4 (from pytest-cov->-r requirements-dev.txt (line 5))
Downloading https://files.pythonhosted.org/packages/fa/7c/e728df519842d537b2a4498553e2396867f50120fe303cd2b45e54b7e323/coverage-4.5.2-cp35-cp35m-manylinux1_x86_64.whl (205kB)
Collecting nbformat (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/da/27/9a654d2b6cc1eaa517d1c5a4405166c7f6d72f04f6e7eea41855fe808a46/nbformat-4.4.0-py2.py3-none-any.whl
Collecting jupyter-client (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/94/dd/fe6c4d683b09eb05342bd2816b7779663f71762b4fa9c2d5203d35d17354/jupyter_client-5.2.3-py2.py3-none-any.whl
Collecting nbconvert (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/b5/bb/94c493051d60e5b9c0f7f9a368b324201818c1b1c4cae85d1e49a41846c7/nbconvert-5.4.0-py2.py3-none-any.whl
Collecting pyflakes (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/44/98/af7a72c9a543b1487d92813c648cb9b9adfbc96faef5455d60f4439aa99b/pyflakes-2.0.0-py2.py3-none-any.whl
Collecting requests (from nbsmoke->-r requirements-dev.txt (line 6))
Downloading https://files.pythonhosted.org/packages/ff/17/5cbb026005115301a8fb2f9b0e3e8d32313142fe8b617070e7baad20554f/requests-2.20.1-py2.py3-none-any.whl (57kB)
Collecting ipykernel (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/d8/b0/f0be5c5ab335196f5cce96e5b889a4fcf5bfe462eb0acc05cd7e2caf65eb/ipykernel-5.1.0-py3-none-any.whl
Collecting beautifulsoup4 (from nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/21/0a/47fdf541c97fd9b6a610cb5fd518175308a7cc60569962e776ac52420387/beautifulsoup4-4.6.3-py3-none-any.whl
Collecting notebook (from jupyter->-r requirements-dev.txt (line 9))
Downloading https://files.pythonhosted.org/packages/a2/5d/d1907cd32ac00b5ead56f6e61d9794fa60ef105a22ac5da6e7556011580f/notebook-5.7.2-py2.py3-none-any.whl (9.0MB)
Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 9))
Downloading https://files.pythonhosted.org/packages/e0/7a/8aefbc0ed078dec7951ac9a06dcd1869243ecd7bcbce26fa47bf5e469a8f/qtconsole-4.4.3-py2.py3-none-any.whl (113kB)
Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/30/9a/a008c7b1183fac9e52066d80a379b3c64eab535bd9d86cdc29a0b766fd82/ipywidgets-7.4.2-py2.py3-none-any.whl
Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/cb/ee/6374ae8c21b7d0847f9c3722dcdfac986b8e54fa9ad9ea66e1eb6320d2b8/jupyter_console-6.0.0-py2.py3-none-any.whl
Collecting docutils>=0.11 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl
Collecting snowballstemmer>=1.1 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/d4/6c/8a935e2c7b54a37714656d753e4187ee0631988184ed50c0cf6476858566/snowballstemmer-1.2.1-py2.py3-none-any.whl
Collecting alabaster<0.8,>=0.7 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/10/ad/00b090d23a222943eb0eda509720a404f531a439e803f6538f35136cae9e/alabaster-0.7.12-py2.py3-none-any.whl
Collecting packaging (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/89/d1/92e6df2e503a69df9faab187c684585f0136662c12bb1f36901d426f3fab/packaging-18.0-py2.py3-none-any.whl
Collecting Pygments>=2.0 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/02/ee/b6e02dc6529e82b75bb06823ff7d005b141037cb1416b10c6f00fc419dca/Pygments-2.2.0-py2.py3-none-any.whl
Collecting sphinxcontrib-websupport (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/52/69/3c2fbdc3702358c5b34ee25e387b24838597ef099761fc9a42c166796e8f/sphinxcontrib_websupport-1.1.0-py2.py3-none-any.whl
Collecting Jinja2>=2.3 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Collecting imagesize (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/fc/b6/aef66b4c52a6ad6ac18cf6ebc5731ed06d8c9ae4d3b2d9951f261150be67/imagesize-1.1.0-py2.py3-none-any.whl
Collecting babel!=2.0,>=1.3 (from sphinx->-r requirements-dev.txt (line 10))
Using cached https://files.pythonhosted.org/packages/b8/ad/c6f60602d3ee3d92fbed87675b6fb6a6f9a38c223343ababdb44ba201f10/Babel-2.6.0-py2.py3-none-any.whl
Collecting traitlets (from nbsphinx->-r requirements-dev.txt (line 11))
Using cached https://files.pythonhosted.org/packages/93/d6/abcb22de61d78e2fc3959c964628a5771e47e7cc60d53e9342e21ed6cc9a/traitlets-4.3.2-py2.py3-none-any.whl
Collecting commonmark<=0.5.4 (from recommonmark->-r requirements-dev.txt (line 12))
Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/f8/0e/2365ddc010afb3d79147f1dd544e5ee24bf4ece58ab99b16fbb465ce6dc0/pytz-2018.7-py2.py3-none-any.whl (506kB)
Collecting python-dateutil>=2.5.0 (from pandas>=0.19.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/74/68/d87d9b36af36f44254a8d512cbfc48369103a3b9e474be9bdfe536abfc45/python_dateutil-2.7.5-py2.py3-none-any.whl (225kB)
Collecting scipy>=0.18.1 (from gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/cd/32/5196b64476bd41d596a8aba43506e2403e019c90e1a3dfc21d51b83db5a6/scipy-1.1.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting pbr<4,>=0.6 (from lda>=1.0.5->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/0c/5d/b077dbf309993d52c1d71e6bf6fe443a8029ea215135ebbe0b1b10e7aefc/pbr-3.1.1-py2.py3-none-any.whl
Collecting kiwisolver>=1.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/7e/31/d6fedd4fb2c94755cd101191e581af30e1650ccce7a35bddb7930fed6574/kiwisolver-1.0.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting cycler>=0.10 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib>=1.5.3->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/71/e8/6777f6624681c8b9701a8a0a5654f3eb56919a01a78e12bf3c73f5a3c714/pyparsing-2.3.0-py2.py3-none-any.whl (59kB)
Collecting PyYAML>=3.10 (from bokeh>=0.12.6->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting pillow>=4.0 (from bokeh>=0.12.6->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/bc/cc/b6e47b0075ca4267855d77850af7ea4194d2fc591664f1d70e5151b50637/Pillow-5.3.0-cp35-cp35m-manylinux1_x86_64.whl (2.0MB)
Collecting tornado>=4.3 (from bokeh>=0.12.6->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting parse>=1.8.2 (from metadata_toolbox->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl
Collecting jupyter-core (from nbformat->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/1d/44/065d2d7bae7bebc06f1dd70d23c36da8c50c0f08b4236716743d706762a8/jupyter_core-4.4.0-py2.py3-none-any.whl
Collecting ipython-genutils (from nbformat->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl
Collecting pyzmq>=13 (from jupyter-client->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/9a/f9/6d7d3d1c83159997e2e7bc74d11e84a83aa1e7f85e6f028341414e7c2141/pyzmq-17.1.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting mistune>=0.8.1 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/09/ec/4b43dae793655b7d8a25f76119624350b4d65eb663459eb9603d7f1f0345/mistune-0.8.4-py2.py3-none-any.whl
Collecting testpath (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/be/a4/162f9ebb6489421fe46dcca2ae420369edfee4b563c668d93cb4605d12ba/testpath-0.4.2-py2.py3-none-any.whl
Collecting pandocfilters>=1.4.1 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Collecting defusedxml (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/87/1c/17f3e3935a913dfe2a5ca85fa5ccbef366bfd82eb318b1f75dadbf0affca/defusedxml-0.5.0-py2.py3-none-any.whl
Collecting entrypoints>=0.2.2 (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/cc/8b/4eefa9b47f1910b3d2081da67726b066e379b04ca897acfe9f92bac56147/entrypoints-0.2.3-py2.py3-none-any.whl
Collecting bleach (from nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/d4/0d/4696373c3b714f6022d668fbab619690a42050dbeacede6d10ed97fbd3e2/bleach-3.0.2-py2.py3-none-any.whl
Collecting idna<2.8,>=2.5 (from requests->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/56/9d/1d02dd80bc4cd955f98980f28c5ee2200e1209292d5f9e9cc8d030d18655/certifi-2018.10.15-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests->nbsmoke->-r requirements-dev.txt (line 6))
Downloading https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl (118kB)
Collecting chardet<3.1.0,>=3.0.2 (from requests->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting ipython>=5.0.0 (from ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Downloading https://files.pythonhosted.org/packages/1b/e2/ffb8c1b574f972cf4183b0aac8f16b57f1e3bbe876b31555b107ea3fd009/ipython-7.1.1-py3-none-any.whl (764kB)
Collecting Send2Trash (from notebook->jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/49/46/c3dc27481d1cc57b9385aff41c474ceb7714f7935b1247194adae45db714/Send2Trash-1.5.0-py3-none-any.whl
Collecting prometheus-client (from notebook->jupyter->-r requirements-dev.txt (line 9))
Collecting terminado>=0.8.1 (from notebook->jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/2e/20/a26211a24425923d46e1213b376a6ee60dc30bcdf1b0c345e2c3769deb1c/terminado-0.8.1-py2.py3-none-any.whl
Collecting widgetsnbextension~=3.4.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/8a/81/35789a3952afb48238289171728072d26d6e76649ddc8b3588657a2d78c1/widgetsnbextension-3.4.2-py2.py3-none-any.whl
Collecting prompt-toolkit<2.1.0,>=2.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 9))
Downloading https://files.pythonhosted.org/packages/d1/e6/adb3be5576f5d27c6faa33f1e9fea8fe5dbd9351db12148de948507e352c/prompt_toolkit-2.0.7-py3-none-any.whl (338kB)
Collecting MarkupSafe>=0.23 (from Jinja2>=2.3->sphinx->-r requirements-dev.txt (line 10))
Downloading https://files.pythonhosted.org/packages/3e/a5/e188980ef1d0a4e0788b5143ea933f9afd760df38fec4c0b72b5ae3060c8/MarkupSafe-1.1.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting decorator (from traitlets->nbsphinx->-r requirements-dev.txt (line 11))
Using cached https://files.pythonhosted.org/packages/bc/bb/a24838832ba35baf52f32ab1a49b906b5f82fb7c76b2f6a7e35e140bac30/decorator-4.3.0-py2.py3-none-any.whl
Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Collecting boto3 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/a0/d8/8b000ffeba218d47cd81fbd0bd0b2790742f81ffe116964a298be785a8a4/boto3-1.9.50-py2.py3-none-any.whl (128kB)
Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/23/10/c0b78c27298029e4454a472a1919bde20cb182dab1662cec7f2ca1dcc523/boto-2.49.0-py2.py3-none-any.whl
Collecting webencodings (from bleach->nbconvert->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl
Collecting pickleshare (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/9a/41/220f49aaea88bc6fa6cba8d05ecf24676326156c23b991e80b3f2fc24c77/pickleshare-0.7.5-py2.py3-none-any.whl
Collecting backcall (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Collecting pexpect; sys_platform != "win32" (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/89/e6/b5a1de8b0cc4e07ca1b305a4fcc3f9806025c1b651ea302646341222f88b/pexpect-4.6.0-py2.py3-none-any.whl
Collecting jedi>=0.10 (from ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/7a/1a/9bd24a185873b998611c2d8d4fb15cd5e8a879ead36355df7ee53e9111bf/jedi-0.13.1-py2.py3-none-any.whl
Collecting ptyprocess; os_name != "nt" (from terminado>=0.8.1->notebook->jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/d1/29/605c2cc68a9992d18dada28206eeada56ea4bd07a239669da41674648b6f/ptyprocess-0.6.0-py2.py3-none-any.whl
Collecting wcwidth (from prompt-toolkit<2.1.0,>=2.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 9))
Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Collecting s3transfer<0.2.0,>=0.1.10 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/d7/14/2a0004d487464d120c9fb85313a75cd3d71a7506955be458eebfe19a6b1d/s3transfer-0.1.13-py2.py3-none-any.whl
Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Using cached https://files.pythonhosted.org/packages/b7/31/05c8d001f7f87f0f07289a5fc0fc3832e9a57f2dbd4d3b0fee70e0d51365/jmespath-0.9.3-py2.py3-none-any.whl
Collecting botocore<1.13.0,>=1.12.50 (from boto3->smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==1.0.2.dev0->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/82/c1/a26012b4dbca2e2ae06d7b24dc6b4378c0b0544d25f3e3f216612db37033/botocore-1.12.50-py2.py3-none-any.whl (4.9MB)
Collecting parso>=0.3.0 (from jedi>=0.10->ipython>=5.0.0->ipykernel->nbsmoke->-r requirements-dev.txt (line 6))
Using cached https://files.pythonhosted.org/packages/09/51/9c48a46334be50c13d25a3afe55fa05c445699304c5ad32619de953a2305/parso-0.3.1-py2.py3-none-any.whl
Building wheels for collected packages: nbsphinx, regex, bokeh
Running setup.py bdist_wheel for nbsphinx: started
Running setup.py bdist_wheel for nbsphinx: finished with status 'done'
Stored in directory: /tmp/pip-ephem-wheel-cache-gvbsin4o/wheels/a3/7b/ff/b71160a2e088926eb13abcf8e0cde94c0aa8400c67d3df57f5
Running setup.py bdist_wheel for regex: started
Running setup.py bdist_wheel for regex: finished with status 'done'
Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/56/b8/60/93cb6f51554e529246d89c994c1cba7b64d768ff1680062661
Running setup.py bdist_wheel for bokeh: started
Running setup.py bdist_wheel for bokeh: finished with status 'done'
Stored in directory: /mnt/data/jenkins/.cache/pip/wheels/04/dc/69/0fe79e7fa6a44f344d5be532011ff43a362ed60060d82589be
Successfully built nbsphinx regex bokeh
nbsphinx 0.3.5 has requirement nbconvert!=5.4, but you'll have nbconvert 5.4.0 which is incompatible.
Installing collected packages: pluggy, attrs, six, more-itertools, pathlib2, atomicwrites, py, pytest, coverage, pytest-cov, jsonschema, decorator, ipython-genutils, traitlets, jupyter-core, nbformat, python-dateutil, pyzmq, tornado, jupyter-client, mistune, MarkupSafe, Jinja2, Pygments, testpath, pandocfilters, defusedxml, entrypoints, webencodings, bleach, nbconvert, pyflakes, idna, certifi, urllib3, chardet, requests, pickleshare, backcall, wcwidth, prompt-toolkit, ptyprocess, pexpect, parso, jedi, ipython, ipykernel, beautifulsoup4, nbsmoke, Send2Trash, prometheus-client, terminado, notebook, qtconsole, widgetsnbextension, ipywidgets, jupyter-console, jupyter, docutils, snowballstemmer, alabaster, pyparsing, packaging, sphinxcontrib-websupport, imagesize, pytz, babel, sphinx, nbsphinx, commonmark, recommonmark, numpy, pandas, regex, scipy, bz2file, jmespath, botocore, s3transfer, boto3, boto, smart-open, gensim, pbr, lda, lxml, kiwisolver, cycler, matplotlib, PyYAML, pillow, bokeh, parse, metadata-toolbox, cophi-toolbox, dariah-topics
Running setup.py develop for dariah-topics
Successfully installed Jinja2-2.10 MarkupSafe-1.1.0 PyYAML-3.13 Pygments-2.2.0 Send2Trash-1.5.0 alabaster-0.7.12 atomicwrites-1.2.1 attrs-18.2.0 babel-2.6.0 backcall-0.1.0 beautifulsoup4-4.6.3 bleach-3.0.2 bokeh-1.0.1 boto-2.49.0 boto3-1.9.50 botocore-1.12.50 bz2file-0.98 certifi-2018.10.15 chardet-3.0.4 commonmark-0.5.4 cophi-toolbox-0.1.1.dev0 coverage-4.5.2 cycler-0.10.0 dariah-topics decorator-4.3.0 defusedxml-0.5.0 docutils-0.14 entrypoints-0.2.3 gensim-3.6.0 idna-2.7 imagesize-1.1.0 ipykernel-5.1.0 ipython-7.1.1 ipython-genutils-0.2.0 ipywidgets-7.4.2 jedi-0.13.1 jmespath-0.9.3 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.2.3 jupyter-console-6.0.0 jupyter-core-4.4.0 kiwisolver-1.0.1 lda-1.1.0 lxml-4.2.5 matplotlib-3.0.2 metadata-toolbox-0.1.0.dev0 mistune-0.8.4 more-itertools-4.3.0 nbconvert-5.4.0 nbformat-4.4.0 nbsmoke-0.2.7 nbsphinx-0.3.5 notebook-5.7.2 numpy-1.15.4 packaging-18.0 pandas-0.23.4 pandocfilters-1.4.2 parse-1.9.0 parso-0.3.1 pathlib2-2.3.2 pbr-3.1.1 pexpect-4.6.0 pickleshare-0.7.5 pillow-5.3.0 pluggy-0.8.0 prometheus-client-0.4.2 prompt-toolkit-2.0.7 ptyprocess-0.6.0 py-1.7.0 pyflakes-2.0.0 pyparsing-2.3.0 pytest-4.0.0 pytest-cov-2.6.0 python-dateutil-2.7.5 pytz-2018.7 pyzmq-17.1.2 qtconsole-4.4.3 recommonmark-0.4.0 regex-2018.11.22 requests-2.20.1 s3transfer-0.1.13 scipy-1.1.0 six-1.11.0 smart-open-1.7.1 snowballstemmer-1.2.1 sphinx-1.8.2 sphinxcontrib-websupport-1.1.0 terminado-0.8.1 testpath-0.4.2 tornado-5.1.1 traitlets-4.3.2 urllib3-1.24.1 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-3.4.2
+ ./setup.py sdist bdist_wheel
running sdist
running egg_info
writing requirements to dariah_topics.egg-info/requires.txt
writing dependency_links to dariah_topics.egg-info/dependency_links.txt
writing top-level names to dariah_topics.egg-info/top_level.txt
writing dariah_topics.egg-info/PKG-INFO
reading manifest file 'dariah_topics.egg-info/SOURCES.txt'
writing manifest file 'dariah_topics.egg-info/SOURCES.txt'
running check
creating dariah_topics-1.0.2.dev0
creating dariah_topics-1.0.2.dev0/dariah_topics
creating dariah_topics-1.0.2.dev0/dariah_topics.egg-info
creating dariah_topics-1.0.2.dev0/test
copying files to dariah_topics-1.0.2.dev0...
copying README.md -> dariah_topics-1.0.2.dev0
copying setup.cfg -> dariah_topics-1.0.2.dev0
copying setup.py -> dariah_topics-1.0.2.dev0
copying dariah_topics/__init__.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/evaluation.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/modeling.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/postprocessing.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/preprocessing.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/utils.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics/visualization.py -> dariah_topics-1.0.2.dev0/dariah_topics
copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/requires.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info
copying dariah_topics.egg-info/top_level.txt -> dariah_topics-1.0.2.dev0/dariah_topics.egg-info
copying test/test_fuzzy_segmenting.py -> dariah_topics-1.0.2.dev0/test
Writing dariah_topics-1.0.2.dev0/setup.cfg
creating dist
Creating tar archive
removing 'dariah_topics-1.0.2.dev0' (and everything under it)
running bdist_wheel
running build
running build_py
creating build
creating build/lib
creating build/lib/dariah_topics
copying dariah_topics/modeling.py -> build/lib/dariah_topics
copying dariah_topics/preprocessing.py -> build/lib/dariah_topics
copying dariah_topics/utils.py -> build/lib/dariah_topics
copying dariah_topics/postprocessing.py -> build/lib/dariah_topics
copying dariah_topics/evaluation.py -> build/lib/dariah_topics
copying dariah_topics/__init__.py -> build/lib/dariah_topics
copying dariah_topics/visualization.py -> build/lib/dariah_topics
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/modeling.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/utils.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/postprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics
copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics
running install_egg_info
Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-1.0.2.dev0-py3.5.egg-info
running install_scripts
adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
creating build/bdist.linux-x86_64/wheel/dariah_topics-1.0.2.dev0.dist-info/WHEEL
creating 'dist/dariah_topics-1.0.2.dev0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
adding 'dariah_topics/__init__.py'
adding 'dariah_topics/evaluation.py'
adding 'dariah_topics/modeling.py'
adding 'dariah_topics/postprocessing.py'
adding 'dariah_topics/preprocessing.py'
adding 'dariah_topics/utils.py'
adding 'dariah_topics/visualization.py'
adding 'dariah_topics-1.0.2.dev0.dist-info/LICENSE'
adding 'dariah_topics-1.0.2.dev0.dist-info/METADATA'
adding 'dariah_topics-1.0.2.dev0.dist-info/WHEEL'
adding 'dariah_topics-1.0.2.dev0.dist-info/top_level.txt'
adding 'dariah_topics-1.0.2.dev0.dist-info/RECORD'
removing build/bdist.linux-x86_64/wheel
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:470: UserWarning: Normalizing '1.0.2.dev' to '1.0.2.dev0'
normalized_version,
+ pytest --nbsmoke-run
============================= test session starts ==============================
platform linux -- Python 3.5.3, pytest-4.0.0, py-1.7.0, pluggy-0.8.0
rootdir: /mnt/data/jenkins/workspace/DARIAH-Topics, inifile: setup.cfg
plugins: cov-2.6.0, nbsmoke-0.2.7
collected 58 items
dariah_topics/postprocessing.py ........... [ 18%]
dariah_topics/preprocessing.py ........................... [ 65%]
dariah_topics/utils.py ...... [ 75%]
notebooks/IntroducingGensim.ipynb F [ 77%]
notebooks/IntroducingLda.ipynb F [ 79%]
notebooks/IntroducingLda_DKProWrapper.ipynb F [ 81%]
notebooks/IntroducingMallet.ipynb F [ 82%]
notebooks/Visualizations.ipynb F [ 84%]
test/test_fuzzy_segmenting.py ......... [100%]
=================================== FAILURES ===================================
_______________________________________ _______________________________________
self = <CallInfo when='call' exception: An error occurred while executing the following cell:
------------------
PlotDocument...e, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f904b899bf8>
when = 'call', treat_keyboard_interrupt_as_exception = False
def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False):
#: context of invocation: one of "setup", "call",
#: "teardown", "memocollect"
self.when = when
self.start = time()
try:
> self.result = func()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> lambda: ihook(item=item, **kwds),
when=when,
treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"),
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:193:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>}
notincall = set()
def __call__(self, *args, **kwargs):
if args:
raise TypeError("hook calling supports only keyword arguments")
assert not self.is_historic()
if self.spec and self.spec.argnames:
notincall = (
set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
)
if notincall:
warnings.warn(
"Argument(s) {} which are declared in the hookspec "
"can not be found in this hook call".format(tuple(notincall)),
stacklevel=2,
)
> return self._hookexec(self, self.get_hookimpls(), kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/hooks.py:284:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.PytestPluginManager object at 0x7f908ec88390>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>}
def _hookexec(self, hook, methods, kwargs):
# called from all hookcaller instances.
# enable_tracing will set its own wrapping function at self._inner_hookexec
> return self._inner_hookexec(hook, methods, kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>}
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
methods,
kwargs,
> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
res = hook_impl.function(*args)
if res is not None:
results.append(res)
if firstresult: # halt further impl calls
break
except BaseException:
excinfo = sys.exc_info()
finally:
if firstresult: # first result hooks return a single value
outcome = _Result(results[0] if results else None, excinfo)
else:
outcome = _Result(results, excinfo)
# run all wrapper post-yield blocks
for gen in reversed(teardowns):
try:
gen.send(outcome)
_raise_wrapfail(gen, "has second yield")
except StopIteration:
pass
> return outcome.get_result()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <pluggy.callers._Result object at 0x7f904bd9c0f0>
def get_result(self):
"""Get the result(s) for this hook call.
If the hook was marked as a ``firstresult`` only a single value
will be returned otherwise a list of results.
"""
__tracebackhide__ = True
if self._excinfo is None:
return self._result
else:
ex = self._excinfo
if _py3:
> raise ex[1].with_traceback(ex[2])
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
> res = hook_impl.function(*args)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>
def pytest_runtest_call(item):
_update_current_test_var(item, "call")
sys.last_type, sys.last_value, sys.last_traceback = (None, None, None)
try:
> item.runtest()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb'>
def runtest(self):
self._skip()
with io.open(self.name,encoding='utf8') as nb:
notebook = nbformat.read(nb, as_version=4)
# TODO: which kernel? run in pytest's or use new one (make it option)
_timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout')
kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300,
allow_errors=False,
# or sys.version_info[1] ?
kernel_name='python')
ep = ExecutePreprocessor(**kwargs)
with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right?
> ep.preprocess(notebook,{})
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908ac82198>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}, km = None
def preprocess(self, nb, resources, km=None):
"""
Preprocess notebook executing each code cell.
The input argument `nb` is modified in-place.
Parameters
----------
nb : NotebookNode
Notebook being executed.
resources : dictionary
Additional resources used in the conversion process. For example,
passing ``{'metadata': {'path': run_path}}`` sets the
execution path to ``run_path``.
km: KernelManager (optional)
Optional kernel manager. If none is provided, a kernel manager will
be created.
Returns
-------
nb : NotebookNode
The executed notebook.
resources : dictionary
Additional resources used in the conversion process.
"""
with self.setup_preprocessor(nb, resources, km=km):
self.log.info("Executing notebook with kernel: %s" % self.kernel_name)
> nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908ac82198>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}
def preprocess(self, nb, resources):
"""
Preprocessing to apply on each notebook.
Must return modified nb, resources.
If you wish to apply your preprocessing to each cell, you might want
to override preprocess_cell method instead.
Parameters
----------
nb : NotebookNode
Notebook being converted
resources : dictionary
Additional resources used in the conversion process. Allows
preprocessors to pass variables into the Jinja engine.
"""
for index, cell in enumerate(nb.cells):
> nb.cells[index], resources = self.preprocess_cell(cell, resources, index)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908ac82198>
cell = {'cell_type': 'code', 'metadata': {'scrolled': False}, 'source': 'PlotDocumentTopics = visualization.PlotDocumentTopic...r, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale"}], 'execution_count': 22}
resources = {}, cell_index = 51
def preprocess_cell(self, cell, resources, cell_index):
"""
Executes a single code cell. See base.py for details.
To execute all cells see :meth:`preprocess`.
"""
if cell.cell_type != 'code' or not cell.source.strip():
return cell, resources
reply, outputs = self.run_cell(cell, cell_index)
cell.outputs = outputs
cell_allows_errors = (self.allow_errors or "raises-exception"
in cell.metadata.get("tags", []))
if self.force_raise_errors or not cell_allows_errors:
for out in outputs:
if out.output_type == 'error':
> raise CellExecutionError.from_cell_and_msg(cell, out)
E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E PlotDocumentTopics = visualization.PlotDocumentTopics(document_topics)
E show(PlotDocumentTopics.interactive_heatmap(), notebook_handle=True)
E ------------------
E
E [0;31m---------------------------------------------------------------------------[0m
E [0;31mAttributeError[0m Traceback (most recent call last)
E [0;32m<ipython-input-22-3c545f10dd80>[0m in [0;36m<module>[0;34m[0m
E [1;32m 1[0m [0mPlotDocumentTopics[0m [0;34m=[0m [0mvisualization[0m[0;34m.[0m[0mPlotDocumentTopics[0m[0;34m([0m[0mdocument_topics[0m[0;34m)[0m[0;34m[0m[0m
E [0;32m----> 2[0;31m [0mshow[0m[0;34m([0m[0mPlotDocumentTopics[0m[0;34m.[0m[0minteractive_heatmap[0m[0;34m([0m[0;34m)[0m[0;34m,[0m [0mnotebook_handle[0m[0;34m=[0m[0;32mTrue[0m[0;34m)[0m[0;34m[0m[0m
E [0m
E [0;32m~/workspace/DARIAH-Topics/dariah_topics/visualization.py[0m in [0;36minteractive_heatmap[0;34m(self, palette, reverse_palette, tools, width, height, x_axis_location, toolbar_location, sizing_mode, line_color, grid_line_color, axis_line_color, major_tick_line_color, major_label_text_font_size, major_label_standoff, major_label_orientation, colorbar)[0m
E [1;32m 283[0m [0mtools[0m[0;34m=[0m[0mtools[0m[0;34m,[0m [0mtoolbar_location[0m[0;34m=[0m[0mtoolbar_location[0m[0;34m,[0m[0;34m[0m[0m
E [1;32m 284[0m [0msizing_mode[0m[0;34m=[0m[0msizing_mode[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m--> 285[0;31m logo=None)
E [0m[1;32m 286[0m fig.rect(x='Documents', y='Topics', source=source, width=1, height=1,
E [1;32m 287[0m [0mfill_color[0m[0;34m=[0m[0;34m{[0m[0;34m'field'[0m[0;34m:[0m [0;34m'Distributions'[0m[0;34m,[0m [0;34m'transform'[0m[0;34m:[0m [0mmapper[0m[0;34m}[0m[0;34m,[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36mfigure[0;34m(**kwargs)[0m
E [1;32m 1021[0m [0;34m[0m[0m
E [1;32m 1022[0m [0;32mdef[0m [0mfigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m-> 1023[0;31m [0;32mreturn[0m [0mFigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 1024[0m [0mfigure[0m[0;34m.[0m[0m__doc__[0m [0;34m=[0m [0mFigure[0m[0;34m.[0m[0m__doc__[0m[0;34m[0m[0m
E [1;32m 1025[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36m__init__[0;34m(self, *arg, **kw)[0m
E [1;32m 187[0m [0mkw[0m[0;34m[[0m[0;34m'title'[0m[0;34m][0m [0;34m=[0m [0mTitle[0m[0;34m([0m[0mtext[0m[0;34m=[0m[0mtitle[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 188[0m [0;34m[0m[0m
E [0;32m--> 189[0;31m [0msuper[0m[0;34m([0m[0mFigure[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m*[0m[0marg[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 190[0m [0;34m[0m[0m
E [1;32m 191[0m [0mself[0m[0;34m.[0m[0mx_range[0m [0;34m=[0m [0m_get_range[0m[0;34m([0m[0mopts[0m[0;34m.[0m[0mx_range[0m[0;34m)[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/model.py[0m in [0;36m__init__[0;34m(self, **kwargs)[0m
E [1;32m 259[0m [0mself[0m[0;34m.[0m[0m_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [1;32m 260[0m [0mself[0m[0;34m.[0m[0m_temp_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [0;32m--> 261[0;31m [0msuper[0m[0;34m([0m[0mModel[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 262[0m [0mdefault_theme[0m[0;34m.[0m[0mapply_to_model[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 263[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__init__[0;34m(self, **properties)[0m
E [1;32m 252[0m [0;34m[0m[0m
E [1;32m 253[0m [0;32mfor[0m [0mname[0m[0;34m,[0m [0mvalue[0m [0;32min[0m [0mproperties[0m[0;34m.[0m[0mitems[0m[0;34m([0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m--> 254[0;31m [0msetattr[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 255[0m [0;34m[0m[0m
E [1;32m 256[0m [0;32mdef[0m [0m__setattr__[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__setattr__[0;34m(self, name, value)[0m
E [1;32m 287[0m [0;34m[0m[0m
E [1;32m 288[0m raise AttributeError("unexpected attribute '%s' to %s, %s attributes are %s" %
E [0;32m--> 289[0;31m (name, self.__class__.__name__, text, nice_join(matches)))
E [0m[1;32m 290[0m [0;34m[0m[0m
E [1;32m 291[0m [0;32mdef[0m [0m__str__[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;31mAttributeError[0m: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
E AttributeError: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError
_______________________________________ _______________________________________
self = <CallInfo when='call' exception: An error occurred while executing the following cell:
------------------
PlotDocument...e, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f90622377b8>
when = 'call', treat_keyboard_interrupt_as_exception = False
def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False):
#: context of invocation: one of "setup", "call",
#: "teardown", "memocollect"
self.when = when
self.start = time()
try:
> self.result = func()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> lambda: ihook(item=item, **kwds),
when=when,
treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"),
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:193:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>}
notincall = set()
def __call__(self, *args, **kwargs):
if args:
raise TypeError("hook calling supports only keyword arguments")
assert not self.is_historic()
if self.spec and self.spec.argnames:
notincall = (
set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
)
if notincall:
warnings.warn(
"Argument(s) {} which are declared in the hookspec "
"can not be found in this hook call".format(tuple(notincall)),
stacklevel=2,
)
> return self._hookexec(self, self.get_hookimpls(), kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/hooks.py:284:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.PytestPluginManager object at 0x7f908ec88390>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>}
def _hookexec(self, hook, methods, kwargs):
# called from all hookcaller instances.
# enable_tracing will set its own wrapping function at self._inner_hookexec
> return self._inner_hookexec(hook, methods, kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>}
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
methods,
kwargs,
> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
res = hook_impl.function(*args)
if res is not None:
results.append(res)
if firstresult: # halt further impl calls
break
except BaseException:
excinfo = sys.exc_info()
finally:
if firstresult: # first result hooks return a single value
outcome = _Result(results[0] if results else None, excinfo)
else:
outcome = _Result(results, excinfo)
# run all wrapper post-yield blocks
for gen in reversed(teardowns):
try:
gen.send(outcome)
_raise_wrapfail(gen, "has second yield")
except StopIteration:
pass
> return outcome.get_result()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <pluggy.callers._Result object at 0x7f904b83d9e8>
def get_result(self):
"""Get the result(s) for this hook call.
If the hook was marked as a ``firstresult`` only a single value
will be returned otherwise a list of results.
"""
__tracebackhide__ = True
if self._excinfo is None:
return self._result
else:
ex = self._excinfo
if _py3:
> raise ex[1].with_traceback(ex[2])
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
> res = hook_impl.function(*args)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>
def pytest_runtest_call(item):
_update_current_test_var(item, "call")
sys.last_type, sys.last_value, sys.last_traceback = (None, None, None)
try:
> item.runtest()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda.ipynb'>
def runtest(self):
self._skip()
with io.open(self.name,encoding='utf8') as nb:
notebook = nbformat.read(nb, as_version=4)
# TODO: which kernel? run in pytest's or use new one (make it option)
_timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout')
kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300,
allow_errors=False,
# or sys.version_info[1] ?
kernel_name='python')
ep = ExecutePreprocessor(**kwargs)
with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right?
> ep.preprocess(notebook,{})
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908b56e9b0>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}, km = None
def preprocess(self, nb, resources, km=None):
"""
Preprocess notebook executing each code cell.
The input argument `nb` is modified in-place.
Parameters
----------
nb : NotebookNode
Notebook being executed.
resources : dictionary
Additional resources used in the conversion process. For example,
passing ``{'metadata': {'path': run_path}}`` sets the
execution path to ``run_path``.
km: KernelManager (optional)
Optional kernel manager. If none is provided, a kernel manager will
be created.
Returns
-------
nb : NotebookNode
The executed notebook.
resources : dictionary
Additional resources used in the conversion process.
"""
with self.setup_preprocessor(nb, resources, km=km):
self.log.info("Executing notebook with kernel: %s" % self.kernel_name)
> nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908b56e9b0>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}
def preprocess(self, nb, resources):
"""
Preprocessing to apply on each notebook.
Must return modified nb, resources.
If you wish to apply your preprocessing to each cell, you might want
to override preprocess_cell method instead.
Parameters
----------
nb : NotebookNode
Notebook being converted
resources : dictionary
Additional resources used in the conversion process. Allows
preprocessors to pass variables into the Jinja engine.
"""
for index, cell in enumerate(nb.cells):
> nb.cells[index], resources = self.preprocess_cell(cell, resources, index)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f908b56e9b0>
cell = {'cell_type': 'code', 'metadata': {}, 'source': 'PlotDocumentTopics = visualization.PlotDocumentTopics(document_topics...r, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale"}], 'execution_count': 23}
resources = {}, cell_index = 56
def preprocess_cell(self, cell, resources, cell_index):
"""
Executes a single code cell. See base.py for details.
To execute all cells see :meth:`preprocess`.
"""
if cell.cell_type != 'code' or not cell.source.strip():
return cell, resources
reply, outputs = self.run_cell(cell, cell_index)
cell.outputs = outputs
cell_allows_errors = (self.allow_errors or "raises-exception"
in cell.metadata.get("tags", []))
if self.force_raise_errors or not cell_allows_errors:
for out in outputs:
if out.output_type == 'error':
> raise CellExecutionError.from_cell_and_msg(cell, out)
E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E PlotDocumentTopics = visualization.PlotDocumentTopics(document_topics)
E show(PlotDocumentTopics.interactive_heatmap(), notebook_handle=True)
E ------------------
E
E [0;31m---------------------------------------------------------------------------[0m
E [0;31mAttributeError[0m Traceback (most recent call last)
E [0;32m<ipython-input-23-3c545f10dd80>[0m in [0;36m<module>[0;34m[0m
E [1;32m 1[0m [0mPlotDocumentTopics[0m [0;34m=[0m [0mvisualization[0m[0;34m.[0m[0mPlotDocumentTopics[0m[0;34m([0m[0mdocument_topics[0m[0;34m)[0m[0;34m[0m[0m
E [0;32m----> 2[0;31m [0mshow[0m[0;34m([0m[0mPlotDocumentTopics[0m[0;34m.[0m[0minteractive_heatmap[0m[0;34m([0m[0;34m)[0m[0;34m,[0m [0mnotebook_handle[0m[0;34m=[0m[0;32mTrue[0m[0;34m)[0m[0;34m[0m[0m
E [0m
E [0;32m~/workspace/DARIAH-Topics/dariah_topics/visualization.py[0m in [0;36minteractive_heatmap[0;34m(self, palette, reverse_palette, tools, width, height, x_axis_location, toolbar_location, sizing_mode, line_color, grid_line_color, axis_line_color, major_tick_line_color, major_label_text_font_size, major_label_standoff, major_label_orientation, colorbar)[0m
E [1;32m 283[0m [0mtools[0m[0;34m=[0m[0mtools[0m[0;34m,[0m [0mtoolbar_location[0m[0;34m=[0m[0mtoolbar_location[0m[0;34m,[0m[0;34m[0m[0m
E [1;32m 284[0m [0msizing_mode[0m[0;34m=[0m[0msizing_mode[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m--> 285[0;31m logo=None)
E [0m[1;32m 286[0m fig.rect(x='Documents', y='Topics', source=source, width=1, height=1,
E [1;32m 287[0m [0mfill_color[0m[0;34m=[0m[0;34m{[0m[0;34m'field'[0m[0;34m:[0m [0;34m'Distributions'[0m[0;34m,[0m [0;34m'transform'[0m[0;34m:[0m [0mmapper[0m[0;34m}[0m[0;34m,[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36mfigure[0;34m(**kwargs)[0m
E [1;32m 1021[0m [0;34m[0m[0m
E [1;32m 1022[0m [0;32mdef[0m [0mfigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m-> 1023[0;31m [0;32mreturn[0m [0mFigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 1024[0m [0mfigure[0m[0;34m.[0m[0m__doc__[0m [0;34m=[0m [0mFigure[0m[0;34m.[0m[0m__doc__[0m[0;34m[0m[0m
E [1;32m 1025[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36m__init__[0;34m(self, *arg, **kw)[0m
E [1;32m 187[0m [0mkw[0m[0;34m[[0m[0;34m'title'[0m[0;34m][0m [0;34m=[0m [0mTitle[0m[0;34m([0m[0mtext[0m[0;34m=[0m[0mtitle[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 188[0m [0;34m[0m[0m
E [0;32m--> 189[0;31m [0msuper[0m[0;34m([0m[0mFigure[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m*[0m[0marg[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 190[0m [0;34m[0m[0m
E [1;32m 191[0m [0mself[0m[0;34m.[0m[0mx_range[0m [0;34m=[0m [0m_get_range[0m[0;34m([0m[0mopts[0m[0;34m.[0m[0mx_range[0m[0;34m)[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/model.py[0m in [0;36m__init__[0;34m(self, **kwargs)[0m
E [1;32m 259[0m [0mself[0m[0;34m.[0m[0m_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [1;32m 260[0m [0mself[0m[0;34m.[0m[0m_temp_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [0;32m--> 261[0;31m [0msuper[0m[0;34m([0m[0mModel[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 262[0m [0mdefault_theme[0m[0;34m.[0m[0mapply_to_model[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 263[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__init__[0;34m(self, **properties)[0m
E [1;32m 252[0m [0;34m[0m[0m
E [1;32m 253[0m [0;32mfor[0m [0mname[0m[0;34m,[0m [0mvalue[0m [0;32min[0m [0mproperties[0m[0;34m.[0m[0mitems[0m[0;34m([0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m--> 254[0;31m [0msetattr[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 255[0m [0;34m[0m[0m
E [1;32m 256[0m [0;32mdef[0m [0m__setattr__[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__setattr__[0;34m(self, name, value)[0m
E [1;32m 287[0m [0;34m[0m[0m
E [1;32m 288[0m raise AttributeError("unexpected attribute '%s' to %s, %s attributes are %s" %
E [0;32m--> 289[0;31m (name, self.__class__.__name__, text, nice_join(matches)))
E [0m[1;32m 290[0m [0;34m[0m[0m
E [1;32m 291[0m [0;32mdef[0m [0m__str__[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;31mAttributeError[0m: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
E AttributeError: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError
_______________________________________ _______________________________________
self = <CallInfo when='call' exception: An error occurred while executing the following cell:
------------------
meta = pd.co...0m[0;34m:[0m[0;34m[0m[0m
[0;31mValueError[0m: No objects to concatenate
ValueError: No objects to concatenate
>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f90620f56a8>
when = 'call', treat_keyboard_interrupt_as_exception = False
def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False):
#: context of invocation: one of "setup", "call",
#: "teardown", "memocollect"
self.when = when
self.start = time()
try:
> self.result = func()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> lambda: ihook(item=item, **kwds),
when=when,
treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"),
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:193:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>}
notincall = set()
def __call__(self, *args, **kwargs):
if args:
raise TypeError("hook calling supports only keyword arguments")
assert not self.is_historic()
if self.spec and self.spec.argnames:
notincall = (
set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
)
if notincall:
warnings.warn(
"Argument(s) {} which are declared in the hookspec "
"can not be found in this hook call".format(tuple(notincall)),
stacklevel=2,
)
> return self._hookexec(self, self.get_hookimpls(), kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/hooks.py:284:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.PytestPluginManager object at 0x7f908ec88390>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>}
def _hookexec(self, hook, methods, kwargs):
# called from all hookcaller instances.
# enable_tracing will set its own wrapping function at self._inner_hookexec
> return self._inner_hookexec(hook, methods, kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>}
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
methods,
kwargs,
> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
res = hook_impl.function(*args)
if res is not None:
results.append(res)
if firstresult: # halt further impl calls
break
except BaseException:
excinfo = sys.exc_info()
finally:
if firstresult: # first result hooks return a single value
outcome = _Result(results[0] if results else None, excinfo)
else:
outcome = _Result(results, excinfo)
# run all wrapper post-yield blocks
for gen in reversed(teardowns):
try:
gen.send(outcome)
_raise_wrapfail(gen, "has second yield")
except StopIteration:
pass
> return outcome.get_result()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <pluggy.callers._Result object at 0x7f904b83d4e0>
def get_result(self):
"""Get the result(s) for this hook call.
If the hook was marked as a ``firstresult`` only a single value
will be returned otherwise a list of results.
"""
__tracebackhide__ = True
if self._excinfo is None:
return self._result
else:
ex = self._excinfo
if _py3:
> raise ex[1].with_traceback(ex[2])
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
> res = hook_impl.function(*args)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>
def pytest_runtest_call(item):
_update_current_test_var(item, "call")
sys.last_type, sys.last_value, sys.last_traceback = (None, None, None)
try:
> item.runtest()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingLda_DKProWrapper.ipynb'>
def runtest(self):
self._skip()
with io.open(self.name,encoding='utf8') as nb:
notebook = nbformat.read(nb, as_version=4)
# TODO: which kernel? run in pytest's or use new one (make it option)
_timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout')
kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300,
allow_errors=False,
# or sys.version_info[1] ?
kernel_name='python')
ep = ExecutePreprocessor(**kwargs)
with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right?
> ep.preprocess(notebook,{})
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f904b83def0>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}, km = None
def preprocess(self, nb, resources, km=None):
"""
Preprocess notebook executing each code cell.
The input argument `nb` is modified in-place.
Parameters
----------
nb : NotebookNode
Notebook being executed.
resources : dictionary
Additional resources used in the conversion process. For example,
passing ``{'metadata': {'path': run_path}}`` sets the
execution path to ``run_path``.
km: KernelManager (optional)
Optional kernel manager. If none is provided, a kernel manager will
be created.
Returns
-------
nb : NotebookNode
The executed notebook.
resources : dictionary
Additional resources used in the conversion process.
"""
with self.setup_preprocessor(nb, resources, km=km):
self.log.info("Executing notebook with kernel: %s" % self.kernel_name)
> nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f904b83def0>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}
def preprocess(self, nb, resources):
"""
Preprocessing to apply on each notebook.
Must return modified nb, resources.
If you wish to apply your preprocessing to each cell, you might want
to override preprocess_cell method instead.
Parameters
----------
nb : NotebookNode
Notebook being converted
resources : dictionary
Additional resources used in the conversion process. Allows
preprocessors to pass variables into the Jinja engine.
"""
for index, cell in enumerate(nb.cells):
> nb.cells[index], resources = self.preprocess_cell(cell, resources, index)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f904b83def0>
cell = {'cell_type': 'code', 'metadata': {'collapsed': False, 'scrolled': True}, 'source': "meta = pd.concat([metadata.fname2...0m: No objects to concatenate'], 'output_type': 'error', 'evalue': 'No objects to concatenate'}], 'execution_count': 7}
resources = {}, cell_index = 19
def preprocess_cell(self, cell, resources, cell_index):
"""
Executes a single code cell. See base.py for details.
To execute all cells see :meth:`preprocess`.
"""
if cell.cell_type != 'code' or not cell.source.strip():
return cell, resources
reply, outputs = self.run_cell(cell, cell_index)
cell.outputs = outputs
cell_allows_errors = (self.allow_errors or "raises-exception"
in cell.metadata.get("tags", []))
if self.force_raise_errors or not cell_allows_errors:
for out in outputs:
if out.output_type == 'error':
> raise CellExecutionError.from_cell_and_msg(cell, out)
E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E meta = pd.concat([metadata.fname2metadata(str(path), pattern=pattern) for path in path_to_corpus.glob('*.csv')])
E meta[:5] # by adding '[:5]' to the variable, only the first 5 elements will be printed
E ------------------
E
E [0;31m---------------------------------------------------------------------------[0m
E [0;31mValueError[0m Traceback (most recent call last)
E [0;32m<ipython-input-7-30c5c181683a>[0m in [0;36m<module>[0;34m[0m
E [0;32m----> 1[0;31m [0mmeta[0m [0;34m=[0m [0mpd[0m[0;34m.[0m[0mconcat[0m[0;34m([0m[0;34m[[0m[0mmetadata[0m[0;34m.[0m[0mfname2metadata[0m[0;34m([0m[0mstr[0m[0;34m([0m[0mpath[0m[0;34m)[0m[0;34m,[0m [0mpattern[0m[0;34m=[0m[0mpattern[0m[0;34m)[0m [0;32mfor[0m [0mpath[0m [0;32min[0m [0mpath_to_corpus[0m[0;34m.[0m[0mglob[0m[0;34m([0m[0;34m'*.csv'[0m[0;34m)[0m[0;34m][0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 2[0m [0mmeta[0m[0;34m[[0m[0;34m:[0m[0;36m5[0m[0;34m][0m [0;31m# by adding '[:5]' to the variable, only the first 5 elements will be printed[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/reshape/concat.py[0m in [0;36mconcat[0;34m(objs, axis, join, join_axes, ignore_index, keys, levels, names, verify_integrity, sort, copy)[0m
E [1;32m 223[0m [0mkeys[0m[0;34m=[0m[0mkeys[0m[0;34m,[0m [0mlevels[0m[0;34m=[0m[0mlevels[0m[0;34m,[0m [0mnames[0m[0;34m=[0m[0mnames[0m[0;34m,[0m[0;34m[0m[0m
E [1;32m 224[0m [0mverify_integrity[0m[0;34m=[0m[0mverify_integrity[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m--> 225[0;31m copy=copy, sort=sort)
E [0m[1;32m 226[0m [0;32mreturn[0m [0mop[0m[0;34m.[0m[0mget_result[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 227[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/reshape/concat.py[0m in [0;36m__init__[0;34m(self, objs, axis, join, join_axes, keys, levels, names, ignore_index, verify_integrity, copy, sort)[0m
E [1;32m 257[0m [0;34m[0m[0m
E [1;32m 258[0m [0;32mif[0m [0mlen[0m[0;34m([0m[0mobjs[0m[0;34m)[0m [0;34m==[0m [0;36m0[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m--> 259[0;31m [0;32mraise[0m [0mValueError[0m[0;34m([0m[0;34m'No objects to concatenate'[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 260[0m [0;34m[0m[0m
E [1;32m 261[0m [0;32mif[0m [0mkeys[0m [0;32mis[0m [0;32mNone[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;31mValueError[0m: No objects to concatenate
E ValueError: No objects to concatenate
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError
_______________________________________ _______________________________________
self = <CallInfo when='call' exception: An error occurred while executing the following cell:
------------------
PlotDocument...e, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f90620f57b8>
when = 'call', treat_keyboard_interrupt_as_exception = False
def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False):
#: context of invocation: one of "setup", "call",
#: "teardown", "memocollect"
self.when = when
self.start = time()
try:
> self.result = func()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> lambda: ihook(item=item, **kwds),
when=when,
treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"),
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:193:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>}
notincall = set()
def __call__(self, *args, **kwargs):
if args:
raise TypeError("hook calling supports only keyword arguments")
assert not self.is_historic()
if self.spec and self.spec.argnames:
notincall = (
set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
)
if notincall:
warnings.warn(
"Argument(s) {} which are declared in the hookspec "
"can not be found in this hook call".format(tuple(notincall)),
stacklevel=2,
)
> return self._hookexec(self, self.get_hookimpls(), kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/hooks.py:284:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.PytestPluginManager object at 0x7f908ec88390>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>}
def _hookexec(self, hook, methods, kwargs):
# called from all hookcaller instances.
# enable_tracing will set its own wrapping function at self._inner_hookexec
> return self._inner_hookexec(hook, methods, kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>}
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
methods,
kwargs,
> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
res = hook_impl.function(*args)
if res is not None:
results.append(res)
if firstresult: # halt further impl calls
break
except BaseException:
excinfo = sys.exc_info()
finally:
if firstresult: # first result hooks return a single value
outcome = _Result(results[0] if results else None, excinfo)
else:
outcome = _Result(results, excinfo)
# run all wrapper post-yield blocks
for gen in reversed(teardowns):
try:
gen.send(outcome)
_raise_wrapfail(gen, "has second yield")
except StopIteration:
pass
> return outcome.get_result()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <pluggy.callers._Result object at 0x7f90621bf748>
def get_result(self):
"""Get the result(s) for this hook call.
If the hook was marked as a ``firstresult`` only a single value
will be returned otherwise a list of results.
"""
__tracebackhide__ = True
if self._excinfo is None:
return self._result
else:
ex = self._excinfo
if _py3:
> raise ex[1].with_traceback(ex[2])
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
> res = hook_impl.function(*args)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>
def pytest_runtest_call(item):
_update_current_test_var(item, "call")
sys.last_type, sys.last_value, sys.last_traceback = (None, None, None)
try:
> item.runtest()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingMallet.ipynb'>
def runtest(self):
self._skip()
with io.open(self.name,encoding='utf8') as nb:
notebook = nbformat.read(nb, as_version=4)
# TODO: which kernel? run in pytest's or use new one (make it option)
_timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout')
kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300,
allow_errors=False,
# or sys.version_info[1] ?
kernel_name='python')
ep = ExecutePreprocessor(**kwargs)
with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right?
> ep.preprocess(notebook,{})
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f90621bf278>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}, km = None
def preprocess(self, nb, resources, km=None):
"""
Preprocess notebook executing each code cell.
The input argument `nb` is modified in-place.
Parameters
----------
nb : NotebookNode
Notebook being executed.
resources : dictionary
Additional resources used in the conversion process. For example,
passing ``{'metadata': {'path': run_path}}`` sets the
execution path to ``run_path``.
km: KernelManager (optional)
Optional kernel manager. If none is provided, a kernel manager will
be created.
Returns
-------
nb : NotebookNode
The executed notebook.
resources : dictionary
Additional resources used in the conversion process.
"""
with self.setup_preprocessor(nb, resources, km=km):
self.log.info("Executing notebook with kernel: %s" % self.kernel_name)
> nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f90621bf278>
nb = {'nbformat_minor': 1, 'metadata': {'kernelspec': {'language': 'python', 'name': 'python3', 'display_name': 'Python 3'}...= PlotDocumentTopics.static_heatmap()\nstatic_heatmap.show()', 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}
def preprocess(self, nb, resources):
"""
Preprocessing to apply on each notebook.
Must return modified nb, resources.
If you wish to apply your preprocessing to each cell, you might want
to override preprocess_cell method instead.
Parameters
----------
nb : NotebookNode
Notebook being converted
resources : dictionary
Additional resources used in the conversion process. Allows
preprocessors to pass variables into the Jinja engine.
"""
for index, cell in enumerate(nb.cells):
> nb.cells[index], resources = self.preprocess_cell(cell, resources, index)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f90621bf278>
cell = {'cell_type': 'code', 'metadata': {'collapsed': True}, 'source': 'PlotDocumentTopics = visualization.PlotDocumentTopic...r, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale"}], 'execution_count': 25}
resources = {}, cell_index = 60
def preprocess_cell(self, cell, resources, cell_index):
"""
Executes a single code cell. See base.py for details.
To execute all cells see :meth:`preprocess`.
"""
if cell.cell_type != 'code' or not cell.source.strip():
return cell, resources
reply, outputs = self.run_cell(cell, cell_index)
cell.outputs = outputs
cell_allows_errors = (self.allow_errors or "raises-exception"
in cell.metadata.get("tags", []))
if self.force_raise_errors or not cell_allows_errors:
for out in outputs:
if out.output_type == 'error':
> raise CellExecutionError.from_cell_and_msg(cell, out)
E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E PlotDocumentTopics = visualization.PlotDocumentTopics(document_topics)
E show(PlotDocumentTopics.interactive_heatmap(), notebook_handle=True)
E ------------------
E
E [0;31m---------------------------------------------------------------------------[0m
E [0;31mAttributeError[0m Traceback (most recent call last)
E [0;32m<ipython-input-25-3c545f10dd80>[0m in [0;36m<module>[0;34m[0m
E [1;32m 1[0m [0mPlotDocumentTopics[0m [0;34m=[0m [0mvisualization[0m[0;34m.[0m[0mPlotDocumentTopics[0m[0;34m([0m[0mdocument_topics[0m[0;34m)[0m[0;34m[0m[0m
E [0;32m----> 2[0;31m [0mshow[0m[0;34m([0m[0mPlotDocumentTopics[0m[0;34m.[0m[0minteractive_heatmap[0m[0;34m([0m[0;34m)[0m[0;34m,[0m [0mnotebook_handle[0m[0;34m=[0m[0;32mTrue[0m[0;34m)[0m[0;34m[0m[0m
E [0m
E [0;32m~/workspace/DARIAH-Topics/dariah_topics/visualization.py[0m in [0;36minteractive_heatmap[0;34m(self, palette, reverse_palette, tools, width, height, x_axis_location, toolbar_location, sizing_mode, line_color, grid_line_color, axis_line_color, major_tick_line_color, major_label_text_font_size, major_label_standoff, major_label_orientation, colorbar)[0m
E [1;32m 283[0m [0mtools[0m[0;34m=[0m[0mtools[0m[0;34m,[0m [0mtoolbar_location[0m[0;34m=[0m[0mtoolbar_location[0m[0;34m,[0m[0;34m[0m[0m
E [1;32m 284[0m [0msizing_mode[0m[0;34m=[0m[0msizing_mode[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m--> 285[0;31m logo=None)
E [0m[1;32m 286[0m fig.rect(x='Documents', y='Topics', source=source, width=1, height=1,
E [1;32m 287[0m [0mfill_color[0m[0;34m=[0m[0;34m{[0m[0;34m'field'[0m[0;34m:[0m [0;34m'Distributions'[0m[0;34m,[0m [0;34m'transform'[0m[0;34m:[0m [0mmapper[0m[0;34m}[0m[0;34m,[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36mfigure[0;34m(**kwargs)[0m
E [1;32m 1021[0m [0;34m[0m[0m
E [1;32m 1022[0m [0;32mdef[0m [0mfigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m-> 1023[0;31m [0;32mreturn[0m [0mFigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 1024[0m [0mfigure[0m[0;34m.[0m[0m__doc__[0m [0;34m=[0m [0mFigure[0m[0;34m.[0m[0m__doc__[0m[0;34m[0m[0m
E [1;32m 1025[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36m__init__[0;34m(self, *arg, **kw)[0m
E [1;32m 187[0m [0mkw[0m[0;34m[[0m[0;34m'title'[0m[0;34m][0m [0;34m=[0m [0mTitle[0m[0;34m([0m[0mtext[0m[0;34m=[0m[0mtitle[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 188[0m [0;34m[0m[0m
E [0;32m--> 189[0;31m [0msuper[0m[0;34m([0m[0mFigure[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m*[0m[0marg[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 190[0m [0;34m[0m[0m
E [1;32m 191[0m [0mself[0m[0;34m.[0m[0mx_range[0m [0;34m=[0m [0m_get_range[0m[0;34m([0m[0mopts[0m[0;34m.[0m[0mx_range[0m[0;34m)[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/model.py[0m in [0;36m__init__[0;34m(self, **kwargs)[0m
E [1;32m 259[0m [0mself[0m[0;34m.[0m[0m_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [1;32m 260[0m [0mself[0m[0;34m.[0m[0m_temp_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [0;32m--> 261[0;31m [0msuper[0m[0;34m([0m[0mModel[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 262[0m [0mdefault_theme[0m[0;34m.[0m[0mapply_to_model[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 263[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__init__[0;34m(self, **properties)[0m
E [1;32m 252[0m [0;34m[0m[0m
E [1;32m 253[0m [0;32mfor[0m [0mname[0m[0;34m,[0m [0mvalue[0m [0;32min[0m [0mproperties[0m[0;34m.[0m[0mitems[0m[0;34m([0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m--> 254[0;31m [0msetattr[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 255[0m [0;34m[0m[0m
E [1;32m 256[0m [0;32mdef[0m [0m__setattr__[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__setattr__[0;34m(self, name, value)[0m
E [1;32m 287[0m [0;34m[0m[0m
E [1;32m 288[0m raise AttributeError("unexpected attribute '%s' to %s, %s attributes are %s" %
E [0;32m--> 289[0;31m (name, self.__class__.__name__, text, nice_join(matches)))
E [0m[1;32m 290[0m [0;34m[0m[0m
E [1;32m 291[0m [0;32mdef[0m [0m__str__[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;31mAttributeError[0m: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
E AttributeError: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError
_______________________________________ _______________________________________
self = <CallInfo when='call' exception: An error occurred while executing the following cell:
------------------
interactive_...e, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f9062120ae8>
when = 'call', treat_keyboard_interrupt_as_exception = False
def __init__(self, func, when, treat_keyboard_interrupt_as_exception=False):
#: context of invocation: one of "setup", "call",
#: "teardown", "memocollect"
self.when = when
self.start = time()
try:
> self.result = func()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> lambda: ihook(item=item, **kwds),
when=when,
treat_keyboard_interrupt_as_exception=item.config.getvalue("usepdb"),
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:193:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_HookCaller 'pytest_runtest_call'>, args = ()
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>}
notincall = set()
def __call__(self, *args, **kwargs):
if args:
raise TypeError("hook calling supports only keyword arguments")
assert not self.is_historic()
if self.spec and self.spec.argnames:
notincall = (
set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
)
if notincall:
warnings.warn(
"Argument(s) {} which are declared in the hookspec "
"can not be found in this hook call".format(tuple(notincall)),
stacklevel=2,
)
> return self._hookexec(self, self.get_hookimpls(), kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/hooks.py:284:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.PytestPluginManager object at 0x7f908ec88390>
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>}
def _hookexec(self, hook, methods, kwargs):
# called from all hookcaller instances.
# enable_tracing will set its own wrapping function at self._inner_hookexec
> return self._inner_hookexec(hook, methods, kwargs)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>}
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
methods,
kwargs,
> firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/manager.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
res = hook_impl.function(*args)
if res is not None:
results.append(res)
if firstresult: # halt further impl calls
break
except BaseException:
excinfo = sys.exc_info()
finally:
if firstresult: # first result hooks return a single value
outcome = _Result(results[0] if results else None, excinfo)
else:
outcome = _Result(results, excinfo)
# run all wrapper post-yield blocks
for gen in reversed(teardowns):
try:
gen.send(outcome)
_raise_wrapfail(gen, "has second yield")
except StopIteration:
pass
> return outcome.get_result()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <pluggy.callers._Result object at 0x7f906218d240>
def get_result(self):
"""Get the result(s) for this hook call.
If the hook was marked as a ``firstresult`` only a single value
will be returned otherwise a list of results.
"""
__tracebackhide__ = True
if self._excinfo is None:
return self._result
else:
ex = self._excinfo
if _py3:
> raise ex[1].with_traceback(ex[2])
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/mnt/data/jenkins/shiningpanda/jobs/62c67c92/vi...9089a05390>>, <HookImpl plugin_name='logging-plugin', plugin=<_pytest.logging.LoggingPlugin object at 0x7f9089997f98>>]
caller_kwargs = {'item': <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>}
firstresult = False
def _multicall(hook_impls, caller_kwargs, firstresult=False):
"""Execute a call into multiple python functions/methods and return the
result(s).
``caller_kwargs`` comes from _HookCaller.__call__().
"""
__tracebackhide__ = True
results = []
excinfo = None
try: # run impl and wrapper setup functions in a loop
teardowns = []
try:
for hook_impl in reversed(hook_impls):
try:
args = [caller_kwargs[argname] for argname in hook_impl.argnames]
except KeyError:
for argname in hook_impl.argnames:
if argname not in caller_kwargs:
raise HookCallError(
"hook call must provide argument %r" % (argname,)
)
if hook_impl.hookwrapper:
try:
gen = hook_impl.function(*args)
next(gen) # first yield
teardowns.append(gen)
except StopIteration:
_raise_wrapfail(gen, "did not yield")
else:
> res = hook_impl.function(*args)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
item = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>
def pytest_runtest_call(item):
_update_current_test_var(item, "call")
sys.last_type, sys.last_value, sys.last_traceback = (None, None, None)
try:
> item.runtest()
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/runner.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <RunNb '/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/Visualizations.ipynb'>
def runtest(self):
self._skip()
with io.open(self.name,encoding='utf8') as nb:
notebook = nbformat.read(nb, as_version=4)
# TODO: which kernel? run in pytest's or use new one (make it option)
_timeout = self.parent.parent.config.getini('nbsmoke_cell_timeout')
kwargs = dict(timeout=int(_timeout) if _timeout!='' else 300,
allow_errors=False,
# or sys.version_info[1] ?
kernel_name='python')
ep = ExecutePreprocessor(**kwargs)
with cwd(os.path.dirname(self.name)): # jupyter notebook always does this, right?
> ep.preprocess(notebook,{})
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbsmoke/__init__.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f906218dc18>
nb = {'nbformat_minor': 2, 'metadata': {'language_info': {'mimetype': 'text/x-python', 'nbconvert_exporter': 'python', 'cod...how(interactive_barchart_per_document, notebook_handle=True)", 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}, km = None
def preprocess(self, nb, resources, km=None):
"""
Preprocess notebook executing each code cell.
The input argument `nb` is modified in-place.
Parameters
----------
nb : NotebookNode
Notebook being executed.
resources : dictionary
Additional resources used in the conversion process. For example,
passing ``{'metadata': {'path': run_path}}`` sets the
execution path to ``run_path``.
km: KernelManager (optional)
Optional kernel manager. If none is provided, a kernel manager will
be created.
Returns
-------
nb : NotebookNode
The executed notebook.
resources : dictionary
Additional resources used in the conversion process.
"""
with self.setup_preprocessor(nb, resources, km=km):
self.log.info("Executing notebook with kernel: %s" % self.kernel_name)
> nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f906218dc18>
nb = {'nbformat_minor': 2, 'metadata': {'language_info': {'mimetype': 'text/x-python', 'nbconvert_exporter': 'python', 'cod...how(interactive_barchart_per_document, notebook_handle=True)", 'outputs': [], 'execution_count': None}], 'nbformat': 4}
resources = {}
def preprocess(self, nb, resources):
"""
Preprocessing to apply on each notebook.
Must return modified nb, resources.
If you wish to apply your preprocessing to each cell, you might want
to override preprocess_cell method instead.
Parameters
----------
nb : NotebookNode
Notebook being converted
resources : dictionary
Additional resources used in the conversion process. Allows
preprocessors to pass variables into the Jinja engine.
"""
for index, cell in enumerate(nb.cells):
> nb.cells[index], resources = self.preprocess_cell(cell, resources, index)
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/base.py:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f906218dc18>
cell = {'cell_type': 'code', 'metadata': {}, 'source': 'interactive_heatmap = PlotDocumentTopics.interactive_heatmap(width=80...r, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale"}], 'execution_count': 21}
resources = {}, cell_index = 26
def preprocess_cell(self, cell, resources, cell_index):
"""
Executes a single code cell. See base.py for details.
To execute all cells see :meth:`preprocess`.
"""
if cell.cell_type != 'code' or not cell.source.strip():
return cell, resources
reply, outputs = self.run_cell(cell, cell_index)
cell.outputs = outputs
cell_allows_errors = (self.allow_errors or "raises-exception"
in cell.metadata.get("tags", []))
if self.force_raise_errors or not cell_allows_errors:
for out in outputs:
if out.output_type == 'error':
> raise CellExecutionError.from_cell_and_msg(cell, out)
E nbconvert.preprocessors.execute.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E interactive_heatmap = PlotDocumentTopics.interactive_heatmap(width=800,
E height=550,
E colorbar=False)
E show(interactive_heatmap, notebook_handle=True)
E ------------------
E
E [0;31m---------------------------------------------------------------------------[0m
E [0;31mAttributeError[0m Traceback (most recent call last)
E [0;32m<ipython-input-21-0ec82b4d408d>[0m in [0;36m<module>[0;34m[0m
E [1;32m 1[0m interactive_heatmap = PlotDocumentTopics.interactive_heatmap(width=800,
E [1;32m 2[0m [0mheight[0m[0;34m=[0m[0;36m550[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m----> 3[0;31m colorbar=False)
E [0m[1;32m 4[0m [0mshow[0m[0;34m([0m[0minteractive_heatmap[0m[0;34m,[0m [0mnotebook_handle[0m[0;34m=[0m[0;32mTrue[0m[0;34m)[0m[0;34m[0m[0m
E
E [0;32m~/workspace/DARIAH-Topics/dariah_topics/visualization.py[0m in [0;36minteractive_heatmap[0;34m(self, palette, reverse_palette, tools, width, height, x_axis_location, toolbar_location, sizing_mode, line_color, grid_line_color, axis_line_color, major_tick_line_color, major_label_text_font_size, major_label_standoff, major_label_orientation, colorbar)[0m
E [1;32m 283[0m [0mtools[0m[0;34m=[0m[0mtools[0m[0;34m,[0m [0mtoolbar_location[0m[0;34m=[0m[0mtoolbar_location[0m[0;34m,[0m[0;34m[0m[0m
E [1;32m 284[0m [0msizing_mode[0m[0;34m=[0m[0msizing_mode[0m[0;34m,[0m[0;34m[0m[0m
E [0;32m--> 285[0;31m logo=None)
E [0m[1;32m 286[0m fig.rect(x='Documents', y='Topics', source=source, width=1, height=1,
E [1;32m 287[0m [0mfill_color[0m[0;34m=[0m[0;34m{[0m[0;34m'field'[0m[0;34m:[0m [0;34m'Distributions'[0m[0;34m,[0m [0;34m'transform'[0m[0;34m:[0m [0mmapper[0m[0;34m}[0m[0;34m,[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36mfigure[0;34m(**kwargs)[0m
E [1;32m 1021[0m [0;34m[0m[0m
E [1;32m 1022[0m [0;32mdef[0m [0mfigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m-> 1023[0;31m [0;32mreturn[0m [0mFigure[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 1024[0m [0mfigure[0m[0;34m.[0m[0m__doc__[0m [0;34m=[0m [0mFigure[0m[0;34m.[0m[0m__doc__[0m[0;34m[0m[0m
E [1;32m 1025[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/plotting/figure.py[0m in [0;36m__init__[0;34m(self, *arg, **kw)[0m
E [1;32m 187[0m [0mkw[0m[0;34m[[0m[0;34m'title'[0m[0;34m][0m [0;34m=[0m [0mTitle[0m[0;34m([0m[0mtext[0m[0;34m=[0m[0mtitle[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 188[0m [0;34m[0m[0m
E [0;32m--> 189[0;31m [0msuper[0m[0;34m([0m[0mFigure[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m*[0m[0marg[0m[0;34m,[0m [0;34m**[0m[0mkw[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 190[0m [0;34m[0m[0m
E [1;32m 191[0m [0mself[0m[0;34m.[0m[0mx_range[0m [0;34m=[0m [0m_get_range[0m[0;34m([0m[0mopts[0m[0;34m.[0m[0mx_range[0m[0;34m)[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/model.py[0m in [0;36m__init__[0;34m(self, **kwargs)[0m
E [1;32m 259[0m [0mself[0m[0;34m.[0m[0m_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [1;32m 260[0m [0mself[0m[0;34m.[0m[0m_temp_document[0m [0;34m=[0m [0;32mNone[0m[0;34m[0m[0m
E [0;32m--> 261[0;31m [0msuper[0m[0;34m([0m[0mModel[0m[0;34m,[0m [0mself[0m[0;34m)[0m[0;34m.[0m[0m__init__[0m[0;34m([0m[0;34m**[0m[0mkwargs[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 262[0m [0mdefault_theme[0m[0;34m.[0m[0mapply_to_model[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m[0m[0m
E [1;32m 263[0m [0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__init__[0;34m(self, **properties)[0m
E [1;32m 252[0m [0;34m[0m[0m
E [1;32m 253[0m [0;32mfor[0m [0mname[0m[0;34m,[0m [0mvalue[0m [0;32min[0m [0mproperties[0m[0;34m.[0m[0mitems[0m[0;34m([0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E [0;32m--> 254[0;31m [0msetattr[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m[0m[0m
E [0m[1;32m 255[0m [0;34m[0m[0m
E [1;32m 256[0m [0;32mdef[0m [0m__setattr__[0m[0;34m([0m[0mself[0m[0;34m,[0m [0mname[0m[0;34m,[0m [0mvalue[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;32m~/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/bokeh/core/has_props.py[0m in [0;36m__setattr__[0;34m(self, name, value)[0m
E [1;32m 287[0m [0;34m[0m[0m
E [1;32m 288[0m raise AttributeError("unexpected attribute '%s' to %s, %s attributes are %s" %
E [0;32m--> 289[0;31m (name, self.__class__.__name__, text, nice_join(matches)))
E [0m[1;32m 290[0m [0;34m[0m[0m
E [1;32m 291[0m [0;32mdef[0m [0m__str__[0m[0;34m([0m[0mself[0m[0;34m)[0m[0;34m:[0m[0;34m[0m[0m
E
E [0;31mAttributeError[0m: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
E AttributeError: unexpected attribute 'logo' to Figure, possible attributes are above, aspect_scale, background_fill_alpha, background_fill_color, below, border_fill_alpha, border_fill_color, css_classes, disabled, extra_x_ranges, extra_y_ranges, h_symmetry, height, hidpi, inner_height, inner_width, js_event_callbacks, js_property_callbacks, layout_height, layout_width, left, lod_factor, lod_interval, lod_threshold, lod_timeout, match_aspect, min_border, min_border_bottom, min_border_left, min_border_right, min_border_top, name, outline_line_alpha, outline_line_cap, outline_line_color, outline_line_dash, outline_line_dash_offset, outline_line_join, outline_line_width, output_backend, plot_height, plot_width, renderers, right, sizing_mode, subscribed_events, tags, title, title_location, toolbar, toolbar_location, toolbar_sticky, v_symmetry, width, x_range, x_scale, y_range or y_scale
../../shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nbconvert/preprocessors/execute.py:385: CellExecutionError
--- generated xml file: /mnt/data/jenkins/workspace/DARIAH-Topics/tests.xml ----
----------- coverage: platform linux, python 3.5.3-final-0 -----------
Coverage HTML written to dir htmlcov
Coverage XML written to file coverage.xml
=============================== warnings summary ===============================
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/scipy/sparse/sparsetools.py:21
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/scipy/sparse/sparsetools.py:21: DeprecationWarning: `scipy.sparse.sparsetools` is deprecated!
scipy.sparse.sparsetools is a private module for scipy.sparse, and should not be used.
_deprecated()
dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_gensim_document_topics
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead.
score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc)
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead.
score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc)
dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_gensim_topics
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead.
score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc)
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/gensim/models/ldamodel.py:1077: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.from_iter(generator)) or the python sum builtin instead.
score += np.sum(cnt * logsumexp(Elogthetad + Elogbeta[:, int(id)]) for id, cnt in doc)
dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_document_topics
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead.
"""
dariah_topics/postprocessing.py::dariah_topics.postprocessing._show_lda_topics
/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py:1: FutureWarning: Method .as_matrix will be removed in a future version. Use .values instead.
"""
dariah_topics/postprocessing.py::dariah_topics.postprocessing.doc2bow
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/cophi_toolbox/preprocessing.py:625: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))
dariah_topics/postprocessing.py::dariah_topics.postprocessing.save_document_term_matrix
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/cophi_toolbox/preprocessing.py:625: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
document_term_matrix.set_value((document_id, type_id), 0, int(bag_of_words[document_id][type_id]))
notebooks/IntroducingGensim.ipynb::/mnt/data/jenkins/workspace/DARIAH-Topics/notebooks/IntroducingGensim.ipynb
/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/jupyter_client/manager.py:72: DeprecationWarning: KernelManager._kernel_name_changed is deprecated in traitlets 4.1: use @observe and @unobserve instead.
def _kernel_name_changed(self, name, old, new):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
============== 5 failed, 53 passed, 10 warnings in 163.31 seconds ==============
Build step 'Virtualenv Builder' marked build as failure
Recording test results
[Cobertura] Publishing Cobertura coverage report...
[Cobertura] Publishing Cobertura coverage results...
[Cobertura] Cobertura coverage report found.
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@6cfcf1a6[description=A Python library for topic modeling.,homepage=https://dariah-de.github.io/Topics,name=Topics,fork=true,size=42294,milestones={},language=Python,commits={},responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/octet-stream], Date=[Thu, 22 Nov 2018 12:20:04 GMT], ETag=["000d500583338110c48ab48d6bffc9e1"], Last-Modified=[Thu, 08 Nov 2018 02:39:54 GMT], OkHttp-Received-Millis=[1542889204987], OkHttp-Response-Source=[CONDITIONAL_CACHE 304], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1542889204665], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[304 Not Modified], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[E7AF:061E:94B601:1393D8A:5BF69EF4], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4998], X-RateLimit-Reset=[1542892716], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:b371474) with context:DARIAH-Topics
Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/b3714747ccf8cde3e96a5efca4b6918aa1fe1863
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Notifying upstream projects of job completion
[ci-game] evaluating rule: Build result
[ci-game] evaluating rule: Increased number of passed tests
[ci-game] evaluating rule: Decreased number of passed tests
[ci-game] evaluating rule: Increased number of failed tests
[ci-game] evaluating rule: Decreased number of failed tests
[ci-game] evaluating rule: Increased number of skipped tests
[ci-game] evaluating rule: Decreased number of skipped tests
[ci-game] evaluating rule: Open HIGH priority tasks
[ci-game] evaluating rule: Open NORMAL priority tasks
[ci-game] evaluating rule: Open LOW priority tasks
[ci-game] evaluating rule: Changed number of compiler warnings
Finished: FAILURE