Started by GitHub push by severinsimmler Building remotely on Rechenknecht in workspace /mnt/data/jenkins/workspace/DARIAH-Topics > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/DARIAH-DE/Topics # timeout=10 Fetching upstream changes from https://github.com/DARIAH-DE/Topics > git --version # timeout=10 using GIT_ASKPASS to set credentials > git fetch --tags --progress https://github.com/DARIAH-DE/Topics +refs/heads/*:refs/remotes/origin/* > git rev-parse refs/remotes/origin/testing^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/testing^{commit} # timeout=10 Checking out Revision 4e871a26989d53a446078afa1164ea5e3e8f6071 (refs/remotes/origin/testing) > git config core.sparsecheckout # timeout=10 > git checkout -f 4e871a26989d53a446078afa1164ea5e3e8f6071 > git rev-list a6996f585d361611f89802670c3e57000778eee9 # timeout=10 [DARIAH-Topics] $ /usr/bin/python3 /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenv.py /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9 Using base prefix '/usr' New python executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python3 Also creating executable in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/bin/python Installing setuptools, pip, wheel...done. [DARIAH-Topics] $ /bin/sh -xe /tmp/shiningpanda3667867648081616999.sh + pip install -U pip Requirement already up-to-date: pip in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages + pip install -U -r requirements-dev.txt Obtaining file:///mnt/data/jenkins/workspace/DARIAH-Topics (from -r requirements.txt (line 1)) Collecting nose (from -r requirements-dev.txt (line 2)) Using cached nose-1.3.7-py3-none-any.whl Collecting nosexcover (from -r requirements-dev.txt (line 3)) Using cached nosexcover-1.0.11-py2.py3-none-any.whl Collecting jupyter (from -r requirements-dev.txt (line 4)) Using cached jupyter-1.0.0-py2.py3-none-any.whl Collecting pandas>=0.19.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pandas-0.20.2-cp35-cp35m-manylinux1_x86_64.whl Collecting regex>=2017.01.14 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting gensim>=0.13.2 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting matplotlib==1.5.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached matplotlib-1.5.3-cp35-cp35m-manylinux1_x86_64.whl Collecting numpy>=1.3 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached numpy-1.13.0-cp35-cp35m-manylinux1_x86_64.whl Collecting scipy>=0.7 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached scipy-0.19.1-cp35-cp35m-manylinux1_x86_64.whl Collecting flask>=0.11.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Flask-0.12.2-py2.py3-none-any.whl Collecting wikipedia>=1.4.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting lxml>=3.6.4 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached lxml-3.8.0-cp35-cp35m-manylinux1_x86_64.whl Collecting wordcloud>=1.3.1 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting pyLDAvis>=2.0.0 (from dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting coverage>=3.4 (from nosexcover->-r requirements-dev.txt (line 3)) Using cached coverage-4.4.1-cp35-cp35m-manylinux1_x86_64.whl Collecting qtconsole (from jupyter->-r requirements-dev.txt (line 4)) Using cached qtconsole-4.3.0-py2.py3-none-any.whl Collecting nbconvert (from jupyter->-r requirements-dev.txt (line 4)) Using cached nbconvert-5.2.1-py2.py3-none-any.whl Collecting jupyter-console (from jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_console-5.1.0-py2.py3-none-any.whl Collecting notebook (from jupyter->-r requirements-dev.txt (line 4)) Using cached notebook-5.0.0-py2.py3-none-any.whl Collecting ipykernel (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipykernel-4.6.1-py3-none-any.whl Collecting ipywidgets (from jupyter->-r requirements-dev.txt (line 4)) Using cached ipywidgets-6.0.0-py2.py3-none-any.whl Collecting python-dateutil>=2 (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached python_dateutil-2.6.0-py2.py3-none-any.whl Collecting pytz>=2011k (from pandas>=0.19.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pytz-2017.2-py2.py3-none-any.whl Collecting six>=1.5.0 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached six-1.10.0-py2.py3-none-any.whl Collecting smart-open>=1.2.1 (from gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting pyparsing!=2.0.0,!=2.0.4,!=2.1.2,>=1.5.6 (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pyparsing-2.2.0-py2.py3-none-any.whl Collecting cycler (from matplotlib==1.5.3->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached cycler-0.10.0-py2.py3-none-any.whl Collecting click>=2.0 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached click-6.7-py2.py3-none-any.whl Collecting Werkzeug>=0.7 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Werkzeug-0.12.2-py2.py3-none-any.whl Collecting Jinja2>=2.4 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Jinja2-2.9.6-py2.py3-none-any.whl Collecting itsdangerous>=0.21 (from flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting beautifulsoup4 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached beautifulsoup4-4.6.0-py3-none-any.whl Collecting requests<3.0.0,>=2.0.0 (from wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached requests-2.18.1-py2.py3-none-any.whl Collecting pillow (from wordcloud>=1.3.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached Pillow-4.1.1-cp35-cp35m-manylinux1_x86_64.whl Collecting pytest (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached pytest-3.1.2-py2.py3-none-any.whl Requirement already up-to-date: wheel>=0.23.0 in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting joblib>=0.8.4 (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached joblib-0.11-py2.py3-none-any.whl Collecting numexpr (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached numexpr-2.6.2-cp35-cp35m-manylinux1_x86_64.whl Collecting funcy (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting future (from pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting ipython-genutils (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython_genutils-0.2.0-py2.py3-none-any.whl Collecting traitlets (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached traitlets-4.3.2-py2.py3-none-any.whl Collecting jupyter-core (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_core-4.3.0-py2.py3-none-any.whl Collecting jupyter-client>=4.1 (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached jupyter_client-5.1.0-py2.py3-none-any.whl Collecting pygments (from qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached Pygments-2.2.0-py2.py3-none-any.whl Collecting bleach (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached bleach-2.0.0-py2.py3-none-any.whl Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Collecting mistune!=0.6 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached mistune-0.7.4-py2.py3-none-any.whl Collecting entrypoints>=0.2.2 (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached entrypoints-0.2.3-py2.py3-none-any.whl Collecting testpath (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached testpath-0.3.1-py2.py3-none-any.whl Collecting nbformat (from nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached nbformat-4.3.0-py2.py3-none-any.whl Collecting ipython (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached ipython-6.1.0-py3-none-any.whl Collecting prompt-toolkit<2.0.0,>=1.0.0 (from jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached prompt_toolkit-1.0.14-py3-none-any.whl Collecting tornado>=4 (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting terminado>=0.3.3; sys_platform != "win32" (from notebook->jupyter->-r requirements-dev.txt (line 4)) Collecting widgetsnbextension~=2.0.0 (from ipywidgets->jupyter->-r requirements-dev.txt (line 4)) Using cached widgetsnbextension-2.0.0-py2.py3-none-any.whl Collecting boto>=2.32 (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached boto-2.47.0-py2.py3-none-any.whl Collecting bz2file (from smart-open>=1.2.1->gensim>=0.13.2->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask>=0.11.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting idna<2.6,>=2.5 (from requests<3.0.0,>=2.0.0->wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached idna-2.5-py2.py3-none-any.whl Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.0.0->wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached certifi-2017.4.17-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 (from requests<3.0.0,>=2.0.0->wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached chardet-3.0.4-py2.py3-none-any.whl Collecting urllib3<1.22,>=1.21.1 (from requests<3.0.0,>=2.0.0->wikipedia>=1.4.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached urllib3-1.21.1-py2.py3-none-any.whl Collecting olefile (from pillow->wordcloud>=1.3.1->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting py>=1.4.33 (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Using cached py-1.4.34-py2.py3-none-any.whl Requirement already up-to-date: setuptools in /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages (from pytest->pyLDAvis>=2.0.0->dariah-topics==0.2.0.dev0->-r requirements.txt (line 1)) Collecting decorator (from traitlets->qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached decorator-4.0.11-py2.py3-none-any.whl Collecting pyzmq>=13 (from jupyter-client>=4.1->qtconsole->jupyter->-r requirements-dev.txt (line 4)) Using cached pyzmq-16.0.2-cp35-cp35m-manylinux1_x86_64.whl Collecting html5lib>=0.99999999 (from bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached html5lib-0.999999999-py2.py3-none-any.whl Collecting jsonschema!=2.5.0,>=2.4 (from nbformat->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached jsonschema-2.6.0-py2.py3-none-any.whl Collecting jedi>=0.10 (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached jedi-0.10.2-py2.py3-none-any.whl Collecting pexpect; sys_platform != "win32" (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached pexpect-4.2.1-py2.py3-none-any.whl Collecting simplegeneric>0.8 (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Collecting pickleshare (from ipython->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached pickleshare-0.7.4-py2.py3-none-any.whl Collecting wcwidth (from prompt-toolkit<2.0.0,>=1.0.0->jupyter-console->jupyter->-r requirements-dev.txt (line 4)) Using cached wcwidth-0.1.7-py2.py3-none-any.whl Collecting ptyprocess (from terminado>=0.3.3; sys_platform != "win32"->notebook->jupyter->-r requirements-dev.txt (line 4)) Using cached ptyprocess-0.5.2-py2.py3-none-any.whl Collecting webencodings (from html5lib>=0.99999999->bleach->nbconvert->jupyter->-r requirements-dev.txt (line 4)) Using cached webencodings-0.5.1-py2.py3-none-any.whl Installing collected packages: nose, coverage, nosexcover, ipython-genutils, six, decorator, traitlets, jupyter-core, pyzmq, python-dateutil, jupyter-client, wcwidth, prompt-toolkit, jedi, ptyprocess, pexpect, pygments, simplegeneric, pickleshare, ipython, tornado, ipykernel, qtconsole, webencodings, html5lib, bleach, MarkupSafe, Jinja2, pandocfilters, mistune, entrypoints, testpath, jsonschema, nbformat, nbconvert, jupyter-console, terminado, notebook, widgetsnbextension, ipywidgets, jupyter, numpy, pytz, pandas, regex, idna, certifi, chardet, urllib3, requests, boto, bz2file, smart-open, scipy, gensim, pyparsing, cycler, matplotlib, click, Werkzeug, itsdangerous, flask, beautifulsoup4, wikipedia, lxml, olefile, pillow, wordcloud, py, pytest, joblib, numexpr, funcy, future, pyLDAvis, dariah-topics Running setup.py develop for dariah-topics Successfully installed Jinja2-2.9.6 MarkupSafe-1.0 Werkzeug-0.12.2 beautifulsoup4-4.6.0 bleach-2.0.0 boto-2.47.0 bz2file-0.98 certifi-2017.4.17 chardet-3.0.4 click-6.7 coverage-4.4.1 cycler-0.10.0 dariah-topics decorator-4.0.11 entrypoints-0.2.3 flask-0.12.2 funcy-1.8 future-0.16.0 gensim-2.2.0 html5lib-0.999999999 idna-2.5 ipykernel-4.6.1 ipython-6.1.0 ipython-genutils-0.2.0 ipywidgets-6.0.0 itsdangerous-0.24 jedi-0.10.2 joblib-0.11 jsonschema-2.6.0 jupyter-1.0.0 jupyter-client-5.1.0 jupyter-console-5.1.0 jupyter-core-4.3.0 lxml-3.8.0 matplotlib-1.5.3 mistune-0.7.4 nbconvert-5.2.1 nbformat-4.3.0 nose-1.3.7 nosexcover-1.0.11 notebook-5.0.0 numexpr-2.6.2 numpy-1.13.0 olefile-0.44 pandas-0.20.2 pandocfilters-1.4.1 pexpect-4.2.1 pickleshare-0.7.4 pillow-4.1.1 prompt-toolkit-1.0.14 ptyprocess-0.5.2 py-1.4.34 pyLDAvis-2.1.1 pygments-2.2.0 pyparsing-2.2.0 pytest-3.1.2 python-dateutil-2.6.0 pytz-2017.2 pyzmq-16.0.2 qtconsole-4.3.0 regex-2017.6.23 requests-2.18.1 scipy-0.19.1 simplegeneric-0.8.1 six-1.10.0 smart-open-1.5.3 terminado-0.6 testpath-0.3.1 tornado-4.5.1 traitlets-4.3.2 urllib3-1.21.1 wcwidth-0.1.7 webencodings-0.5.1 widgetsnbextension-2.0.0 wikipedia-1.4.0 wordcloud-1.3.1 + ./setup.py sdist bdist_wheel running sdist running egg_info writing dariah_topics.egg-info/PKG-INFO writing dependency_links to dariah_topics.egg-info/dependency_links.txt writing top-level names to dariah_topics.egg-info/top_level.txt writing requirements to dariah_topics.egg-info/requires.txt reading manifest file 'dariah_topics.egg-info/SOURCES.txt' writing manifest file 'dariah_topics.egg-info/SOURCES.txt' /mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/setuptools/dist.py:336: UserWarning: Normalizing '0.2.0dev0' to '0.2.0.dev0' normalized_version, warning: sdist: standard file not found: should have one of README, README.rst, README.txt running check warning: check: missing required meta-data: url creating dariah_topics-0.2.0.dev0 creating dariah_topics-0.2.0.dev0/dariah_topics creating dariah_topics-0.2.0.dev0/dariah_topics.egg-info creating dariah_topics-0.2.0.dev0/test copying files to dariah_topics-0.2.0.dev0... copying setup.cfg -> dariah_topics-0.2.0.dev0 copying setup.py -> dariah_topics-0.2.0.dev0 copying dariah_topics/__init__.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/doclist.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/evaluation.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/mallet.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/meta.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/preprocessing.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics/visualization.py -> dariah_topics-0.2.0.dev0/dariah_topics copying dariah_topics.egg-info/PKG-INFO -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/SOURCES.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/dependency_links.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/requires.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying dariah_topics.egg-info/top_level.txt -> dariah_topics-0.2.0.dev0/dariah_topics.egg-info copying test/test_fuzzy_segmenting.py -> dariah_topics-0.2.0.dev0/test Writing dariah_topics-0.2.0.dev0/setup.cfg Creating tar archive removing 'dariah_topics-0.2.0.dev0' (and everything under it) running bdist_wheel running build running build_py installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/preprocessing.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/model_creation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/meta.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/mallet.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/evaluation.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/__init__.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/doclist.py -> build/bdist.linux-x86_64/wheel/dariah_topics copying build/lib/dariah_topics/visualization.py -> build/bdist.linux-x86_64/wheel/dariah_topics running install_egg_info Copying dariah_topics.egg-info to build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0-py3.5.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/dariah_topics-0.2.0.dev0.dist-info/WHEEL + nosetests --cover-html ..............................E......... ====================================================================== ERROR: Integration test notebook (via Jupyter) ---------------------------------------------------------------------- Traceback (most recent call last): File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/nose/case.py", line 198, in runTest self.test(*self.arg) File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/integration_test.py", line 59, in jupyter_integration_test stderr=STDOUT, universal_newlines=True) File "/usr/lib/python3.5/subprocess.py", line 316, in check_output **kwargs).stdout File "/usr/lib/python3.5/subprocess.py", line 398, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['jupyter-nbconvert', '--execute', '--log-level=ERROR', '--ExecutePreprocessor.iopub_timeout=30', '--ExecutePreprocessor.timeout=None', '/mnt/data/jenkins/workspace/DARIAH-Topics/Mallet.ipynb']' returned non-zero exit status 255 -------------------- >> begin captured logging << -------------------- root: ERROR: This application is used to convert notebook files (*.ipynb) to various other formats. WARNING: THE COMMANDLINE INTERFACE MAY CHANGE IN FUTURE RELEASES. Options ------- Arguments that take values are actually convenience aliases to full Configurables, whose aliases are listed on the help line. For more information on full configurables, see '--help-all'. --execute Execute the notebook prior to export. --no-prompt Exclude input and output prompts from converted document. --stdout Write notebook output to stdout instead of files. --allow-errors Continue notebook execution even if one of the cells throws an error and include the error message in the cell output (the default behaviour is to abort conversion). This flag is only relevant if '--execute' was specified, too. --debug set log level to logging.DEBUG (maximize logging output) --inplace Run nbconvert in place, overwriting the existing notebook (only relevant when converting to notebook format) --stdin read a single notebook file from stdin. Write the resulting notebook with default basename 'notebook.*' --generate-config generate default config file -y Answer yes to any questions instead of prompting. --template=<Unicode> (TemplateExporter.template_file) Default: '' Name of the template file to use --writer=<DottedObjectName> (NbConvertApp.writer_class) Default: 'FilesWriter' Writer class used to write the results of the conversion --nbformat=<Enum> (NotebookExporter.nbformat_version) Default: 4 Choices: [1, 2, 3, 4] The nbformat version to write. Use this to downgrade notebooks. --reveal-prefix=<Unicode> (SlidesExporter.reveal_url_prefix) Default: '' The URL prefix for reveal.js. This can be a a relative URL for a local copy of reveal.js, or point to a CDN. For speaker notes to work, a local reveal.js prefix must be used. --config=<Unicode> (JupyterApp.config_file) Default: '' Full path of a config file. --output=<Unicode> (NbConvertApp.output_base) Default: '' overwrite base name use for output files. can only be used when converting one notebook at a time. --log-level=<Enum> (Application.log_level) Default: 30 Choices: (0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL') Set the log level by value or name. --output-dir=<Unicode> (FilesWriter.build_directory) Default: '' Directory to write output(s) to. Defaults to output to the directory of each notebook. To recover previous default behaviour (outputting to the current working directory) use . as the flag value. --to=<Unicode> (NbConvertApp.export_format) Default: 'html' The export format to be used, either one of the built-in formats, or a dotted object name that represents the import path for an `Exporter` class --post=<DottedOrNone> (NbConvertApp.postprocessor_class) Default: '' PostProcessor class used to write the results of the conversion To see all available configurables, use `--help-all` Examples -------- The simplest way to use nbconvert is > jupyter nbconvert mynotebook.ipynb which will convert mynotebook.ipynb to the default format (probably HTML). You can specify the export format with `--to`. Options include ['asciidoc', 'custom', 'html', 'latex', 'markdown', 'notebook', 'pdf', 'python', 'rst', 'script', 'slides'] > jupyter nbconvert --to latex mynotebook.ipynb Both HTML and LaTeX support multiple output templates. LaTeX includes 'base', 'article' and 'report'. HTML includes 'basic' and 'full'. You can specify the flavor of the format used. > jupyter nbconvert --to html --template basic mynotebook.ipynb You can also pipe the output to stdout, rather than a file > jupyter nbconvert mynotebook.ipynb --stdout PDF is generated via latex > jupyter nbconvert mynotebook.ipynb --to pdf You can get (and serve) a Reveal.js-powered slideshow > jupyter nbconvert myslides.ipynb --to slides --post serve Multiple notebooks can be given at the command line in a couple of different ways: > jupyter nbconvert notebook*.ipynb > jupyter nbconvert notebook1.ipynb notebook2.ipynb or you can specify the notebooks list in a config file, containing:: c.NbConvertApp.notebooks = ["my_notebook.ipynb"] > jupyter nbconvert --config mycfg.py root: ERROR: This application is used to convert notebook files (*.ipynb) to various other formats. WARNING: THE COMMANDLINE INTERFACE MAY CHANGE IN FUTURE RELEASES. Options ------- Arguments that take values are actually convenience aliases to full Configurables, whose aliases are listed on the help line. For more information on full configurables, see '--help-all'. --stdin read a single notebook file from stdin. Write the resulting notebook with default basename 'notebook.*' -y Answer yes to any questions instead of prompting. --inplace Run nbconvert in place, overwriting the existing notebook (only relevant when converting to notebook format) --stdout Write notebook output to stdout instead of files. --generate-config generate default config file --debug set log level to logging.DEBUG (maximize logging output) --no-prompt Exclude input and output prompts from converted document. --execute Execute the notebook prior to export. --allow-errors Continue notebook execution even if one of the cells throws an error and include the error message in the cell output (the default behaviour is to abort conversion). This flag is only relevant if '--execute' was specified, too. --reveal-prefix=<Unicode> (SlidesExporter.reveal_url_prefix) Default: '' The URL prefix for reveal.js. This can be a a relative URL for a local copy of reveal.js, or point to a CDN. For speaker notes to work, a local reveal.js prefix must be used. --writer=<DottedObjectName> (NbConvertApp.writer_class) Default: 'FilesWriter' Writer class used to write the results of the conversion --config=<Unicode> (JupyterApp.config_file) Default: '' Full path of a config file. --template=<Unicode> (TemplateExporter.template_file) Default: '' Name of the template file to use --post=<DottedOrNone> (NbConvertApp.postprocessor_class) Default: '' PostProcessor class used to write the results of the conversion --log-level=<Enum> (Application.log_level) Default: 30 Choices: (0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL') Set the log level by value or name. --to=<Unicode> (NbConvertApp.export_format) Default: 'html' The export format to be used, either one of the built-in formats, or a dotted object name that represents the import path for an `Exporter` class --output=<Unicode> (NbConvertApp.output_base) Default: '' overwrite base name use for output files. can only be used when converting one notebook at a time. --output-dir=<Unicode> (FilesWriter.build_directory) Default: '' Directory to write output(s) to. Defaults to output to the directory of each notebook. To recover previous default behaviour (outputting to the current working directory) use . as the flag value. --nbformat=<Enum> (NotebookExporter.nbformat_version) Default: 4 Choices: [1, 2, 3, 4] The nbformat version to write. Use this to downgrade notebooks. To see all available configurables, use `--help-all` Examples -------- The simplest way to use nbconvert is > jupyter nbconvert mynotebook.ipynb which will convert mynotebook.ipynb to the default format (probably HTML). You can specify the export format with `--to`. Options include ['asciidoc', 'custom', 'html', 'latex', 'markdown', 'notebook', 'pdf', 'python', 'rst', 'script', 'slides'] > jupyter nbconvert --to latex mynotebook.ipynb Both HTML and LaTeX support multiple output templates. LaTeX includes 'base', 'article' and 'report'. HTML includes 'basic' and 'full'. You can specify the flavor of the format used. > jupyter nbconvert --to html --template basic mynotebook.ipynb You can also pipe the output to stdout, rather than a file > jupyter nbconvert mynotebook.ipynb --stdout PDF is generated via latex > jupyter nbconvert mynotebook.ipynb --to pdf You can get (and serve) a Reveal.js-powered slideshow > jupyter nbconvert myslides.ipynb --to slides --post serve Multiple notebooks can be given at the command line in a couple of different ways: > jupyter nbconvert notebook*.ipynb > jupyter nbconvert notebook1.ipynb notebook2.ipynb or you can specify the notebooks list in a config file, containing:: c.NbConvertApp.notebooks = ["my_notebook.ipynb"] > jupyter nbconvert --config mycfg.py --------------------- >> end captured logging << --------------------- Name Stmts Miss Cover ---------------------------------------------------- dariah_topics/__init__.py 0 0 100% dariah_topics/doclist.py 90 7 92% dariah_topics/evaluation.py 107 52 51% dariah_topics/mallet.py 263 206 22% dariah_topics/meta.py 57 38 33% dariah_topics/preprocessing.py 226 16 93% dariah_topics/visualization.py 204 171 16% ---------------------------------------------------- TOTAL 947 490 48% ---------------------------------------------------------------------- Ran 40 tests in 49.515s FAILED (errors=1) Build step 'Virtualenv Builder' marked build as failure Recording test results Skipping Cobertura coverage report as build was not UNSTABLE or better ... [Set GitHub commit status (universal)] ERROR on repos [GHRepository@5b7f28cc[description=A python library for topic modeling.,homepage=,name=Topics,license=<null>,fork=true,size=111994,milestones={},language=Jupyter Notebook,commits={},url=https://api.github.com/repos/DARIAH-DE/Topics,id=69341969]] (sha:4e871a2) with context:DARIAH-Topics Setting commit status on GitHub for https://github.com/DARIAH-DE/Topics/commit/4e871a26989d53a446078afa1164ea5e3e8f6071 [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Started calculate disk usage of build Finished Calculation of disk usage of build in 0 seconds Started calculate disk usage of workspace Finished Calculation of disk usage of workspace in 0 seconds Notifying upstream projects of job completion [ci-game] evaluating rule: Build result [ci-game] evaluating rule: Increased number of passed tests [ci-game] evaluating rule: Decreased number of passed tests [ci-game] evaluating rule: Increased number of failed tests [ci-game] evaluating rule: Decreased number of failed tests [ci-game] evaluating rule: Increased number of skipped tests [ci-game] evaluating rule: Decreased number of skipped tests [ci-game] evaluating rule: Open HIGH priority tasks [ci-game] evaluating rule: Open NORMAL priority tasks [ci-game] evaluating rule: Open LOW priority tasks [ci-game] evaluating rule: Changed number of compiler warnings Finished: FAILURE