Failed
dariah_topics.postprocessing.save_tokenized_corpus (from nosetests)
Error Message
Failed doctest test for dariah_topics.postprocessing.save_tokenized_corpus File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 156, in save_tokenized_corpus ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 179, in dariah_topics.postprocessing.save_tokenized_corpus Failed example: with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: print(file.read()) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "<doctest dariah_topics.postprocessing.save_tokenized_corpus[4]>", line 1, in <module> with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/*.txt' -------------------- >> begin captured logging << -------------------- dariah_topics.postprocessing: INFO: Saving tokenized corpus to tmp ... dariah_topics.postprocessing: DEBUG: Current file: document_label --------------------- >> end captured logging << ---------------------
Stacktrace
File "/usr/lib/python3.5/unittest/case.py", line 59, in testPartExecutor yield File "/usr/lib/python3.5/unittest/case.py", line 601, in run testMethod() File "/usr/lib/python3.5/doctest.py", line 2190, in runTest raise self.failureException(self.format_failure(new.getvalue())) Failed doctest test for dariah_topics.postprocessing.save_tokenized_corpus File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 156, in save_tokenized_corpus ---------------------------------------------------------------------- File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/postprocessing.py", line 179, in dariah_topics.postprocessing.save_tokenized_corpus Failed example: with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: print(file.read()) Exception raised: Traceback (most recent call last): File "/usr/lib/python3.5/doctest.py", line 1321, in __run compileflags, 1), test.globs) File "<doctest dariah_topics.postprocessing.save_tokenized_corpus[4]>", line 1, in <module> with open(os.path.join(path, '*.txt'), 'r', encoding='utf-8') as file: FileNotFoundError: [Errno 2] No such file or directory: 'tmp/*.txt' -------------------- >> begin captured logging << -------------------- dariah_topics.postprocessing: INFO: Saving tokenized corpus to tmp ... dariah_topics.postprocessing: DEBUG: Current file: document_label --------------------- >> end captured logging << ---------------------