Skip to content

Failed

demonstrator_test.DemonstratorTestCase.test_topic_modeling (from nosetests)

Failing for the past 6 builds (Since Failed#366 )
Took 13 ms.

Error Message

10 columns passed, passed data had 2 columns
-------------------- >> begin captured stdout << ---------------------
Accessing user input ...
1 text files.
1 topics.
1 iterations.
Using external stopwords list.
Tokenizing <FileStorage: 'document.txt' ('text/plain')> ...
Accessing external stopwords list ...
Determining hapax legomena ...
Removing stopwords and hapax legomena from corpus ...
Accessing corpus vocabulary ...
LDA training ...
Accessing topics ...

--------------------- >> end captured stdout << ----------------------
-------------------- >> begin captured logging << --------------------
preprocessing: INFO: Finding hapax legomena ...
preprocessing: INFO: Removing features ...
lda: INFO: n_documents: 1
lda: INFO: vocab_size: 2
lda: INFO: n_words: 100
lda: INFO: n_topics: 1
lda: INFO: n_iter: 1
lda: INFO: <0> log likelihood: -75
lda: INFO: <0> log likelihood: -75
--------------------- >> end captured logging << ---------------------

Stacktrace

  File "/usr/lib/python3.5/unittest/case.py", line 59, in testPartExecutor
    yield
  File "/usr/lib/python3.5/unittest/case.py", line 601, in run
    testMethod()
  File "/mnt/data/jenkins/workspace/DARIAH-Topics/test/demonstrator_test.py", line 39, in test_topic_modeling
    resp = self.app.post('/upload', data=data)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 801, in post
    return self.open(*args, **kw)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/testing.py", line 127, in open
    follow_redirects=follow_redirects)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 764, in open
    response = self.run_wsgi_app(environ, buffered=buffered)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 677, in run_wsgi_app
    rv = run_wsgi_app(self.application, environ, buffered=buffered)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/werkzeug/test.py", line 884, in run_wsgi_app
    app_rv = app(environ, start_response)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1997, in __call__
    return self.wsgi_app(environ, start_response)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1985, in wsgi_app
    response = self.handle_exception(e)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1540, in handle_exception
    reraise(exc_type, exc_value, tb)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
    response = self.full_dispatch_request()
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
    rv = self.dispatch_request()
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "demonstrator/demonstrator.py", line 110, in upload_file
    topics = preprocessing.lda2dataframe(model, corpus_vocabulary)
  File "/mnt/data/jenkins/workspace/DARIAH-Topics/dariah_topics/preprocessing.py", line 846, in lda2dataframe
    return pd.DataFrame(topics, index=['Topic ' + str(n+1) for n in range(len(topics))], columns=['Key ' + str(n+1) for n in range(num_keys)])
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 314, in __init__
    arrays, columns = _to_arrays(data, columns, dtype=dtype)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 5639, in _to_arrays
    dtype=dtype)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 5696, in _list_to_arrays
    coerce_float=coerce_float)
  File "/mnt/data/jenkins/shiningpanda/jobs/62c67c92/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pandas/core/frame.py", line 5755, in _convert_object_array
    'columns' % (len(columns), len(content)))
10 columns passed, passed data had 2 columns
-------------------- >> begin captured stdout << ---------------------
Accessing user input ...
1 text files.
1 topics.
1 iterations.
Using external stopwords list.
Tokenizing <FileStorage: 'document.txt' ('text/plain')> ...
Accessing external stopwords list ...
Determining hapax legomena ...
Removing stopwords and hapax legomena from corpus ...
Accessing corpus vocabulary ...
LDA training ...
Accessing topics ...

--------------------- >> end captured stdout << ----------------------
-------------------- >> begin captured logging << --------------------
preprocessing: INFO: Finding hapax legomena ...
preprocessing: INFO: Removing features ...
lda: INFO: n_documents: 1
lda: INFO: vocab_size: 2
lda: INFO: n_words: 100
lda: INFO: n_topics: 1
lda: INFO: n_iter: 1
lda: INFO: <0> log likelihood: -75
lda: INFO: <0> log likelihood: -75
--------------------- >> end captured logging << ---------------------

Standard Output

Accessing user input ...
1 text files.
1 topics.
1 iterations.
Using external stopwords list.
Tokenizing <FileStorage: 'document.txt' ('text/plain')> ...
Accessing external stopwords list ...
Determining hapax legomena ...
Removing stopwords and hapax legomena from corpus ...
Accessing corpus vocabulary ...
LDA training ...
Accessing topics ...