enstop
enstop copied to clipboard
AttributeError: module 'dask' has no attribute 'delayed'
When I am running the following code:
ens_model = EnsembleTopics(n_components=20, n_starts=8, n_jobs=2).fit(data_vec)
I get the error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<timed exec> in <module>
d:\pycharmprojects\biclustering\venv\lib\site-packages\enstop\enstop_.py in fit(self, X, y)
719 self
720 """
--> 721 self.fit_transform(X)
722 return self
723
d:\pycharmprojects\biclustering\venv\lib\site-packages\enstop\enstop_.py in fit_transform(self, X, y)
763 self.alpha,
764 self.solver,
--> 765 self.random_state,
766 )
767 self.components_ = V
d:\pycharmprojects\biclustering\venv\lib\site-packages\enstop\enstop_.py in ensemble_fit(X, estimated_n_topics, model, init, min_samples, min_cluster_size, n_starts, n_jobs, parallelism, topic_combination, n_iter, n_iter_per_test, tolerance, e_step_thresh, lift_factor, beta_loss, alpha, solver, random_state)
507 alpha=alpha,
508 solver=solver,
--> 509 random_state=random_state,
510 )
511
d:\pycharmprojects\biclustering\venv\lib\site-packages\enstop\enstop_.py in ensemble_of_topics(X, k, model, n_jobs, n_runs, parallelism, **kwargs)
181
182 if parallelism == "dask":
--> 183 dask_topics = dask.delayed(create_topics)
184 staged_topics = [dask_topics(X, k, **kwargs) for i in range(n_runs)]
185 topics = dask.compute(*staged_topics, scheduler="threads", num_workers=n_jobs)
AttributeError: module 'dask' has no attribute 'delayed'
data_vec is a vector:
data_vec = CountVectorizer().fit_transform(data)
I cannot run any version of EnsembleTopics.
Could you please help? I am using Python 3.7.5 x64. Windows 10.
i saw the same error AttributeError: module 'dask' has no attribute 'delayed' pip install dask[complete] seems to move me past it but now seeing Terminating: Nested parallel kernel launch detected, the workqueue threading layer does not supported nested parallelism. Try the TBB threading layer.
my job was small so i just switched off parallelism with parallelism="none"