Skip to content

ValueError: unsupported pickle protocol: 5 #33

@AlexanderAivazidis

Description

@AlexanderAivazidis

Hi,

I am getting the following error when I run CellDancer with more than 1 job. I made sure there is enough memory available, so I don't think that can be a reason:


--------------------------------------------------------------------------------
LokyProcess-1 failed with traceback: 
--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/joblib/externals/loky/backend/popen_loky_posix.py", line 197, in <module>
    prep_data = pickle.load(from_parent)
ValueError: unsupported pickle protocol: 5


--------------------------------------------------------------------------------


--------------------------------------------------------------------------------
LokyProcess-2 failed with traceback: 
--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/joblib/externals/loky/backend/popen_loky_posix.py", line 197, in <module>
    prep_data = pickle.load(from_parent)
ValueError: unsupported pickle protocol: 5


--------------------------------------------------------------------------------
---------------------------------------------------------------------------
TerminatedWorkerError                     Traceback (most recent call last)
/tmp/ipykernel_2290520/1755510333.py in <module>
     15                                        gene_list= np.array(input_data['gene_name']),
     16                                        permutation_ratio=0.125,
---> 17                                        n_jobs=2)
     18     # compute cell velocity
     19     cellDancer_df=cd.compute_cell_velocity(cellDancer_df=cellDancer_df, projection_neighbor_choice='gene', 

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/celldancer/velocity_estimation.py in velocity(cell_type_u_s, gene_list, max_epoches, check_val_every_n_epoch, patience, learning_rate, dt, n_neighbors, permutation_ratio, speed_up, norm_u_s, norm_cell_distribution, loss_func, n_jobs, save_path)
    787             save_path=save_path,
    788             norm_u_s=norm_u_s)
--> 789         for data_index in range(0,len(gene_list_buring)))
    790 
    791     # clean directory

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/joblib/parallel.py in __call__(self, iterable)
   1054 
   1055             with self._backend.retrieval_context():
-> 1056                 self.retrieve()
   1057             # Make sure that we get a last message telling us we are done
   1058             elapsed_time = time.time() - self._start_time

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/joblib/parallel.py in retrieve(self)
    933             try:
    934                 if getattr(self._backend, 'supports_timeout', False):
--> 935                     self._output.extend(job.get(timeout=self.timeout))
    936                 else:
    937                     self._output.extend(job.get())

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/site-packages/joblib/_parallel_backends.py in wrap_future_result(future, timeout)
    540         AsyncResults.get from multiprocessing."""
    541         try:
--> 542             return future.result(timeout=timeout)
    543         except CfTimeoutError as e:
    544             raise TimeoutError from e

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
    433                 raise CancelledError()
    434             elif self._state == FINISHED:
--> 435                 return self.__get_result()
    436             else:
    437                 raise TimeoutError()

/nfs/team283/aa16/software/miniconda3/envs/cellDancer/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
    382     def __get_result(self):
    383         if self._exception:
--> 384             raise self._exception
    385         else:
    386             return self._result

TerminatedWorkerError: A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker.

The exit codes of the workers are {EXIT(1)}


Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions