Confirming that CHOLMOD runs on GPU #945
-
|
Hi! I'm trying to use CHOLMOD as a linear solver with GPU support. However, when I run the following command using the Demo scripts on input matrices with over 1,000,000 non-zeros, I don’t see any corresponding GPU activity in nvidia-smi:
Is there something I might be missing in the setup, or an additional step needed to enable GPU usage? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Yes, it does use the GPU, but not for the int (32-bit) version (which is what cholmod_di_simple uses). It's only used for the version with 64-bit integers. |
Beta Was this translation helpful? Give feedback.
Yes, it does use the GPU, but not for the int (32-bit) version (which is what cholmod_di_simple uses). It's only used for the version with 64-bit integers.