Large tables can fail in typst due to resource use #13921
aghaynes
started this conversation in
Show and Tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Description
I just had a case where a large table caused typst to fail. The dataset had approximately 1500 rows of text (it was a line listing of adverse events in a clinical trial). I've been using the same code for numerous months and today it failed. When typst was compiling the .typ file to .pdf, the computer got slow, fans started blowing like crazy, eventually RStudio crashed. Running it again, with an eye on computer resources, typst was using practically everything the computer had available. It was typst itself using the resources, not RStudio, not quarto, Typst (typst.exe).
I was using tinytable to produce the code. gt had already proven difficult for such large tables in typst.
Solution: The work around I used was to split the table into two smaller chunks.
While googling the issue, I came across this thread on the typst forum, which suggests that part of the problem may be due to how typst handles its own code. While that thread talks about a problem with many many more rows of data than in my case, clearly the same (or a very similar) issue can occur with considerably smaller datasets (perhaps mine had more formatting, or the typst code generated by tinytable has a higher overhead than the other users code). That thread does suggest another potential solution: passing a CSV to typst and let it do the formatting.
Presumably, that would entail a workflow similar to
But, that of course involves using native typst code rather than R...
Beta Was this translation helpful? Give feedback.
All reactions