-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
If a user starts running a lot of big-memory computations whose results are being cached on the Python side (as Pandas DataFrames), memory usage will quickly blow up. In order to keep running computations, it will likely be necessary to free up some of this used memory.
One way to do this would be to globally track all existing DataFrames and their memory usage, and delete their cached results as memory usage starts to approach the system's limit. Of course, we could still keep a cached copy of the results on SQLite for future use.
Metadata
Metadata
Assignees
Labels
No labels