I often push up against the size limit for the direct solvers. I already find that the limit in Solidworks 2020 is higher than in 2018, but I don't know what it is without iterating mesh size.

Is the problem size limit documented anywhere?

Also, RAM is relatively cheap today. Can't the limit be pushed up in the compiled code dramatically, or made a run-time flag set by the user (with understood trade-offs)? I remember at my first job often going back to the lead developer to ask for custom compiled builds, pre-allocating more fixed space to one matrix over another - but that code was in FORTRAN 77 and our workstations had a 'generous' 32 MB of RAM.

It has nothing to do with RAM. The number of rows and columns allowed in an array is limited by the maximum size of an integer that the operating system allows. This is typically 2^31-1, which is 2147483647

From the Knowledge Base:

S-062259Question:Why do I receive a "Problem is too large for Direct Sparse solver. Use the Iterative Solver" error message despite having sufficient RAM?Answer:If the problem becomes too large, the Direct Sparse solver may overflow some integer limits while performing its calculations. Therefore, even if you have enough RAM you may not be able to run your simulation using the Direct Sparse solver.

The number of degrees of freedom that triggers the error is not a fixed number.

This number depends on many factors. For instance, the type of elements, whether the structure is bulky or slender, and the version of SOLIDWORKS you are using.

The error can happen when a problem becomes larger than 1.5 to 6 million Degrees of Freedom.

Possible solutions to this problem are:

1. using a different solver, like the FFEPlus or the Large Problem Direct Sparse solver.

2. reducing the number of degrees of freedom in the model