My graphing code had a few bugs in it, so I essentially started from scratch to clean it up. While doing so, I noticed some weird behavior in the matrix solver.
To start with, I was curious how the time to solve increased with n. Theoretically, if we double n, we'll get 4 times as many elements, and we have to perform twice as many row reductions. So we're already at n^3=8 times longer. Of course, with a larger matrix, the size of the operands increases (the bottom right corner contains n^(n-1)). Looking solely at the bottom right element, the size of operands (in bits) increases as n*log(n). Additions and subtractions will therefore take n*log(n) times longer. However, multiplications take even longer. Depending on the math library, they can take up to n^2 time, though n^1.5 or n*log(n) is possible. I'm not sure what SAGE uses. And if it's n^2, then it's really (n*log(n))^2, because the operand size is n*log(n). Anyway, I was hoping SAGE's library was at least n^1.5, if not faster.
Stringing it all together, I was expecting something in the ballpark of O(n^4.5) time. Interestingly, it looks like it's at least O(n^5), possibly slightly more.
But even more interestingly, for n approximately 10+45*k, e.g., 55, 100, 145, 190, 235, etc., the solve time suddenly jumps by a significant amount. Increasing n by 1 might suddenly make the solve time increase by 30-40%.
I was so perplexed the first time I noticed this that I spent several hours playing around with it, trying to figure out the pattern. I'm still not sure if this is an artifact of SAGE internals, or if it's a property of the matrix method itself. Andrew found repetitive ratios every 7th term, if I recall correctly, so perhaps there's another pattern every 44th or 45th term.
So I graphed the fifth root of the solve times (because a linear graph would mean O(n^5) solve time). Here's what I got:
Notice the jump at roughly evenly spaced intervals. There aren't any jumps beyond 280 because the solve times were too long to try to verify that the jumps continue regularly.
Now, here's a detailed graph of each data point. There are gaps when I heuristically determined it wouldn't be interesting. In the transition ranges, I ran every value of n, up to 10 times to ensure stable times for comparisons.
The first thing to notice is the chaotic nature. Near the transition points (55, 100, 145, 189, etc.), the time might increase by 30% by adding a term, then decrease 15% by adding another term. I ran multiple tests, so it wasn't just a random effect of processor load or whatever.
To start with, I was curious how the time to solve increased with n. Theoretically, if we double n, we'll get 4 times as many elements, and we have to perform twice as many row reductions. So we're already at n^3=8 times longer. Of course, with a larger matrix, the size of the operands increases (the bottom right corner contains n^(n-1)). Looking solely at the bottom right element, the size of operands (in bits) increases as n*log(n). Additions and subtractions will therefore take n*log(n) times longer. However, multiplications take even longer. Depending on the math library, they can take up to n^2 time, though n^1.5 or n*log(n) is possible. I'm not sure what SAGE uses. And if it's n^2, then it's really (n*log(n))^2, because the operand size is n*log(n). Anyway, I was hoping SAGE's library was at least n^1.5, if not faster.
Stringing it all together, I was expecting something in the ballpark of O(n^4.5) time. Interestingly, it looks like it's at least O(n^5), possibly slightly more.
But even more interestingly, for n approximately 10+45*k, e.g., 55, 100, 145, 190, 235, etc., the solve time suddenly jumps by a significant amount. Increasing n by 1 might suddenly make the solve time increase by 30-40%.
I was so perplexed the first time I noticed this that I spent several hours playing around with it, trying to figure out the pattern. I'm still not sure if this is an artifact of SAGE internals, or if it's a property of the matrix method itself. Andrew found repetitive ratios every 7th term, if I recall correctly, so perhaps there's another pattern every 44th or 45th term.
So I graphed the fifth root of the solve times (because a linear graph would mean O(n^5) solve time). Here's what I got:
Notice the jump at roughly evenly spaced intervals. There aren't any jumps beyond 280 because the solve times were too long to try to verify that the jumps continue regularly.
Now, here's a detailed graph of each data point. There are gaps when I heuristically determined it wouldn't be interesting. In the transition ranges, I ran every value of n, up to 10 times to ensure stable times for comparisons.
The first thing to notice is the chaotic nature. Near the transition points (55, 100, 145, 189, etc.), the time might increase by 30% by adding a term, then decrease 15% by adding another term. I ran multiple tests, so it wasn't just a random effect of processor load or whatever.
~ Jay Daniel Fox

