Deciding whether or not a given cut is minimal (breaks the graph only into two components) is the slow part for the above algorithm; the test uses ConnectedComponents that needs to perform an algorithm that searches through the graph. In the description of the above algorithm two cases were mentioned where this hard test could be avoided. One was that the fundamental cuts were minimal by construction. The second was for ring sums between two fundamental cuts: if the ring sum was between two edge sets with no common edges, it was not minimal and the hard test was not required.
I speculated that this might be possible to simplify for ringsums of more than two cuts. The following algorithm accumulates the ringsums incrementally, and at each stage tests for lack of common edges.
We need to test all 2^r subsets of the set of r fundamental cuts (except the empty set). Let's code the subsets by a list of integers that index the fundamental sets. For example the index [1,2,4,5] will mean evaluating the ring sum of the first, second, fourth and fifth subsets. The subsets iterator in the combinat package can iterate through all subsets of integers in the list [1,2,3,...,r], thereby successively finding all subsets. More importantly, it does it in an order that means we can use previously evaluated ringsums to find new ones. The order is all subsets with zero elements, then one element, then two elements, etc., and in numerical order within those cases, e.g., for r=3 this is , , , , [1,2], [1,3], [2,3], [1,2,3]. So as an example (for r>4), when we come to evaluate the ringsum of subsets [1,2,3,7,9] we find the ringsum of subsets [1,2,3,7] has already been done, and so we can evaluate the ringsum between the [1,2,3,7] result and fundamental cut , and store the result in cutsets[1,2,3,7,9] for later use. If the [1,2,3,7] and  edge sets have no common edges, this will not be minimal and we can avoid the hard test in this case.
This was implemented using a Maple table cutsets that was indexed with the lists of integers. All 2^r cutsets need to be stored, so there is a larger memory requirement. Profiling the algorithm on @Icz's graph with 20 vertices and 72 edges and 314415 minimal cuts showed that the time saved by incrementally doing the ringsums was approximately offset by the extra complexity of the algorithm. For this graph at least, although many more hard tests were avoided, this was still a small fraction of the total.
fundamental 19 19
easy (no common edges) 104 25063
hard tests 524164 499205
total = 2^19-1 524287 524287
The total times were appromimately the same. 342 out of the 370 seconds (92%) was spent on hard tests, so those are still limiting here. The number of hard tests was reduced by only 5%.
Edit: This analysis suggests that testing all pairs out of 2^r may be the way to go, or perhaps avoid a brute-force method using the much more complicated algorithm of @Icz's ref .
Update: testing all pairs is much, much slower, so the mixed strategy here or above seems to be optimum for a brute force method.