My personal impression is that the question is not that well-posed:
- if you assume infinite memory, you don't actually need a GC (and for any input you can tweak the GC setting to make sure no collection happens)
- if you assume finite memory, the notion of asymptotic behaviour breaks down unless you can prove your algorithm has a finite peak memory consumption that is below that bound
- this is independent of whether or not your language using a GC; if your algorithm constantly allocates heap memory, you will run into compaction issues even with malloc/free that may degrade theoretical performances
Neelk answer in the link I gave above is that if you can tune your GC to make sure collections happen infrequently enough, you can make collection amortized constant-time. But that means you have to accept some memory wasted (the runtime using significantly more memory than the size of the actually live data), possibly proportional to the running time of your program.