I'd love to try it, but pretty much all my repos are >10mb. It's not because there is that much code, but because I am doing bioinformatics and the test files (for the unit tests) inflate the repo size. It would be great if there was a way to test it on just 1 large repo for perhaps a week or something, because I balk at the idea of spending $20 a month on something that I don't even know works well.
This is important because I'm not deeply familiar with public projects, so I can't accurately assess if the tool is worthwhile. Whereas with one of my repos, I'd be able to tell quality pretty quickly.
As an aside I suggest using dvc. It stores only metadata of large files in the got repository while maintain a separate data repository. The data repository can also be synced with cloud storage like AWS. It really helps manage code better without raw data
That's funny, we actually do ignore those files but realizing now we don't account for that in our calculation of codebase size since we just get that via the GH API.
This comes from a place of love and empathy -- I've been here, am here, and will be here again -- but this realization reads like the first step in the long, long road from prototype to product.
This is important because I'm not deeply familiar with public projects, so I can't accurately assess if the tool is worthwhile. Whereas with one of my repos, I'd be able to tell quality pretty quickly.