HN2new | past | comments | ask | show | jobs | submitlogin

Here is the magnet link for posterity: magnet:?xt=urn:btih:ZXXDAUWYLRUXXBHUYEMS6Q5CE5WA3LVA&dn=LLaMA


Thanks not working for me...

Not that I could run it if I downloaded it.


Great, now how do I run it? Do I need a GPU with over 65GB RAM?


Try this, it's for running llms that won't fit in the gpu: https://github.com/FMInference/FlexGen


Currently that looks like it only supports facebook's opt and galactica models. Though they do appear to plan to add support for more models.


Generally, you'll need multiply model size by two to get required amount of video RAM. There are 4 sizes, so you might get away with even smaller GPU for say 13B model.


Nope, more like 111gb




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: