Hacker News .hnnew | past | comments | ask | show | jobs | submitlogin

Imagine for a moment doing AI-work

In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: