> Remove "and Python 3.11" from title. Python used only for converting model to llama.cpp project format, 3.10 or whatever is fine.
As @rnosov notes elsewhere in the thread, this post has a workaround for the PyTorch issue with Python 3.11, which is why the "and Python 3.11" qualification is there.
In this particular case that doesn't matter, because the only time you run Python is for a one-off conversion against the model files.
That takes at most a minute to run, but once converted you'll never need to run it again. Actual llama.cpp model inference uses compiled C++ code with no Python involved at all.
As @rnosov notes elsewhere in the thread, this post has a workaround for the PyTorch issue with Python 3.11, which is why the "and Python 3.11" qualification is there.