we've considered using local llm, but the problem is that for a better user experience, we will add user's new vocabulary list, then inject words based on the list, it's hard to do this on local.
We will seriourly consider the point of support local llm, this will also allow more users to utilize our basic functions.
We will seriourly consider the point of support local llm, this will also allow more users to utilize our basic functions.