HN2new | past | comments | ask | show | jobs | submit | rprtr258's commentslogin

how did you manage to put

- no bloat

- built-in AI co-pilot with ollama (local) or openai/anthropic

into one list?


Pretty easily, as it happens[0]. Just implement the RESTful methods as appropriate. Ollama supports the OpenAI API spec

[0]: https://github.com/clidey/whodb/blob/main/core/src/llm/llm_c...

[1]: https://github.com/openai/openai-openapi


simple process manager for linux https://github.com/rprtr258/pm it is inspired by pm2, but much simpler without any js and js integrations


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: