|
That looks like RAG rather than training.
If you want to install [build] vllm or ollama and your Mac is new enough - apple silicon - you can run locally for free. I have a few mistral models I can switch between that I’m testing on.
It’s real easy - and free - to use elasticsearch for the vector database.
I’m doing this as an R&D project with langchain4j for the integration layer (I’m rather better with Java than Python because I use it a lot more) to augment mistral to work as a user agent to act as L1 support.
|