Is there a plan to support local LLM for AI Search functionality? I am concerned about privacy.
@mdubakov Hi, is this request possible?
Not in near future unfortunately. We want to do it at some point though
Thank you for the information. I look forward to seeing it on the Fibery Roadmap someday.
Dear Lod,
maybe not a full compensation, but it should be possible to link certain entities in your db’s with a local model run for example via Ollama. You’d need an automation software like make.com or n8n. n8n can be run for free locally, but is more tricky to set up, as you’d need a creative way to parse and send data via webhooks or other methods. I think make.com is a better approach for that but that can be pricey, depending on your usecase.
You can then use the local model to do certain task. The possibilities are endless, when it comes to that.