Hi Fibery team ![]()
It would be very helpful to have the ability to choose which AI model is used for Fibery AI responses.
In some cases we ask very simple questions or perform lightweight tasks where a fast and inexpensive model would be perfectly sufficient. However, currently it sometimes takes noticeably longer to receive a response, even when the question itself is simple.
Because of this, it feels like the system may be using a heavier model even when it is not necessary.
Suggestion:
Allow users to select the AI model, for example:
-
Fast / lightweight model (for quick questions and simple tasks)
-
More advanced model (for complex analysis or generation)
This would allow teams to balance speed depending on the task.