Share your Fibery MCP Use Cases

Great overview! It’s exciting to see the continuous improvements and what’s coming next for Fibery.

Feedback on Fibery MCP Since you asked for feedback on the MCP server - I use it frequently to demonstrate how Fibery can serve as the central backbone for company AI initiatives. Once you have substantial company data in Fibery, you can ask analytical questions and run various analyses through AI.

Practical Examples:

  • Sales insights: We store insights from sales call transcripts in Fibery. Through MCP, we can ask questions like “What are the latest trends for solution X?” or “Give me customer quotes about licensing.”

  • CRM analysis: Our sales opportunities data lives in Fibery, allowing our head of sales to generate ad-hoc reports like “Create an activity report for Salesperson X showing top activities last month and major new opportunities.”

  • Team onboarding: MCP is also a great way to introduce colleagues to Fibery (and to AI & MCPs) by showing them the power of well-organized company data.

Suggestion for Improvement: The main limitation I see is that MCP doesn’t remember context between interactions. We always have to start from scratch (checking space structure, loading company products into memory, etc.).

A potential solution: Create a technical “AI Contexts” database that could be automatically loaded into MCP at the start of each interaction. This could contain high-level information about space organization, database purposes, and other contextual details that users want MCP to remember.

2 Likes

My MCP feedback is that we’ve brought our email, references, meeting
notes (this is still a bit of a struggle but is where a huge amount of
context is), knowledge base, and legal case management all into Fibery
– we use Fibery’s AI predominantly, but are looking towards making an
integration with chatbots and LLMs (like ChatGPT enterprise or Copilot
enterprise) that would enable us to merge context from another MCP
source of legal data to reach insights that we never thought was
possible absent a colossal amount of context. We’re moving slowly
because a) we’ve been focusing on using Fibery first, and b) we are very
cautious about opening our sensitive Fibery data to any LLM until we are
confident it is secure.

2 Likes