Another day, another issue with an unhelpful error message – probably related to having “too many” entities?
If there are some underlying limits to total number of entities, etc, they really need to be stated explicitly, so we can choose a platform that can handle our data requirements.
I see now the error shows the limit if 200,000 records. Are there any plans to extend this? I know reports aren’t in focus at the moment, but this is quite needed for the use case I’m building.
Thanks!
Btw, table view seems to be taking like a champ. A bit slow to load and toggle fields on and off. But once its up, its just like any other db. Fantastic.
No plans.
Usually, we recommend that you apply source filters if you have more than 200k records. Fibery isn’t engineered to be able to do rapid analytics on large data sets.
Thanks for the heads up. Yeah, maybe we’ll just have weekly and monthly reports by client instead of everything in one.
But I will note that if it does get more optimised for data analytics, it would unlock a whole market segment. You could just be integrating data into Fibery for the use case of reporting and asking the AI Questions about it. (But also not sure the ai is suited for such huge datasets).
I know its not Fibery’s starting point, but as any business grows it will have more and more data and will need a tool to properly analyze large datasets. Either they would need to reach for something new and integrate fibery into it, or things get better with big data within fibery.
Yes, it is likely that the database sizes will increase over time, but the same principles would still apply, namely that the reports ought to be focussed on a relevant subset, e.g. this month’s data.
Fibery will never compete with a tool like Tableau for analytics on large data sets (and in fact, we ourself use Tableau for these kind of reports).
Currently no reason to do so.
The amount of dev resources (= $$$) we would need to spend to achieve anything close to Tableau-level performance is less than we would spend on Tableau for the next N years.
I will just note, I tried asking the AI info on a big data sample, and it also chokes. Doesn’t properly query all the data.
Another idea I just thought is that it’s possible to aggregate the data using auto-linking (does this have tech limits??) and then make reports using the formula fields instead of the data. for things like yearly report, it might be doable.