⚙️ Help: Custom App Limitations?

Hello! Long time fibery user :wave:

Sent to support but figured I’d post here to see if anyone already knows and potentially provide clarity on this topic to for others in the future.

Since Fibery doesn’t offer database/Postgres syncing, we are trying to build something ourselves. However, we’re a bit unsure about the rate limits and what it can handle.

Example: We have a Supabase instance with over 9M rows. Thousands of rows are added per hour.

To get a better understanding of what the capabilities are, we’re hoping to answer a few questions:

  • Can Fibery handle millions of rows?

  • Is there a rate limit for custom apps?

  • How often can we run a manual sync?

  • Can we trigger syncs automatically?

Looking forward to finding out more!

Thanks,
Josiah

1 Like

It would be great if all of this sort of info about Fibery was explicit and easily found.

Could be useful to have a different type of membership to allow more intensive uses, which most do not need.

3 Likes

Hello, @Illusory

Thanks for good questions.

  • Can Fibery handle millions of rows?

Sorry, can’t say exactly, we did tests for 500K records maximum, since we didn’t imagine the integration use case of usage 9M rows in Fibery.

  • Is there a rate limit for custom apps?

The automatic sync interval is 1 hour minimum per integration. We don’t have rate limits. But in case if integration takes more then two hours it will be cancelled and aborted. In case if any separate request to app will take more then 5 minutes - integration will be cancelled and aborted. Filters, paging and delta sync strategy probably must have to be implemented in app for huge data sets.

  • How often can we run a manual sync?

Sync button can be pushed right after completing of previous sync.

  • Can we trigger syncs automatically?

Unfortunately, we don’t have public and documented API for now.

Could you please describe your use case for 9M rows integration? Maybe it is not required to store all this info in Fibery and it may be enough to have some aggregation data from your Postgres database.

Thanks,
Oleg

1 Like

Hi @Oleg, thanks for your response and assistance.

We have a few things we want to sync:

  1. Product Inventory (Bare Metal Servers & Proxies)
  2. Custom Process Log Events (for said inventory; also the bulk of our records)
  3. Account Data (user, org, contact)
  4. Subscription Data (plans, invoices)

This is a rough map:

We have bare metal servers / proxies that are sending custom process logs as well as customer data (user profile data, subscription/payment/invoice data as well as the hardware inventory that is linked to them).

Being able to store our product inventory and run actions is extremely valuable.

Being able to pull in product process logs allows us to create relations to the hardware and identify anomalies or bugs through Fibery reports and create issues in Fibery for said hardware. We can also go further and relate these hardware action logs directly to customer entities for further analysis and create tasks based on data available (like manual review, hardware update, subscription extension, etc). Right now, we use tools like Retool but the thing missing here is relational functionality and Fibery’s robust customization features.

Use Case Example: A device is failing to connect intermittently and our logs have reported 20 records/entities over the last 24 hours, we can create a formula that calculates incidents within the last 24 hours and automatically create a Fibery task for the product when the number reaches a specific amount (20). From there, we can see each event (entity) that took place and run further custom actions. (Notify the customer, flag the hardware for inspection, create another task to credit user’s subscription, etc).

This is just one of many examples… I think this type of thing would make Fibery extremely powerful and cater to larger enterprises with lots of data for a variety of use cases.

I agree with @Matt_Blais as well with respect to different types of memberships for more intensive use cases. I’d be willing to pay more and I’m sure others would to.

@helloitse any thoughts here?

2 Likes