Hello. In our company there is a weekly scenario when we truncating table data and import new from CSV. In order to truncate table i just iterate over all entities and delete them in cycle. Maybe there is a better way to remove all entities via single API call?
I cannot find a way to trigger CSV import via API. Am I missing somethig?
What do you mean by truncating table data? Do you mean deleting all entities in a specific database? Or perhaps a subset based on creation date? Or …
Where is your csv data coming from? It sounds like you might benefit from a custom app to solve this whole flow…
By truncate i have in mind deletion of all entities while keeping table and related automations.
Data comes from different sources. Some CSVs are exported from other online systems while other generated by SQL from data stored in PostgreSQL.
Frankly speaking i would like to avoid custom app in order to reduce number of moving parts in our data infrastructure. CSV imports work well (putting aside performance issues) but we would like to do it on schedule without manual work.
I am not sure that I totally understand the issue, but perhaps I can help. Has your issue already been solved?
You can use the API to send batch queries. I’d do a call searching the database to return all results first then write a script to generate the payload. The fibery.schema/batch
command takes an array of commands as its input. See more here.
Not that I’ve looked into this recently, but I’ve always done CSV ETL in a third party app. I don’t believe Fibery has an API endpoint for csv ingestion. In the past I’ve used Make to connect csv files to stored in SharePoint in Fibery with great success.