Make native bulk editing actions more reliable and faster for large datasets

Selecting 20,000 entities and deleting them results in error: The upstream server is timing out.

I accidentally imported the wrong CSV with 80,000 entities, and now deleting them all is a pain… I need to do it 10,000 at a time, and each time takes at least 5 mins, it usually crashes everytime, but works in the backend.

EDIT Temporary solution:
The AI Build Mode does a pretty good job on running multiple batch actions after each other. Work’s pretty well.

I hope this will be addressed next year to some degree.

1 Like

I was thinking about this as I need it more and more. I think at the moment, on bulk, it sends the ids of all the entities that need to be edited/deleted. But it might be a solution to add a “bulk” endpoint, which performs the button action on a database in full, with certain filters. And this gets called when all entities of are selected in a view. Just an idea.