Make native bulk editing actions more reliable and faster for large datasets

Selecting 20,000 entities and deleting them results in error: The upstream server is timing out.

I accidentally imported the wrong CSV with 80,000 entities, and now deleting them all is a pain… I need to do it 10,000 at a time, and each time takes at least 5 mins, it usually crashes everytime, but works in the backend.

EDIT Temporary solution:
The AI Build Mode does a pretty good job on running multiple batch actions after each other. Work’s pretty well.

I hope this will be addressed next year to some degree.

1 Like