Unable to Delete Large Space Due to 'Too many open files' Error
Platform Notice: Data Center - This article applies to Atlassian products on the Data Center platform.
Note that this knowledge base article was created for the Data Center version of the product. Data Center knowledge base articles for non-Data Center-specific features may also work for Server versions of the product, however they have not been tested. Support for Server* products ended on February 15th 2024. If you are running a Server product, you can visit the Atlassian Server end of support announcement to review your migration options.
*Except Fisheye and Crucible
Symptoms
Attempt to delete a large space produced a "Too many open files" error in the log file.
Cause
If the "Too many open files" error is produced, the OS doesn't have enough open file handles to deal with the amount of files that Confluence needs to have open concurrently when deleting a space.
Workaround
- On Linux, you can set ulimit -n 10000 for the user Confluence is running as and have it set in its environment.
- Increasing total file descriptors in your OS (either per process file descriptors or system wide's can work). Depending on your operating system the way to increase the file descriptors can vary. Please consult your chosen OS's documentation for detailed instruction.
After increasing the file handle, you may discover Confluence hangs while trying to delete the space. This means you need to increase the number of database pool connection. You can get the information on how to do this from here.