Spark Error: Too many open files

This is a typical Spark error happens on Ubuntu (probably other Linux versions too). To resolve it, one could do the following:

Change this file /etc/security/limits.conf

to add:

* soft nofile 55000
* hard nofile 55000

55000 is the one I use for example, you could choose larger or smaller number, it means, we allow the system to handle open as many as 55000 files.

After save the changes, you will need to REBOOT to make it effective.

One note is that, I recommend don’t go crazy on this number, for example, I once put it 1,000,000 and Spark generated too many temporary files and caused my harddisk very hard time deleting those temporary files.

Comments are closed.