From: paul DOT english AT sg DOT adisys DOT com DOT au (Paul English) Subject: "Too many open files", any way to achieve ulimit -n? 10 Jun 1998 05:18:01 -0700 Message-ID: <13693.53758.616000.258854.cygnus.gnu-win32@GUMBLE> Reply-To: paul DOT english AT technologist DOT com Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: gnu-win32 AT cygnus DOT com I have an issue with some perl scripts, and occassionally some links, which terminate with a "Too many open files" message when running under bash (bash 2.01.1(2) on Cygwin32 19.1 under Windows NT 4.0 SP3). This only happens on a filesystem which is exported by Samba from a Solaris server. Our MIS department have suggested increasing the maximum number of file descriptors which can be open, i.e. ulimit -n 256. Unfortunately ulimit -n seems to be fixed at 32. Is there any way of increasing this limit? I'd even consider building bash from source if necessary. Note that the file descriptors in the perl script are explicitly closed after use, but must remain in use for some time after being closed allowing the number of open descriptors to mount up. A sleep 0.7 command immediately after the close stops the problem occuring, but slows the script significantly (sleep 0.5 is insufficient). Running the same script on the same file tree works fine from a Solaris workstation accessing the server via NFS. Unfortunately there is nothing I can do about links failing, at least not while using make. Fortunately this is relatively rare, occuring only during times when the network is slow and/or the server load is high. Has anyone any ideas for a fix/workaround? Thanks in advance, Paul. -- , , ("\''/").___..--''" -._ Paul English `9_ 9 ) `-. ( ).`-.__.') paul DOT english AT technologist DOT com (_Y_.)' ._ ) ._ . ``-..-' _..`--'_..-_/ /--'_.' .' (il).-'' ((i).' ((!.-' - For help on using this list (especially unsubscribing), send a message to "gnu-win32-request AT cygnus DOT com" with one line of text: "help".