.net - Parallel Foreach Memory Issue -


i have file collection (3000 files) in fileinfocollection. want process files applying logic independent (can executed in parallel).

 fileinfo[] fileinfocollection = directory.getfiles();  parallel.foreach(fileinfocollection, processworkeritem); 

but after processing 700 files getting out of memory error. used thread-pool before giving same error. if try execute without threading (parallel processing) works fine.

in "processworkeritem" running algorithm based on string data of file. additionally use log4net logging , there lot of communications sql server in method.

here info, files size : 1-2 kb xml files. read files , process dependent on content of file. identifying keywords in string , generating xml format. keywords in sql server database (nearly 2000 words).

well, processworkeritem do? may able change use less memory (e.g. stream data instead of loading in @ once) or may want explicitly limit degree of parallelism using this overload , paralleloptions.maxdegreeofparallelism. want avoid trying process 3000 files @ once :) iirc, parallel extensions "notice" if tasks appear io bound, , allow more normal number execute @ once - isn't want here, you're memory bound well.


Comments

Popular posts from this blog

c# - how to write client side events functions for the combobox items -

exception - Python, pyPdf OCR error: pyPdf.utils.PdfReadError: EOF marker not found -