Fastest Way to Parse a Large File in Ruby -
i have simple text file ~150mb. code read each line, , if matches regexes, gets written output file. right now, takes long time iterate through of lines of file (several minutes) doing
file.open(filename).each |line| # stuff end
i know looping through lines of file taking while because if nothing data in "#do stuff", still takes long time.
i know unix programs can parse large files instantly (like grep), wondering why ruby (mri 1.9) takes long read file, , there way make faster?
file.readlines.each |line| #do stuff each line end
will read whole file 1 array of lines. should lot faster, takes more memory.
Comments
Post a Comment