[MacRuby-devel] [MacRuby] #578: Garbage Collector in 0.6

MacRuby ruby-noreply at macosforge.org
Thu Jan 28 00:00:29 PST 2010


#578: Garbage Collector in 0.6
---------------------------+------------------------------------------------
 Reporter:  jsn@…          |       Owner:  lsansonetti@…        
     Type:  defect         |      Status:  new                  
 Priority:  major          |   Milestone:  MacRuby 0.6          
Component:  MacRuby        |    Keywords:                       
---------------------------+------------------------------------------------
 Reporting this as a bug, since i'm fairly sure this is not the way it's
 intended to work.

 Running the following code on a bunch of 600mb - 2gb files will
 (eventually) cause a SIGSEGV when memory allocation fails.

 {{{
    def get_digest(file)
      digest = Digest::MD5.new()
      fil = File.open(file,'r')
      while((l = fil.read(READ_BUFFER_SIZE))!=nil)
           digest << l
     end
     fil.close()
     digest.hexdigest
    end
 }}}

 If I run it on a smaller set of files, 1.8gb to be exact, the memory
 allocated to the process jumps to 700-900mb, and just stays there.
 I let the process sit there for 30 mins, without noticing any drop in
 allocated memory.

 I've tried with various sizes for READ_BUFFER_SIZE, from 16kb to 32mb, and
 also tried with File.read(), and it all behaves the same way, although the
 higher read buffer, the faster memory usage shoots up.
 setting "l=nil" for every read operation and/or "digest=nil" before
 returning doesn't seem to make a difference.
 Using Digest::MD5.file(fname).hexdigest results in "writing to non-
 bytestrings is not supported at this time."

 I'm not sure if this is a "string cache" thing gone wrong, or it's simply
 not collecting garbage.

-- 
Ticket URL: <http://www.macruby.org/trac/ticket/578>
MacRuby <http://macruby.org/>



More information about the MacRuby-devel mailing list