Bazaar uses too much memory
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Bazaar |
New
|
Undecided
|
Unassigned |
Bug Description
I created a repository using "bzr svn-import" and it did not create a working tree. So I went into the directory and did a "bzr co" which eventually died because of Memory Error. When looking in top, here is the line after the error. It was going down, so I'm guessing it probably topped at 2GB, silly 32-bit OS.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
8637 ted 20 0 2969m 1.8g 3636 D 1 90.4 1:32.65 bzr
I would like to commend bazaar on it's effeciency. Filling 2GB of RAM with only 1.5 min of CPU time is very effective :)
Here is the backtrace if required:
$ bzr co
bzr: ERROR: exceptions.
Traceback (most recent call last):
File "/usr/lib/
return run_bzr(argv)
File "/usr/lib/
ret = run(*run_argv)
File "/usr/lib/
return self.run(
File "/usr/lib/
source.
File "/usr/lib/
accelerator
File "/usr/lib/
hardlink=
File "/usr/lib/
delta_
File "/usr/lib/
accelerator
File "/usr/lib/
new_
File "/usr/lib/
for result in self._repositor
File "/usr/lib/
for record in self.texts.
File "/usr/lib/
needed_
File "/usr/lib/
record_map = self._get_
File "/usr/lib/
self.
File "/usr/lib/
izip(
File "/usr/lib/
for names, read_func in reader.
File "/usr/lib/
for record in self._iter_
File "/usr/lib/
record_kind = self.reader_func(1)
File "/usr/lib/
return self._source.
File "/usr/lib/
self._next()
File "/usr/lib/
length, data = self.readv_
File "/usr/lib/
data = fp.read(
MemoryError
bzr 1.6.1 on python 2.5.2 (linux2)
arguments: ['/usr/bin/bzr', 'co']
encoding: 'UTF-8', fsenc: 'UTF-8', lang: 'en_US.UTF-8'
plugins:
bzr_notification /home/ted/
gtk /usr/lib/
launchpad /usr/lib/
power_management /home/ted/
pqm /usr/lib/
rebase /usr/lib/
search /home/ted/
svn /usr/lib/
*** Bazaar has encountered an internal error.
Please report a bug at https:/
including this traceback, and a description of what you
were doing when the error occurred.
What sort of size was the tree that had to be created ? Thousands of files, many gigabytes of data?