Comment 7 for bug 125067

Revision history for this message
John A Meinel (jameinel) wrote :

Well, even the fastest format will still take a while to download 60MB of historical data. (Or if it is something like the Mozilla tree, 2GB).

This may be something we actually want to think about. It is nice to be able to incrementally download information, and not have all of that download be completely wasted if your connection times out.

We could do it in a variety of ways. Such as downloading packs as 100 revisions at a time, or even 1000. Then as long as you have completed 1 chunk, then you are better off than starting from scratch. (You could even update the branch pointer at that time, so it is reasonably valid).

I know there is the basic plan to have pack files that build up over time, and possibly collapse them into larger packs. I think it could be better to have a cap of how many revisions we put between each atomic operation.