I have a rather large project (13 GB) and a poor internet connection. I brought a copy of my working directory home on a flash drive - but without the .svn directory which would have doubled the size.
I was hoping to copy the flash drive contents and then turn it into a proper working directory. Essentially a checkout, but with the 13 GB of files provided by my flash drive.
This clever idea doesn't work for various reasons (E155017: Checksum mismatch or "An obstructing working copy was found").
With many of these ideas/experiments, SVN tries to download the whole thing over my connection again (possibly because it can't checksum to compare local with remote?)
Sorry for the long explanation, but I think it's necessary for the whole picture.
Anybody else try to bring copied local files into sync with an existing SVN repo?
The .svn
directory contains a "clean" copy of each of the versioned source files, which is what makes it so big. It also contains metadata though, which would be very difficult to reconstruct.
In contrast, the working files can be easily reconstructed from the clean copies in .svn
. So an alternative approach would be to copy only the .svn
directory, and then run svn revert -R .
to restore the working files with no further network traffic.
A working copy without the .svn
directory is, unfortunately, not a working copy at all. You might be able to avoid copying the text bases (or pristines as they're called starting in Subversion 1.7), but you can't just disregard the .svn
directory altogether.
Some possible remedies:
You might want to try using some DVCS as a mechanism for having a disconnected working copy, because they're somewhat more resilient to copying. git-svn
comes to mind.
You could also just tar up the working copy to a file on your flash drive, then untar it to a disk at home.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.