A simple little trick I learned to do exactly what you are doing.
Use the tar command:
# tar cvf - /home/xxx/public_html | (cd /opt1/backup ; tar xpf -)
tar cvf - Creates the archive, Verbose listing showing you what is actually being copied from your public_html dir, and the "f" is for the filename, which in this case is simply "-". The | is for piping the 2nd half of the command: changing directories and "tar xpf - ". The tar xpf eXtracts the files from the - file, Preserving permissions, and of course the filename of - (which is simply a temporary name just to get the job done).
You can always rename the file from - to anything you like, but I kept it the way I learned. You can also find that - file and keep it if you want just as a backup.
You might also try to do this after a fresh boot. Always seems to help for something intensive like that. But really, 10 gigs may seem like a lot but on a relatively new machine, any method you use shouldn't be that long or heavy on your system, especially if you aren't trying to do several other things at once. It should only take a few minutes at most.
Good luck. Hope that helps.
Check the syntax to make sure it suits what you are trying to do (especially folder names).
One other thing you might consider, drop down to init 1 and do it without KDE or gnome running. That will free up a lot of resources and should make this fly quickly.