The best tool to use if you want to create a local copy of all or part of a website is
wget. It is command line based, so it takes some learning curve to get it to do exactly what you want: "wget -r -np http://somesite.com/somepage/" is a good starting point. However, once you learn how to use it, it can do
anything as far as this kind of stuff goes.
I use
this Windows build. It is not the latest version, but it works just fine. If you use this package, you will want to
edit the PATH environment variable and add ";%PROGRAMFILES%\GnuWin32\bin" at the end (";%PROGRAMFILES(X86)%\GnuWin32\bin" on 64-bit Windows).
Other than that, the
DownThemAll Firefox extension works well for simpler tasks. It won't recursively mirror a site, but it is fine for grabbing all the images on a page or all linked files of a certain type.