I need a program that I can give a lit of URLs to (either paste or in a file) like below and then it must be able to crawl those links and save files of a certain type, like images for example. I have tried a few spiders but not had luck.

Currently, the only way to download everything, is to open each link, then I use the "DownThemAll!" Firefox plugin which selects all the images (or any file type) on the page and downloads them. This works page by page, but I need something similar that works a whole list of URLs.

Does anyone have any suggestions??
Thanks a lot.

PS. Could I also add that it be something fairly easy to use that has a half decent user interface and doesn't run from the command line. Thanks

   http://www.someUrl.com/fhg/xbox/1/index.php?ccbill=123456
http://www.someUrl.com/fhg/allison_gf/1/index.php?ccbill=123456
http://www.someUrl.com/fhg/cookin/1/index.php?ccbill=123456
http://www.someUrl.com/fhg/blackjacket/1/index.php?ccbill=123456