download - What is the curl equivalent to wget -x?

08
2014-07
  • Alex I

    I want to use curl to save multiple files with the same local paths as would be produced by wget -x (ie, cleaned up versions of the full urls). But I can't find the right option in the curl docs. Is this possible?

    curl --remote-name "Only the file part of the remote file is used, the path is cut off."

  • Answers
  • Renju Chandran chingath

    curl --create-dirs

    man page says :

    When used in conjunction with the -o option, curl will create the necessary local directory hierarchy as needed. This option creates the dirs mentioned with the -o option, nothing else. If the -o file name uses no dir or if the dirs it mentions already exist, no dir will be created.


  • Related Question

    linux - Download Speed cURL
  • Ibn Ar-Rashid

    I am having a bit of problem using cURL to download using FTP, that is to say I think it's being rather slow if working at all. I usually use wget an this one file which I downloaded earlier with wget started and finished under 10 minutes. The size was about 200 MB, a Linux distribution. I tried it later on with cURL, first time using it, and all I did was type in the FTP address after the cURL command as I would with wget. I started seeing the source code and everything, but it's been like an hour and it still hasn't finished, is this normal? My connection seems fine, as fast as it's supposed to be. I would appreciate if someone who uses cURL or knows about it extensively could explain the matter.

    -

    Currently using: Ubuntu 9.10/Windows 7, Crunchbang 9.04/Windows XP.


  • Related Answers
  • quack quixote

    wget saves to a file by default; cURL by default outputs to STDOUT (meaning your screen). You need to specify that it should be outputting to a file. You can do this with the -o (aka --output) switch, or with shell redirection:

    # switch to the curl command
    curl ftp://someserver.com/path/to/file -o output.filename.here
    
    # shell redirection
    curl ftp://someserver.com/path/to/file > output.filename.here
    

    The slowness probably is due more to the output going to the terminal screen than your network speed; once you redirect the output to a file you should see download speeds roughly equivalent to what you got with wget.

  • John T

    The reason you're seeing source code is probably because you haven't specified an output file as ~quack has mentioned.

    I've never had speed issues with cURL via FTP, you may have also picked a bad server in terms of distance and speeds. Sometimes a site will provide a single download link that will actually dynamically pick you a mirror when you access it. You may have gotten a different, slower server the second time around.