Some Advanced Uses of cURL
Posted By : Shiv Pujan Maurya | 25-Feb-2018
Let’s Discuss curl first
DESCRIPTION : (Taken from man page of curl)
curl is a tool to transfer data from or to a server, using one of the supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP). The command is designed to work without user interaction.
Let’s go through examples to understand and use effectively
curl -o : This saves data to any given filename
curl -o myfile http://myfile.com/data.txt
curl -O : This save data to file contained in the url
curl -O https://mytextfile.com/f1.txt
curl -L : This seems a powerful feature suppose you hit a url and response says moved permanently then using -L option will follow the moved location and will download the requested webpage.
curl http://www.amazon.com #this says moved permanently
curl -L http://www.amazon.com #web page will be downloaded
curl -C : This option can be used for resuming an interrupted download
curl -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso #press Ctrl + C
Now resume it
curl -C - -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso
curl --limit-rate : if the download has taken your full bandwidth just limit the download rate
curl --limit-rate 500B -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso
curl -z : This option is very useful if you have something to check constantly and download only when its modified. This works for both HTTP and FTP.
curl -z 25-Feb-18 -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso
#Now if the file was modified after given date it will be downloaded
curl -z -25-Feb-18 -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso
#now file will be downloaded if it was modified before given date. Note that (-) sign is used for previous dates
curl -u : This option can be used if you have some authentication to be done before download
curl -u mysiteuser:password http://testsite.com
curl -v : This option gives verbose output of the process
curl -v http://www.amazon.com
* Rebuilt URL to: http://www.amazon.com/
* Trying 52.222.186.8...
* TCP_NODELAY set
* Connected to www.amazon.com (52.222.186.8) port 80 (#0)
> GET / HTTP/1.1
> Host: www.amazon.com
> User-Agent: curl/7.55.1
> Accept: */*
>
< HTTP/1.1 301 Moved Permanently
< Server: CloudFront
< Date: Sun, 25 Feb 2018 13:56:06 GMT
< Content-Type: text/html
< Content-Length: 183
< Connection: keep-alive
< Location: https://www.amazon.com/
< X-Cache: Redirect from cloudfront
* Connection #0 to host www.amazon.com left intact
curl --trace <file name>: If you ever need to save the output curl command just use this option
curl --trace log.text http://www.google.com #now full trace is saved to this file
#cat log.text
curl --interface : This option can be used when you want your request to follow specific interface.
curl -v --interface wlan0 https://www.github.com
curl --max-redirs <num> : This option allows control over redirections default limit is 50. To set it unlimited use -1.
curl -L --max-redirs 2 http://www.google.com
curl -R : This option can be used to get the timestamp of remote file if available then the downloaded file will get the same timestamp.
curl -Y, --speed-limit <speed> : This option can be used to abort a download if its below this specified limit. Measured in bytes/second and default is 30.
curl -y, --speed-time <second> : This option can be used to abort a download if its below this specified limit for defined time period.curl -Y 50000 -y 150 -O http://releases.ubuntu.com/16.04/ubuntu-16.04.3-desktop-amd64.iso
curl -2 : for sslv2 , -3 for sslv3 : This option helps if any specific version of ssl is failing.
curl -v -3 https://github.com
* Rebuilt URL to: https://github.com/
* Trying 192.30.253.112...
* TCP_NODELAY set
* Connected to github.com (192.30.253.112) port 443 (#0)
* ALPN, offering http/1.1
* Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
* successfully set certificate verify locations:
* CAfile: /home/shivpujan/anaconda3/ssl/cacert.pem
CApath: none
* SSLv3 (OUT), TLS handshake, Client hello (1):
* SSLv3 (IN), TLS alert, Server hello (2):
* error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert handshake failure
* Closing connection 0
curl: (35) error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert handshake failure
curl -tls-max <version>: Limits maximum supported version.
curl -v -tls-max 1.3 https://github.com
curl -v --trace-time : This option provides a timestamp foreach trace
curl -v --trace-time http://github.com
23:56:50.746965 * Rebuilt URL to: http://github.com/
23:56:50.759466 * Trying 192.30.253.113...
23:56:50.759590 * TCP_NODELAY set
23:56:51.571973 * Connected to github.com (192.30.253.113) port 80 (#0)
23:56:51.572330 > GET / HTTP/1.1
23:56:51.572330 > Host: github.com
23:56:51.572330 > User-Agent: curl/7.55.1
23:56:51.572330 > Accept: */*
23:56:51.572330 >
23:56:52.390564 < HTTP/1.1 301 Moved Permanently
23:56:52.390996 < Content-length: 0
23:56:52.391149 < Location: https://github.com/
23:56:52.391303 <
23:56:52.391473 * Connection #0 to host github.com left intact
EXIT CODES
Last but not the least some error code responses that help when things go wrong.
ExitCode 1: protocol is not supported.
ExitCode 2: initialization failed.
ExitCode 3: incorrect syntax.
ExitCode 5: unresolvable proxy host.
ExitCode 6: unable to locate remote host.
ExitCode 7: failure in connecting to host.
Thank you for you time.
Cookies are important to the proper functioning of a site. To improve your experience, we use cookies to remember log-in details and provide secure log-in, collect statistics to optimize site functionality, and deliver content tailored to your interests. Click Agree and Proceed to accept cookies and go directly to the site or click on View Cookie Settings to see detailed descriptions of the types of cookies and choose whether to accept certain cookies while on the site.
About Author
Shiv Pujan Maurya
Shiv is a Redhat certified engineer with B.Tech in Computer Science & Engineering. He is passionate about new technologies specially in DevOps and Artificial intelligence.