Command: curl
In Linux, curl is a file transfer tool that uses URL rules to work on the command line. It can be said that curl is a very powerful http command line tool. It supports file upload and download. It is a comprehensive transmission tool, but traditionally, it is used to call URL a download tool.
Syntax:# curl [option] [url]
Common parameters:
-A/--user-agent <string> Set user agent to send to server -b/--cookie <name=string/file> cookie String or file read location -c/--cookie-jar <file> After the operation cookie Write to this file -C/--continue-at <offset> Breakpoint continuation -D/--dump-header <file> hold header Information is written to the file -e/--referer Source URL -f/--fail Do not display when connection fails http error -o/--output Write the output to this file -O/--remote-name Write the output to the file and keep the file name of the remote file -r/--range <range> Retrieve from HTTP/1.1 or FTP Server byte range -s/--silent Mute mode. Don't output anything -T/--upload-file <file> Upload file -u/--user <user[:password]> Set the user and password of the server -w/--write-out [format] After what output is completed -x/--proxy <host[:port]> Use on a given port HTTP agent -#/--The progress bar displays the current transfer status
example:
1. Basic usage
# curl http://www.linux.com
After execution, the html of www.linux.com will be displayed on the screen
Ps: because there is no desktop when installing linux, which means there is no browser, this method is often used to test whether a server can reach a website
2. Save visited web pages
2.1: save using the redirection function of linux
# curl http://www.linux.com >> linux.html
2.2: you can use curl's built-in option: - O (lowercase) to save web pages
$ curl -o linux.html http://www.linux.com
After execution, the following interface will be displayed. If 100% is displayed, it means that the save is successful
% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 79684 0 79684 0 0 3437k 0 --:--:-- --:--:-- --:--:-- 7781k
2.3: you can use curl's built-in option: - O (uppercase) to save files in web pages
Note that the url behind here should be specific to a file, otherwise you can't catch it
# curl -O http://www.linux.com/hello.sh
3. Test web page return value
# curl -o /dev/null -s -w %{http_code} www.linux.com
Ps: in scripts, this is very common to test whether the website is used normally
4. Specify the proxy server and its port
Many times, you need a proxy server to surf the Internet (for example, when you use a proxy server to surf the Internet or are blocked by others because you use curl's website). Fortunately, curl supports proxy setting by using the built-in option: - x
# curl -x 192.168.100.100:1080 http://www.linux.com
5,cookie
Some websites use cookies to record session information. For browsers such as chrome, cookie information can be easily processed, but cookies can also be easily processed by adding relevant parameters in curl
5.1: save the cookie information in the http response. Built in option:-c (lowercase)
# curl -c cookiec.txt http://www.linux.com
After execution, the cookie information is saved in cookie.txt
5.2: save the header information in the http response. Built in option: -D
# curl -D cookied.txt http://www.linux.com
After execution, the cookie information is saved in cookie.txt
Note that the cookie generated by: - C (lowercase) is different from that in - D.
5.3: using cookie s
Many websites monitor your cookie information to determine whether you visit their websites according to the rules, so we need to use the saved cookie information. Built in option: -b
# curl -b cookiec.txt http://www.linux.com
6. Mimic browser
Some websites need to use a specific browser to access them, and some need to use a specific version. curl built-in option:-A allows us to specify a browser to access the website
# curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com
In this way, the server side will be considered to be accessed using IE8.0
7. Forged referer (chain stealing)
Many servers check the referer of http access to control access. For example, you first visit the home page, and then visit the mailbox page in the home page. The referer address of the mailbox here is the page address after the successful visit to the home page. If the server finds that the referer address of the mailbox page is not the address of the home page, it concludes that it is a theft
The built-in option: - e in curl allows us to set the referer
# curl -e "www.linux.com" http://mail.linux.com
This will make the server think you have clicked a link from www.linux.com
8. Download File
8.1: download files using curl.
Use built-in option: - O (lowercase)
# curl -o dodo1.jpg http:www.linux.com/dodo1.JPG
Use built-in option: - O (uppercase)
# curl -O http://www.linux.com/dodo1.JPG
This saves the file locally with the name on the server
8.2: Circular Download
Sometimes the name of the front part of the downloaded image can be the same, but the name of the last caudal vertebra is different
# curl -O http://www.linux.com/dodo[1-5].JPG
In this way, all dodo1, dodo2, dodo3, dodo4 and dodo5 will be saved
8.3: Download rename
# curl -O http://www.linux.com/{hello,bb}/dodo[1-5].JPG
Because the file names in the downloaded hello and bb are dodo1, dodo2, dodo3, dodo4 and dodo5, the second download will overwrite the first download, so the file needs to be renamed.
# curl -o #1_#2.JPG http://www.linux.com/{hello,bb}/dodo[1-5].JPG
In this way, the file downloaded from hello/dodo1.JPG will become hello_dodo1.JPG, and so on, so as to effectively prevent the file from being overwritten
8.4: block Download
Sometimes the downloaded items will be large. At this time, we can download them in sections. Use the built-in option: - r
# curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG # curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1.JPG # curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG # cat dodo1_part* > dodo1.JPG
In this way, you can view the contents of dodo1.JPG
8.5: download files through ftp
curl can download files through ftp. curl provides two syntax for downloading files from ftp
# curl -O -u Username: password ftp://www.linux.com/dodo1.JPG # curl -O ftp: / / user name: password @ www.linux.com/dodo1.JPG
8.6: display download progress bar
# curl -# -O http://www.linux.com/dodo1.JPG
8.7: download progress information will not be displayed
# curl -s -O http://www.linux.com/dodo1.JPG
9. Breakpoint continuation
In windows, we can use software like Xunlei for breakpoint continuation. curl can achieve the same effect through the built-in option:-C
If you suddenly drop the line during downloading dodo1.JPG, you can continue the transmission in the following way
# curl -C -O http://www.linux.com/dodo1.JPG
10. Upload file
curl can not only download files, but also upload files. It is realized through the built-in option:-T
# curl -T dodo1.JPG -u user name: password ftp://www.linux.com/img/
In this way, the file dodo1.JPG is uploaded to the ftp server
11. Display grab error
# curl -f http://www.linux.com/error
Other parameters (translated here as Reprint):
-a/--append Attach to target file when uploading file --anyauth You can use any authentication method --basic use HTTP Basic verification -B/--use-ascii use ASCII Text transmission -d/--data <data> HTTP POST Transfer data by --data-ascii <data> with ascii The way post data --data-binary <data> In binary form post data --negotiate use HTTP Authentication --digest Use digital authentication --disable-eprt Prohibited use EPRT or LPRT --disable-epsv Prohibited use EPSV --egd-file <file> Is random data(SSL)set up EGD socket route --tcp-nodelay use TCP_NODELAY option -E/--cert <cert[:passwd]> Client certificate file and password (SSL) --cert-type <type> Certificate file type (DER/PEM/ENG) (SSL) --key <key> Private key file name (SSL) --key-type <type> Private key file type (DER/PEM/ENG) (SSL) --pass <pass> Private key password (SSL) --engine <eng> Encryption engine usage (SSL). "--engine list" for list --cacert <file> CA certificate (SSL) --capath <directory> CA order (made using c_rehash) to verify peer against (SSL) --ciphers <list> SSL password --compressed The request to return is a compressed situation (using deflate or gzip) --connect-timeout <seconds> Set maximum request time --create-dirs Establish the directory hierarchy of the local directory --crlf Upload is LF Turn into CRLF --ftp-create-dirs If the remote directory does not exist, create the remote directory --ftp-method [multicwd/nocwd/singlecwd] control CWD Use of --ftp-pasv use PASV/EPSV Substitute port --ftp-skip-pasv-ip use PASV When,Ignore this IP address --ftp-ssl Try to use SSL/TLS Come on ftp data transmission --ftp-ssl-reqd Required SSL/TLS Come on ftp data transmission -F/--form <name=content> simulation http Form submission data -form-string <name=string> simulation http Form submission data -g/--globoff Disable URL sequence and range usage{}and[] -G/--get with get To send data -h/--help help -H/--header <line> Custom header information is passed to the server --ignore-content-length Ignored HTTP Length of header information -i/--include Output includes protocol Header information -I/--head Show only document information -j/--junk-session-cookies Ignore when reading file session cookie --interface <interface> Use specified network interface/address --krb4 <level> Use the specified security level krb4 -k/--insecure Allow to connect to without using certificate SSL site -K/--config Specified profile read -l/--list-only list ftp File name under directory --limit-rate <rate> Set transmission speed --local-port<NUM> Force local port number -m/--max-time <seconds> Set maximum transfer time --max-redirs <num> Sets the maximum number of directories to read --max-filesize <bytes> Set the maximum number of files downloaded -M/--manual Display full manual -n/--netrc from netrc Read user name and password from file --netrc-optional use .netrc perhaps URL To cover-n --ntlm use HTTP NTLM Authentication -N/--no-buffer Disable buffered output -p/--proxytunnel use HTTP agent --proxy-anyauth Select any proxy authentication method --proxy-basic Use basic authentication on Proxy --proxy-digest Using digital authentication on a proxy --proxy-ntlm Use on Proxy ntlm Authentication -P/--ftp-port <address> Use port addresses instead of PASV -Q/--quote <cmd> Send commands to the server before file transfer --range-file Read( SSL)Random file -R/--remote-time When generating files locally, keep remote files for --retry <num> Number of retries when there was a problem with the transmission --retry-delay <seconds> Set the retry interval when there is a problem with the transmission --retry-max-time <seconds> Set the maximum retry time when there is a problem with the transmission -S/--show-error Display error --socks4 <host[:port]> use socks4 Proxy given host and port --socks5 <host[:port]> use socks5 Proxy given host and port -t/--telnet-option <OPT=val> Telnet Option settings --trace <file> Edit the specified file debug --trace-ascii <file> Like --Tracking but not hex output --trace-time track/Add timestamp when detailed output --url <URL> Spet URL to work with -U/--proxy-user <user[:password]> Set proxy user name and password -V/--version display version information -X/--request <command> Specify what command -y/--speed-time The time required to give up the speed limit. The default is 30 -Y/--speed-limit Stop transmission speed limit, speed time'second -z/--time-cond Transmission time setting -0/--http1.0 use HTTP 1.0 -1/--tlsv1 use TLSv1(SSL) -2/--sslv2 use SSLv2 Yes( SSL) -3/--sslv3 Used SSLv3(SSL) --3p-quote like -Q for the source URL for 3rd party transfer --3p-url use url,Third party transmission --3p-user Third party transmission using user name and password -4/--ipv4 use IP4 -6/--ipv6 use IP6
Transferred from: http://www.linuxdiyf.com/linux/2800.html