Home  >  Article  >  Operation and Maintenance  >  what is linux curl

what is linux curl

青灯夜游
青灯夜游Original
2023-04-20 17:05:463387browse

In Linux, curl is a very practical tool for transferring data to and from the server. It is a file transfer tool that uses URL rules to work under the command line; it supports file upload and download. , is a comprehensive transmission tool. Curl provides a lot of very useful functions, including proxy access, user authentication, ftp upload and download, HTTP POST, SSL connection, cookie support, breakpoint resume and so on.

what is linux curl

#The operating environment of this tutorial: linux7.3 system, Dell G3 computer.

In Linux, curl is a file transfer tool that uses URL rules to work under the command line. It can be said to be a very powerful http command line tool. It supports the uploading and downloading of files and is a comprehensive transmission tool. However, according to tradition, it is customary to call URL a download tool.

curl is a very practical tool for transmitting data to and from servers; supported protocols include (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS , POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP), curl is designed to complete the work without user interaction; curl provides a lot of very useful functions, including proxy access, user authentication, FTP upload and download, HTTP POST, SSL connection, cookie support, breakpoint resume...

1. The most commonly used curl command

1. Send a GET request

curl URL
curl URL?a=1&b=nihao

2. Send POST request

curl -X POST -d 'a=1&b=nihao' URL

3. Send json format request:

curl -H "Content-Type: application/json" -X POST -d '{"abc":123,"bcd":"nihao"}' URL
curl -H "Content-Type: application/json" -X POST -d @test.json URL

Among them, -H represents the header, -X specifies what type of request (POST/GET/HEAD/DELETE/PUT/PATCH), and -d represents What data is transmitted. These are the most commonly used.

View all curl commands: man curl or curl -h
Request header: H,A,e
Response header: I,i,D
Cookie: b,c,j
Transfer :F(POST),G(GET),T(PUT),X
Output: o,O,w
Resumable upload: r
Debugging: v,--trace,--trace- ascii,--trace-time

2. Detailed explanation of curl command syntax and curl command parameters

1. curl command syntax

curl [options] [URL...]

2. Detailed explanation of curl command parameters

Because the Linux curl function is very powerful, there are many command parameters. The following The table is only some of the parameters selected by aiezu.com. For more parameters, please run the "man curl" command to view.

Set the proxy username and password;Data-GIf this parameter is used, the data set by the "-d/", "--data" and "--data-binary" parameters will be Attached to the url, requested in GET mode; -d @file##--retry d2c1e15f2758ae4ead444247148d09eaNumber of failed retries; -SDisplay error message in quiet mode;##--stderr 28897b20adb25fbae118a3f80f538decOutput --output file If you use wildcards to obtain multiple URLs, you can use "#" followed by "numeric sequence number", and curl will automatically replace it with the corresponding keyword, such as: -O --remote-name--create-dirs-w --write-out formatDebug Use " -" as the file name to send output to standard output. --trace-ascii file Use "-" as the file name to send output to standard output. ##-KRead parameters from the configuration file, reference: http://curl.haxx .se/docs/-vDisplay more detailed information, used during debugging;Help-MDisplay the complete help manual; -hLinux curl usage help;

Full parameter explanation of curl -h:

Parameter group Parameter Description
url url One or more URLs to be crawled;
Multiple following wildcard methods:
1. http://{www,ftp,mail}.aiezu.com;
2. http://aiezu.com/images/[001-999].jpg;
3. http://aiezu.com/images/[1-999].html;
4. ftp:// aiezu.com/file[a-z].txt
Please
Request
header
-H "name: value"
--header "name: value"
(HTTP)Add an http header (http request header);
-H "name:"
--header "name :"
(HTTP)Remove an http header(http request header);
-A "string"
--user-agent "string"
(HTTP) Set the Http request header "User-Agent". The server can use "User-Agent" to determine the browser name and operating system type used by the client. Forging this parameter can cause the server to do Make a wrong judgment.
You can also use "-H", "--header option" to set this option;
-e 258c40d94d8689854ad79c4076dd5f96
--referer 258c40d94d8689854ad79c4076dd5f96
(HTTP) Set the source page when accessing and tell the http service from which page to enter this page;
-e "aiezu.com" is equivalent to "-H "Referer: www.qq .com"";
ring
should
header
-I
--head
(HTTP) only Output HTTP-header, do not obtain content (HTTP/FTP/FILE).
When used for HTTP service, get the http header of the page;
(such as: curl -I http://aiezu.com)
When used for FTP/FILE, the file size and last modification will be obtained Time;
(such as: curl -I file://test.txt)
-i
--include
(HTTP) Output HTTP Header and return content;
-D 28897b20adb25fbae118a3f80f538dec
--dump-header 28897b20adb25fbae118a3f80f538dec
(HTTP) Dump http response header to Specify file;
cookie -b name=data
--cookie name=data
(HTTP) Send cookie The data is sent to the HTTP server. The data format is: "NAME1=VALUE1; NAME2=VALUE2";

If there is no "=" in the line, the parameter value will be regarded as the cookie file name;

This Cookie data can be sent by the server's http response header "Set-Cookie:" line;
-c filename
--cookie-jar file name
(HTTP) After completing the operation, save the cookies returned by the server to the specified file;
If the specified parameter value is "-", it will be directed to the standard output "such as the console";
-j
--junk-session-cookies
(HTTP) tells curl to give up all "session cookies";
is equivalent to restarting the browser;
proxy -x host:port
-x [protocol://[user:pwd@]host[:port]
--proxy [protocol:// [user:pwd@]host[:port]
Use HTTP proxy access; if no port is specified, port 8080 is used by default;
protocol defaults to http_proxy, other possible values ​​include:
http_proxy, HTTPS_PROXY, socks4, socks4a, socks5;
Such as:
--proxy 8.8.8.8:8080;
-x "http_proxy://aiezu:123@aiezu.com:80"
-p
--proxytunnel
Use the proxy of "-x" parameter as a channel to proxy non-HTTP protocols, such as ftp;
--socks4 aa3755fb6675d042aec4e88952ca7a83
--socks4a aa3755fb6675d042aec4e88952ca7a83
--socks5 90d2197616d1391dff9c2fe5bab7c13a
--proxy-user a0cdbb1e98c0a7f7bc41394907e02e8d
Transmission
--get

-d "string"
--data "string"
--data-ascii "string"
--data-binary "string"
--data-urlencode "string"
(HTTP) Use HTTP POST to send "key/ value pair" data, equivalent to browser form attributes (method="POST", enctype="application/x-www-form-urlencoded")
-d, --data: HTTP mode POST data;
--data-ascii: HTTP POST ascii data;
--data-binary: HTTP POST binary data;
--data-urlencode: HTTP POST data (urlencode);
If data Starting with "@", followed by a file, the content in the post file will be posted;
-F name=@file
-F name=717c611466f7397b5cc280e0cb943cdf
(HTTP ) is similar to "--form", but "@" and "5f5a716014ec41ad2b14944dabe1882e
--continue-at ab76cfca1a1dc7ff1291dcdf873f72ec
Breakpoint Continue transfer, continue downloading/uploading from the specified position in the file header;
offset is the starting position of resume transfer. If the offset value is "-", curl will automatically identify the starting position in the file and start transmission;
-r 01fbef2219010f55027ff7800552222a
--range 01fbef2219010f55027ff7800552222a
(HTTP/FTP/SFTP/FILE) Only transfer the specified part of the content:
0 -499: The first 500 bytes;
-500: The last 500 bytes;
9500-: The first 9500 bytes;
0-0,-1: The first and last 1 Bytes;
100-199,500-599: two 100 bytes;



Authentication
--basic (HTTP) tells curl to use HTTP Basic authentication (when using the HTTP protocol), which is the default authentication method;
--ntlm (HTTP) Use NTLM authentication method for HTTP protocol;
Generally used for IIS websites that use NTLM;
--digest (HTTP) uses HTTP Digest Authentication encryption, used for HTTP protocol;
Use the "-u/--user" option to prevent passwords from being sent in clear text;
--negotiate (HTTP) uses GSS-Negotiate authentication method for HTTP protocol;
Its main purpose is to provide support for kerberos5 authentication;
--anyauth (HTTP) tells curl to automatically select the appropriate authentication method and select the most secure method;
-u user:password
--user user :password
Use username and password authentication. This parameter will override the "-n", "--netrc" and "--netrc-optional" options;

If you only provide the user name, curl will ask you to enter the password;

If you use the curl library opened by "SSPI" for "NTLM" authentication, you can use the "-u:" option without username and password to force curl to use the current Login user name and password for authentication;

This parameter is equivalent to setting the http header "Authorization:";
Certificate -E 69156b4ead9773ecdb172c1ab69508db
--cert 69156b4ead9773ecdb172c1ab69508db
(SSL)Specify the certificate file and certificate password in "PEM" format;
--cert-type 7674b22ef33c73b930516fd6bc30b7a3 (SSL) tells curl the type of certificate provided: PEM, DER, ENG, etc.;
The default is "PEM";
--cacert 4e7487b72d8d9c49c4cdbf6b802767a3 (SSL) tells curl that the specified CA certificate file must be "PEM" Format;
--capath 94ad9a49b180bd9e5392f480f52eef4d (SSL) tells curl that the CA certificate in the specified directory is used for verification;
These The certificate must be in "PEM" format;
--crlfile 28897b20adb25fbae118a3f80f538dec (HTTPS/FTPS) provides a PEM format file to specify the revoked List of certificates;
-k
--insecure
(SSL) Setting this option will allow connections and transfers using insecure SSL without a certificate.
SSL
Others
--ciphers e8f064f971a27002865e16550000e50d (SSL) specifies the encryption method to be used by SSL; such as : "aes_256_sha_256";
--engine 8a11bc632ea32a57b3e3693c7987c420 Set an OpenSSL encryption engine for encryption operations;
Use "curl --engine list "View the list of supported encryption engines;
--random-file (SSL) specifies the file path name containing random data; the data is used for SSL connections The random seed generated is;
--egd-file 28897b20adb25fbae118a3f80f538dec (SSL) is the path specified by the random seed generator EGD (Entropy Gathering Daemon socket) Name;
-1/--tlsv1
--tlsv1.0
--tlsv1.1
--tlsv1.2
-2/ --sslv2
-3/--sslv3
(SSL) Use TLS version 2 to communicate with the remote server;
(SSL) Use TLS version 1.0 to communicate with the remote server;
(SSL ) Use TLS version 1.1 to communicate with the remote server;
(SSL) Use TLS version 1.2 to communicate with the remote server;
(SSL) Use SSL version 2 to communicate with the remote server;
(SSL) Use SSL version 3 Communicate with the remote server;
Private key
Public key
--key 42538adbdb6240b2b083a000a615d5bd (SSL/SSH) specify one Private key file name; when specified, automatically try to use the following files: "~/.ssh/id_rsa", "~/.ssh/id_dsa", "./id_rsa'", "./id_dsa";
--key-type 7674b22ef33c73b930516fd6bc30b7a3 (SSL) specifies the private key file type, supports: DER, PEM, ENG, the default is PEM;
--pass b6cb73fae0efbdb0e9a26fd2eaddb3bb (SSL/SSH)Specify the password of the private key file;
--pubkey 42538adbdb6240b2b083a000a615d5bd (SSH) Use your public key provided in the specified file;
FTP -P
--ftp-port 4f601f74760c4a73e3a27910c2cb6a6c
Limit the maximum bandwidth used by curl; if the unit is not specified, the default unit is "bytes/second", you can also specify the unit as "K", "M", "G", etc. Unit, such as: "--limit-rate 1m" limits the maximum bandwidth to "1m bytes/second";
-y
--speed-time 701425f2dac6a379291315d27192845a
If a download is slower than speed-limit bytes per second during a speed-time period, the download gets aborted. If speed-time is used, the default speed-limit will be 1 unless set with -Y.
This option controls transfers and thus will not affect slow connects etc. If this is a concern for you, try the --connect-timeout option.
-Y
--speed-limit 0525e6a15abb94ecba5807e114b60ce6
If a download is slower than this given speed (in bytes per second) for speed-time seconds it gets aborted. speed-time is set with -y and is 30 if not set.
Other
Options
-0/--http1.0 (HTTP) Force curl to use HTTP 1.0 instead Instead of using the default HTTP 1.1;
--interface 8a11bc632ea32a57b3e3693c7987c420 Use the specified network card interface to access;
curl --interface eth0 http:/ /aiezu.com
curl --interface 10.0.0.101 http://aiezu.com
-X e2b9f03cad787b9644e1f51fd23b7dac
--request e2b9f03cad787b9644e1f51fd23b7dac
(HTTP) specifies the request method used to communicate with the server, such as: GET, PUT, POST, DELETE, etc., the default is GET;
--keepalive-time 66b556e3499ad9a34108fe8c8cfb952b Set keepalive time
--no-keepalive Turn off the keepalive function;
--no-buffer Disable output stream buffering;
--buffer Enable output stream buffering;
-L
--location
(HTTP/HTTPS) follows the http response header "Location:" to direct to the page after the jump;
(when the http response code is 3XX When used, such as 301 jump, 302 jump)
--location-trusted (HTTP/HTTPS) Same as "--location", but jump The username and password before the jump will be sent later;
--compressed (HTTP) request uses a compression algorithm to compress the returned content; curl supports gzip compression Decompress;
--connect-timeout 0e72eeb93e8222d25f37582edca0e1bc Specify the maximum connection timeout in seconds;
-m seconds
--max-time seconds
Limit the maximum time for the entire curl operation, in seconds;
-s
--silent
Silent mode. Do not display a progress meter or error message;
-
# --progress-bar
Show a progress bar;
Error
Option
-f
--fail
(HTTP) When the connection fails (error 400 or above), the default error page is not returned, but a curl error code is returned. "22";
--retry-delay 0e72eeb93e8222d25f37582edca0e1bc
--retry-max-time 0e72eeb93e8222d25f37582edca0e1bc
Retry interval;
Maximum retry time;
--show-error
Error message save file;
-o file
Output the returned content to a file.
 curl "http://aiezu.com/ {a,b}.txt" -o "#1.txt";
Will be saved as: "a.txt", "b.txt";

curl "http://aiezu. com/{a,b}_[1-3].txt" -o "#1#2.txt";
will be saved as: a1.txt, a2.txt, a3.txt, b1.txt, b2.txt, b3.txt

If you want to create a saving directory according to rules, refer to: "--create-dirs"

Specifying "-" will direct to the standard output "such as the console" ;


Output the returned content to the current directory, to a file with the same name as the file in the url (excluding Directory);
Used with the "-o" parameter to create the necessary local directory hierarchy

After the operation is completed, append the specified content to the end of the return information; the content to be appended can be a string "string", obtained from the file "@filename", get "@-" from standard input
In the format parameters, you can use the %{variable_name} method to use the variables related to the response information, such as: %{content_type}, %{http_code}, % {local_ip}..., for more variables, please refer to "man curl";

Format parameters can use "\n", "\r", "\t" and other escape characters;

--trace 28897b20adb25fbae118a3f80f538dec Dump all incoming and outgoing data to a file, including description information;
Dump all incoming and outgoing data to a file, including description information, only dump the ASCII part, easier to read ;This option will overwrite the previously used -v, --verbose, --trace-ascii options;
--trace-time Add time to the dump file Information;
--config f6e9e3a277d0a30df37c6f516833faa0
--verbose
--manual
--help
Can use "any" authentication method##--basic                                                                                                           /--use-ascii                                                                                     -d/--data 1d029f6197b5a3eb8a3fdf0a088ddf55 ##--data-ascii 1d029f6197b5a3eb8a3fdf0a088ddf55                             Post data in ascii format--data-binary < ;data> ##-Digest Use digital authentication -Disable-EPRT ##-Egd-File & LT; File & GT; --tcp-nodelay      #Certificate file type (DER/PEM/ENG) (SSL)##--key-type 7674b22ef33c73b930516fd6bc30b7a3                                - -pass "--engine list" for list--connect-timeout 0e72eeb93e8222d25f37582edca0e1bc           --create -dirs              --crlf                                                                                                                                                                      --crlf                         --ftp-create-dirs                                                                                                                                                                                                                                     --ftp-method [multicwd/nocwd/singlecwd ]--ftp-pasv                                                                              ​##--ftp-skip-pasv-ip                                                                                                                                                                                                                                              through Use SSL/TLS for ftp data transmission-F/--form 26ce86e700de121bc743b24a0dfee1c0                                                                                                                                                                                      ##-H/--header 15a73cc5312745b1b00671f6e690e36a --ignore-content-length                                                                                                                                                                                                                                                    since Include protocol header information when output-I/--head        Only display document information-j/- -junk-session-cookies                                                                                                                                                                                                      through --krb4 ##-K/--config   Set transfer speed--local-port7a16b29f015c60d23685bc81cbbd96c2                                                                                                        m/--max-time 93c5c842594fce928d1d02d2ebbf05e2              Set the maximum total number of downloaded files-M/--manual                                                                                                                                                                                                                                                                                                 since ##-N/--no-buffer                                                                                          ## —— Proxy-Anyauth --proxy-basic                                                                                                                                                                                                                                                                                      #Use digital authentication on the proxy--proxy-ntlm                                                                -P/--ftp-port
--retry-max-time 0e72eeb93e8222d25f37582edca0e1bc    -S/--show-error                                                                                                                                                                                                                                                               since Use socks4 to proxy the given host and portUse socks5 to proxy the given host and portDebug the specified file##--trace-ascii 28897b20adb25fbae118a3f80f538dec             Like --trace but no hex output--trace-time                                                                                                                                                                                                                                                                  What command to specify-y/--speed-time                                                                                                                                .The default is 30 ## stop to stop transmission speed, speed time 'second -1/--tlsv1                                                                                                                                                                              Using SSLv2 (SSL) -3/--sslv3                Using SSLv3 (SSL)-- 3p-quote              like -Q for the source URL for 3rd party transfer##--3p-url                                                     --3p-user                                                                                                                               Use IP4-6/--ipv6                                                                                        

3. Linux curl command exit code

The following are the error codes of the Linux curl command and their corresponding error messages, which may appear in harsh environments.

-a/--append                                                                                                                                          
Use ASCII text transmission
## Prohibited use EPRT or LPRT
-Disable-EPSV Forbidden use EPSV
## is random data ( SSL) Set the EGD socket path
--key 42538adbdb6240b2b083a000a615d5bd
Private key file type (DER/PEM/ENG) (SSL)
--cacert 28897b20adb25fbae118a3f80f538dec                                                   --capath 6f655ecf17a4459bb2d05758ed836e3d                                                                                                                                                                                                                                                                          since
--ciphers 4309a73696dbaeac0ddd115cebb6f9b7                                                                                                                                                                                                                     through deflate or gzip)
Set the maximum request time
Establish the directory hierarchy of the local directory
Establish the directory hierarchy structure of the local directory
Uploading is to convert LF into CRLF
If the remote directory does not exist, create the remote directory
Control the use of CWD
--ftp-ssl-reqd                                                     
Help
##Allow SSL sites that do not use certificates
#-l/--list-only                                                                                                                                                                                                                                                                                      since
##--netrc-optional                                                                                                                                                    ##Use HTTP NTLM Authentication
-P/-Proxytunnel Use http proxy
## 选择Method
##-R/--remote-time                                                                                                                                                                                                                                                                                since ; Try interval time
Set the maximum retry time when there is a problem with transmission
--socks5 aa3755fb6675d042aec4e88952ca7a83    
-t/--telnet-option 67aea6ab925e1fd5e2ee4553bfe0973c                                                      
Spet URL to work with
-U/--proxy-user 251924df0d6d19e84c42ee228692f6a3                           Set proxy username and password
-V/--version                                                                          
»Y/-Speed-Limit
#-z/--time-cond                                                                                                                                                                                                                                                        through
##--3p-url                                                                                                                                                            like -Q for the source URL for 3rd party transfer
退出码 错误描述
1 Unsupported protocol. This build of curl has no support for this protocol.
2 Failed to initialize.
3 URL malformed. The syntax was not correct.
5 Couldn't resolve proxy. The given proxy host could not be resolved.
6 Couldn't resolve host. The given remote host was not resolved.
7 Failed to connect to host.
8 FTP weird server reply. The server sent data curl couldn't parse.
9 FTP access denied. The server denied login or denied access to the particular resource or directory you wanted to reach. Most often you tried to change to a directory that doesn't exist on the server.
11 FTP weird PASS reply. Curl couldn't parse the reply sent to the PASS request.
13 FTP weird PASV reply, Curl couldn't parse the reply sent to the PASV request.
14 FTP weird 227 format. Curl couldn't parse the 227-line the server sent.
15 FTP can't get host. Couldn't resolve the host IP we got in the 227-line.
17 FTP couldn't set binary. Couldn't change transfer method to binary.
18 Partial file. Only a part of the file was transferred.
19 FTP couldn't download/access the given file, the RETR (or similar) command failed.
21 FTP quote error. A quote command returned error from the server.
22 HTTP page not retrieved. The requested url was not found or returned another error with the HTTP error code being 400 or above. This return code only appears if -f/--fail is used.
23 Write error. Curl couldn't write data to a local filesystem or similar.
25 FTP couldn't STOR file. The server denied the STOR operation, used for FTP uploading.
26 Read error. Various reading problems.
27 Out of memory. A memory allocation request failed.
28 Operation timeout. The specified time-out period was reached according to the conditions.
30 FTP PORT failed. The PORT command failed. Not all FTP servers support the PORT command, try doing a transfer using PASV instead!
31 FTP couldn't use REST. The REST command failed. This command is used for resumed FTP transfers.
33 HTTP range error. The range "command" didn't work.
34 HTTP post error. Internal post-request generation error.
35 SSL connect error. The SSL handshaking failed.
36 FTP bad download resume. Couldn't continue an earlier aborted download.
37 FILE couldn't read file. Failed to open the file. Permissions?
38 LDAP cannot bind. LDAP bind operation failed.
39 LDAP search failed.
41 Function not found. A required LDAP function was not found.
42 Aborted by callback. An application told curl to abort the operation.
43 Internal error. A function was called with a bad parameter.
45 Interface error. A specified outgoing interface could not be used.
47 Too many redirects. When following redirects, curl hit the maximum amount.
48 Unknown TELNET option specified.
49 Malformed telnet option.
51 The peer's SSL certificate or SSH MD5 fingerprint was not ok.
52 The server didn't reply anything, which here is considered an error.
53 SSL crypto engine not found.
54 Cannot set SSL crypto engine as default.
55 Failed sending network data.
56 Failure in receiving network data.
58 Problem with the local certificate.
59 Couldn't use specified SSL cipher.
60 Peer certificate cannot be authenticated with known CA certificates.
61 Unrecognized transfer encoding.
62 Invalid LDAP URL.
63 Maximum file size exceeded.
64 Requested FTP SSL level failed.
65 Sending the data requires a rewind that failed.
66 Failed to initialize SSL Engine.
67 The user name, password, or similar was not accepted and curl failed to log in.
68 File not found on TFTP server.
69 Permission problem on TFTP server.
70 Out of disk space on TFTP server.
71 Illegal TFTP operation.
72 Unknown TFTP transfer ID.
73 File already exists (TFTP).
74 No such user (TFTP).
75 Character conversion failed.
76 Character conversion functions required.
77 Problem with reading the SSL CA cert (path? access rights?).
78 The resource referenced in the URL does not exist.
79 An unspecified error occurred during the SSH session.
80 Failed to shut down the SSL connection.
82 Could not load CRL file, missing or wrong format (added in 7.19.0).
83 Issuer check failed (added in 7.19.0).
XX More error codes will appear here in future releases. The existing ones are meant to never change.

4. Common usage

##1. Download (option:-o or option:-O)

1.1. Download page:

curl -o dodo1.jpg http:www.linux.com/dodo1.JPG

#Please note that the url behind -O must be specific to a certain file, otherwise it will be captured Not downloading
curl -O http://www.linux.com/dodo1.JPG

1.2: Loop download

Sometimes the names of the previous parts of the downloaded pictures may be the same. The last tail vertebra has a different name. This will save all dodo1, dodo2, dodo3, dodo4, and dodo5
curl -O http://www.linux.com/dodo[1-5].JPG

1.3: Download heavy Naming

The file in hello/dodo1.JPG will become hello_dodo1.JPG after downloading, and so on for other files, thus effectively avoiding file overwriting
curl -o #1_#2.JPG http: //www.linux.com/{hello,bb}/dodo[1-5].JPG

Because the file names in the downloaded hello and bb are all dodo1, dodo2, dodo3, dodo4, dodo5. Therefore, the second download will overwrite the first download, so the file needs to be renamed.

curl -O http://www.linux.com/{hello,bb}/dodo[1-5].JPG

1.4: Download in chunks (option: -r)

curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG
curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1. JPG
curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG
cat dodo1_part* > dodo1.JPG #This way you can view the contents of dodo1.JPG

1.5: Download files through ftp (option: -u)

curl can download files through ftp, curl provides two syntaxes for downloading from ftp
curl -O -u Username: Password ftp ://www.linux.com/dodo1.JPG
curl -O ftp://username:password@www.linux.com/dodo1.JPG

1.6: Download, display progress bar ( option: -#) or do not display the progress bar (option: -s)

curl -# -O http://www.linux.com/dodo1.JPG
curl -s -O http://www .linux.com/dodo1.JPG

1.7. Download, resume at breakpoint (-C ab76cfca1a1dc7ff1291dcdf873f72ec)

Resume at breakpoint and continue downloading/uploading from the specified position in the file header; offset is the starting position of resume transmission. If the offset value is "-", curl will automatically identify the starting position from the file and start transmission;
curl -# -o centos6.8.iso -C - http://mirrors. aliyun.com/centos/6.8/isos/x86_64/CentOS-6.8-x86_64-minimal.iso
curl -C -O http://www.linux.com/dodo1.JPG

2. Upload file (option: -T)

curl -T dodo1.JPG -u Username: Password ftp://www.linux.com/img/

3. Forged source page | fake referer | hot link (option: -e)

Many servers will check the referer for http access so that to control access. For example: you visit the homepage first, and then access the email page on the homepage. The referer address of the email address here is the page address after successfully accessing the homepage. If the server finds that the referer address of the email page is not the address of the homepage, it will determine that. It's a stolen connection

#This will make the server think that you clicked a link from www.linux.com
curl -e "www.linux.com" http://mail.linux. com
#Tell Aiezu that I am from Baidu
curl -e http://baidu.com http://aiezu.com

4、 Fake proxy device (imitation browser)

Some websites require the use of specific browsers to access them, and some also require the use of certain specific versions. Curl's built-in option: -A allows us to specify the browser to access the website

curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com
#Tell I love the E family, I am a GOOGLE crawler spider (actually I am the curl command)
curl -A "Mozilla/5.0 (compatible; Googlebot/2.1; http://www.google.com/bot.html)" http: //aiezu.com
#Tell Aiezu that I use WeChat’s built-in browser
curl -A "Mozilla/5.0 AppleWebKit/600 Mobile MicroMessenger/6.0" http://aiezu.com

5. Set http request

5.1. Set http request header (or option:-H or option:--head)

curl -H "Cache-Control:no-cache" http://aiezu.com

5.2. Specify the proxy server and its port (option::-x)

#It is often needed to access the Internet. To the proxy server (for example, when you use a proxy server to access the Internet or your IP address is blocked by others because you use curl on other people's websites), fortunately curl supports setting the proxy by using the built-in option: -x
curl -x 192.168.100.100 :1080 http://www.linux.com

6, http response header

6.1. View the http response header (option:-I)
# See what the http header of this site looks like
curl -I http://aiezu.com
Output:
HTTP/1.1 200 OK
Date: Fri, 25 Nov 2016 16:45:49 GMT
Server: Apache
Set-Cookie: rox__Session=abdrt8vesprhnpc3f63p1df7j4; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Vary : Accept-Encoding
Content-Type: text/html; charset=utf-8

6.2. Save the header information in the http response (option:-D)
curl -D cookied. txt http://www.linux.com
After execution, the cookie information is saved in cookied.txt
Note: The cookies generated by -c (lowercase) are different from the cookies in -D.

7. Send form data

curl -F "pic=@logo.png" -F "site=aiezu" http:/ /aiezu.com/

8、cookie

8.1、Send cookie(option:-b)
#Some websites use Cookie to record session information. For browsers like chrome, cookie information can be easily processed, but cookies can also be easily processed by adding relevant parameters in curl
curl -b "domain=aiezu.com" http://aiezu.com
#Many websites monitor your cookie information to determine whether you are visiting their website according to the rules, so we need to use the saved cookie information. Built-in option: -b
curl -b cookiec.txt http://www.linux.com

8.2. Save the cookie information in the http response (option:-c)
After executing http The cookie information in the response is saved in cookiec.txt
curl -c cookiec.txt http://www.linux.com

9. Test a URL

9.1. Test whether a URL is reachable
curl -v http://www.linux.com

9.2. Test the return value of the web page (option: -w [format])
curl -o /dev/null -s -w %{http_code} www.linux.com

10. Save the visited web page (> >)

2.1: Save using the redirection function of linux
curl http://www.linux.com >> linux.html

11. Request method

curl -i -v -H '' -X POST -d '' http:www.test.com/a/b
Among them, -X POST -d, -X GET -d, -X PUT -d are equivalent to -F, -G -d, -P respectively

Take post request as an example:
11.1 , -X POST -d
(11.1.1), POST application/x-www-form-urlencoded
application/x-www-form-urlencoded is the default
curl -X POST -d "param1 =value1¶m2=value2" http://localhost:3000/data
Equivalent to
curl -H "Content-Type:application/x-www-form-urlencoded" -X POST -d "param1=value1¶m2 =value2" http://localhost:3000/data
Using data files
curl -X POST -d "@data.txt" http://localhost:3000/data
where data.txt content As follows: param1=value1¶m2=value2

(11.1.2), POST application/json
curl -H "Content-Type:application/json" -X POST -d '{"key1":" value1","key2":"value2"}' http://localhost:3000/data
If using data files:
curl -X POST -d "@data.json" http://localhost: 3000/data
The content of data.json is as follows: {"key1":"value1","key2":"value2"}
Another example:
curl -H "Content-type:application /json" -X POST -d "{\"app_key\":\"$appKey\",\"time_stamp\":\"$time\"}" http://www.test.com.cn/a /b

11.2、-F
curl -v -H "token: 222" -F "file=@/Users/funglio/Downloads/401.png" localhost:8000/api/v1/upimg

curl -f http://www.linux.com/error

11.3, other examples

(11.3.1),

curl -X POST "http://www.test.com/e/f" -H "Content-Type:application/x-www-form-urlencoded;charset=UTF-8" \
-d "a=b" \
-d "c=d" \
-d "e=f" \
-d "g=h"

(11.3.2), error: curl -i -G -d "a=b#1&c=d" http://www.test.com/e/f
Correct: Use urlencode to convert parameter values ​​that are special symbols
curl -i -G -d "a=b#1&c=d" http://www.test.com/e/f

12. Debugging

curl -v可以显示一次http通信的整个过程,包括端口连接和http request头信息。
如果觉得还不够,那么下面的命令可以查看更详细的通信过程:
curl --trace output.txt www.baidu.com 或者 curl --trace-ascii output.txt www.baidu.com 
运行后,请打开output.txt文件查看。
curl --trace output.txt  http://www.baidu.com
curl --trace-ascii output2.txt  http://www.baidu.com
curl --trace output3.txt --trace-time http://www.baidu.com
curl --trace-ascii output4.txt --trace-time http://www.baidu.com 

举例:有需求每5分钟请求一次http://www.test.com/a/b生成一个日志文件。希望一月的日志(正确的和错误的)能写入一个日志文件
day=`date +%F`
logfile='/var/logs/www.test.com_'`date +%Y%m`'.log'
/usr/bin/echo -e "\n\n[${day}] Start request \n " >> ${logfile}
/bin/curl -v "http://www.test.com/a/b" -d "ccccc" 1>> ${logfile} 2>> ${logfile} --trace-time
/usr/bin/echo -e "\n\n[${day}] End request\n" >> ${logfile}

13、显示抓取错误

curl -f http://www.linux.com/error

相关推荐:《Linux视频教程

The above is the detailed content of what is linux curl. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Previous article:what is linux tapNext article:what is linux tap