Command line tools for web developer. Writing HTTP Requests with Curl
We often have to download various files from the Internet, for example, executable files programs, script files, source archives. But this does not always need to be done through the browser. In many situations it is much easier to perform all actions through the terminal. Because this way you can automate the process. On the other hand, webmasters from time to time have to test website accessibility, check sent and received headers, and much more.
To solve such problems and problems of a similar range, you can use the curl utility. It allows you to solve a much wider range of problems, including even simulating user actions on the site. In this article we will look at how to use curl, what it is and why this program is needed.
In fact, curl is more than just a command line utility for Linux or Windows. This is a set of libraries that implement basic capabilities working with URL pages and transferring files. The library supports working with protocols: FTP, FTPS, HTTP, HTTPS, TFTP, SCP, SFTP, Telnet, DICT, LDAP, as well as POP3, IMAP and SMTP. It is great for simulating user actions on pages and other operations with URLs.
Support for the curl library has been added to many different programming languages and platforms. The curl utility is an independent wrapper for this library. It is this utility that we will focus on in this article.
curl command
Before moving on to a description of how the curl linux command can be used, let's look at the utility itself and its main options that we will need. The syntax of the utility is very simple:
$ curl options link
Now let's look at the main options:
- -# - display a simple progress bar during loading;
- -0 - use the http 1.0 protocol;
- -1 - use the tlsv1 encryption protocol;
- -2 - use sslv2;
- -3 - use sslv3;
- -4 - use ipv4;
- -6 - use ipv6;
- -A- indicate your USER_AGENT;
- -b- save Cookie to a file;
- -c- send Cookie to the server from a file;
- -C- continue downloading the file from the break point or specified offset;
- -m- maximum waiting time for a response from the server;
- -d- send data using the POST method;
- -D- save headers returned by the server to a file;
- -e- set the Referer-uri field, indicating which site the user came from;
- -E- use an external SSL certificate;
- -f- do not display error messages;
- -F- send data in the form of a form;
- -G- if this option is enabled, then all data specified in the -d option will be transmitted using the GET method;
- -H- transfer headers to the server;
- -I- receive only the HTTP header and ignore the entire page content;
- -j- read and send cookies from a file;
- -J- remove header from request;
- -L- accept and process redirects;
- -s - maximum amount redirections using Location;
- -o- output page content to a file;
- -O- save content to a file with the name of the page or file on the server;
- -p- use a proxy;
- --proto- indicate the protocol to be used;
- -R- save the last modification time of a remote file;
- -s- display a minimum of information about errors;
- -S- display error messages;
- -T- upload the file to the server;
- -v- the most detailed output;
- -y- minimum download speed;
- -Y - maximum speed downloads;
- -z- download the file only if it was modified later than the specified time;
- -V- display the version.
This is by no means all of the options for curl linux, but it lists the basics that you will need to use.
How to use curl?
We've covered everything related to the theory of working with the curl utility, now it's time to move on to practice and look at examples of the curl command.
The most common task is this. Downloading the file is very simple. To do this, just pass the file name or html page to the utility in the parameters:
curl https://raw.githubusercontent.com/curl/curl/master/README.md
But here one surprise awaits you: the entire contents of the file will be sent to standard output. To write it to any file use:
curl -o readme.txt https://raw.githubusercontent.com/curl/curl/master/README.md
And if you want the resulting file to be named the same as the file on the server, use the -O option:
curl -O https://raw.githubusercontent.com/curl/curl/master/README.md
curl -# -C - -O https://cdn.kernel.org/pub/linux/kernel/v4.x/testing/linux-4.11-rc7.tar.xz
If necessary, you can download several files with one command:
curl -O https://raw.githubusercontent.com/curl/curl/master/README.md -O https://raw.githubusercontent.com/curl/curl/master/README
Another thing that may be useful for an administrator is to only download a file if it has been modified:
curl -z 21-Dec-17 https://raw.githubusercontent.com/curl/curl/master/README.md -O https://raw.githubusercontent.com/curl/curl/master/README
Speed Limit
You can limit the download speed to the required limit so as not to overload the network using the -Y option:
curl --limit-rate 50K -O https://cdn.kernel.org/pub/linux/kernel/v4.x/testing/linux-4.11-rc7.tar.xz
Here you need to specify the number of kilobytes per second that can be downloaded. You can also terminate the connection if the speed is not enough, use the -Y option to do this:
curl -Y 100 -O https://raw.githubusercontent.com/curl/curl/master/README.md
Transferring files
curl -T login.txt ftp://speedtest.tele2.net/upload/
Or let’s check that the file is sent via HTTP; there is a special service for this:
curl -T ~/login.txt http://posttestserver.com/post.php
In the response, the utility will tell you where you can find the downloaded file.
Sending POST data
You can send not only files, but also any data using the POST method. Let me remind you that this method is used to send data of various forms. To send such a request, use the -d option. For testing we will use the same service:
curl -d "field1=val&fileld2=val1"http://posttestserver.com/post.php
If you are not happy with this submission option, you can pretend to submit the form. There is an option for this -F:
curl -F "password=@pass;type=text/plain" http://posttestserver.com/post.php
Here we pass the password field with the form as plain text, in the same way you can pass several parameters.
Sending and receiving cookies
Cookies are used by websites to store certain information on the user's side. This may be necessary, for example, for authentication. You can accept and send Cookies using curl. To save the received Cookies to a file, use the -c option:
curl -c cookie.txt http://posttestserver.com/post.php
You can then send the curl cookie back:
curl -b cookie.txt http://posttestserver.com/post.php
Header transmission and analysis
We don't always necessarily need the content of the page. Sometimes only the headlines can be interesting. To display only them there is the -I option:
curl -I https://site
And the -H option allows you to send several or more to the server, for example, you can pass the If-Modified-Since header so that the page is returned only if it has been modified:
curl authentication
If the server requires one of the common types of authentication, such as HTTP Basic or FTP, then curl can handle this task very easily. To specify authentication details, simply specify them separated by a colon in the -u option:
curl -u ftpuser:ftppass -T - ftp://ftp.testserver.com/myfile_1.txt
Authentication on HTTP servers will be performed in the same way.
Using a proxy
If you need to use a proxy server to download files, then that is also very simple. It is enough to specify the proxy server address in the -x option:
curl -x proxysever.test.com:3128 http://google.co.in
conclusions
In this article, we looked at how to use curl, why this utility is needed and its main capabilities. Despite their similarities with, they are very different. The curl linux command is designed more for analyzing and simulating various actions on the server, while wget is more suitable for downloading files and crawling sites.
c URL is very useful tool command line to transfer data from or to the server. Curl supports various protocols such as FILE, HTTP, HTTPS, IMAP, IMAPS, LDAP, DICT, LDAPS, TELNET, FTPS, GOPHER, RTMP, RTSP, SCP, SFTP, POP3, POP3S, SMB, SMBS, SMTP, SMTPS, and TFTP.
cURL can be used in a variety of different and interesting ways. With this tool you can download, upload and manage files, check your email address, or even update your status on some social media websites or check the weather outside. In this article, we will look at the five most useful and basic uses of the cURL tool on any .
1. Check the URL
One of the most common and simplest uses of cURL is printing the command itself followed by the URL you want to test
Curl https://domain.ru
This command will display the contents of the URL on your terminal
2. Save the URL output to a file
Curl -o website https://domain.ru % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 41793 0 41793 0 0 275k 0 --:--:-- --:-- :-- --:--:-- 2.9M
In this example, the output will be saved to a file named 'website' in the current working directory.
3. Uploading Files Using Curl
You can download files using Curl by adding the -o option to the command. It is used to save files on the local server with the same names as on the remote server
Curl -O https://domain.ru/file.zip
In this example, the archive 'file.zip' will be downloaded to the current working directory.
You can also upload a file with a different name by adding the -o option to cURL.
Curl -o archive.zip https://domain.ru/file.zip
So the archive 'file.zip' will be downloaded and saved as 'Archive.zip'.
cURL can also be used to download multiple files at once, as shown in the example below
Curl -O https://domain.ru/file.zip -O https://domain.com/file2.zip
Curl can also be used to upload files securely over SSH using the following command
Curl -u user sftp://server.domain.ru/path/to/file
Please note that you must use the full path to the file you want to download
4. Take information from the website's HTTP header
You can easily get HTTP header information from any website by adding the -I ('i') option to cURL.
Curl -I http://domain.ru HTTP/1.1 200 OK Date: Sun, 16 Oct 2016 23:37:15 GMT Server: Apache/2.4.23 (Unix) X-Powered-By: PHP/5.6.24 Connection : close Content-Type: text/html; charset=UTF-8
5. Access to FTP server
To access the FTP server from using Curl, you need to use the following command
Curl ftp://ftp.domain.ru --user username:password
Curl will connect to the FTP server and list all files and directories in the user's home directory
You can download the file using FTP
Curl ftp://ftp.domain.ru/file.zip --user username:password
and upload the file to the FTP server
Curl -T file.zip ftp://ftp.domain.ru/ --user username:password
You can check the Curl page manually to see everything available options cURL and its functionality
Man curl
PS. If you liked this post, please share it with your friends on in social networks using the buttons below or simply leave a comment. Thank you.
curl (1)NAME
curl - transfer a URLSYNOPSIS
curlDESCRIPTION
curl is a tool to transfer data from or to a server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, TFTP, DICT, TELNET, LDAP or FILE). The command is designed to work without user interaction.curl offers a busload of useful tricks like proxy support, user authentication, ftp upload, HTTP post, SSL (https:) connections, cookies, file transfer resume and more. As you will see below, the amount of features will make your head spin!
curl is powered by libcurl for all transfer-related features. See (3) for details.
URL
The URL syntax is protocol dependent. You"ll find a detailed description in RFC 3986.You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
or you can get sequences of alphanumeric series by using as in:
No nesting of the sequences is supported at the moment, but you can use several ones next to each other:
You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order.
Since curl 7.15.1 you can also specify step counter for the ranges, so that you can get every Nth number or letter:
If you specify URL without protocol:// prefix, curl will attempt to guess what protocol you might want. It will then default to HTTP but try other protocols based on often-used host name prefixes. For example, for host names starting with "ftp." curl will assume you want to speak FTP.
PROGRESS METER
curl normally displays a progress meter during operations, indicating the amount of transferred data, transfer speeds and estimated time left etc.However, since curl displays data to the terminal by default, if you invoke curl to do an operation and it is about to write data to the terminal, it disables the progress meter as otherwise it would mess up the output mixing progress meter and response data.
If you want a progress meter for HTTP POST or PUT requests, you need to redirect the response output to a file, using shell redirect (>), -o or similar.
It is not the same case for FTP upload as that operation is not spitting out any response data to the terminal.
If you prefer a progress "bar" instead of the regular meter, -# is your friend.
OPTIONS
-a/--append (FTP) When used in an FTP upload, this will tell curl to append to the target file instead of overwriting it. If the file doesn't exist, it will be created.If this option is used twice, the second one will disable append mode again.
option of course. If this option is set more than once, the last one will be the one that"s used. --anyauth (HTTP) Tells curl to figure out authentication method by itself, and use the most secure one the remote site claims it supports. This is done by first doing a request and checking the response-headers, thus inducing an extra network round-trip. This is used instead of setting a specific authentication method, which you can do with, --basic, --digest--ntlm , and.
--negotiate
If this option is used several times, the following occurrences make no difference.
(HTTP) Pass the data to the HTTP server as a cookie. It is supposedly the data previously received from the server in a "Set-Cookie:" line. The data should be in the format "NAME1=VALUE1; NAME2=VALUE2". If no "=" letter is used in the line, it is treated as a filename to use to read previously stored cookie lines from, which should be used in this session if they match. Using this method also activates the "cookie parser" which will make curl record incoming cookies too, which may be handy if you"re using this in combination with the-L/--location
option. The file format of the file to read cookies from should be plain HTTP headers or the Netscape/Mozilla cookie file format. NOTE that the file specified with-b/--cookie is only used as input. No cookies will be stored in the file. To store cookies, use the-c/--cookie-jar option or you could even save the HTTP headers to a file using!
-D/--dump-header
If this option is set more than once, the last one will be the one that"s used. -B/--use-ascii Enable ASCII transfer when using FTP or LDAP. For FTP, this can also be enforced by using an URL that ends with ";type=A". This option causes data sent to stdout to be in text mode for win32 systems. --digest, --basic If this option is used twice, the second one will disable ASCII usage. , and).
--basic (HTTP) Tells curl to use HTTP Basic authentication. This is the default and this option is usually pointless, unless you use it to override a previously set option that sets a different authentication method (such as and If this option is used several times, the following occurrences make no difference.
--ciphers
(SSL) Specifies which ciphers to use in the connection. The list of ciphers must be using valid ciphers. Read up on SSL cipher list details on this URL:
If this option is used several times, the last one will be used.
option. The file format of the file to read cookies from should be plain HTTP headers or the Netscape/Mozilla cookie file format. Specify to which file you want curl to write all cookies after a completed operation. Curl writes all cookies previously read from a specified file as well as all cookies received from remote server(s). If no cookies are known, no file will be written. The file will be written using the Netscape cookie file format. If you set the file name to a single dash, "-", the cookies will be written to stdout.
If the cookie jar can"t be created or written to, the whole curl operation won"t fail or even report an error clearly. Using -v will get a warning displayed, but that is the only visible feedback you get about this possibly lethal situation.
-C/--continue-at
Continue/Resume a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped counted from the beginning of the source file before it is transferred to the destination. If used with uploads, the ftp server command SIZE will not be used by curl.
Use "-C -" to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out. If this option is used several times, the last one will be used.--create-dirs When used in conjunction with the -o option, curl will create the necessary local directory hierarchy as needed. This option creates the dirs mentioned with the -o option, nothing else. If the -o file name uses no dir or if the dirs it mentions already exist, no dir will be created.
To create remote directories when using FTP, try --ftp-create-dirs .--crlf (FTP) Convert LF to CRLF in upload. Useful for MVS (OS/390). If this option is used several times, the following occurrences make no difference.. If this option is used more than once on the same command line, the data pieces specified will be merged together with a separating &-letter. Using "-d name=daniel -d skill=lousy" would generate a post chunk that looks like "Thus name=daniel&skill=lousy".
If you start the data with the letter @, the rest should be a file name to read the data from, or - if you want curl to read the data from stdin. The contents of the file must already be url-encoded. Multiple files can also be specified. Posting data from a file named "foobar" would thus be done with --data@foobar".
To post data purely binary, you should instead use the --data-binary option.
-d/--data is the same as --data-ascii.
If this option is used several times, the ones following the first will append data. --data-ascii -d/--data option.
(HTTP) This is an alias for the If this option is used several times, the ones following the first will append data. --data-ascii--data-binary --data-ascii(HTTP) This posts data in a similar manner as
does, although when using this option the entire context of the posted data is kept as-is. If you want to post a binary file without the strip-newlines feature of the option, this is for you. If this option is used several times, the ones following the first will append data. --digest, , and If this option is used twice, the second one will disable ASCII usage. --digest (HTTP) Enables HTTP Digest authentication. This is a authentication that prevents the password from being sent over the wire in clear text. Use this in combination with the normal-u/--user
option to set user name and password. See also
--anyauth
for related options.
This option is handy to use when you want to store the headers that a HTTP site sends to you. Cookies from the headers could then be read in a second curl invoke by using the that the file specified with option! The is only used as input. No cookies will be stored in the file. To store cookies, use the option is however a better way to store cookies.
When used on FTP, the ftp server response lines are considered being "headers" and thus are saved there.
If this option is used several times, the last one will be used.
you can append ";auto" to the --referer URL to make curl automatically set the previous URL when it follows a Location: header. The ";auto" string can be used alone, even if you don"t set an initial --referer.
--engine list
(HTTPS) Specify the path name to the Entropy Gathering Daemon socket. The socket is used to seed the random engine for SSL connections. See also the
option.
(HTTPS) Tells curl to use the specified certificate file when getting a file with HTTPS. The certificate must be in PEM format. If the optional password isn't specified, it will be queried for on the terminal. Note that this certificate is the private key and the private certificate concatenated!
The windows version of curl will automatically look for a CA certs file named "curl-ca-bundle.crt", either in the same directory as curl.exe, or in the Current Working Directory, or in any folder along your PATH.
If this option is used several times, the last one will be used.
file contains many CA certificates.
If this option is used several times, the last one will be used.
-f/--fail (HTTP) Fail silently (no output at all) on server errors. This is mostly done like this to better enable scripts etc to better deal with failed attempts. In normal cases when a HTTP server fails to deliver a document, it returns an HTML document stating so (which often also describes why and more). This flag will prevent curl from outputting that and return error 22.
If this option is used twice, the second will again disable directory creation.
--ftp-method (FTP) Control what method curl should use to reach a file on a FTP(S) server. The method argument should be one of the following alternatives: multicwd curl does a single CWD operation for each path part in the given URL. For deep hierarchies this means very many commands. This is how RFC1738 says it should be done. This is the default but the slowest behavior.
nocwd curl does no CWD at all. curl will do SIZE, RETR, STOR etc and give a full path to the server for all these commands. This is the fastest behavior.
--ftp-pasv (FTP) Use PASV when transferring. PASV is the internal default behavior, but using this option can be used to override a previous --ftp-port option. (Added in 7.11.0)
If this option is used several times, the following occurrences make no difference.
Ftp-alternative-to-user
(FTP) If authenticating with the USER and PASS commands fails, send this command. When connecting to Tumbleweed's Secure Transport server over FTPS using a client certificate, using "SITE AUTH" will tell the server to retrieve the username from the certificate. (Added in 7.15.5) --ftp-skip-pasv-ip ( FTP) Tell curl to not use the IP address the server suggests in its response to curl"s PASV command when curl connects the data connection. Instead curl will re-use the same IP address it already uses for the control connection. (Added in 7.14.2)
Example, to send your password file to the server, where "password" is the name of the form-field to which /etc/passwd will be the input:
To read the file"s content from stdin instead of a file, use - where the file name should"ve been. This goes for both @ and< constructs.
You can also tell curl what Content-Type to use by using "type=", in a manner similar to:
curl-F" [email protected];type=text/html" url.com
curl-F "name=daniel;type=text/foo" url.com
You can also explicitly change the name field of an file upload part by setting filename=, like this:
curl-F "file=@localfile;filename=nameinpost" url.com
See further examples and details in the MANUAL.
This option can be used multiple times.
-G/--get When used, this option will make all data specified with
or
curl will make sure that each header you add/replace get sent with the proper end of line marker, you should thus not add that as a part of the header content: do not add newlines or carriage returns they will only mess things up for you.
See also the -A/--user-agent If this option is used twice, the second one will disable ASCII usage. -e/--referer options.
This option can be used multiple times to add/replace/remove multiple headers.
--ignore-content-length (HTTP) Ignore the Content-Length header. This is particularly useful for servers running Apache 1.x, which will report incorrect Content-Length for files larger than 2 gigabytes.
If this option is used twice, the second will again disable header include.
--interface
If this option is used several times, each occurrence will toggle this on/off. -k/--insecure (SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers. All SSL connections are attempted to be made secure by using the CA certificate bundle installed by default. This makes all connections considered "insecure" to fail unless-k/--insecure
is used.
--key
(SSL) Private key file type. Specify which type your
provided private key is. DER, PEM and ENG are supported. If this option is used several times, the last one will be used.--krb4
(FTP) Enable kerberos4 authentication and use. The level must be entered and should be one of "clear", "safe", "confidential" or "private". Should you use a level that is not one of these, "private" will instead be used.
-V/--version
to see if your curl supports it. If this option is used several times, the last one will be used.-K/--config
Specify which config file to read curl arguments from. The config file is a text file in which command line arguments can be written which then will be used as if they were written on the actual command line. Options and their parameters must be specified on the same config file line. If the parameter is to contain white spaces, the parameter must be enclosed within quotes. If the first column of a config line is a "#" character, the rest of the line will be treated as a comment.
Specify the filename as "-" to make curl read the file from stdin. Note that to be able to specify a URL in the config file, you need to specify it using the--url
1) curl tries to find the "home dir": It first checks for the CURL_HOME and then the HOME environment variables. Failing that, it uses getpwuid() on unix-like systems (which returns the home dir given the current user in your system). On Windows, it then checks for the APPDATA variable, or as a last resort the "%USERPROFILE%Application Data".
2) On windows, if there is no _curlrc file in the home dir, it checks for one in the same dir the executable curl is placed. On unix-like systems, it will simply try to load .curlrc from the determined home dir.
Specify the maximum transfer rate you want curl to use. This feature is useful if you have a limited pipe and you"d like your transfer not use your entire bandwidth.
The given speed is measured in bytes/second, unless a suffix is appended. Appending "k" or "K" will count the number as kilobytes, "m" or M" makes it megabytes while "g" or "G" makes it gigabytes. Examples: 200K, 3m and 1G. If you are also using the-Y/--speed-limit
option, that option will take precedence and might cripple the rate-limiting slightly, to help keeping the speed-limit logic working.
If this option is used several times, the last one will be used.
-l/--list-only (FTP) When listing an FTP directory, this switch forces a name-only view. Especially useful if you want to machine-parse the contents of an FTP directory since the normal directory view doesn"t use a standard look or format.
If this option is used twice, the second will again disable location following. If no "=" letter is used in the line, it is treated as a filename to use to read previously stored cookie lines from, which should be used in this session if they match. Using this method also activates the "cookie parser" which will make curl record incoming cookies too, which may be handy if you"re using this in combination with the--location-trusted (HTTP/HTTPS) Like
, but will allow sending the name + password to all hosts that the site may redirect to. This may or may not introduce a security breach if the site redirects you do a site to which you"ll send your authentication info (which is plaintext in the case of HTTP Basic authentication).
--max-filesize
-m/--max-time Maximum time in seconds that you allow the whole operation to take. This is useful for preventing your batch jobs from hanging for hours due to slow networks or links going down. See also the--connect-timeout (4) If this option is used several times, the last one will be used. (1) -M/--manual Manual. Display the huge help text.
-n/--netrc Makes curl scan the Maximum time in seconds that you allow the whole operation to take. This is useful for preventing your batch jobs from hanging for hours due to slow networks or links going down. See also the.netrc
file in the user's home directory for login name and password. This is typically used for ftp on unix. If used with http, curl will enable user authentication. See
or for details on the file format. Curl will not complain if that file hasn't the right permissions (it should not be world nor group readable). The environment variable "HOME" is used to find the home directory. A quick and very simple example of how to setup a to allow curl to ftp to the machine host.domain.com with user name "myself" and password "secret" should look similar to: machine host.domain.com login myself password secret for details on the file format. Curl will not complain if that file hasn't the right permissions (it should not be world nor group readable). The environment variable "HOME" is used to find the home directory. does.
--negotiate (HTTP) Enables GSS-Negotiate authentication. The GSS-Negotiate method was designed by Microsoft and is used in their web applications. It is primarily meant as a support for Kerberos5 authentication but may also be used along with another authentication methods. For more information see IETF draft draft-brezak-spnego-http-04.txt. If this option is used several times, the last one will be used. This option requires that the library was built with GSSAPI support. This is not very common. Use
to see if your version supports GSS-Negotiate.
When using this option, you must also provide a fake -u/--user option to activate the authentication code properly. Sending a "-u:" is enough as the user name and password from the -u option aren't actually used.
If this option is used several times, the following occurrences make no difference.
-N/--no-buffer Disables the buffering of the output stream. In normal work situations, curl will use a standard buffered output stream that will have the effect that it will output the data in chunks, not necessarily exactly when the data arrives. Using this option will disable that buffering. If this option is used twice, the second will again switch on buffering..
--ntlm (HTTP) Enables NTLM authentication. The NTLM authentication method was designed by Microsoft and is used by IIS web servers. It is a proprietary protocol, reversed engineered by clever people and implemented in curl based on their efforts. This kind of behavior should not be endorsed, you should encourage everyone who uses NTLM to switch to a public and documented authentication method instead. Such as Digest. If this option is used several times, the last one will be used. If you want to enable NTLM for your proxy authentication, then use
--proxy-ntlm
-o/--output
See also the Write output to instead of stdout. If you are using () or to fetch multiple documents, you can use "#" followed by a number in the
specifier. That variable will be replaced with the current string for the URL being fetched. Like in:
You may use this option as many times as you have number of URLs.
(SSL) Pass phrase for the private key
If this option is used several times, the last one will be used. If this option is set more than once, the last one will be the one that"s used. --anyauth (HTTP) Tells curl to figure out authentication method by itself, and use the most secure one the remote site claims it supports. This is done by first doing a request and checking the response-headers, thus inducing an extra network round-trip. This is used instead of setting a specific authentication method, which you can do with--proxy-anyauth Tells curl to pick a suitable authentication method when communicating with the given proxy. This will cause an extra request/response round-trip. (Added in 7.13.2)
If this option is used twice, the second will again disable the proxy use-any authentication. --basic--proxy-basic Tells curl to use HTTP Basic authentication when communicating with the given proxy. Use
for enabling HTTP Basic with a remote host. Basic is the default authentication method curl uses with proxies. --digest If this option is used twice, the second will again disable proxy HTTP Basic authentication.
--proxy-digest Tells curl to use HTTP Digest authentication when communicating with the given proxy. Use for enabling HTTP Digest with a remote host. If this option is used twice, the second will again disable proxy HTTP Digest.
--proxy-ntlm Tells curl to use HTTP NTLM authentication when communicating with the given proxy. Use
for enabling NTLM with a remote host. If this option is used twice, the second will again disable proxy HTTP NTLM.-p/--proxytunnel When an HTTP proxy is used ( -x/--proxy), this option will cause non-HTTP protocols to attempt to tunnel through the proxy instead of simply using it to do HTTP-like operations. The tunnel approach is made with the HTTP proxy CONNECT request and requires that the proxy allows direct connect to the remote port number curl wants to tunnel through to. If this option is used twice, the second will again disable proxy tunnel.. EPRT is really PORT++. -q If used as the first parameter on the command line, the curlrc config file will not be read and used. See the-K/--config
-Q/--quote
specifies the bytes from offset 9500 and forward
specifies the first and last byte only(*)(H)
specifies 300 bytes from offset 500(H)
specifies two separate 100 bytes ranges(*)(H)
(*) = NOTE that this will cause the server to reply with a multipart response!
When curl is about to retry a transfer, it will first wait one second and then for all upcoming retries it will double the waiting time until it reaches 10 minutes which then will be the delay between the rest of the retries. By using --retry-delay you disable this exponential backoff algorithm. See also --retry-max-time to limit the total time allowed for retries. (Added in 7.12.3)
If this option is used multiple times, the last occurrence decide the amount.
is also used. Setting this delay to zero will make curl use the default backoff time. (Added in 7.12.3)
) as long as the timer hasn't reached this given limit. Notice that if the timer hasn't reached the limit, the request will be made and while performing, it may take longer than this given time period. To limit a single request"s maximum time, use
. Set this option to zero to not timeout retries. (Added in 7.12.3)
If this option is used multiple times, the last occurrence decide the amount.
for enabling HTTP Digest with a remote host.
If this option is used twice, the second will again disable silent mode.
If this option is used twice, the second will again disable show error. for enabling HTTP Digest with a remote host.--socks4
Use the specified SOCKS4 proxy. If the port number is not specified, it is assumed at port 1080. (Added in 7.15.2)
If this option is used several times, the last one will be used. (3) --tcp-nodelay Turn on the TCP_NODELAY option. See the
man page for details about this option. (Added in 7.11.2)
-t/--telnet-option
TTYPE=
XDISPLOC= Sets the X display location.
Sets an environment variable.
-T/--upload-file
This transfers the specified local file to the remote URL. If there is no file part in the specified URL, Curl will append the local file name. NOTE that you must use a trailing / on the last directory to really prove to Curl that there is no file name or curl will think that your last directory name is the remote file name to use. That will most likely cause the upload operation to fail. If this is used on a http(s) server, the PUT command will be used.
Use the file name "-" (a single dash) to use stdin instead of a given file. You can specify one -T for each URL on the command line. Each -T + URL pair specifies what to upload and to where. curl also supports "globbing" of the -T argument, meaning that you can upload multiple files to a single URL by using the same URL globbing style supported in the URL, like this: config file will not be read and used. See the FILES~/.curlrc
Default config file, see[:port] Sets proxy server to use for HTTPS.
There exists a bunch of different error codes and their corresponding error messages that may appear during bad conditions. At the time of this writing, the exit codes are: 1 Unsupported protocol. This build of curl has no support for this protocol. 2 Failed to initialize. is used. The existing ones are meant to never change.3 URL malformat. The syntax was not correct.
4 URL user malformatted. The user-part of the URL syntax was not correct.
5 Couldn"t resolve proxy. The given proxy host could not be resolved. 6 Couldn"t resolve host. The given remote host was not resolved.
- 7 Failed to connect to host.
- If you want to make curl its own folder, C:\Program Files\curl\ or C:\curl\ will do.
- If you have a lot of free executables and don't want to add many separate folders to your PATH, use a single folder for this purpose, such as C:\Program Files\tools\ or C:\tools\ .
- Click the Windows 10 Start menu. Start typing "environment."
- You will see the search result. Edit system environment variables. Choose this.
- The System Properties window opens. Click the button Environment Variables at the bottom.
- Select the "Path" variable in the "System Variables" section (bottom field). Click the button Change.
- Click " Add" and paste the path to the folder where curl.exe is located.
- Click OK, if it is needed. Close open console windows and reopen them so they get the new PATH .
8 FTP weird server reply. The server sent data curl couldn't parse. 9 FTP access denied. The server denied login or denied access to the particular resource or directory you wanted to reach. Most often you tried to change to a directory that doesn't exist on the server .
10 FTP user/password incorrect. Either one or both were not accepted by the server.
Find curl.exe in your downloaded package; it's probably under bin\ .
Select a location on your hard drive that will serve as a permanent home for the curls:
Place curl.exe in the folder. And never move the folder or its contents.
Then you'll want to make curl available anywhere on the command line. To do this, add the folder to PATH, like this:
Now enjoy typing curl on any command line. Time to have fun!
To run curl from the command line
a) Right click on the My Computer icon
b) Select "Properties"
d) Go to the [Advanced] tab - "Environment Variables" button
e) Under "System Variable" select "Path" and "Edit"
f) Add a semicolon and then the path to where you placed your curl.exe (eg D:\software\curl)
You can now run from the command line by typing:
Curl www.google.com
Starting with Windows 10 version 1803 (and earlier, Insider build 17063), you no longer install curl . Windows contains its own curl.exe (and tar.exe) in C:\Windows\System32\ which you can access directly from regular CMD.
C:\Users\vonc>C:\Windows\System32\curl.exe --version curl 7.55.1 (Windows) libcurl/7.55.1 WinSSL Release-Date: Protocols: dict file ftp ftps http https imap imaps pop3 pop3s smtp smtps telnet tftp Features: AsynchDNS IPv6 Largefile SSPI Kerberos SPNEGO NTLM SSL C:\Users\vonc>C:\Windows\System32\tar.exe --version bsdtar 3.3.2 - libarchive 3.3.2 zlib/1.2.5.f- ipp
It's probably worth noting that Powershell v3 and later contains the Invoke-WebRequest cmdlet, which has some curling capabilities. It's probably worth mentioning the New-WebServiceProxy and Invoke-RestMethod cmdlets as well.
I'm not sure if they will suit you or not, but even though I'm not Windows, I have to say that I find the object-based approach that PS uses much easier to work with utilities like curl, wget, etc. They might be worth a look
You can create the latest version of curl, openssl, libssh2 and zlib in 3 easy steps by following this tutorial.
Curl is built statically, so you don't have to propagate the necessary dynamic runtime.
You can also download the pre-built version (x86 and x64) from
I was looking for the download process Curl and every where they said copy the file curl.exe on System32, but they didn't provide a direct link. so you can enjoy here curl.exe easy to bin folder Just
unzip it and then go to the bin folder where you will get the exe file
I thought I would write exactly what I did (Windows 10, 64-bit version):
Select the curl executable.
Select Win64.
Choose universal.
Choose any one.
curl version: 7.53.1 - SSL enabled SSH enabled. Credit: Victor Shakats. This package is an executable type of curl. This link will get you a pre-compiled curl binary (or in some cases, using the information provided on the page that link takes you to). You may or may not install libcurl as a shared library/DLL. The file is packaged using 7zip. 7zip is a file archiving format.
Click download.
You should have curl-7.53.1-win64-mingw.7z file in your downloads folder.
Install 7-Zip if you don't have it.
Right click, 7-Zip, Extract here. Copy and paste the extracted file somewhere like Z:\Tools\
If you look in the bin folder you will see curl.exe. If you double click on it, the window will quickly flash and disappear. To run it, you need to use the command line. Go to your bin folder and enter curl followed by your options to make the request. You must use double quotes. Single quotes will not work with curl on Windows.
Now you need to add curl to the user Path variable so that you don't have to navigate to the correct folder to run the program. Go to This PC, Computer, System Properties, Advanced System Settings, log in as an administrator (you are not an administrator, right? Right?). Environment Variables, System Variables, look at the list and select Path, then Edit, then New, then e.g.
Z:\Tools\curl-7.53.1-win64-MinGW\Bin
You can add a backslash if you want, I don't think it matters. Press the move up button until it is at the top of the list and you can easily see it from the previous screen. Click OK, OK, OK, then open the command prompt and you can run curl by typing curl from any folder as any user. Don't forget your double quotes.
This is the answer I would like to receive.
This installer made it easy for me http://www.confusedbycode.com/curl/
We present to your attention a new course from the team The Codeby- "Penetration testing of Web Applications from scratch." General theory, working environment preparation, passive fuzzing and fingerprinting, Active fuzzing, Vulnerabilities, Post-exploitation, Tools, Social Engeneering and much more.
What is cURL for?
- cURL is great for simulating user actions in a browser.
Real practical example: you need to reboot your router (modem) to change the IP address. To do this, you need to: log in to the router, go to the maintenance page and click the “Reboot” button. If this action needs to be performed several times, then the procedure must be repeated. Agree, you don’t want to do this routine manually every time. cURL allows you to automate all of this. With just a few cURL commands you can achieve authorization and complete the task on the router.
- cURL is useful for retrieving data from websites on the command line.
Another practical example: we want to implement the display of general statistics for several sites. If we use cURL, then this becomes a completely trivial task: using cURL we authenticate on the statistics collection service (if required), then (again using cURL commands) we obtain the necessary pages, parse the data we need; the procedure is repeated for all our sites, then we add and display the final result.
Those. cases of using cURL are quite real, although, in the majority, cURL is needed by programmers who use it for their programs.
cURL supports many protocols and authorization methods, can transfer files, works correctly with cookies, supports SSL certificates, proxies and much more.
cURL in PHP and command line
We can use cURL in two main ways: in PHP scripts and on the command line.
To enable cURL in PHP on the server, you need to uncomment the line in the php.ini file
Extension=php_curl.dll
And then reboot the server.
On Linux you need to install the curl package.
On Debian, Ubuntu or Linux Mint:
$ sudo apt-get install curl
On Fedora, CentOS or RHEL:
$ sudo yum install curl
To clearly see the difference in use in PHP and on the command line, we will perform the same tasks twice: first in the PHP script, and then on the command line. Let's try not to get confused.
Retrieving data using cURL
Retrieving data using cURL in PHP
Example in PHP:
Everything is very simple:
$target_url— the address of the site that interests us. After the site address, you can put a colon and add the port address (if the port is different from the standard one).
curl_init— initializes a new session and returns a descriptor, which in our example is assigned to a variable $ch.
We then execute the request with the cURL function curl_exec, to which a descriptor is passed as a parameter.
Everything is very logical, but when this script is executed, the contents of the site will be displayed on our page. But what if we don’t want to display the content, but want to write it to a variable (for subsequent processing or parsing).
Let's add a little to our script:
0) ( echo "Curl error: " . curl_error($ch); ) curl_close($ch); ?>
We have a line curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);.
curl_setopt— sets options. A complete list of options can be found on this page: http://php.net/manual/ru/function.curl-setopt.php
$response_data = curl_exec($ch);
Now the script value is assigned to the $response_data variable, with which further operations can be performed. For example, you can display its contents.
If (curl_errno($ch) > 0) ( echo "Curl error: " . curl_error($ch); )
serve for debugging in case errors occur.
Retrieving data using cURL on the command line
On the command line, just type
Curl mi-al.ru
where instead of mi-al.ru- your website address.
If you need to copy data into a variable, rather than display the result on the screen, then do this:
Temp=`curl mi-al.ru`
However, some data is still displayed:
To prevent them from being displayed, add the key -s:
Temp=`curl -s mi-al.ru`
You can see what has been recorded:
Echo $temp | less
Basic and HTTP authentication
Authentication, simply put, is entering a username and password.
Basic authentication is server-based authentication. For this, two files are created: .htaccess And .htpasswd
The contents of the .htaccess file are something like this
AuthName "For registered users only!" AuthType Basic require valid-user AuthUserFile /home/freeforum.biz/htdocs/.htpassw
The contents of the .htpasswd file are something like this:
Mial:CRdiI.ZrZQRRc
Those. login and password hash.
When you try to access a password-protected folder, the browser will display something like this:
HTTP authentication is the case when we enter a login and password into a form on the site. It is this authentication that is used when logging into mail, forums, etc.
Basic cURL authentication (PHP)
There is a site http://62.113.208.29/Update_FED_DAYS/ that requires us to log in:
Let's try our initial script:
0) ( echo "Curl error: " . curl_error($ch); ) else ( echo $response_data; ) curl_close($ch); ?>
Although the script believes that there is no error, we don’t like the output result at all:
Add two lines:
Curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_BASIC); curl_setopt($ch, CURLOPT_USERPWD, "ru-board:ru-board");
The first line we set the authentication type - basic. The second line contains the name and password separated by a colon (in our case, the name and password are the same - ru-board). It turned out like this:
0) ( echo "Curl error: " . curl_error($ch); ) else ( echo $response_data; ) curl_close($ch); ?>
Basic cURL authentication (on the command line)
The same thing can be achieved on the command line with one line:
Curl -u ru-board:ru-board http://62.113.208.29/Update_FED_DAYS/
I didn't forget to specify the authentication type, it's just that in cURL the basic authentication type is the default.
Everything worked out so quickly on the command line that, out of frustration, I wrote the following program. She connects to the site and downloads the latest update:
Temp=`curl -s -u ru-board:ru-board http://62.113.208.29/Update_FED_DAYS/ | grep -E -o "Update_FED_201(1).(2).(2).7z" | uniq | tail -n 1`; curl -o $temp -u ru-board:ru-board http://62.113.208.29/Update_FED_DAYS/$temp
With just a few more commands you can add:
- unpacking the archive into the specified directory;
- launching ConsultantPlus updates (these are updates for it);
- you can check whether the latest available update has already been downloaded or whether a new one has appeared;
- add it all to Cron for daily updates.
HTTP authentication cURL
HTTP cURL authentication in PHP
We need to know:
- address where to send authentication data
- sending method GET or POST
- login
- password
Sometimes this data is not enough. Let's figure it out.
The address where you need to send the data can be taken from the authentication form. For example: