dig wget curl

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了dig wget curl相关的知识,希望对你有一定的参考价值。

dig

dig @202.106.0.20 www.chinacache.com @指定将要使用DNS

#dig chinacache.com NS使用NS参数查看权威服务器地址,注意这里请求解析的对象是chinacache.com,也就是域(domain)而非www.chinacache.com,也就是区(zone)
#dig www.chinacache.com +trace使用+trace选项追踪递归解析过程
#dig chinacahe.com MX使用MX参数查看邮件交换服务器,注意这里也是chinacache.com
#dig chinacache.com ANY使用ANY查看所有记录
#dig chinacache.com AAAA使用AAAA查看ipv4与ipv6混用的DNS记录
#dig www.chinacache.com +short给出最简明的应答,写脚本时可能很有用
WGET

* -b 后台运行,默认运行日志写入当前目录的wget-log文件
* -o file 将wget日志写入file文件
* -a file 将wget日志追加到file文件
* -d 输出debug信息,包含请求头和响应头信息
* -q 安静模式
* -i file 读取url列表文件进行下载
* -O 将文件另存为
* -c 断点续传
* -S 输出响应header
* -U 指定User-Agent
* --header=header-line  指定特殊的请求header –header=‘Host: www.baidu.com’
* --referer=url 指定请求的refer字段  --referer=http://www.baidu.com
* --limit-rate 限速  --limit-rate=100k
* -e http_proxy=123.123.123.123 指定代理
* -A "Mozilla/5.0 (Windows NT 6.1; WOW64)"
* wget  --refer=www.baidu.com 
* wget --user-agent="Mozilla/5.0 “

Curl

* curl   http://www.baidu.com 直接将内容输出到屏幕
* -A 指定User-Agent          –A chinacache
* -e 指定referer         –e www.baidu.com
* -H 指定特殊的header      -H “From: chinacache”
* -I head请求
* --limit-rate 限定速度     --limit-rate 100K
* -m time  整个下载的超时时间(秒) –m 120
* -o 文件另存为
* -r 发送range请求  -r 100-200
* -s 安静模式
* -x 指定代理    -x 8.8.8.8:80
* -0/--http1.0  指定使用http1.0请求
* -v 详细输出请求header及响应header
* --compress   压缩
* -w/--write-out: 可以定义输出的内容,如常用的http码,tcp连接时间,域名解析的时间,握手时间及第一时间响应时间等,非常强大。

用法如:
curl -o /dev/null -s -w %{http_code} "http://www.baidu.com" 打印出返回的http码
curl -o /dev/null -s -w “time_total: %{time_total}\n" "http://www.baidu.com" 打印响应时间

    * url_effective
    * http_code
    * http_connect
    * time_total
    * time_namelookup
    * time_connect
    * time_pretransfer
    * time_redirect
    * time_starttransfer
    * size_download
    * size_upload
    * size_header
    * size_request
    * speed_download
    * speed_upload
    * content_type
    * num_connects
    * num_redirects
    * ftp_entry_path

以下是可用的变量名:

 -w, --write-out
 以下变量会按CURL认为合适的格式输出,输出变量需要按照%{variable_name}的格式,如果需要输出%,double一下即可,即%%,同时,\n是换行,\r是回车,\t是TAB。

url_effective The URL that was fetched last. This is most meaningful if you‘ve told curl to follow location: headers.

filename_effective The ultimate filename that curl writes out to. This is only meaningful if curl is told to write to a file with the --remote-name or --output option. It‘s most useful in combination with the --remote-header-name option. (Added in 7.25.1)

http_code http状态码,如200成功,301转向,404未找到,500服务器错误等。(The numerical response code that was found in the last retrieved HTTP(S) or FTP(s) transfer. In 7.18.2 the alias response_code was added to show the same info.)

http_connect The numerical code that was found in the last response (from a proxy) to a curl CONNECT request. (Added in 7.12.4)

time_total 总时间,按秒计。精确到小数点后三位。 (The total time, in seconds, that the full operation lasted. The time will be displayed with millisecond resolution.)

time_namelookup DNS解析时间,从请求开始到DNS解析完毕所用时间。(The time, in seconds, it took from the start until the name resolving was completed.)

time_connect 连接时间,从开始到建立TCP连接完成所用时间,包括前边DNS解析时间,如果需要单纯的得到连接时间,用这个time_connect时间减去前边time_namelookup时间。以下同理,不再赘述。(The time, in seconds, it took from the start until the TCP connect to the remote host (or proxy) was completed.)

time_appconnect 连接建立完成时间,如SSL/SSH等建立连接或者完成三次握手时间。(The time, in seconds, it took from the start until the SSL/SSH/etc connect/handshake to the remote host was completed. (Added in 7.19.0))

time_pretransfer 从开始到准备传输的时间。(The time, in seconds, it took from the start until the file transfer was just about to begin. This includes all pre-transfer commands and negotiations that are specific to the particular protocol(s) involved.)

time_redirect 重定向时间,包括到最后一次传输前的几次重定向的DNS解析,连接,预传输,传输时间。(The time, in seconds, it took for all redirection steps include name lookup, connect, pretransfer and transfer before the final transaction was started. time_redirect shows the complete execution time for multiple redirections. (Added in 7.12.3))

time_starttransfer 开始传输时间。在发出请求之后,Web 服务器返回数据的第一个字节所用的时间(The time, in seconds, it took from the start until the first byte was just about to be transferred. This includes time_pretransfer and also the time the server needed to calculate the result.)

size_download 下载大小。(The total amount of bytes that were downloaded.)

size_upload 上传大小。(The total amount of bytes that were uploaded.)

size_header 下载的header的大小(The total amount of bytes of the downloaded headers.)

size_request 请求的大小。(The total amount of bytes that were sent in the HTTP request.)

speed_download 下载速度,单位-字节每秒。(The average download speed that curl measured for the complete download. Bytes per second.)

speed_upload 上传速度,单位-字节每秒。(The average upload speed that curl measured for the complete upload. Bytes per second.)

content_type 就是content-Type,不用多说了,这是一个访问我博客首页返回的结果示例(text/html; charset=UTF-8);(The Content-Type of the requested document, if there was any.)

num_connects Number of new connects made in the recent transfer. (Added in 7.12.3)

num_redirects Number of redirects that were followed in the request. (Added in 7.12.3)

redirect_url When a HTTP request was made without -L to follow redirects, this variable will show the actual URL a redirect would take you to. (Added in 7.18.2)

ftp_entry_path The initial path libcurl ended up in when logging on to the remote FTP server. (Added in 7.15.4)

ssl_verify_result ssl认证结果,返回0表示认证成功。( The result of the SSL peer certificate verification that was requested. 0 means the verification was successful. (Added in 7.19.0))

若多次使用-w参数,按最后一个的格式输出。If this option is used several times, the last one will be used.
转载请注明:来自:DigDeeply’s Blog–使用 cURL 获取站点的各类响应时间 – dns解析时间,响应时间,传输时间

以上是关于dig wget curl的主要内容,如果未能解决你的问题,请参考以下文章

使用 dig 和 curl 确定网站的物理主机

云服务器查看外网ip

网络操作命令

postman 自动生成 curl 代码片段

postman 自动生成 curl 代码片段

HTTP服务原理