Kali Linux 之 web安全扫描器 skipfish 使用

Posted 我超怕的

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Kali Linux 之 web安全扫描器 skipfish 使用相关的知识,希望对你有一定的参考价值。

0x00.skipfish简介

谷歌公司出品的开源web程序评估软件。 

skipfish特点:CPU资源占用低,扫描速度快,每秒可以轻松处理2000个请求,误报率低。

1x00.skipfish使用

 1x01  帮助信息 

 

 

[email protected]:~# skipfish --help
    skipfish web application scanner - version 2.10b
    Usage: skipfish [ options ... ] -W wordlist -o output_dir start_url [ start_url2 ... ]

    Authentication and access options:

      -A user:pass      - use specified HTTP authentication credentials
      -F host=IP        - pretend that host resolves to IP
      -C name=val       - append a custom cookie to all requests
      -H name=val       - append a custom HTTP header to all requests
      -b (i|f|p)        - use headers consistent with MSIE / Firefox / iPhone
      -N                - do not accept any new cookies
      --auth-form url   - form authentication URL
      --auth-user user  - form authentication user
      --auth-pass pass  - form authentication password
      --auth-verify-url -  URL for in-session detection

    Crawl scope options:

      -d max_depth     - maximum crawl tree depth (16)
      -c max_child     - maximum children to index per node (512)
      -x max_desc      - maximum descendants to index per branch (8192)
      -r r_limit       - max total number of requests to send (100000000)
      -p crawl%        - node and link crawl probability (100%)
      -q hex           - repeat probabilistic scan with given seed
      -I string        - only follow URLs matching string
      -X string        - exclude URLs matching string
      -K string        - do not fuzz parameters named string
      -D domain        - crawl cross-site links to another domain
      -B domain        - trust, but do not crawl, another domain
      -Z               - do not descend into 5xx locations
      -O               - do not submit any forms
      -P               - do not parse html, etc, to find new links

    Reporting options:

      -o dir          - write output to specified directory (required)
      -M              - log warnings about mixed content / non-SSL passwords
      -E              - log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
      -U              - log all external URLs and e-mails seen
      -Q              - completely suppress duplicate nodes in reports
      -u              - be quiet, disable realtime progress stats
      -v              - enable runtime logging (to stderr)

    Dictionary management options:

      -W wordlist     - use a specified read-write wordlist (required)
      -S wordlist     - load a supplemental read-only wordlist
      -L              - do not auto-learn new keywords for the site
      -Y              - do not fuzz extensions in directory brute-force
      -R age          - purge words hit more than age scans ago
      -T name=val     - add new form auto-fill rule
      -G max_guess    - maximum number of keyword guesses to keep (256)

      -z sigfile      - load signatures from this file

    Performance settings:

      -g max_conn     - max simultaneous TCP connections, global (40)
      -m host_conn    - max simultaneous connections, per target IP (10)
      -f max_fail     - max number of consecutive HTTP errors (100)
      -t req_tmout    - total request response timeout (20 s)
      -w rw_tmout     - individual network I/O timeout (10 s)
      -i idle_tmout   - timeout on idle HTTP connections (10 s)
      -s s_limit      - response size limit (400000 B)
      -e              - do not keep binary responses for reporting

    Other settings:

      -l max_req      - max requests per second (0.000000)
      -k duration     - stop scanning after the given duration h:m:s
      --config file   - load the specified configuration file

    Send comments and complaints to <[email protected]>.

 

1x02 

? skipfish -o test [url]  #test为保存结果的文件名
? skipfish -o test @url.txt #指定目标IP列表文件
? skipfish -o test -S complet.wl -W abc.wl [url]  #-S load a supplemental read-only wordlist,-W  use a specified read-write wordlist (required)

? -I 只检查包含′string′的 URL
? -X 不检查包含′string′的URL
? -K 不对指定参数进行 Fuzz 测试
? -D 跨站点爬另外一个域
? -l 每秒最大请求数
? -m 每IP最大并发连接数
? --config 指定配置文件

 











以上是关于Kali Linux 之 web安全扫描器 skipfish 使用的主要内容,如果未能解决你的问题,请参考以下文章

安全牛学习笔记Web扫描器

网安入门必备的12个kali Linux工具

#yyds干货盘点#网络安全与网站安全及计算机安全:Kali Linux网络安全与主机扫描背后的“秘密”

Kali Linux Wmap扫描小记

安全牛学习笔记kali TOP10 安全工具:

kali linux怎么安装scanner