Bug解决requests.exceptions.ReadTimeout &http.client.RemoteDisconnected: Remote end closed connecti

Posted zstar-_

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Bug解决requests.exceptions.ReadTimeout &http.client.RemoteDisconnected: Remote end closed connecti相关的知识,希望对你有一定的参考价值。

记录爬虫实践中一个小bug:
requests.exceptions.ReadTimeout

出问题的原始代码片段:

with requests.get(url=url, headers=header, timeout=3) as html:
     html.encoding = 'utf-8'
     htmlCode = html.text
     # 解析网页
     soup = BeautifulSoup(htmlCode, 'html.parser')
     # 返回解析后的页面内容
 return soup

问题原因:
timeout设置为3秒,服务器在3秒内未给出响应,出现报错。

解决方式:
使用try来捕获异常,超时、服务器拒绝访问等各种异常都继承于requests.exceptions.RequestException
捕获异常后进行3次重连:

    # 如果超时,重新进行三次连接
    reconnect = 0
    while reconnect < 3:
        try:
            with requests.post(url=url, data=data, headers=header, stream=True, timeout=20) as rep:
                # 得到中文乱码,查询网页编码方式为utf-8
                rep.encoding = 'utf-8'
                # 解析网页
                soup = BeautifulSoup(rep.text, 'html.parser')
                return soup
        except (requests.exceptions.RequestException, ValueError):
            reconnect += 1
    return "3次机会都没连上,给你机会你不中用啊"

另外补充遇到的另一个bug:
http.client.RemoteDisconnected: Remote end closed connection
此异常说明爬虫使用单一的请求头,导致被服务器发觉被封。

解决方式:使用多个user_agent,每次随机一个:

import random

def getHtml(url):
    # 爬到一半遇到这个报错
    # http.client.RemoteDisconnected: Remote end closed connection without response
    # 请求头被封,于是采用多个请求头,每次随机用一个,防止被服务器识别为爬虫
    user_agent_list = [
        "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36",
        "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36",
        "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.186 Safari/537.36",
        "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.62 Safari/537.36",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
        "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.5; en-US; rv:1.9.2.15) Gecko/20110303 Firefox/3.6.15",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
        "Mozilla/4.0 (compatible; MSIE 6.0; ) Opera/UCWEB7.0.2.37/28/999",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows Phone OS 7.5; Trident/5.0; IEMobile/9.0; HTC; Titan)"
    ]
    header = 'User-Agent': random.choice(user_agent_list)
    data = 
        'col': 1,
        'webid': '1',
        'path': 'http://www.jinan.gov.cn/',
        'columnid': '1861',  # 1861对应重点关注
        'sourceContentType': '1',
        'unitid': '543813',
        'webname': '%E6%B5%8E%E5%8D%97%E5%B8%82%E4%BA%BA%E6%B0%91%E6%94%BF%E5%BA%9C',
        'permissiontype': 0
    
    # 如果超时,重新进行三次连接
    reconnect = 0
    while reconnect < 3:
        try:
            with requests.post(url=url, data=data, headers=header, stream=True, timeout=20) as rep:
                # 得到中文乱码,查询网页编码方式为utf-8
                rep.encoding = 'utf-8'
                # 解析网页
                soup = BeautifulSoup(rep.text, 'html.parser')
                return soup
        except (requests.exceptions.RequestException, ValueError):
            reconnect += 1
    return []

以上是关于Bug解决requests.exceptions.ReadTimeout &http.client.RemoteDisconnected: Remote end closed connecti的主要内容,如果未能解决你的问题,请参考以下文章

Python关于requests.exceptions.SSLError解决方案

requests.exceptions.SSLError: HTTPSConnectionPool报错解决方案

解决python爬虫requests.exceptions.SSLError: HTTPSConnectionPool(host='XXX', port=443)问题

python使用requests时报错requests.exceptions.SSLError: HTTPSConnectionPool

python 访问网站时报错:requests.exceptions.SSLError: HTTPSConnectionPool

requests.exceptions.SSLError: HTTPSConnectionPool(host='cn.bing.com', port=443)报错解决方案