使用爬虫获取四川省各市的降雨量

Posted 及时行樂_

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了使用爬虫获取四川省各市的降雨量相关的知识,希望对你有一定的参考价值。

先看懂这个:使用爬虫获取省份降雨情况,并生成json文件

好了,上面那个链接应该看懂了,接下来来看如何获取四川省各城市的降雨量,一样的,我们先看代码。

import requests
import json
import re

headers = 
    'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (Khtml, like Gecko) Chrome/86.0.4240.183 Safari/537.36',

def getCityInfo(url:str) ->(float):
    response = requests.get(url, headers=headers)
    response.encoding = response.apparent_encoding
    _precip = re.findall('"od26":"(.*?)"',response.text)
    _precip.reverse()
    return _precip

def getcityname(cityid):
    if cityid == 0:
        str = '成都'
    elif cityid == 1:
        str = '攀枝花'
    elif cityid == 2:
        str = '自贡'
    elif cityid == 3:
        str = '绵阳'
    elif cityid == 4:
        str = '南充'
    elif cityid == 5:
        str = '达州'
    elif cityid == 6:
        str = '遂宁'
    elif cityid == 7:
        str = '广安'
    elif cityid == 8:
        str = '巴中'
    elif cityid == 9:
        str = '泸州'
    elif cityid == 10:
        str = '宜宾'
    elif cityid == 11:
        str = '内江'
    elif cityid == 12:
        str = '资阳'
    elif cityid == 13:
        str = '乐山'
    elif cityid == 14:
        str = '眉山'
    elif cityid == 15:
        str = '凉山'
    elif cityid == 16:
        str = '雅安'
    elif cityid == 17:
        str = '甘孜'
    elif cityid == 18:
        str = '阿坝'
    elif cityid == 19:
        str = '德阳'
    elif cityid == 20:
        str = '广元'
    else :
        str = '未知'
    return str
if __name__ == "__main__":
    city_dict = []
    for i in range(21):
        city_sign = 101270101 + i*100
        url = 'http://www.weather.com.cn/html/weather/%s.shtml'%city_sign
        list_precip =getCityInfo(url)
        splitstring = [float(s) for s in list_precip]
        total = sum(splitstring)
        temp = 
        # temp['cityid'] = city_sign
        temp['name'] = getcityname(i)
        temp['count'] = total
        city_dict.append(temp)
    print(city_dict)
    # # 将字典转换成json字符串
    # str_json = json.dumps(city_dict)
    # print(type(str_json))
    # 字典转换成json 存入本地文件
    with open('./sichuan_precip.json','w') as f:
        # 设置不转换成ascii  json字符串首缩进
        f.write( json.dumps(city_dict,ensure_ascii=False,indent=2))
    # print(precip)

其他都差不多,主要就是循环这里,根据行政划分,四川共有21个市州,所以只用循环21次,每一个城市的结尾也都相同,主要就是城市id的第6位和第7位不一样,由01一直到21,所以我们只要每个加100,就能实现城市的循环。

这里和省份的降雨量一样,采用城市的降雨量体现整个市州的降雨量。

一样,我们看一下结果的json文件。

[
  
    "name": "成都",
    "count": 0.0
  ,
  
    "name": "攀枝花",
    "count": 0.0
  ,
  
    "name": "自贡",
    "count": 0.0
  ,
  
    "name": "绵阳",
    "count": 0.0
  ,
  
    "name": "南充",
    "count": 0.0
  ,
  
    "name": "达州",
    "count": 0.0
  ,
  
    "name": "遂宁",
    "count": 0.0
  ,
  
    "name": "广安",
    "count": 0.0
  ,
  
    "name": "巴中",
    "count": 0.1
  ,
  
    "name": "泸州",
    "count": 0.0
  ,
  
    "name": "宜宾",
    "count": 0.0
  ,
  
    "name": "内江",
    "count": 0.0
  ,
  
    "name": "资阳",
    "count": 0.0
  ,
  
    "name": "乐山",
    "count": 0.0
  ,
  
    "name": "眉山",
    "count": 0.0
  ,
  
    "name": "凉山",
    "count": 0.0
  ,
  
    "name": "雅安",
    "count": 0.0
  ,
  
    "name": "甘孜",
    "count": 0.0
  ,
  
    "name": "阿坝",
    "count": 0.0
  ,
  
    "name": "德阳",
    "count": 0.0
  ,
  
    "name": "广元",
    "count": 0.0
  
]

以上是关于使用爬虫获取四川省各市的降雨量的主要内容,如果未能解决你的问题,请参考以下文章

使用爬虫获取四川省各市的降雨量

使用爬虫获取四川省各市的降雨量

使用爬虫获取四川省各市的降雨量

使用爬虫获取省份降雨情况,并生成json文件

使用爬虫获取省份降雨情况,并生成json文件

使用爬虫获取省份降雨情况,并生成json文件