从表中刮取数据并将其存储在csv文件中
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了从表中刮取数据并将其存储在csv文件中相关的知识,希望对你有一定的参考价值。
我想废弃这个网站上的数据,并以这种方式将其存储在csv文件中。
但是当我试图废弃数据时,它并没有以精确的格式存储。所有数据都存储在第1列中。我不知道如何处理这个问题。
链接:https://pce.ac.in/students/bachelors-students/
码:
import csv # file operations
from bs4 import BeautifulSoup as soup # lib for pulling data from html/xmlsites
from urllib.request import urlopen as uReq # lib for sending and rec info over http
Url = 'https://pce.ac.in/students/bachelors-students/'
pageHtml = uReq(Url)
soup = soup(pageHtml,"html.parser") #parse the html
table = soup.find_all("table", { "class" : "tablepress tablepress-id-10 tablepress-responsive-phone" })
f = csv.writer(open('BEPillaiDepart.csv', 'w'))
f.writerow(['Choice Code', 'Course Name', 'Year of Establishment','Sanctioned Strength']) # headers
for x in table:
data=""
table_body = x.find('tbody') #find tbody tag
rows = table_body.find_all('tr') #find all tr tag
for tr in rows:
cols = tr.find_all('td') #find all td tags
for td in cols:
data=data+ "
"+ td.text.strip()
f.writerow([data])
#print(data)
答案
在每个tr标签中创建变量数据,您可以这样尝试:
import csv # file operations
from bs4 import BeautifulSoup as soup # lib for pulling data from html/xmlsites
from urllib.request import urlopen as uReq # lib for sending and rec info over http
Url = 'https://pce.ac.in/students/bachelors-students/'
pageHtml = uReq(Url)
soup = soup(pageHtml,"html.parser") #parse the html
table = soup.find_all("table", { "class" : "tablepress tablepress-id-10 tablepress-responsive-phone" })
with open('BEPillaiDepart.csv', 'w',newline='') as csvfile:
f = csv.writer(csvfile)
f.writerow(['Choice Code', 'Course Name', 'Year of Establishment','Sanctioned Strength']) # headers
for x in table:
table_body = x.find('tbody') #find tbody tag
rows = table_body.find_all('tr') #find all tr tag
for tr in rows:
data=[]
cols = tr.find_all('td') #find all td tags
for td in cols:
data.append(td.text.strip())
f.writerow(data)
print(data)
另一答案
如果您搜索csv的含义,您会发现它意味着逗号分隔值,但是在将文本附加到文件时,我看不到文本中的任何逗号。
以上是关于从表中刮取数据并将其存储在csv文件中的主要内容,如果未能解决你的问题,请参考以下文章