scrapyd 配置文件

Posted walkonmars

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了scrapyd 配置文件相关的知识,希望对你有一定的参考价值。

Configuration file

Scrapyd searches for configuration files in the following locations, and parses them in order with the latest one taking more priority:

  • /etc/scrapyd/scrapyd.conf (Unix)
  • c:scrapydscrapyd.conf (Windows)
  • /etc/scrapyd/conf.d/* (in alphabetical order, Unix)
  • scrapyd.conf
  • ~/.scrapyd.conf (users home directory)

 

Example configuration file

[scrapyd]
eggs_dir    = eggs
logs_dir    = logs
items_dir   =
jobs_to_keep = 5
dbs_dir     = dbs
max_proc    = 0
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 5.0
bind_address = 127.0.0.1
http_port   = 6800
debug       = off
runner      = scrapyd.runner
application = scrapyd.app.application
launcher    = scrapyd.launcher.Launcher
webroot     = scrapyd.website.Root

[services]
schedule.json     = scrapyd.webservice.Schedule
cancel.json       = scrapyd.webservice.Cancel
addversion.json   = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json  = scrapyd.webservice.ListSpiders
delproject.json   = scrapyd.webservice.DeleteProject
delversion.json   = scrapyd.webservice.DeleteVersion
listjobs.json     = scrapyd.webservice.ListJobs
daemonstatus.json = scrapyd.webservice.DaemonStatus

 

以上是关于scrapyd 配置文件的主要内容,如果未能解决你的问题,请参考以下文章

使用Scrapyd部署Scrapy爬虫到远程服务器上

scrapyd 部署scrapy项目

scrapyd打包到服务器运行

scrapyd 设置访问密码

Scrapyd部署scrapy项目

分布式爬虫的部署之Scrapyd对接Docker