green-spider/config/__init__.py
Marian Steinbach c59db691a0
Reparatur und Aufräumen an der job execution (#340)
* Update jq URL

* Improve docker compose setup

* Script makeover: only one spider job, debian 11, add git clone

* Update image name

* Add some docs

* Pin click to v7 due to problems with rq

* Newline

* Improve manager code

* Add make tarket venv

* Remove obsolete 'spider' command from cli

* Remove git clone from manager code

* Remove worker functions from spider code

* Let 'make jobs' execute git clone and use docker compose

* Add 'spider' make target

* Update .dockerignore

* Add dryrun target to spider a URL without storing results

* Remove unused config entry
2024-03-04 17:18:37 +01:00

24 lines
580 B
Python

# connection timeout for website checks (seconds)
CONNECT_TIMEOUT = 5
# response timeout for website checks
READ_TIMEOUT = 10
# folder in that repo that holds the data
GREEN_DIRECTORY_DATA_PATH = 'data/countries/de'
# folder we use locally to clone the repo
GREEN_DIRECTORY_LOCAL_PATH = './cache/green-directory'
# IP address of the verdigado GCMS server
GCMS_IP = "194.29.234.123"
# kind name of the spider job key datastore entities
JOB_DATASTORE_KIND = 'spider-jobs'
K8S_JOBS_PATH = './k8s-jobs'
K8S_JOB_TEMPLATE = './manager/job_template.yaml'
K8S_JOB_BATCH_SIZE = 10