class documentation

Base class for scrapy spiders. All spiders must inherit from this class.

Class Method from_crawler Undocumented
Class Method handles_request Undocumented
Class Method update_settings Undocumented
Static Method close Undocumented
Method __init__ Undocumented
Method __repr__ Undocumented
Method log Log the given message at the given log level
Method parse Undocumented
Method start_requests Undocumented
Class Variable custom_settings Undocumented
Instance Variable crawler Undocumented
Instance Variable name Undocumented
Instance Variable settings Undocumented
Instance Variable start_urls Undocumented
Property logger Undocumented
Method _parse Undocumented
Method _set_crawler Undocumented

Inherited from object_ref:

Method __new__ Undocumented
Class Variable __slots__ Undocumented
@classmethod
def from_crawler(cls, crawler, *args, **kwargs): (source)

Undocumented

@classmethod
def handles_request(cls, request): (source)

Undocumented

@classmethod
def update_settings(cls, settings): (source)

Undocumented

@staticmethod
def close(spider, reason): (source)

Undocumented

def __init__(self, name=None, **kwargs): (source)
def __repr__(self): (source)

Undocumented

def log(self, message, level=logging.DEBUG, **kw): (source)

Log the given message at the given log level This helper wraps a log call to the logger within the spider, but you can use it directly (e.g. Spider.logger.info('msg')) or use any other Python logger too.

def parse(self, response, **kwargs): (source)

Undocumented

custom_settings: Optional[dict] = (source)

Undocumented

Undocumented

settings = (source)

Undocumented

start_urls: list = (source)

Undocumented

Undocumented

def _set_crawler(self, crawler: Crawler): (source)

Undocumented