• You are not logged in.

#1 Sept. 2, 2013 21:30:13

admin
Registered: 2012-03-15
Posts: 289
Reputation: +  1  -
Profile   Send e-mail  

How to avoid crash when scraping big sites

After a long-running, the program may crash. This is an inevitable question, especially for the software with browser core. When performing a few hours and opened tens of thousands pages, it may crash unexplained.

FMiner can resume extraction without missing any pages, and you can restart and resume it when crash. For big site, you can use a tool to minitor it, restart and resume automatically when crash. See the tutorial here: http://www.fminer.com/faq/

PS: In older version, we add a feature to let it can restart itself automatically when crash, but we removed this feature later, because we think we should put all energe on the core of the program to make it easy to use and powerful to scrape all kinds of sites. This feature is rarely used and can be done by other tools, these tools are better at this job.

Edited admin (Sept. 2, 2013 21:39:53)

Offline

Board footer

Moderator control

Powered by DjangoBB