環境:Nutch0.9+Fedora5+tomcat6+JDK6
tomcat和jdk都安裝好;
二:nutch-0.9.tar.gz
將下載到的tar.gz包,解壓到/opt目錄下并改名:
#gunzip -xf nutch-0.9.tar.gz |tar xf
#mv nutch-0.9.tar.gz /usr/local/nutch
測試環境是否設置成功:運行:/urs/local/nutch/bin/nutch看一下有沒有命令參數輸出,如果有說明沒問題。
抓取過程:#cd /opt/nutch
#mkdir urls
#vi nutch.txt 輸入www.aicent.net
#vi conf/crawl-urlfilter.txt 加入以下信息:利用正則表達式對網站url抓取篩選。
/**** accept hosts in MY.DOMAIN.NAME******/
+^http://([a-z0-9]*\.)*aicent.net/
#vi nutch/nutch-site.xml(給自己的蜘蛛取一個名字)設置如下:
<configuration>
<property>
<name>http.agent.name</name>
<value>test/unique</value>
</property>
</configuration>
開始抓?。?bin/nutch crawl urls -dir crawl -detpth 5 -thread 10 >& crawl.log
等待一會,時間依據網站的大小,和設置的抓取深度。
三:apache-tomcat
在這里,當你看到每次檢索的頁面為0里,需要修改一下參數,因為tomcat中的nutch的檢索路徑不對造成的。
#vi /usr/local/tomcat/webapps/ROOT/WEB-INF/classes/nutch-site.xml
<property>
<name>searcher.dir</name>
<value>/opt/nutch/crawl</value>抓取網頁所在的路徑
<description>My path to nutch's searcher dir.</description>
</property>
#/opt/tomcat/bin/startup.sh
OK,搞定。。。
問題匯總:
運行:sh ./bin/nutch crawl urls -dir crawl -depth 3 -threads 60 -topN 100 >&./logs/nutch_log.log
1.Exception in thread "main" java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
at org.apache.nutch.crawl.Injector.inject(Injector.java:162)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:115)
網上查有說是JDK版本的問題,不能用JDK1.6,于是安裝1.5。但是還是同樣的問題,奇怪啊。
于是繼續google,發現有如下的可能:
Injector: Converting injected urls to crawl db entries.
Exception in thread "main" java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
at org.apache.nutch.crawl.Injector.inject(Injector.java:162)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:115)
說明:一般為crawl-urlfilters.txt中配置問題,比如過濾條件應為
+^http://www.ihooyo.com ,而配置成了 http://www.ihooyo.com 這樣的情況就引起如上錯誤。
但是自己的配置根本就沒有問題啊。
在Logs目錄下面除了生成nutch_log.log還自動生成一個log文件:hadoop.log
發現有錯誤出現:
2009-07-22 22:20:55,501 INFO crawl.Crawl - crawl started in: crawl
2009-07-22 22:20:55,501 INFO crawl.Crawl - rootUrlDir = urls
2009-07-22 22:20:55,502 INFO crawl.Crawl - threads = 60
2009-07-22 22:20:55,502 INFO crawl.Crawl - depth = 3
2009-07-22 22:20:55,502 INFO crawl.Crawl - topN = 100
2009-07-22 22:20:55,603 INFO crawl.Injector - Injector: starting
2009-07-22 22:20:55,604 INFO crawl.Injector - Injector: crawlDb: crawl/crawldb
2009-07-22 22:20:55,604 INFO crawl.Injector - Injector: urlDir: urls
2009-07-22 22:20:55,605 INFO crawl.Injector - Injector: Converting injected urls to crawl db entries.
2009-07-22 22:20:56,574 INFO plugin.PluginRepository - Plugins: looking in: /opt/nutch/plugins
2009-07-22 22:20:56,720 INFO plugin.PluginRepository - Plugin Auto-activation mode: [true]
2009-07-22 22:20:56,720 INFO plugin.PluginRepository - Registered Plugins:
2009-07-22 22:20:56,720 INFO plugin.PluginRepository - the nutch core extension points (nutch-extensionpoints)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Basic Query Filter (query-basic)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Basic URL Normalizer (urlnormalizer-basic)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Basic Indexing Filter (index-basic)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Html Parse Plug-in (parse-html)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Basic Summarizer Plug-in (summary-basic)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Site Query Filter (query-site)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - HTTP Framework (lib-http)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Text Parse Plug-in (parse-text)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Regex URL Filter (urlfilter-regex)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Pass-through URL Normalizer (urlnormalizer-pass)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Http Protocol Plug-in (protocol-http)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Regex URL Normalizer (urlnormalizer-regex)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - OPIC Scoring Plug-in (scoring-opic)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - CyberNeko HTML Parser (lib-nekohtml)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - JavaScript Parser (parse-js)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - URL Query Filter (query-url)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Regex URL Filter Framework (lib-regex-filter)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Registered Extension-Points:
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Nutch Summarizer (org.apache.nutch.searcher.Summarizer)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Nutch URL Normalizer (org.apache.nutch.net.URLNormalizer)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Nutch Protocol (org.apache.nutch.protocol.Protocol)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Nutch Analysis (org.apache.nutch.analysis.NutchAnalyzer)
2009-07-22 22:20:56,721 INFO plugin.PluginRepository - Nutch URL Filter (org.apache.nutch.net.URLFilter)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Nutch Indexing Filter (org.apache.nutch.indexer.IndexingFilter)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Nutch Online Search Results Clustering Plugin (org.apache.nutch.clustering.OnlineClusterer)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - HTML Parse Filter (org.apache.nutch.parse.HtmlParseFilter)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Nutch Content Parser (org.apache.nutch.parse.Parser)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Nutch Scoring (org.apache.nutch.scoring.ScoringFilter)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Nutch Query Filter (org.apache.nutch.searcher.QueryFilter)
2009-07-22 22:20:56,722 INFO plugin.PluginRepository - Ontology Model Loader (org.apache.nutch.ontology.Ontology)
2009-07-22 22:20:56,786 WARN regex.RegexURLNormalizer - can't find rules for scope 'inject', using default
2009-07-22 22:20:56,829 WARN mapred.LocalJobRunner - job_2319eh
java.lang.RuntimeException: java.net.UnknownHostException: jackliu: jackliu
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:617)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:591)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:364)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:390)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.startPartition(MapTask.java:294)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpillToDisk(MapTask.java:355)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$100(MapTask.java:231)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:180)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:126)
Caused by: java.net.UnknownHostException: jackliu: jackliu
at java.net.InetAddress.getLocalHost(InetAddress.java:1353)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:614)
... 8 more
也就是Host配置錯誤,于是:
Add the following to your /etc/hosts file
127.0.0.1 jackliu
這次再次運行,結果成功!
2:http://127.0.0.1:8080/nutch-0.9
輸入nutch進行查詢,結果報錯:
HTTP Status 500 -
type Exception report
message
description The server encountered an internal error () that prevented it from fulfilling this request.
exception
org.apache.jasper.JasperException: /search.jsp(151,22) Attribute value language + "/include/header.html" is quoted with " which must be escaped when used within the value
org.apache.jasper.compiler.DefaultErrorHandler.jspError(DefaultErrorHandler.java:40)
org.apache.jasper.compiler.ErrorDispatcher.dispatch(ErrorDispatcher.java:407)
org.apache.jasper.compiler.ErrorDispatcher.jspError(ErrorDispatcher.java:198)
org.apache.jasper.compiler.Parser.parseQuoted(Parser.java:299)
org.apache.jasper.compiler.Parser.parseAttributeValue(Parser.java:249)
org.apache.jasper.compiler.Parser.parseAttribute(Parser.java:211)
org.apache.jasper.compiler.Parser.parseAttributes(Parser.java:154)
org.apache.jasper.compiler.Parser.parseInclude(Parser.java:867)
org.apache.jasper.compiler.Parser.parseStandardAction(Parser.java:1134)
org.apache.jasper.compiler.Parser.parseElements(Parser.java:1461)
org.apache.jasper.compiler.Parser.parse(Parser.java:137)
org.apache.jasper.compiler.ParserController.doParse(ParserController.java:255)
org.apache.jasper.compiler.ParserController.parse(ParserController.java:103)
org.apache.jasper.compiler.Compiler.generateJava(Compiler.java:170)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:332)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:312)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:299)
org.apache.jasper.JspCompilationContext.compile(JspCompilationContext.java:586)
org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:317)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267)
javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
note The full stack trace of the root cause is available in the Apache Tomcat/6.0.20 logs.
分析:查看nutch Web應用根目錄下的search.jsp可知,是引號匹配的問題。
<jsp:include page="<%= language + "/include/header.html"%>"/> //line 152 search.jsp
第一個引號和后面第一個出現的引號進行匹配,而不是和這一行最后一個引號進行匹配,所以問題就出現了。
解決方法:
將該行代碼修改為:<jsp:include page="<%= language+urlsuffix %>"/>
這里我們定一個字符串urlsuffix,我們把它定義在language字符串定義之后,
String language = // line 116 search.jsp
ResourceBundle.getBundle("org.nutch.jsp.search", request.getLocale())
.getLocale().getLanguage();
String urlsuffix="/include/header.html";
修改完成后,為確保修改成功,重啟一下Tomcat服務器,進行搜索,不再報錯。
3.無法查詢結果?
對比nutch_log.log的結果發現和網上描述的不同,而且crawl里面只有兩個文件夾segments和crawldb,后來重新爬了一次
發現這次是好的,奇怪不知道為什么上次爬的失敗了。
4.cached.jsp explain.jsp等都有上面3的錯誤,更正過去就OK了。
5.今天花了一上午和半個下午的時間終于搞定了nutch的安裝和配置了。明天繼續學習。