锘??xml version="1.0" encoding="utf-8" standalone="yes"?>日韩欧美亚洲中文乱码,亚洲精品自偷自拍无码,亚洲小说图片视频http://www.tkk7.com/bacoo/category/35981.html蹇冩鏈潵錛屽紑鍒涙湭鏉ワ紒zh-cnWed, 07 Jan 2009 23:15:44 GMTWed, 07 Jan 2009 23:15:44 GMT60InputFormat瀛︿範http://www.tkk7.com/bacoo/archive/2009/01/07/250221.htmlso trueso trueWed, 07 Jan 2009 01:40:00 GMThttp://www.tkk7.com/bacoo/archive/2009/01/07/250221.htmlhttp://www.tkk7.com/bacoo/comments/250221.htmlhttp://www.tkk7.com/bacoo/archive/2009/01/07/250221.html#Feedback0http://www.tkk7.com/bacoo/comments/commentRss/250221.htmlhttp://www.tkk7.com/bacoo/services/trackbacks/250221.htmlInputFormat錛屽氨鏄負浜嗚兘澶熶粠涓涓猨obconf涓緱鍒頒竴涓猻plit闆嗗悎錛圛nputSplit[]錛夛紝鐒跺悗鍐嶄負榪欎釜split闆嗗悎閰嶄笂涓涓悎閫傜殑RecordReader錛坓etRecordReader錛夋潵璇誨彇姣忎釜split涓殑鏁版嵁銆?/p>

InputSplit錛岀戶鎵胯嚜Writable鎺ュ彛錛屽洜姝や竴涓狪nputSplit瀹炲垯鍖呭惈浜嗗洓涓帴鍙e嚱鏁幫紝璇誨拰鍐欙紙readFields鍜寃rite錛夛紝getLength鑳藉緇欏嚭榪欎釜split涓墍璁板綍鐨勬暟鎹ぇ灝忥紝getLocations鑳藉寰楀埌榪欎釜split浣嶄簬鍝簺涓繪満涔嬩笂錛坆lkLocations[blkIndex].getHosts()錛夛紝榪欓噷闇瑕佽鏄庣殑鏄竴涓猙lock瑕佷箞瀵瑰簲涓涓猻plit錛岃涔堝搴斿涓猻plit錛屽洜姝ゆ瘡涓猻plit閮藉彲浠ヤ粠瀹冩墍灞炵殑block涓幏鍙栦富鏈轟俊鎭紝鑰屼笖鎴戠寽嫻媌lock鐨勫ぇ灝忓簲璇ユ槸split鐨勬暣鏁板嶏紝鍚﹀垯鏈夊彲鑳戒竴涓猻plit璺ㄨ秺涓や釜block銆?/p>

瀵逛簬RecordReader錛屽叾瀹炶繖涓帴鍙d富瑕佸氨鏄負浜嗙淮鎶や竴緇?lt;K,V>閿煎錛屼換浣曚竴涓疄鐜頒簡璇ユ帴鍙g殑綾葷殑鏋勯犲嚱鏁伴兘闇瑕佹槸“(Configuration conf, Class< ? extends InputSplit> split)”鐨勫艦寮忥紝鍥犱負涓涓猂ecordReader鏄湁閽堝鎬х殑錛屽氨鏄拡瀵規煇縐峴plit鏉ヨ繘琛岀殑錛屽洜姝ゅ繀欏誨緱涓庢煇縐峴plit緇戝畾璧鋒潵銆傝繖涓帴鍙d腑鏈閲嶈鐨勬柟娉曞氨鏄痭ext錛屽湪鍒╃敤next榪涜璇誨彇K鍜孷鏃訛紝闇瑕佸厛閫氳繃createKey鍜宑reateValue鏉ュ垱寤篕鍜孷鐨勫璞★紝鐒跺悗鍐嶄紶緇檔ext浣滀負鍙傛暟錛屼嬌寰梟ext瀵瑰艦鍙備腑鐨勬暟鎹垚鍛樿繘琛屼慨鏀廣?/p>

涓涓猣ile錛團ileStatus錛夊垎鎴愬涓猙lock瀛樺偍錛圔lockLocation[]錛夛紝姣忎釜block閮芥湁鍥哄畾鐨勫ぇ灝忥紙file.getBlockSize()錛夛紝鐒跺悗璁$畻鍑烘瘡涓猻plit鎵闇鐨勫ぇ灝忥紙computeSplitSize(goalSize, minSize, blockSize)錛夛紝鐒跺悗灝嗛暱搴︿負length錛坒ile.getLen()錛夌殑file鍒嗗壊涓哄涓猻plit錛屾渶鍚庝竴涓笉瓚充竴涓猻plit澶у皬鐨勯儴鍒嗗崟鐙負鍏跺垎閰嶄竴涓猻plit錛屾渶鍚庤繑鍥炶繖涓猣ile鍒嗗壊鐨勬渶緇堢粨鏋滐紙return splits.toArray(new FileSplit[splits.size()])錛夈?/p>

涓涓猨ob錛屼細寰楀埌杈撳叆鐨勬枃浠惰礬寰勶紙conf.get("mapred.input.dir", "")錛夛紝鐒跺悗鎹鍙互寰楀埌涓涓狿ath[]錛屽浜庢瘡涓狿ath錛岄兘鍙互寰楀埌涓涓猣s錛團ileSystem fs = p.getFileSystem(job)錛夛紝鐒跺悗鍐嶅緱鍒頒竴涓狥ileStatus[]錛團ileStatus[] matches = fs.globStatus(p, inputFilter)錛夛紝鍐嶆妸閲岄潰鐨勬瘡涓狥ileStatus鎷垮嚭鏉ワ紝鍒ゆ柇鍏舵槸鍚︿負dir錛屽鏋滄槸鐨勮瘽灝盕ileStatus stat:fs.listStatus(globStat.getPath(), inputFilter)錛岀劧鍚庡啀灝唖tat鍔犲叆鍒版渶緇堢殑緇撴灉闆嗕腑result錛涘鏋滄槸鏂囦歡鐨勮瘽錛岄偅灝辯洿鎺ュ姞鍏ュ埌緇撴灉闆嗕腑銆傝寰楃畝媧佷竴浜涳紝灝辨槸涓涓猨ob浼氬緱鍒癷nput.dir涓殑鎵鏈夋枃浠訛紝姣忎釜鏂囦歡閮界敤FileStatus鏉ヨ褰曘?/p>

MultiFileSplit鐨勫畼鏂規弿榪版槸“A sub-collection of input files. Unlike {@link FileSplit}, MultiFileSplit class does not represent a split of a file, but a split of input files into smaller sets. The atomic unit of split is a file.”錛屼竴涓狹ultiFileSplit涓惈鏈夊涓皬鏂囦歡錛屾瘡涓枃浠跺簲璇ュ彧闅跺睘浜庝竴涓猙lock錛岀劧鍚巊etLocations灝辮繑鍥炴墍鏈夊皬鏂囦歡瀵瑰簲鐨刡lock鐨刧etHosts錛沢etLength榪斿洖鎵鏈夋枃浠剁殑澶у皬鎬誨拰銆?/p>

瀵逛簬MultiFileInputFormat錛屽畠鐨刧etSplits榪斿洖鐨勬槸涓涓狹ultiFileSplit鐨勯泦鍚堬紝涔熷氨鏄竴涓釜鐨勫皬鏂囦歡綈囷紝涓句釜綆鍗曠殑渚嬪瓙灝變細寰堟竻妤氫簡錛氬亣瀹氳繖涓猨ob涓湁5涓皬鏂囦歡錛屽ぇ灝忓垎鍒負2錛?錛?錛?錛?錛涘亣瀹氭垜浠湡鏈泂plit鐨勬繪暟鐩負3鐨勮瘽錛屽厛綆楀嚭涓猟ouble avgLengthPerSplit = ((double)totLength) / numSplits錛岀粨鏋滃簲璇ヤ負5錛涚劧鍚庡啀鍒囧垎錛屽洜姝ゅ緱鍒扮殑涓変釜鏂囦歡綈囦負錛歿鏂囦歡1鍜?}銆亄鏂囦歡3}銆亄鏂囦歡4鍜?}銆傚鏋滆繖浜斾釜鏂囦歡鐨勫ぇ灝忓垎鍒負2錛?錛?錛?錛?錛涢偅涔堝簲璇ュ緱鍒板洓涓枃浠剁皣涓猴細{鏂囦歡1}銆亄鏂囦歡2}銆亄鏂囦歡3鍜?}銆亄鏂囦歡5}銆傛澶栵紝榪欎釜綾葷殑getRecordReader渚濈劧鏄釜abstract鐨勬柟娉曪紝鍥犳鍏跺瓙綾誨繀欏誨緱瀹炵幇榪欎釜鍑芥暟銆?/p>

so true 2009-01-07 09:40 鍙戣〃璇勮
]]>
閰嶇疆鍒嗗竷寮廻adoop鏃秙sh鏂歸潰璇ユ敞鎰忕殑浜嬮」http://www.tkk7.com/bacoo/archive/2008/11/15/240625.htmlso trueso trueFri, 14 Nov 2008 17:25:00 GMThttp://www.tkk7.com/bacoo/archive/2008/11/15/240625.htmlhttp://www.tkk7.com/bacoo/comments/240625.htmlhttp://www.tkk7.com/bacoo/archive/2008/11/15/240625.html#Feedback0http://www.tkk7.com/bacoo/comments/commentRss/240625.htmlhttp://www.tkk7.com/bacoo/services/trackbacks/240625.html閰嶇疆ssh鏃犲瘑鐮佽闂細
姣斿錛孉鏄痵erver錛孊鏄痗lient錛岀幇鍦˙甯屾湜閫氳繃ssh鏃犲瘑鐮佽闂瓵錛岄偅涔堝氨闇瑕佹妸B鐨勫叕鍖欐斁鍒癆鐨刟uthorized_keys鏂囦歡涓?/p>

1銆傞鍏堥渶瑕丄鏀寔榪欑璁塊棶妯″紡錛?br /> 閰嶇疆A鐨?etc/ssh/sshd_config錛屽皢榪欎袱欏硅緗涓嬶細
RSAAuthentication yes
PubkeyAuthentication yes

2銆侭鐢熶駭id_rsa.pub錛屽茍灝嗚繖涓枃浠朵腑鐨勫唴瀹規渶緇堢敤“>>”娣誨姞鍒癆鐨刟uthorized_keys鏂囦歡鏈熬銆?/p>

3銆傚湪B涓婏紝ssh A鐨刬p/A鐨刪ostname灝卞彲浠ュ疄鐜版棤瀵嗙爜鐧婚檰A浜?/p>

浣嗘槸榪欎箞鍋氭槸鏈夊墠鎻愮殑錛屽緢澶氫漢閮藉拷鐣ヤ簡榪欎釜鍓嶆彁錛屽鑷磋垂浜嗗緢澶氬懆鎶橀兘娌℃湁鎴愬姛錛屽氨鍍忔垜浼肩殑錛屾垜灝辮垂浜嗗緢澶氭椂闂存墠鎵懼埌闂鎵鍦ㄣ?br /> 鍥犱負A鎴朆鏈哄櫒閲岄兘鏈夊緢澶氫釜璐︽埛錛屽湪B涓婇敭鍏sh鍛戒護鍚庯紝鎴戜滑騫舵病鏈夊埗瀹氳繛鎺ュ埌A涓婄殑閭d釜甯愭埛錛岄偅涔堣繖閲岄潰榛樿鐨勬綔瑙勫垯鏄粈涔堝憿錛熷氨鏄綘鍦˙涓妔sh鏃訛紝褰撳墠浣跨敤鐨勯偅涓笎鎴鳳紙鍋囧鍚嶅瓧鏄痟aha錛夊氨浼氫綔涓轟綘鏈熷緟榪炴帴鍒癆涓婄殑甯愭埛錛屾垜浠彲浠ユ樉紺虹殑閫氳繃ssh -l haha [hostname]鎴栬卻sh haha@[hostname]榪欑鏂瑰紡鏉ヨ繛鎺ュ埌A涓婄殑haha甯愭埛錛屽鏋滅敤闅愬+瑙勫垯鐨勮瘽錛岄偅涔堢郴緇熷氨鏄緷鎹綘鍦˙涓婂綋鍓嶄嬌鐢ㄧ殑甯愭埛鏉ヤ綔涓篈涓婅榪炴帴鐨勫笎鎴楓?br /> 鍥犳錛岃瀹炵幇鏃犲瘑鐮佽闂殑鍓嶆彁灝辨槸錛欰鍜孊涓婃湁鍚屾牱鐨勫笎鎴峰悕縐幫紝瀹屽叏涓鑷達紝鍖呮嫭澶у皬鍐欍傦紙鎴戝氨寰堥儊闂鳳紝鍥犱負鎴戝湪windows涓嬬敤cygwin鍜屼竴涓猯inux鏈哄櫒榪炴帴錛寃indows涓嬬殑甯愭埛絎竴涓瓧姣嶅ぇ鍐欎簡錛岃宭inux鐨勫笎鎴風殑絎竴涓瓧姣嶆槸灝忓啓鐨勶紝瀵艱嚧鎴戣垂浜嗗緢闀挎椂闂撮兘娌℃湁鍙戠幇闂鐥囩粨鎵鍦級銆傚叾瀹烇紝榪欎篃灝辨槸涓轟粈涔堝湪閰嶇疆hadoop鍒嗗竷寮忚綆楁椂錛屽繀欏昏姹傜殑姣忎釜鏈哄櫒涓婇兘蹇呴』鏈変竴涓畬鍏ㄤ竴鏍風殑鐢ㄦ埛鍚嶃?/p>

鏃㈢劧璇村埌浜嗗悗闈㈢殑榪欎簺娉ㄦ剰浜嬮」錛岄偅涔堜篃瑕佹彁閱掑ぇ瀹訛紝鍦ㄤ笂闈㈢粰鍑虹殑涓変釜姝ラ涓殑絎?姝ワ紝蹇呴』鏄湪絳夊悓鐨勫笎鎴蜂笅寰楀埌鐨刬d_rsa.pub鏂囦歡錛屽惁鍒欒繕鏄笉琛屻?/p>

so true 2008-11-15 01:25 鍙戣〃璇勮
]]>
涓涓畝鍗晄hell鑴氭湰http://www.tkk7.com/bacoo/archive/2008/11/15/240624.htmlso trueso trueFri, 14 Nov 2008 17:23:00 GMThttp://www.tkk7.com/bacoo/archive/2008/11/15/240624.htmlhttp://www.tkk7.com/bacoo/comments/240624.htmlhttp://www.tkk7.com/bacoo/archive/2008/11/15/240624.html#Feedback0http://www.tkk7.com/bacoo/comments/commentRss/240624.htmlhttp://www.tkk7.com/bacoo/services/trackbacks/240624.html浠婂ぉ鑳藉啓鍑鴻繖鏍蜂竴涓猻hell鑴氭湰錛屽叾瀹炲茍娌℃湁璐瑰お澶у姏姘旓紝鍥犳騫朵笉鏄鎴戝嚑緇忓懆鎶樼粓鏈夌粨鏋滆屽叴濂嬶紝鑰屾槸瑙夊緱鑷繁鐜板湪緇堜簬鍙互韙忓疄涓嬫潵鍋氳嚜宸卞枩嬈㈠仛鐨勪簨鎯咃紝鑳藉涓撴敞鐨勫幓瀛﹁瀛︾殑涓滆タ鑰屽叴濂嬨備箣鍓嶅浜嗗緢澶氭潅涓冩潅鍏殑涓滆タ錛屽洜涓虹洰鏍囦笉鏄庣‘錛屽緢鐥涜嫤錛岀┒鍏舵牴鏈紝鏄洜涓轟笉鐭ラ亾鑷繁灝嗕粠浜嬩粈涔堣亴涓氾紝鍙煡閬撹嚜宸辨兂浠庝簨IT榪欒錛屼絾鍏蜂綋鐨勫伐浣滄柟鍚戝嵈涓嶇煡閬擄紝鍥犳鍟ラ兘瑕佸涔狅紝榪欎釜榪囩▼瀵逛簬鎴戞潵璇村緢鐥涜嫤銆傚洜涓烘垜鏄竴涓瘮杈冨枩嬈㈣笍韙忓疄瀹炲仛浜嬬殑浜猴紝涓嶅仛灝變笉鍋氾紝鍋氬氨瑕佸仛寰楀緢濂姐傛垜涔嬪墠鐪嬭繃涓綃囧叧浜庤榪扮▼搴忓憳嫻簛鐨勬枃绔狅紝鍐欏緱澶簿褰╀簡銆傝岄噷闈㈡彁鍒扮殑寰堝嫻簛鐨勫仛娉曢兘鍦ㄦ垜韜笂寰楀埌浜嗗嵃璇侊紝榪欒鎴戝緢閮侀椃銆傜幇鍦紝宸ヤ綔瀹氫簡錛屾垜鐭ラ亾璇ュ鐐瑰暐浜嗭紝鐩爣涓撴敞浜嗭紝澶編濂戒簡銆?/p>

鍊熺敤Steven Jobs鐨勪竴鐣瘽鏉ヨ灝辨槸錛?/p>

The only way to be truely satisfied is to do what you believe is great work, and the only way to do great work is to love what you do!

鎴戣寰椾竴涓漢鑳藉仛鍒拌繖涓姝ワ紝鐪熺殑寰堝垢紱忥紝鑷繁鍘誨姫鍔涳紝鍘繪嫾鎼忥紝鍘誨疄鐜拌嚜宸辯殑浠峰鹼紝璁╄嚜宸卞鑷繁鐨勮〃鐜版弧鎰忥紝榪欐槸鎴戠粡甯稿鑷繁璇寸殑涓鍙ヨ瘽銆?/p>

鐜板湪鐨勬垜錛屽伐浣滃畾浜嗭紝濂沖弸涔熷畾浜嗭紝涔熷氨鏄濡囧畾浜嗭紝鎴戦渶瑕佸仛鐨勫氨鏄幓濂嬫枟錛屽幓鍔姏錛屽幓鎷兼悘銆?/p>

鎴戝緢鎰熻阿鑷繁鑳介亣鍒拌繖鏍蜂竴涓濡囷紝鑳芥敮鎸佹垜錛屽叧蹇冩垜錛屾垜涓嶇煡閬撹嚜宸變粖鍚庝細涓嶄細寰堟垚鍔燂紝浣嗘槸鎴戠煡閬撴湁浜嗚繖涓ソ鍐呮煴錛屾垜鍋氫粈涔堥兘韙忓疄銆傛垜鐭ラ亾錛屾湁浜嗗ス錛屾垜澶垢紱忥紝鎴戜篃涓瀹氫細甯︾粰濂瑰垢紱忕殑錛孖 promise!

 

濂戒簡錛屼笅闈㈠氨鎶婁唬鐮佽創鍑烘潵鍚э紝鍛靛懙錛?/p>

#!/bin/sh

cd /hadoop/logs

var="`ls *.log`"
cur=""
name=""
file=log_name.txt

if [ -e $file ]; then
 rm $file
fi

for cur in $var
do
 name=`echo $cur | cut -d'-' -f3`
 
 #cat $cur | grep ^2008 | awk '{print $0 " [`echo $name`]"}' >> $file
 cat $cur | grep ^2008 | sed "s/^.*$/&[$name]/" >> $file
 #awk '{print $0 " [`echo $name`]"}' >> $file
done

cp $file __temp.txt
sort __temp.txt >$file
rm __temp.txt

榪愯鐨勭粨鏋滄槸錛?/p>

2008-11-14 10:08:47,671 INFO org.apache.hadoop.dfs.NameNode: STARTUP_MSG: [namenode]
2008-11-14 10:08:48,140 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000[namenode]
2008-11-14 10:08:48,171 INFO org.apache.hadoop.dfs.NameNode: Namenode up at: bacoo/192.168.1.34:9000[namenode]
2008-11-14 10:08:48,171 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null[namenode]
2008-11-14 10:08:48,234 INFO org.apache.hadoop.dfs.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.dfs.FSNamesystemMetrics: Initializing FSNamesystemMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[namenode]
2008-11-14 10:08:48,890 INFO org.apache.hadoop.fs.FSNamesystem: Registered FSNamesystemStatusMBean[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 4 edits # 0 loaded in 0 seconds.[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Image file of size 80 loaded in 0 seconds.[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.fs.FSNamesystem: Finished loading FSImage in 657 msecs[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Leaving safe mode after 0 secs.[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks[namenode]
2008-11-14 10:08:49,609 INFO org.mortbay.util.Credential: Checking Resource aliases[namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][namenode]
2008-11-14 10:08:54,656 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@17f11fb[namenode]
2008-11-14 10:08:55,453 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][namenode]
2008-11-14 10:08:55,468 INFO org.apache.hadoop.fs.FSNamesystem: Web-server up at: 0.0.0.0:50070[namenode]
2008-11-14 10:08:55,468 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50070[namenode]
2008-11-14 10:08:55,468 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@61a907[namenode]
2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[namenode]
2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9000: starting[namenode]
2008-11-14 10:08:56,015 INFO org.apache.hadoop.dfs.NameNode.Secondary: STARTUP_MSG: [secondarynamenode]
2008-11-14 10:08:56,156 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=SecondaryNameNode, sessionId=null[secondarynamenode]
2008-11-14 10:08:56,468 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
2008-11-14 10:08:56,546 INFO org.mortbay.util.Credential: Checking Resource aliases[secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][secondarynamenode]
2008-11-14 10:08:56,953 INFO org.mortbay.jetty.servlet.XMLConfiguration: No WEB-INF/web.xml in file:/E:/cygwin/hadoop/webapps/secondary. Serving files and default/dynamic servlets only[secondarynamenode]
2008-11-14 10:08:56,953 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@b1a4e2[secondarynamenode]
2008-11-14 10:08:57,062 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][secondarynamenode]
2008-11-14 10:08:57,078 INFO org.apache.hadoop.dfs.NameNode.Secondary: Secondary Web-server up at: 0.0.0.0:50090[secondarynamenode]
2008-11-14 10:08:57,078 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50090[secondarynamenode]
2008-11-14 10:08:57,078 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@18a8ce2[secondarynamenode]
2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint Period   :3600 secs (60 min)[secondarynamenode]
2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Log Size Trigger    :67108864 bytes (65536 KB)[secondarynamenode]
2008-11-14 10:08:59,828 INFO org.apache.hadoop.mapred.JobTracker: STARTUP_MSG: [jobtracker]
2008-11-14 10:09:00,015 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=JobTracker, port=9001[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9001: starting[jobtracker]
2008-11-14 10:09:00,125 INFO org.mortbay.util.Credential: Checking Resource aliases[jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][jobtracker]
2008-11-14 10:09:02,312 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@1cd280b[jobtracker]
2008-11-14 10:09:08,359 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker up at: 9001[jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker webserver: 50030[jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=[jobtracker]
2008-11-14 10:09:08,375 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50030[jobtracker]
2008-11-14 10:09:08,375 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@16a9b9c[jobtracker]
2008-11-14 10:09:12,984 INFO org.apache.hadoop.mapred.JobTracker: Starting RUNNING[jobtracker]
2008-11-14 10:09:56,894 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG: [datanode]
2008-11-14 10:10:02,516 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG: [tasktracker]
2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Formatting ...[datanode]
2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Storage directory /hadoop/hadoopfs/data is not formatted.[datanode]
2008-11-14 10:10:11,343 INFO org.apache.hadoop.dfs.DataNode: Registered FSDatasetStatusMBean[datanode]
2008-11-14 10:10:11,347 INFO org.apache.hadoop.dfs.DataNode: Opened info server at 50010[datanode]
2008-11-14 10:10:11,352 INFO org.apache.hadoop.dfs.DataNode: Balancing bandwith is 1048576 bytes/s[datanode]
2008-11-14 10:10:16,430 INFO org.mortbay.util.Credential: Checking Resource aliases[tasktracker]
2008-11-14 10:10:17,976 INFO org.mortbay.util.Credential: Checking Resource aliases[datanode]
2008-11-14 10:10:20,068 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[datanode]
2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][datanode]
2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][datanode]
2008-11-14 10:10:20,725 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[tasktracker]
2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][tasktracker]
2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][tasktracker]
2008-11-14 10:10:27,078 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/localhost[jobtracker]
2008-11-14 10:10:32,171 INFO org.apache.hadoop.dfs.StateChange: BLOCK* NameSystem.registerDatanode: node registration from 192.168.1.167:50010 storage DS-1556534590-127.0.0.1-50010-1226628640386[namenode]
2008-11-14 10:10:32,187 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.1.167:50010[namenode]
2008-11-14 10:13:57,171 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 5 Total time for transactions(ms): 0 Number of syncs: 3 SyncTimes(ms): 4125 [namenode]
2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Roll Edit Log from 192.168.1.34[namenode]
2008-11-14 10:13:57,953 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file fsimage size 80 bytes.[secondarynamenode]
2008-11-14 10:13:57,968 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file edits size 288 bytes.[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 288 edits # 5 loaded in 0 seconds.[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[secondarynamenode]
2008-11-14 10:13:58,718 INFO org.apache.hadoop.dfs.Storage: Image file of size 367 saved in 0 seconds.[secondarynamenode]
2008-11-14 10:13:58,796 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [secondarynamenode]
2008-11-14 10:13:58,921 INFO org.apache.hadoop.dfs.NameNode.Secondary: Posted URL 0.0.0.0:50070putimage=1&port=50090&machine=192.168.1.34&token=-16:145044639:0:1226628551796:1226628513000[secondarynamenode]
2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [namenode]
2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Roll FSImage from 192.168.1.34[namenode]
2008-11-14 10:13:59,265 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint done. New Image Size: 367[secondarynamenode]
2008-11-14 10:29:02,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
2008-11-14 10:29:04,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
2008-11-14 10:29:06,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
2008-11-14 10:29:08,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
2008-11-14 10:29:10,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
2008-11-14 10:29:11,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
2008-11-14 10:29:13,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
2008-11-14 10:29:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
2008-11-14 10:29:17,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
2008-11-14 10:29:19,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
2008-11-14 10:29:21,078 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
2008-11-14 10:29:21,171 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
2008-11-14 10:34:23,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
2008-11-14 10:34:25,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
2008-11-14 10:34:27,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
2008-11-14 10:34:29,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
2008-11-14 10:34:31,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
2008-11-14 10:34:32,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
2008-11-14 10:34:34,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
2008-11-14 10:34:36,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
2008-11-14 10:34:38,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
2008-11-14 10:34:40,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
2008-11-14 10:38:43,359 INFO org.apache.hadoop.dfs.NameNode.Secondary: SHUTDOWN_MSG: [secondarynamenode]

鎴戠浉淇★紝榪欐牱灝卞彲浠ユ寜鐓ф椂闂寸殑欏哄簭錛屾妸鐢熶駭鐨勬棩蹇楀ソ濂界悊涓閬嶉『搴忎簡錛岃屼笖姣忎竴涓楠ゅ悗闈㈣繕閮芥湁浜嗗悇鑷搴旂殑node綾誨瀷銆?

so true 2008-11-15 01:23 鍙戣〃璇勮
]]>
主站蜘蛛池模板: 亚洲最大中文字幕无码网站| 最近免费中文字幕视频高清在线看 | 午夜免费福利在线| 亚洲最大黄色网站| 亚洲成人免费网站| 亚洲国产精品综合一区在线| 成人福利免费视频| 国产成人精品日本亚洲网址| 在线观看免费人成视频| 亚洲高清一区二区三区| 妞干网免费视频观看| 亚洲.国产.欧美一区二区三区| 四虎影视精品永久免费网站| 有码人妻在线免费看片| 亚洲中文字幕无码久久综合网| 国产一区二区免费| 亚洲黄网在线观看| 国产精品久久久久久久久久免费 | 亚洲www在线观看| 免费鲁丝片一级观看| 国产尤物在线视精品在亚洲| 精品国产日韩亚洲一区| 免费日本一区二区| 一本色道久久88亚洲精品综合 | 国产午夜无码精品免费看动漫| 亚洲日本在线观看| 无码永久免费AV网站| 边摸边脱吃奶边高潮视频免费| 国产亚洲AV夜间福利香蕉149| 日本xxxx色视频在线观看免费| 亚洲人成毛片线播放| 四虎影库久免费视频| 国产免费一区二区视频| 亚洲xxxx18| 久久亚洲综合色一区二区三区 | 国产大片线上免费看| 中文毛片无遮挡高清免费| 亚洲神级电影国语版| heyzo亚洲精品日韩| 最近中文字幕国语免费完整| 校园亚洲春色另类小说合集|