内射老阿姨1区2区3区4区_久久精品人人做人人爽电影蜜月_久久国产精品亚洲77777_99精品又大又爽又粗少妇毛片

hadoop運(yùn)行實(shí)例分析-創(chuàng)新互聯(lián)

這篇文章主要講解了“hadoop運(yùn)行實(shí)例分析”,文中的講解內(nèi)容簡單清晰,易于學(xué)習(xí)與理解,下面請大家跟著小編的思路慢慢深入,一起來研究和學(xué)習(xí)“hadoop運(yùn)行實(shí)例分析”吧!

建網(wǎng)站原本是網(wǎng)站策劃師、網(wǎng)絡(luò)程序員、網(wǎng)頁設(shè)計師等,應(yīng)用各種網(wǎng)絡(luò)程序開發(fā)技術(shù)和網(wǎng)頁設(shè)計技術(shù)配合操作的協(xié)同工作。創(chuàng)新互聯(lián)公司專業(yè)提供成都網(wǎng)站制作、網(wǎng)站設(shè)計,網(wǎng)頁設(shè)計,網(wǎng)站制作(企業(yè)站、成都響應(yīng)式網(wǎng)站建設(shè)公司、電商門戶網(wǎng)站)等服務(wù),從網(wǎng)站深度策劃、搜索引擎友好度優(yōu)化到用戶體驗的提升,我們力求做到極致!

1.找到examples的jar包

2.創(chuàng)建輸入和輸出目錄

3.將需要分隔的文件上傳到wc_input目錄下

4.查看上傳的文件

5.hadoop jar /hadoop_soft/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar wordcount /wc_input/* /wc_output/

 [root@hadoop input]# hadoop jar /hadoop_soft/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar wordcount /wc_input/* /wc_output/
17/08/15 10:25:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/15 10:25:25 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.120:18040
17/08/15 10:25:27 INFO input.FileInputFormat: Total input paths to process : 2
17/08/15 10:25:27 INFO mapreduce.JobSubmitter: number of splits:2
17/08/15 10:25:28 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1502762082449_0001
17/08/15 10:25:28 INFO impl.YarnClientImpl: Submitted application application_1502762082449_0001
17/08/15 10:25:29 INFO mapreduce.Job: The url to track the job: http://hadoop:18088/proxy/application_1502762082449_0001/
17/08/15 10:25:29 INFO mapreduce.Job: Running job: job_1502762082449_0001
17/08/15 10:25:48 INFO mapreduce.Job: Job job_1502762082449_0001 running in uber mode : true
17/08/15 10:25:48 INFO mapreduce.Job:  map 0% reduce 0%
17/08/15 10:25:50 INFO mapreduce.Job:  map 100% reduce 0%
17/08/15 10:25:51 INFO mapreduce.Job:  map 100% reduce 100%
17/08/15 10:25:51 INFO mapreduce.Job: Job job_1502762082449_0001 completed successfully
17/08/15 10:25:52 INFO mapreduce.Job: Counters: 52
 File System Counters
  FILE: Number of bytes read=276
  FILE: Number of bytes written=545
  FILE: Number of read operations=0
  FILE: Number of large read operations=0
  FILE: Number of write operations=0
  HDFS: Number of bytes read=798
  HDFS: Number of bytes written=398613
  HDFS: Number of read operations=66
  HDFS: Number of large read operations=0
  HDFS: Number of write operations=23
 Job Counters
  Launched map tasks=2
  Launched reduce tasks=1
  Other local map tasks=2
  Total time spent by all maps in occupied slots (ms)=1972
  Total time spent by all reduces in occupied slots (ms)=803
  TOTAL_LAUNCHED_UBERTASKS=3
  NUM_UBER_SUBMAPS=2
  NUM_UBER_SUBREDUCES=1
  Total time spent by all map tasks (ms)=1972
  Total time spent by all reduce tasks (ms)=803
  Total vcore-milliseconds taken by all map tasks=1972
  Total vcore-milliseconds taken by all reduce tasks=803
  Total megabyte-milliseconds taken by all map tasks=2019328
  Total megabyte-milliseconds taken by all reduce tasks=822272
 Map-Reduce Framework
  Map input records=5
  Map output records=11
  Map output bytes=111
  Map output materialized bytes=109
  Input split bytes=210
  Combine input records=11
  Combine output records=8
  Reduce input groups=7
  Reduce shuffle bytes=109
  Reduce input records=8
  Reduce output records=7
  Spilled Records=16
  Shuffled Maps =2
  Failed Shuffles=0
  Merged Map outputs=2
  GC time elapsed (ms)=637
  CPU time spent (ms)=1820
  Physical memory (bytes) snapshot=830070784
  Virtual memory (bytes) snapshot=8998096896
  Total committed heap usage (bytes)=500510720
 Shuffle Errors
  BAD_ID=0
  CONNECTION=0
  IO_ERROR=0
  WRONG_LENGTH=0
  WRONG_MAP=0
  WRONG_REDUCE=0
 File Input Format Counters
  Bytes Read=70
 File Output Format Counters
  Bytes Written=57

6.查看運(yùn)行結(jié)果

7.檢查結(jié)果數(shù)據(jù)

感謝各位的閱讀,以上就是“hadoop運(yùn)行實(shí)例分析”的內(nèi)容了,經(jīng)過本文的學(xué)習(xí)后,相信大家對hadoop運(yùn)行實(shí)例分析這一問題有了更深刻的體會,具體使用情況還需要大家實(shí)踐驗證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識點(diǎn)的文章,歡迎關(guān)注!

新聞標(biāo)題:hadoop運(yùn)行實(shí)例分析-創(chuàng)新互聯(lián)
本文路徑:http://m.rwnh.cn/article18/djiigp.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供靜態(tài)網(wǎng)站、電子商務(wù)網(wǎng)站策劃、全網(wǎng)營銷推廣移動網(wǎng)站建設(shè)、外貿(mào)建站

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請盡快告知,我們將會在第一時間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場,如需處理請聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時需注明來源: 創(chuàng)新互聯(lián)

成都網(wǎng)站建設(shè)公司
新宁县| 新蔡县| 秀山| 通州市| 白银市| 葵青区| 平度市| 文登市| 秦皇岛市| 金寨县| 仁寿县| 那曲县| 莲花县| 屏东市| 泸水县| 拉孜县| 建湖县| 清涧县| 吉林市| 民乐县| 清丰县| 准格尔旗| 隆子县| 漳平市| 壤塘县| 廉江市| 雷波县| 新龙县| 上虞市| 滨州市| 施秉县| 北安市| 肇东市| 固安县| 永春县| 兴国县| 东明县| 麦盖提县| 莱阳市| 双桥区| 即墨市|