内射老阿姨1区2区3区4区_久久精品人人做人人爽电影蜜月_久久国产精品亚洲77777_99精品又大又爽又粗少妇毛片

Hadoop單機(jī)版安裝-創(chuàng)新互聯(lián)

實(shí)驗(yàn)環(huán)境:

成都創(chuàng)新互聯(lián)主要從事成都做網(wǎng)站、成都網(wǎng)站設(shè)計(jì)、網(wǎng)頁(yè)設(shè)計(jì)、企業(yè)做網(wǎng)站、公司建網(wǎng)站等業(yè)務(wù)。立足成都服務(wù)石龍,10年網(wǎng)站建設(shè)經(jīng)驗(yàn),價(jià)格優(yōu)惠、服務(wù)專業(yè),歡迎來(lái)電咨詢建站服務(wù):18982081108

操作系統(tǒng):Ubuntu 16.04 LTS

Hadoop版本:Hadoop 2.7.1

一、安裝JAVA環(huán)境

dblab@dblab-VirtualBox:/$ sudo apt-get install default-jre default-jdk

dblab@dblab-VirtualBox:/$ vim ~/.bashrc

export JAVA_HOME=/usr/lib/jvm/default-java

dblab@dblab-VirtualBox:/$ source ~/.bashrc? #使變量設(shè)置生效

dblab@dblab-VirtualBox:/$ echo $JAVA_HOME

/usr/lib/jvm/default-java

dblab@dblab-VirtualBox:/$ java -version

openjdk version "1.8.0_131"

OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)

OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)

二、安裝Hadoop

dblab@dblab-VirtualBox:/$ cd ?~/下載

dblab@dblab-VirtualBox:~/下載$ ?wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.6.5/hadoop-2.6.5.tar.gz?

dblab@dblab-VirtualBox:~/下載$ sudo tar -zxvf hadoop-2.6.5.tar.gz -C /usr/local

dblab@dblab-VirtualBox:/usr/local$ sudo mv ./hadoop-2.6.5/ ./hadoop? ?#更改文件夾名稱

dblab@dblab-VirtualBox:/usr/local$ sudo chown -R hadoop ./hadoop/? ?#修改文件夾權(quán)限

dblab@dblab-VirtualBox:/usr/local/hadoop$ ./bin/hadoop version??? ?#顯示版本信息

Hadoop 2.7.1

Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15ecc87ccf4a0228f35af08fc56de536e6ce657a

Compiled by jenkins on 2015-06-29T06:04Z

Compiled with protoc 2.5.0

From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a

This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.7.1.jar

三、Hadoop單機(jī)配置

Hadoop 默認(rèn)模式為非分布式模式(本地模式),無(wú)需進(jìn)行其他配置即可運(yùn)行。

可以執(zhí)行例子來(lái)感受下 Hadoop 的運(yùn)行。Hadoop 附帶了豐富的例子

dblab@dblab-VirtualBox:/usr/local/hadoop$ ./bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar????

An example program must be given as the first argument.

Valid program names are:

aggregatewordcount: An Aggregate based map/reduce program that counts the words in the input files.

aggregatewordhist: An Aggregate based map/reduce program that computes the histogram of the words in the input files.

bbp: A map/reduce program that uses Bailey-Borwein-Plouffe to compute exact digits of Pi.

dbcount: An example job that count the pageview counts from a database.

distbbp: A map/reduce program that uses a BBP-type formula to compute exact bits of Pi.

grep: A map/reduce program that counts the matches of a regex in the input.

join: A job that effects a join over sorted, equally partitioned datasets

multifilewc: A job that counts words from several files.

pentomino: A map/reduce tile laying program to find solutions to pentomino problems.

pi: A map/reduce program that estimates Pi using a quasi-Monte Carlo method.

randomtextwriter: A map/reduce program that writes 10GB of random textual data per node.

randomwriter: A map/reduce program that writes 10GB of random data per node.

secondarysort: An example defining a secondary sort to the reduce.

sort: A map/reduce program that sorts the data written by the random writer.

sudoku: A sudoku solver.

teragen: Generate data for the terasort

terasort: Run the terasort

teravalidate: Checking results of terasort

wordcount: A map/reduce program that counts the words in the input files.

wordmean: A map/reduce program that counts the average length of the words in the input files.

wordmedian: A map/reduce program that counts the median length of the words in the input files.

wordstandarddeviation: A map/reduce program that counts the standard deviation of the length of the words in the input files.

$cd /usr/local/hadoop

$sudo mkdir ./input

$sudo cp ./etc/hadoop/*.xml ./input ??#將配置文件作為輸入文件

$./bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar grep ./input ./output 'dfs[a-z.]+'

$cat ./output/*???????? ?# 查看運(yùn)行結(jié)果

dblab@dblab-VirtualBox:/usr/local/hadoop$ cat ./output/*

1?????? dfsadmin

?????

另外有需要云服務(wù)器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內(nèi)外云服務(wù)器15元起步,三天無(wú)理由+7*72小時(shí)售后在線,公司持有idc許可證,提供“云服務(wù)器、裸金屬服務(wù)器、高防服務(wù)器、香港服務(wù)器、美國(guó)服務(wù)器、虛擬主機(jī)、免備案服務(wù)器”等云主機(jī)租用服務(wù)以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡(jiǎn)單易用、服務(wù)可用性高、性價(jià)比高”等特點(diǎn)與優(yōu)勢(shì),專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應(yīng)用場(chǎng)景需求。

本文名稱:Hadoop單機(jī)版安裝-創(chuàng)新互聯(lián)
轉(zhuǎn)載來(lái)源:http://m.rwnh.cn/article10/poido.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供品牌網(wǎng)站設(shè)計(jì)網(wǎng)站收錄、標(biāo)簽優(yōu)化、網(wǎng)站維護(hù)軟件開發(fā)、網(wǎng)站策劃

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)

營(yíng)銷型網(wǎng)站建設(shè)
大宁县| 望都县| 军事| 大荔县| 项城市| 泌阳县| 五寨县| 札达县| 景东| 申扎县| 高阳县| 友谊县| 辽宁省| 东山县| 桑日县| 稻城县| 蓝山县| 古蔺县| 丰镇市| 宁化县| 淮北市| 肥城市| 潜江市| 十堰市| 通江县| 屯昌县| 搜索| 大港区| 开封市| 石柱| 沿河| 石嘴山市| 侯马市| 大石桥市| 福清市| 芜湖市| 利川市| 介休市| 正安县| 松江区| 高密市|