Hadoop开篇,按惯例,先编译源码,导入到Eclipse,这样以后要了解那块,或者那块出问题了,直接找源码。
hadoop2.4.1编译需要protoc2.5.0的支持,所以还要下载protoc。我下载的是:protobuf-2.5.0.tar.bz2
对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略
yum install gcc
yum install gcc-c++
yum install make
yum install cmake
yum install openssl-devel
yum install ncurses-devel
安装protoc
tar -xvf protobuf-2.5.0.tar.bz2
cd protobuf-2.5.0
./configure --prefix=/opt/protoc/
make && make install
linux系统执行编译命令:mvn install eclipse:eclipse -Pdist,native-DskipTests-Dtar-Dmaven.javadoc.skip=true
编译完成后,查看hadoop-dist文件夹:
[root@localhost target]# ll
total 153824
drwxr-xr-x. 2 root root 4096 Jul 9 17:00 antrun
-rw-r--r--. 1 root root 4809 Jul 9 17:00 dist-layout-stitching.sh
-rw-r--r--. 1 root root 666 Jul 9 17:01 dist-tar-stitching.sh
drwxr-xr-x. 9 root root 4096 Jul 9 17:00 hadoop-3.0.0-SNAPSHOT
-rw-r--r--. 1 root root 157482988 Jul 9 17:01 hadoop-3.0.0-SNAPSHOT.tar.gz
-rw-r--r--. 1 root root 3445 Jul 9 17:01 hadoop-dist-3.0.0-SNAPSHOT.jar
drwxr-xr-x. 2 root root 4096 Jul 9 17:01 maven-archiver
drwxr-xr-x. 2 root root 4096 Jul 9 17:00 test-dir
[root@localhost target]# pwd
/home/fish/hadoop/hadoop-dist/target
查看hadoop的版本:
[root@localhost bin]# cd /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin
[root@localhost bin]# ./hadoop version
Hadoop 3.0.0-SNAPSHOT
Source code repository https://github.com/apache/hadoop.git -r e0febce0e74ec69597376774f771da46834c42b1
Compiled by root on 2015-07-09T08:53Z
Compiled with protoc 2.5.0
From source with checksum d69dd13fde158d22d95a263a0f12bc8
This command was run using /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar
[root@localhost bin]# pwd
/home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin
查看编译的一些信息:
[root@localhost hadoop-3.0.0-SNAPSHOT]# file lib//native/*
lib//native/libhadoop.a: current ar archive
lib//native/libhadooppipes.a: current ar archive
lib//native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0'
lib//native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
lib//native/libhadooputils.a: current ar archive
lib//native/libhdfs.a: current ar archive
lib//native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0'
lib//native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
lib//native/libnativetask.a: current ar archive
lib//native/libnativetask.so: symbolic link to `libnativetask.so.1.0.0'
lib//native/libnativetask.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
[root@localhost hadoop-3.0.0-SNAPSHOT]# pwd
/home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT
编译问题
问题1:
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:3.0.0-SNAPSHOT: Failure to find org.apache.hadoop:hadoop-auth:jar:tests:3.0.0-SNAPSHOT in http
://10.0.1.88:8081/nexus/content/repositories/thirdparty/ was cached in the local repository, resolution will not be reattempted until the update interval of thirdparty has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
修改.m2中的文件:
mv /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar
mv /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar
mv /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar
问题2:
还有些错误会报无法下载到jar,这种情况可以登录到http://search.maven.org/官方库去看下这个包存不存在,如果存在的话,可能是因为网络原因,多执行几次就可以了。
问题3:
[root@localhost bin]# ./hadoop
: No such file or directory
修改hadoop命令为linux的格式:
dos2unix /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin/hadoop
分享到:
相关推荐
Hadoop-2.7.2源码及编译jar包.zip,包含 Apache Hadoop 源码包、编译后的 Win7、Win10、以及 Linux 包。 Hadoop-2.7.2源码及编译jar包.zip,包含 Apache Hadoop 源码包、编译后的 Win7、Win10、以及 Linux 包。 ...
linux环境编译源码linux环境编译源码 linux环境编译源码linux环境编译源码 linux环境编译源码linux环境编译源码
在linux环境中对hadoop2源码编译成64位
linux编译过的hadoop2.7.6 。经过源码编译的Hadoop 。
在linux下hadoop编译所需要的软件,2.74版本,需要下载的请下载,包含hadoop源码包,编译用的ant包maven和protbuf
提供源码编译后的hadoop 2.7.7的包,系统版本内核是Linux 3.10.0-957.el7.centos.plus.i686
本人经过4个小时顺利将hadoop官方的32位编译为64位,通过测试可用 注:*****因为本人CSDN等级低上传限制为50M,源码压缩后有91M,所以用winrar压缩分为了2个卷,需要同时下载才可以使用,另一个卷part1在我的资源里
资源中包含linux平台编译Hadoop源码的所有软件和详细步骤: 1、下载安装各种软件库 yum -y install svn ncurses-devel gcc* yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel 2...
linux环境编译源码\protobuf-2.5.0.tar.gz linux环境编译源码\hadoop-2.7.2-src.tar.gz linux环境编译源码\apache-maven-3.0.5-bin.tar.gz linux环境编译源码\apache-ant-1.9.9-bin.tar.gz
下载hadoop2.6.0的源码辛苦编译成的,网上没找到编译好现成的,费了几个小时用maven编译好的带源码的jar包,中间还得安装编译protoc,可以直接供eclipse使用。
这是我自己编译整理的,现在分享给大家,包含编译好的hadoop2.8.3、编译教程、jdk、下载好的maven仓库、编译使用的所有安装包和编译注意问题。 希望对大家有所帮助,ps:上传文件居然还有限制,每次只能120m,坑爹啊...
编译安装hadoop需要的各种软件,其中包括maven、Protobuff、findbug、hadoop-2.4.1和hadoop-2.6.5
用于编译Hadoop源码的相关软件合集,用于Linux系统环境下的编译,包括2.7.3的源码,maven3.6.3,jdk等等。
hadoop源码在linux环境下编译过的,直接下载下来用就行了,
编译Hadoop源码需要到的包: (1)hadoop-2.7.2-src.tar.gz (2)jdk-8-linux-x64.gz (3)apache-ant-1.9.9-bin.tar.gz (4)apache-maven-3.0.5-bin.tar.gz (5)protobuf-2.5.0.tar.gz
2.jar包准备(hadoop源码、JDK8、maven、ant 、protobuf) (1)hadoop-2.7.2-src.tar.gz (2)jdk-8u144-linux-x64.tar.gz (3)apache-ant-1.9.9-bin.tar.gz(build工具,打包用的) (4)apache-maven-3.0.5-bin....
linux 系统为centos6.5 hadoop版本:2.5.0 cdh3.5.6环境下使用
我编译用到的包(protobuf-2.5.0.tar.gz,findbugs-3.0.1.tar.gz,apache-ant-1.9.13-bin.tar.gz,snappy-1.1.1.tar.gz)和编译的过程详解(遇到的错误)都在压缩包中(hadoop源码请到官网下载)。 背景Hadoop官网...