前一階段看下CarbonData,說是基于HDFS列式存儲(chǔ)睦霎,據(jù)說性能很好,支持復(fù)雜的查詢走诞,對我們的業(yè)務(wù)來說目前很需要,所以想嘗試下蛤高。
1.源碼下載
去官網(wǎng)下載源碼蚣旱,沒有編譯好的,只能自己編譯戴陡,好搓哦塞绿。
1)安裝maven
這個(gè)簡單,下載設(shè)置好mavenhome:
export MAVEN_HOME=/ssd/bigdata/software/apache-maven-3.3.9
和bin路徑就可以了恤批。
2)安裝JAVA
這個(gè)安裝過了异吻,忽略。
3)安裝Thrift 0.9.3
./configure JAVAC=/ssd/bigdata/software/jdk/bin/javac
報(bào)錯(cuò):
configure: error: Bison version 2.5 or higher must be installed on the system!
改命令:
./configure JAVAC=/ssd/bigdata/software/jdk/bin/javac --prefix=/ssd/bigdata/software/bison-3.0/install
安裝完成后喜庞,在環(huán)境變量的path前面加上安裝路徑诀浪。
再次安裝thrift
make
然后檢查 make check
然后用root用戶進(jìn)行make install
4)安裝CarbonData
mvn clean -DskipTests -Pbuild-with-format -Pspark-2.1 install
無法連外網(wǎng),安裝不起來- -
轉(zhuǎn)環(huán)境到自己虛擬機(jī)上編譯試試延都。
[ERROR] Failed to execute goal on project carbondata-core: Could not resolve dependencies for project org.apache.carbondata:carbondata-core:jar:1.0.0-incubating: The following artifacts could not be resolved: pentaho-kettle:kettle-engine:jar:4.4.0-stable, pentaho-kettle:kettle-core:jar:4.4.0-stable, pentaho-kettle:kettle-db:jar:4.4.0-stable, org.apache.spark:spark-core_2.11:jar:2.1.0, org.apache.spark:spark-catalyst_2.11:jar:2.1.0: Failure to find pentaho-kettle:kettle-engine:jar:4.4.0-stable in