您好,登錄后才能下訂單哦!
這篇文章給大家介紹spark中怎么配置啟用LZO壓縮,內容非常詳細,感興趣的小伙伴們可以參考借鑒,希望對大家能有所幫助。
Spark中配置啟用LZO壓縮,步驟如下:
一、spark-env.sh配置
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native
export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/tools/lib/*:/app/spark-2.2.0-bin-2.6.0-cdh6.7.0/jars/*
2、無法找到LzopCodec類
2.1、錯誤提示:
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzopCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135)
at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:175)
at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzopCodec not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1980)
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
2.2、解決辦法:在spark的conf中配置spark-defaults.conf,增加以下內容:
spark.driver.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar
spark.executor.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar
關于spark中怎么配置啟用LZO壓縮就分享到這里了,希望以上內容可以對大家有一定的幫助,可以學到更多知識。如果覺得文章不錯,可以把它分享出去讓更多的人看到。
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。