【1】 Expected only partition pruning predicates
解決方案:設(shè)置spark.sql.hive.metastorePartitionPruning=false
【2】 Error in query: Detected cartesian product for INNER join between logical plans
Project-Join condition is missing or trivial.
Use the CROSS JOIN syntax to allow cartesian products between these relations
解決方案:設(shè)置spark.sql.crossJoin.enabled=true
【3】 ERROR ApplicationMaster: User class threw exception: java.util.concurrent.TimeoutException: Futures timed out after [300 seconds]
- image.png
- image.png
解決方案:設(shè)置spark.sql.autoBroadcastJoinThreshold為-1拟杉,嘗試關(guān)閉BroadCast Join
【4】org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: No suitable driver found for
bigquery打包時(shí),生成了spark-1.0.3的包,用它起thriftserver,里面邏輯涉及到訪問mysql時(shí),報(bào)No suitable driver found for錯(cuò)誤来颤,看錯(cuò)誤是沒拿到mysql的url。檢查jar包,common模塊的resource中的配置文件沒有打進(jìn)去呕诉,看了下是spark模塊pom.xml打包時(shí)沒有加上common模塊的resource包。