我想处理spark数据并将它们插入HBase。我在用Hbase-Spark(Apache HBase)库(https://mvnrepository.com/artifact/org.apache.hbase/hbase-spark/2.0.0-alpha4)
我明白了……
我使用的是Spark 2.2.x,我遇到了同样的问题。我忘记了我设法将其中的哪些依赖项,但是看看配置中缺少哪一个并尝试添加它。我应该工作。我相信这是火花流:
<dependency> <groupId>commons-logging</groupId> <artifactId>commons-logging</artifactId> <version>1.1.1</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>${spark.version}</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>${scala.version}</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>${hbase.version}</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-spark</artifactId> <version>${hbase-spark.version}</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-mapreduce</artifactId> <version>${hbase-spark.version}</version> </dependency>