猿问

如何使用 IntelliJ 解决 Scala 中的异常?

我正在尝试运行这个项目,我在 sbt 文件中添加了依赖项,我的 sbt 文件如下所示:


name := "HelloScala"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

resolvers += Resolver.bintrayRepo("salesforce", "maven")

libraryDependencies += "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"

然后我Helloworld从他们的存储库中复制了该文件夹,但有很多问题。


Information:10/09/18, 12:01 PM - Compilation completed with 88 errors and 0 warnings in 15 s 624 ms

Error:scalac: missing or invalid dependency detected while loading class file 'package.class'.

Could not access type Vector in value org.apache.spark.ml.linalg,

because it (or its dependencies) are missing. Check your build definition for

missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)

A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.

Error:scalac: missing or invalid dependency detected while loading class file 'OPVector.class'.

Could not access type Vector in value org.apache.spark.ml.linalg,

because it (or its dependencies) are missing. Check your build definition for

missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)

A full rebuild may help if 'OPVector.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.

Error:scalac: missing or invalid dependency detected while loading class file 'OpEvaluatorBase.class'.


我试图搜索这些问题,发现它可能是版本问题,但如果存在版本问题,我不知道应该使用哪个版本。但是,如果我尝试从命令行运行它,它会起作用:


cd helloworld

./gradlew compileTestScala installDist

./gradlew -q sparkSubmit -Dmain=com.salesforce.hw.OpTitanicSimple -Dargs="\

`pwd`/src/main/resources/TitanicDataset/TitanicPassengersTrainData.csv"

它不适用于 IntelliJ ,我该如何解决这个问题?


猛跑小猪
浏览 191回答 1
1回答

呼啦一阵风

在build.sbt两个依赖是缺失的:spark-mllib和spark-sqllibraryDependencies ++= Seq(  "org.apache.spark" %% "spark-core" % "2.3.1",  "org.apache.spark" %% "spark-mllib" % "2.3.1",  "org.apache.spark" %% "spark-sql" % "2.3.1",  "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4")这将删除第一个错误块。
随时随地看视频慕课网APP

相关分类

Java
我要回答