相关文章推荐
兴奋的草稿纸  ·  如何从spark scala ...·  1 月前    · 
睿智的羊肉串  ·  js ...·  1 年前    · 
个性的饼干  ·  Jupyter Notebook ...·  1 年前    · 

在Scala项目中。如何将依赖性Jar的细节添加到Manifest文件中

0 人关注

我是Scala新手,正在用Scala编写AWS的简单文件读写(AWS S3)程序。下面的命令是用来使用Maven将jar的详细信息添加到Manifest文件中。同样地,我需要Scala项目的命令/配置来将jar的详细信息添加到Manifest文件中

<manifest>
     <addClasspath>true</addClasspath>
     <classpathPrefix>lib/</classpathPrefix>

build.sbt

enablePlugins(PackPlugin)
name := "FileReadAWS"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.scala-lang" % "scala-library" % scalaVersion.value
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.1"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.1"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.4"

/project/plugin.sbt

addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.13")

当我执行sbt pack 命令时,它正在创建项目jar,并将依赖jar添加/下载到了<project ocation>\target\pack\lib

lib文件夹中也包含项目jar,它的Manifest文件中有主类的详细信息,但没有jar的详细信息。

Manifest-Version: 1.0
Implementation-Title: FileReadAWS
Implementation-Version: 0.1
Specification-Vendor: default
Specification-Title: FileReadAWS
Implementation-Vendor-Id: default
Specification-Version: 0.1
Implementation-Vendor: default
Main-Class: spark.file.io.FileReader