sbt-assembly not found when building Spark 0.5
up vote
1
down vote
favorite
I am trying to build the 0.5 branch of Spark, but it raises errors:
sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found
Hence, I download the ivys
and jars
manually from dl.bintray.com, and put them into my local .ivy
folder.
To be specific, I create a sbt-assembly
under com.eed3si9n
, and I rename files as:
However, this does not work. What is the correct solution?
scala apache-spark sbt
add a comment |
up vote
1
down vote
favorite
I am trying to build the 0.5 branch of Spark, but it raises errors:
sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found
Hence, I download the ivys
and jars
manually from dl.bintray.com, and put them into my local .ivy
folder.
To be specific, I create a sbt-assembly
under com.eed3si9n
, and I rename files as:
However, this does not work. What is the correct solution?
scala apache-spark sbt
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I am trying to build the 0.5 branch of Spark, but it raises errors:
sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found
Hence, I download the ivys
and jars
manually from dl.bintray.com, and put them into my local .ivy
folder.
To be specific, I create a sbt-assembly
under com.eed3si9n
, and I rename files as:
However, this does not work. What is the correct solution?
scala apache-spark sbt
I am trying to build the 0.5 branch of Spark, but it raises errors:
sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found
Hence, I download the ivys
and jars
manually from dl.bintray.com, and put them into my local .ivy
folder.
To be specific, I create a sbt-assembly
under com.eed3si9n
, and I rename files as:
However, this does not work. What is the correct solution?
scala apache-spark sbt
scala apache-spark sbt
asked Nov 22 at 6:38
chenzhongpu
2,23232450
2,23232450
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
up vote
3
down vote
Spark branch-0.5
uses sbt 0.11.3 according to project/build.properties
, so that's pretty old.
sbt community repository location
There's a bug in project/plugins.sbt
. It's pointing to scalasbt.artifactoryonline.com
, but it should point to repo.scala-sbt.org
.
$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)
JDK 1.6
To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.
$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal
This should get the sbt shell started. Once it comes up, type in:
> package
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correctJDK
version for different version ofSpark
? does it mainly depend on theScala
version used?
– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425169%2fsbt-assembly-not-found-when-building-spark-0-5%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
Spark branch-0.5
uses sbt 0.11.3 according to project/build.properties
, so that's pretty old.
sbt community repository location
There's a bug in project/plugins.sbt
. It's pointing to scalasbt.artifactoryonline.com
, but it should point to repo.scala-sbt.org
.
$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)
JDK 1.6
To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.
$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal
This should get the sbt shell started. Once it comes up, type in:
> package
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correctJDK
version for different version ofSpark
? does it mainly depend on theScala
version used?
– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
add a comment |
up vote
3
down vote
Spark branch-0.5
uses sbt 0.11.3 according to project/build.properties
, so that's pretty old.
sbt community repository location
There's a bug in project/plugins.sbt
. It's pointing to scalasbt.artifactoryonline.com
, but it should point to repo.scala-sbt.org
.
$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)
JDK 1.6
To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.
$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal
This should get the sbt shell started. Once it comes up, type in:
> package
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correctJDK
version for different version ofSpark
? does it mainly depend on theScala
version used?
– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
add a comment |
up vote
3
down vote
up vote
3
down vote
Spark branch-0.5
uses sbt 0.11.3 according to project/build.properties
, so that's pretty old.
sbt community repository location
There's a bug in project/plugins.sbt
. It's pointing to scalasbt.artifactoryonline.com
, but it should point to repo.scala-sbt.org
.
$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)
JDK 1.6
To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.
$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal
This should get the sbt shell started. Once it comes up, type in:
> package
Spark branch-0.5
uses sbt 0.11.3 according to project/build.properties
, so that's pretty old.
sbt community repository location
There's a bug in project/plugins.sbt
. It's pointing to scalasbt.artifactoryonline.com
, but it should point to repo.scala-sbt.org
.
$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)
JDK 1.6
To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.
$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal
This should get the sbt shell started. Once it comes up, type in:
> package
answered Nov 22 at 7:38
Eugene Yokota
72.8k39179268
72.8k39179268
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correctJDK
version for different version ofSpark
? does it mainly depend on theScala
version used?
– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
add a comment |
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correctJDK
version for different version ofSpark
? does it mainly depend on theScala
version used?
– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct
JDK
version for different version of Spark
? does it mainly depend on the Scala
version used?– chenzhongpu
Nov 22 at 8:57
Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct
JDK
version for different version of Spark
? does it mainly depend on the Scala
version used?– chenzhongpu
Nov 22 at 8:57
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.
– Eugene Yokota
Nov 22 at 10:18
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425169%2fsbt-assembly-not-found-when-building-spark-0-5%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown