Code Monkey home page Code Monkey logo

librarymanagement's Introduction

librarymanagement module for sbt

scala> import java.io.File
import java.io.File

scala> import sbt.librarymanagement._, syntax._
import sbt.librarymanagement._
import syntax._

scala> val log = sbt.util.LogExchange.logger("test")
log: sbt.internal.util.ManagedLogger = sbt.internal.util.ManagedLogger@c439b0f

scala> val lm = {
         import sbt.librarymanagement.ivy._
         val ivyConfig = InlineIvyConfiguration().withLog(log)
         IvyDependencyResolution(ivyConfig)
       }
lm: sbt.librarymanagement.DependencyResolution = sbt.librarymanagement.DependencyResolution@6a9b40f8

scala> val module = "commons-io" % "commons-io" % "2.5"
module: sbt.librarymanagement.ModuleID = commons-io:commons-io:2.5

scala> lm.retrieve(module, scalaModuleInfo = None, new File("target"), log)
res0: Either[sbt.librarymanagement.UnresolvedWarning,Vector[java.io.File]] = Right(Vector(target/jars/commons-io/commons-io/commons-io-2.5.jar, target/jars/commons-io/commons-io/commons-io-2.5.jar, target/jars/commons-io/commons-io/commons-io-2.5.jar))

librarymanagement's People

Contributors

adpi2 avatar adriaanm avatar ajsquared avatar andreatp avatar armanbilge avatar bigwheel avatar briantopping avatar dpratt avatar duhemm avatar dwijnand avatar eed3si9n avatar gkossakowski avatar harrah avatar havocp avatar indrajitr avatar izharahmd avatar jedesah avatar jsuereth avatar jvican avatar mdedetrich avatar milessabin avatar olegych avatar pdalpra avatar rtyley avatar ruippeixotog avatar smarter avatar swaldman avatar tanishiking avatar willb avatar xuwei-k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

librarymanagement's Issues

Allow users to define the changing pattern in ivy

The notion of SNAPSHOT may vary, as sbt-dynver's versioning proves. Companies also have different notions of what a SNAPSHOT is, and its semantics may be shared for different changing patterns.

Currently, librarymanagement special cases -SNAPSHOT as the changing pattern. However, I think this pattern should be exposed to the users so that they can modify it if they want.

False positive on eviction warning for sbt 1.x modules

[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] 	* org.scala-sbt:io_2.12:1.1.1 is selected over 1.0.2
[warn] 	    +- org.scala-sbt:command_2.12:1.0.3-SNAPSHOT          (depends on 1.1.1)
[warn] 	    +- org.scala-sbt:zinc-classpath_2.12:1.1.0-RC1        (depends on 1.1.1)
[warn] 	    +- org.scala-sbt:completion_2.12:1.0.3-SNAPSHOT       (depends on 1.1.1)
[warn] 	    +- org.scala-sbt:librarymanagement-core_2.12:1.0.4    (depends on 1.0.2)
[warn] 	    +- org.scala-sbt:util-cache_2.12:1.0.3                (depends on 1.0.2)
[warn] 	* org.scala-sbt:util-logging_2.12:1.1.0 is selected over 1.0.3
[warn] 	    +- org.scala-sbt:librarymanagement-core_2.12:1.0.4    (depends on 1.0.3)
[warn] 	    +- org.scala-sbt:command_2.12:1.0.3-SNAPSHOT          (depends on 1.0.3)
[warn] 	    +- org.scala-sbt:protocol_2.12:1.0.3-SNAPSHOT         (depends on 1.0.3)
[warn] 	* org.scala-sbt:util-position_2.12:1.1.0 is selected over 1.0.3
[warn] 	    +- org.scala-sbt:librarymanagement-core_2.12:1.0.4    (depends on 1.0.3)
[warn] 	    +- org.scala-sbt:collections_2.12:1.0.3-SNAPSHOT      (depends on 1.0.3)
[warn] Run 'evicted' to see detailed eviction warnings

IllegalArgumentException: compiler-bridge.jar not compiled when "sbt.ManagedChecksums" is null

I have a build using a externalIvySettings file, which when first run with an uncompiled compiler-bridge_2.1:2:1.05 jar will fail with the stack trace below:

[info] Attempting to fetch org.scala-sbt:compiler-bridge_2.12:1.0.5.
[error] ## Exception when compiling 41 sources to C:\Users\ian.gabriel\workspace\avalanche\play\target\scala-2.12\classes
[error] For input string: "null"
[error] scala.collection.immutable.StringLike.parseBoolean(StringLike.scala:327)
[error] scala.collection.immutable.StringLike.toBoolean(StringLike.scala:286)
[error] scala.collection.immutable.StringLike.toBoolean$(StringLike.scala:286)
[error] scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29)
[error] sbt.internal.librarymanagement.IvyActions$.retrieve(IvyActions.scala:385)
[error] sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$6(IvyActions.scala:213)

It is pretty much the exact same bug as: #110 just in a different location. I am using scala 2.12.4, and sbt 1.0.4.

After adding: <property name="sbt.managedChecksums" value="False"/> to my ivySettings.xml, I was able to work around this problem. I am not sure what the default value for this key should be if not present, but I don't believe it should be null : - )

Here is a my unanswered stack overflow question:

https://stackoverflow.com/questions/48081406/externalivysettingsurl-with-sbt-1-0-4-play-framework

Thanks for your time!

Improve macro error message

steps

Use config inline.

problem

config must be directly assigned to a val, such as `val x = config`.

This is typically not how you define a config.

expectation

config must be directly assigned to a val, such as `val Tooling = config("tooling")`

CrossVersion.Disabled is not a subtype of CrossVersion

In sbt 0.13 CrossVersion.Disabled was a case object that extends CrossVersion. In sbt 1 it's a contraband type - this means it's not a case object but just a class, so CrossVersion.Disabled is an alias to its companion object..

OfflineModeSpec is flaky

just saw this fail in a completely unrelated PR #221

[info] OfflineModeSpec:
[info] Offline update configuration
[info] - should reuse the caches when offline is enabled *** FAILED ***
[info]   149 was not less than or equal to 147.29999999999998 Offline resolution took more than 15% of normal resolution's running time. (OfflineModeSpec.scala:50)

(link: https://travis-ci.org/sbt/librarymanagement/jobs/355537630, but I'm going to restart it)

"Choosing local for com.example#x_2.12;a.b.c-SNAPSHOT" warning

steps

Use SNAPSHOTs in the graph.

problem

[warn] Choosing local for com.example#x_2.12;a.b.c-SNAPSHOT

The warning is not really clear, or actionable.

expectations

Clarify wording to say something like:

[warn] Resolving a snapshot version. It's going to be slow unless you use `updateOptions := updateOptions.value.withLatestSnapshots(false)` options.
[info] Out of 2 candidates we found for com.example#x_2.12;a.b.c-SNAPSHOT in local and foo, we are choosing local.
[info] Out of 2 candidates we found for com.example#y_2.12;a.b.c-SNAPSHOT in local and foo, we are choosing local.

Unimplemented JsonFormats are a potential runtime bomb

[warn] /d/sbt-lm/librarymanagement/src/main/scala/sbt/internal/librarymanagement/formats/DependencyResolverFormat.scala:7: dead code following this construct
[warn]   implicit lazy val DependencyResolverFormat: JsonFormat[DependencyResolver] = ???
[warn]                                                                                ^
[warn] /d/sbt-lm/librarymanagement/src/main/scala/sbt/internal/librarymanagement/formats/GlobalLockFormat.scala:7: dead code following this construct
[warn]   implicit lazy val GlobalLockFormat: JsonFormat[GlobalLock] = ???
[warn]                                                                ^
[warn] /d/sbt-lm/librarymanagement/src/main/scala/sbt/internal/librarymanagement/formats/LoggerFormat.scala:7: dead code following this construct
[warn]   implicit lazy val LoggerFormat: JsonFormat[Logger] = ???
[warn]                                                        ^

These types are transitively required by datatypes we define with contraband. At any point there could be code that compiles that then explodes at runtime when it hits these code paths.

Inconsistent log level of eviction warning summary

Summary

Eviction warning summary shows in [warn] log level even with binary compatible conflicts.

Detail

Eviction warning summary is:

[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.

This is introduced in sbt 1.2.0:

Displays only the eviction warning summary by default, and make it configurable using ThisBuild / evictionWarningOptions. lm211 and #3947 by @exoego

My test

I tested update and evicted command in sbt 1.1.6 and 1.2.8.

source code to reproduce: https://github.com/bigwheel/sbt-warn-eviction-summary-inconsistency-behavior

Result:

1.1.6 with b-compatible 1.1.6 with b-incompatible 1.2.8 with b-compatible 1.2.8 with b-incompatible
Eviction warning summary
shows in update command
no no yes yes
Log level of evicted command info warn info warn

Problem

I understand evicted command prints binary compatible conflict as [info] level and binary incompatible conflict as [warn] because binary compatible conflict is no problem in many case.
However, if there are only binary compatible conflicts Eviction warning summary shows in [warn] level in contrast to evicted command prints only in [info] level.

Solution

I figured out some solutions.

1. Eviction warning sumary shows only there are binary incompatible conflicts

https://github.com/sbt/librarymanagement/blob/96a3293c/core/src/main/scala/sbt/librarymanagement/EvictionWarning.scala#L317

if ((a.options.warnEvictionSummary || a.reportedEvictions.nonEmpty) && a.allEvictions.nonEmpty) {

to

if (a.options.warnEvictionSummary && a.reportedEvictions.nonEmpty) {

I think this is best fix.

2. Eviction warning sumary with info level

This is not easy fix because it seems like we cannot select log level in there.

3. Change default EvictionWarningOptions from summary to empty

Easy fix.

Conclusion

Please tell me how to fix this.
After that, I will create Pull Request for it.

Related to

scala - addSbtPlugin(...) cause eviction warning but 'evicted' command shows nothing - Stack Overflow

URLHandlerRegistry conflict with fm-sbt-s3-resolver plugin in SBT 1.0

I'm trying to track down why fm-sbt-s3-resolver publishing is broken in SBT 1.0: tpunder/fm-sbt-s3-resolver#38

It seems that my URLHandlerRegistry.setDefault call:

https://github.com/frugalmechanic/fm-sbt-s3-resolver/blob/ab0188fd3439ee0346c173efae8b4ca462cbea2e/src/main/scala/fm/sbt/S3ResolverPlugin.scala#L98-L113

might be getting wiped out by a URLHandlerRegistry.setDefault call here:

if (configuration.updateOptions.gigahorse) URLHandlerRegistry.setDefault(gigahorseUrlHandler)
else URLHandlerRegistry.setDefault(basicUrlHandler)

Does this seem plausible? I've verified this is the problem.

Match errors due to changes in MavenRepository and MavenCache class hierarchy

I'm getting these when running scripted in sbt:

[info] scala.MatchError: cache:publish-m2-local: /Users/dnw/.m2/repository (of class sbt.librarymanagement.MavenCache)
[info] 	at sbt.internal.librarymanagement.ConvertResolver$$anonfun$defaultConvert$1.applyOrElse(ConvertResolver.scala:117)
[info] 	at sbt.internal.librarymanagement.ConvertResolver$$anonfun$defaultConvert$1.applyOrElse(ConvertResolver.scala:115)
[info] 	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
[info] 	at sbt.internal.librarymanagement.ConvertResolver$.apply(ConvertResolver.scala:112)
[info] 	at sbt.internal.librarymanagement.IvySbt$$anonfun$mapResolvers$1$1.apply(Ivy.scala:279)
[info] 	at sbt.internal.librarymanagement.IvySbt$$anonfun$mapResolvers$1$1.apply(Ivy.scala:279)
[info] 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] 	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
[info] 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
[info] 	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
[info] 	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
[info] 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
[info] 	at scala.collection.AbstractTraversable.map(Traversable.scala:104)
[info] 	at sbt.internal.librarymanagement.IvySbt$.mapResolvers$1(Ivy.scala:279)

The reason is that in the current (85e7cf6) version of lm MavenCache is no longer a subclass of MavenRepository, which is what defaultConvert is expecting.

Here's the class hierarchy in v0.13.13:

/** An instance of a remote maven repository.  Note:  This will use Aether/Maven to resolve artifacts. */
sealed case class MavenRepository(name: String, root: String) extends Resolver {
  override def toString = s"$name: $root"
  def isCache: Boolean = false
  def localIfFile: Boolean = true
  def withLocalIfFile(value: Boolean) = new MavenRepository(name, root) { override def localIfFile = value }
}

/**
 * An instance of maven CACHE directory.  You cannot treat a cache directory the same as a a remote repository because
 * the metadata is different (see Aether ML discussion).
 */
final class MavenCache(name: String, val rootFile: File) extends MavenRepository(name, rootFile.toURI.toURL.toString) {
  override val toString = s"cache:$name: ${rootFile.getAbsolutePath}"
  override def isCache: Boolean = true
}
object MavenCache {
  def apply(name: String, rootFile: File): MavenCache = new MavenCache(name, rootFile)
}

The class hierarchy that @Duhemm created:

        {
          "name": "IMavenRepository",
          "namespace": "sbt.librarymanagement",
          "target": "Scala",
          "type": "interface",
          "doc": "An instance of a remote maven repository.  Note:  This will use Aether/Maven to resolve artifacts.",
          "fields": [
            { "name": "root",        "type": "String"                                       },
            { "name": "localIfFile", "type": "boolean", "default": "true", "since": "0.0.1" }
          ],
          "types": [
            {
              "name": "MavenRepository",
              "namespace": "sbt.librarymanagement",
              "target": "Scala",
              "type": "record",
              "toString": "s\"$name: $root\""
            },
            {
              "name": "MavenCache",
              "namespace": "sbt.librarymanagement",
              "target": "Scala",
              "type": "record",
              "doc": [
                "An instance of maven CACHE directory.  You cannot treat a cache directory the same as a a remote repository because",
                "the metadata is different (see Aether ML discussion)."
              ],
              "fields": [
                { "name": "rootFile", "type": "java.io.File" }
              ],
              "extra": "def this(name: String, rootFile: java.io.File) = this(name, rootFile.toURI.toURL.toString, true, rootFile)",
              "toString": "s\"cache:$name: ${rootFile.getAbsolutePath}\"",
              "extraCompanion": "def apply(name: String, rootFile: java.io.File): MavenCache = new MavenCache(name, rootFile)"
            }
          ]
        },

ie. an IMavenRepository parent abstract class, and MavenRepository and MavenCache subclasses

There are 3 options:

  1. Keep this hierarchy, change all the code that expected MavenRepository to now expect IMavenRepository.
  2. Make MavenRepository the parent abstract class, and MavenRepo/MavenCache the leaves, change all invocations of new MavenRepository(.. to either new MavenRepo(.. or MavenRepository(..
  3. Model it in Scala, not in JSON, creating a concrete sealed MavenRepository class and a leaf final MavenCache class.

@eed3si9n what do you prefer?

jsch and being intransitive

We're currently getting the following warning:

[warn] Found intransitive dependency (com.jcraft:jsch:0.1.46) while publishMavenStyle is true, but Maven repositories
[warn]   do not support intransitive dependencies. Use exclusions instead so transitive dependencies
[warn]   will be correctly excluded in dependent projects.

I don't know the history here. Looks like jsch's only (non-test) dependency is:

<dependency>
  <groupId>com.jcraft</groupId>
  <artifactId>jzlib</artifactId>
  <version>1.0.7</version>
  <optional>true</optional>
</dependency>

We're now configuring sbt (modules) to publish to Maven, so it looks like we have to change how we do the same thing. Except it's not clear to me how exactly to transform intranstive() by "using exclusions".

Clarify licensing

Hi,

I'd like to package sbt-librarymanagement in my distro (as a dependency of sbt). We only package free software, and you don't seem to declare a license. I have seen the NOTICE file in the core directory, so I assume the source present in that directory is under the ASL 2.0 license.

I'm not sure if I need them yet, but none of the other directories declare a license. There doesn't seem to be a license declaration in source files either. As such, I can only assume the rest of this repository is non-free by default. Could you clarify the terms of the license for building, using and distributing the rest of this repo?

Thank you!

The `config()` macro should validate the ID it captures from the enclosing `val`

The config() macro captures the name of the enclosing val to determine the "id" of the configuration. In this snippet:

val fooBar = config("foo-bar")

it will capture the id fooBar. The macro does not complain.

Instead, at run-time (I mean, build load time), I get the following obscure error:

...
[debug]   Load.loadUnit(file:/localhome/doeraene/projects/scalajs/sbt-plugin-test/referencedCrossProject/, ...) took 5034.22581ms

[debug] Load.apply: load took 19346.912803ms

[debug] Load.apply: resolveProjects took 0.660934ms

[debug] Load.apply: finalTransforms took 140.090565ms

[debug] Load.apply: config.delegates took 1.482166ms

[error] java.lang.IllegalArgumentException: requirement failed: id must be capitalized: scalaJSProjectBaseSettings

[error]         at scala.Predef$.require(Predef.scala:277)

[error] java.lang.IllegalArgumentException: requirement failed: id must be capitalized: scalaJSProjectBaseSettings

[error] Use 'last' for the full log.

[debug] > Exec(load-failed, None, None)

[debug] > Exec(last, None, None)

Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q

Note that this is the output with last. In my case scalaJSProjectBaseSettings is the fooBar, and it was very obscure because the config() call was deep within the rhs of scalaJSProjectBaseSettings (which was a Seq[Setting[_]].

The config() macro should really validate the id at compile-time. There is no point in delaying this error until run-time, when there is not even a stack trace to figure out what's wrong.

Cannot use FileCredentials - xxx not specified in credentials file

Hi,
hope you can help me to understand the following behaviour

When I use sbt I've got a wall of messages like these:

[warn] realm not specified in credentials file: C:\Users\<user>\.ivy2\.credentials
[warn] host not specified in credentials file: C:\Users\<user>\.ivy2\.credentials
[warn] user not specified in credentials file: C:\Users\<user>\.ivy2\.credentials
[warn] password not specified in credentials file: C:\Users\<user>\.ivy2\.credentials

I am using a file for the credentials

credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")

.credentials looks like

realm=Some Nexus Repository Manager
host=<host>
user=<nexus-user>
password=<nexus-user-pass>

I found the following in the code used for sbt 0.13.x and also for 1.x and I ask me:
Which properties will be read? Is there something implicit?
Line 70 in the Credentials.scala file:

private[this] def read(from: File): Map[String, String] = {
val properties = new java.util.Properties
IO.load(properties, from)
properties.asScala.map { case (k, v) => (k.toString, v.toString.trim) }.toMap
}
}

My first thought on this code was: What's about adding / using the RealmKeys, HostKeys , UserKeys and PasswordKeys defined a few lines above as properties to read from the file?

Do FileCredentials really work or is something broken with my configuration?
If I use Credentials("Some Nexus Repository Manager", "<host>", "<nexus-user>", "<nexus-user-pass>") everything works fine.

Thanks in advance,
Dennis

Race condition due to parallel download

@jameskoch2 wrote in #90 (comment):

It looks to me like there's a small amount of mutable state inside most stock Resolvers' download() methods, by virtue of their reliance upon BasicResolver.download.

I haven't seen this cause any actual issues in my testing, but thought I'd mention it. It looks like this would only impact logging of failed download attempts (based on quick read of unfamiliar code).

There in fact is odd behavior reported by @cunei using fresh ivy cache + repository override (missing some parent poms) causing unexplainable NPE.

Update README to match 1.0

README is outdated and mentions sbt 0.13. It also could be more concise on what things this sbt module contains, and be more welcoming to contributors.

NullPointerException in GigahorseUrlHandler : 54 (Content Type null)

Hi,

SBT version 1.0.2.

I'm getting NullPointerExceptions in GigahorseUrlHandler.scala line 54:

            BasicURLHandler.getCharSetFromContentType(response.body().contentType().toString)

It assumes that content type will ALWAYS be present. However, when performing HEAD requests, okhttp does NOT give content Type information, contentType is null. To try it, you can run this to verify:

import java.net.URL
import okhttp3.{OkHttpClient, Request}

object test {
  def main(args: Array[String]) = {
    val okHttpClient = new OkHttpClient()

    val url = new URL("http://your.url.com/");

    val head = new Request.Builder().url(url).head()
    val responseHead = okHttpClient.newCall(head.build()).execute()
    // Will be null
    println("Content Type for HEAD =" + responseHead.body().contentType())

    val get = new Request.Builder().url(url).get()
    val responseGet = okHttpClient.newCall(get.build()).execute()
    println("Content Type for GET =" + responseGet.body().contentType())
  }
}

Sincerely,
Teofilis Martisius

Access Denied error could have better error message

Hi I get this error when we dont have the password configured for my companies nexus server. Then sbt keeps retrying. Adding the password resolves the problem.

[error] Server access Error: Too many follow-up requests: 21 url=https://nexus.mycompany.com/nexus/content/groups/public/org/slf4j/slf4j-parent/1.7.23/slf4j-parent-1.7.23.jar

If I curl the server It defiantly returns a HTTP/1.1 401 Unauthorized.

I think retrying makes sense for other http errors but not the 401. In this case it would be better if the error clearly said access denied and maybe prompted the user to configure a password. This would help other users debug their problems.

I think the error comes from here, and we could add another case. https://github.com/sbt/librarymanagement/blob/1.x/ivy/src/main/scala/sbt/internal/librarymanagement/ivyint/GigahorseUrlHandler.scala#L65

PS. I'd be interested in contributing to this, but was unable to import project into intellij. I get unresolved dependency: com.thoughtworks.paranamer#paranamer;2.8

Explore the use of `force` in resolver

From the ivy docs:

Any standard resolver can be used in force mode, which is used mainly to handle local development builds. In force mode, the resolver attempts to find a dependency whatever the requested revision is (internally it replace the requested revision by 'latest.integration'), and if it finds one, it forces this revision to be returned, even when used in a chain with returnFirst=false.

By using such a resolver at the beginning of a chain, you can be sure that Ivy will pick up whatever module is available in this resolver (usually a private local build) instead of the real requested revision. This allows to handle use case like a developer working on modules A and C, where A -> B -> C, and pick up the local build for C without having to publish a local version of B.
since 2.0

It looks like this could be beneficial for some scenarios. I'm not sure what the status quo is.

Requests for nonexistent jar artifacts when packaging is pom

When working behind proxies trying to resolve artifacts that do not exist can slow things down unbearably. Currently SBT v1.2.8 seems to request jar artifacts when the pom.xml says packaging=pom. So resolution can stall out for minutes trying to fetch jar files that the pom indicates are not there. Can we please fix this, or have a property that toggles it so SBT can be more widely adopted?

ModuleResolversTest is flaky

just saw a flaky test failure in a trivial PR #226:

NORMAL RESOLUTION TIME 3003
FASTER RESOLUTION TIME 2716
[info] ModuleResolversTest:
[info] The direct resolvers in update options
[info] - should skip the rest of resolvers *** FAILED ***
[info]   2716 was not less than or equal to 2402.4 (ModuleResolversTest.scala:48)

Robustify ivy resolution

From pants' pantsbuild/pants#1779:

One common problem is that by default, ivy does not treat resolution of a pom and a jar as an atomic unit. To turn this on you need to add descriptor="required" to each you want to act atomically, see the common resolver atributes doc. With this resolver attribute turned on, a failure to download a pom for a jar dependency will fail the resolve and cache nothing in the ivy cache. You simply re-run the resolve until success. Without the attribute, the pom download failure would be silent and then the jar might download successfully, but with no transitive deps walked. This state would be cached by ivy and further resolves of the jar would continue to fail to add in transitive deps (only quicker now since the failure is cached!).

It seems that we can make our ivy resolution more robust by setting the descriptor attribute to required (http://ant.apache.org/ivy/history/latest-milestone/settings/resolvers.html#common). At first sight, I see no issue with enabling this. It's been successfully used at Pants and the change makes sense.

This change is small. I am happy to do this, but I'd like to know the maintainers' opinion on this issue since they are the ones who are more familiar with ivy semantics.

lm is broken in 1.1.x

lm 1.1.x is currently broken.

It was fine in 4c5e1b9, failed in 67d97fe.

The commits that intervened in between are here: 4c5e1b9...67d97fe

The error message when calling lmCore/packageSrc is:

[info] Packaging /home/cunei/activities/clones/librarymanagement/core/target/scala-2.12/librarymanagement-core_2.12-1.1.3-sources.jar ...
[error] java.util.zip.ZipException: duplicate entry: sbt/librarymanagement/MakePomConfiguration.scala
[error] 	at java.util.zip.ZipOutputStream.putNextEntry(ZipOutputStream.java:232)
[error] 	at java.util.jar.JarOutputStream.putNextEntry(JarOutputStream.java:109)
[error] 	at sbt.io.IO$.addFileEntry$1(IO.scala:550)
[error] 	at sbt.io.IO$.$anonfun$writeZip$3(IO.scala:559)
[error] 	at sbt.io.IO$.$anonfun$writeZip$3$adapted(IO.scala:559)
[error] 	at scala.collection.Iterator.foreach(Iterator.scala:929)
[error] 	at scala.collection.Iterator.foreach$(Iterator.scala:929)
[error] 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
[error] 	at scala.collection.IterableLike.foreach(IterableLike.scala:71)
[error] 	at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
[error] 	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
[error] 	at sbt.io.IO$.writeZip(IO.scala:559)
[error] 	at sbt.io.IO$.$anonfun$archive$1(IO.scala:513)
[error] 	at sbt.io.IO$.$anonfun$archive$1$adapted(IO.scala:510)
[error] 	at sbt.io.IO$.$anonfun$withZipOutput$1(IO.scala:595)
[error] 	at sbt.io.IO$.$anonfun$withZipOutput$1$adapted(IO.scala:584)
[error] 	at sbt.io.Using.apply(Using.scala:22)
[error] 	at sbt.io.IO$.withZipOutput(IO.scala:584)
[error] 	at sbt.io.IO$.archive(IO.scala:510)
[error] 	at sbt.io.IO$.jar(IO.scala:489)
[error] 	at sbt.Package$.makeJar(Package.scala:121)
...

external ivy setting file causes IllegalArgumentException: For input string: "null"

steps

scripted ivy-settings-b test on sbt/sbt.

problem

> update
[info] Updating {xxx}ivy-settings-b...
java.lang.IllegalArgumentException: For input string: "null"
	at scala.collection.immutable.StringLike.parseBoolean(StringLike.scala:327)
	at scala.collection.immutable.StringLike.toBoolean(StringLike.scala:286)
	at scala.collection.immutable.StringLike.toBoolean$(StringLike.scala:286)
	at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29)
	at sbt.internal.librarymanagement.ConvertResolver$$anonfun$defaultConvert$lzycompute$1.applyOrElse(ConvertResolver.scala:158)
	at sbt.internal.librarymanagement.ConvertResolver$$anonfun$defaultConvert$lzycompute$1.applyOrElse(ConvertResolver.scala:156)
	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:34)
	at sbt.internal.librarymanagement.ConvertResolver$.apply(ConvertResolver.scala:153)
	at sbt.internal.librarymanagement.IvySbt$.$anonfun$addResolvers$1(Ivy.scala:407)
	at sbt.internal.librarymanagement.IvySbt$.$anonfun$addResolvers$1$adapted(Ivy.scala:405)
	at scala.collection.Iterator.foreach(Iterator.scala:929)
	at scala.collection.Iterator.foreach$(Iterator.scala:929)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
	at scala.collection.IterableLike.foreach(IterableLike.scala:71)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at sbt.internal.librarymanagement.IvySbt$.addResolvers(Ivy.scala:405)
	at sbt.internal.librarymanagement.IvySbt.settings$lzycompute(Ivy.scala:97)
	at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$settings(Ivy.scala:85)
	at sbt.internal.librarymanagement.IvySbt.ivyLockFile$lzycompute(Ivy.scala:167)
	at sbt.internal.librarymanagement.IvySbt.ivyLockFile(Ivy.scala:167)
	at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:72)
	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:176)
	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:173)
	at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:201)
	at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:227)

notes

val managedChecksums = settings.getVariable(ManagedChecksums).toBoolean

val managedChecksums = settings.getVariable(ManagedChecksums).toBoolean

The above can return null. Need to be protected with Option.

Tests jar are no longer published on maven

Since librarymanagement 1.0.X, you do not publish any tests jar artifact on maven repositories

I used to extend some spec for unitary tests when i used librarymanagement 0.1.X
you can take a look into : https://repo1.maven.org/maven2/org/scala-sbt/librarymanagement_2.10/0.1.0-M9/ , which have tests jar published

And now : https://repo1.maven.org/maven2/org/scala-sbt/librarymanagement-core_2.12/1.2.2/ tests jar are no longer published

Thanks for your help

InlineConfiguration is renamed to ModuleConfiguration but doesn't extend ModuleSettings

Hi,

While fixing the SettingsHelper to be compatible with 1.x I was confused by the 1.0.0 notes. It states

InlineConfiguration is renamed to ModuleConfiguration

In the 0.13.x SettingsHelper I have this line

moduleSettings := InlineConfiguration(projectID.value, projectInfo.value, Seq.empty)

which I tried to turn into

moduleSettings := ModuleConfiguration(projectID.value, projectInfo.value)

The issue was that ModuleConfiguration doesn't extend ModuleSettings. However ModuleDescriptorConfiguration does and seems to work:

moduleSettings := ModuleDescriptorConfiguration(projectID.value, projectInfo.value)
          .withScalaModuleInfo(scalaModuleInfo.value)

I'm not familiar enough with the ivy module implementation details to tell if this is a documentation bug, an implementation bug or if I'm doing something wrong ;)

Expose a few APIs currently in internal

I'd like to refactor sbt-license-report to make a core license reporting library that can be used in other projects. It would need this librarymanagement library, but not the rest of sbt.

It currently uses these three APIs which are in the internal package, so it'd be nice to move them out:

IvyRetrieve.toModuleID
ResolveException
IvySbt#Module

Fine tune gigahorse options

As of 1.0.0, gigahorse backend is enabled for all the http connections. We should have a look at the defaults in Gigahorse (http://eed3si9n.com/gigahorse/configuration.html) and fine tune them for sbt resolution. At first sight they look sane, but I think that we could optimize a little bit more resolution by tweaking some knobs / experimenting.

JDK9 Warning: Illegal reflective access

Hello.

I've installed JDK9 (MacOS) and using sbt 1.0.2 (via sbt-extras) I see the following when I start sbt:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by sbt.internal.librarymanagement.ivyint.ErrorMessageAuthenticator$ (file:/Users/richard/.sbt/boot/scala-2.12.3/org.scala-sbt/sbt/1.0.2/librarymanagement-ivy_2.12-1.0.2.jar) to field java.net.Authenticator.theAuthenticator
WARNING: Please consider reporting this to the maintainers of sbt.internal.librarymanagement.ivyint.ErrorMessageAuthenticator$
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

It doesn't cause me any problems but as requested in the output I'm "reporting this to the maintainers" (assuming I've reached the right place!).

Thank you.

"unresolved dependencies" should not log a stack trace

warn] 	Note: Some unresolved dependencies have extra attributes.  Check that these dependencies exist with the requested attributes.
[warn] 		com.eed3si9n:sbt-buildinfo:0.6.1 (scalaVersion=2.12, sbtVersion=1.0)
[warn]
[warn] 	Note: Unresolved dependencies path:
[warn] 		com.eed3si9n:sbt-buildinfo:0.6.1 (scalaVersion=2.12, sbtVersion=1.0) (/Users/jz/code/scala/project/project/plugins.sbt#L1-2)
[warn] 		  +- default:scala-build-build:0.1.0-SNAPSHOT (scalaVersion=2.12, sbtVersion=1.0)
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.eed3si9n#sbt-buildinfo;0.6.1: not found
[error] 	at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:331)
[error] 	at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:205)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error] 	at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] 	at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] 	at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] 	at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] 	at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] 	at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] 	at xsbt.boot.Using$.withResource(Using.scala:10)
[error] 	at xsbt.boot.Using$.apply(Using.scala:9)
[error] 	at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] 	at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] 	at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] 	at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] 	at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error] 	at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:190)
[error] 	at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] 	at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] 	at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error] 	at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error] 	at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error] 	at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] 	at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error] 	at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2383)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] 	at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] 	at sbt.Execute.work(Execute.scala:271)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] 	at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException:

This is an everyday error and the user is not interested in the stack trace of SBT internals.

Fix for full SemVer

Add test for 1.1.0-M1 it should be compatible with 1.0

  /**
   * Returns sbt binary interface x.y API compatible with the given version string v.
   * RCs for x.y.0 are considered API compatible.
   * Compatible versions include 0.12.0-1 and 0.12.0-RC1 for Some(0, 12).
   */
  private[sbt] def sbtApiVersion(v: String): Option[(Long, Long)] = v match {
    case ReleaseV(x, y, _, _)                     => Some(sbtApiVersion(x.toLong, y.toLong))
    case CandidateV(x, y, _, _)                   => Some(sbtApiVersion(x.toLong, y.toLong))
    case NonReleaseV_n(x, y, z, _) if z.toInt > 0 => Some(sbtApiVersion(x.toLong, y.toLong))
    case _                                        => None
  }

cached resolution: mockito tests broken

I have a set of tests that use mockito, which work just fine on sbt 1.2.4 and lower. Just bumping the version of sbt to 1.2.6, 1.2.7 or 1.2.8 will make the test fail with the following stack trace:

[info] com.telefonica.baikal.i18n.provisioners.PiScopeI18nProvisionerTest *** ABORTED ***
[info]   java.lang.NoSuchMethodError: org.mockito.internal.invocation.ArgumentsProcessor.expandArgs(Lorg/mockito/internal/invocation/MockitoMethod;[Ljava/lang/Object;)[Ljava/lang/Object;
[info]   at org.mockito.internal.invocation.InterceptedInvocation.<init>(InterceptedInvocation.java:44)
[info]   at org.mockito.internal.invocation.DefaultInvocationFactory.createInvocation(DefaultInvocationFactory.java:42)
[info]   at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor.doIntercept(MockMethodInterceptor.java:63)
[info]   at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor.doIntercept(MockMethodInterceptor.java:49)
[info]   at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor$DispatcherDefaultingToRealMethod.interceptAbstract(MockMethodInterceptor.java:128)
[info]   at com.telefonica.baikal.i18n.services.I18nTermsService$MockitoMock$1879364830.publish(Unknown Source)
[info]   at com.telefonica.baikal.i18n.provisioners.PiScopeI18nProvisionerTest.$anonfun$new$1(PiScopeI18nProvisionerTest.scala:39)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   ...
[error] Uncaught exception when running com.telefonica.baikal.i18n.provisioners.PiScopeI18nProvisionerTest: java.lang.NoSuchMethodError: org.mockito.internal.invocation.ArgumentsProcessor.expandArgs(Lorg/mockito/internal/invocation/MockitoMethod;[Ljava/lang/Object;)[Ljava/lang/Object;
[error] sbt.ForkMain$ForkError: java.lang.NoSuchMethodError: org.mockito.internal.invocation.ArgumentsProcessor.expandArgs(Lorg/mockito/internal/invocation/MockitoMethod;[Ljava/lang/Object;)[Ljava/lang/Object;
[error]         at org.mockito.internal.invocation.InterceptedInvocation.<init>(InterceptedInvocation.java:44)
[error]         at org.mockito.internal.invocation.DefaultInvocationFactory.createInvocation(DefaultInvocationFactory.java:42)
[error]         at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor.doIntercept(MockMethodInterceptor.java:63)
[error]         at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor.doIntercept(MockMethodInterceptor.java:49)
[error]         at org.mockito.internal.creation.bytebuddy.MockMethodInterceptor$DispatcherDefaultingToRealMethod.interceptAbstract(MockMethodInterceptor.java:128)
[error]         at com.telefonica.baikal.i18n.services.I18nTermsService$MockitoMock$1879364830.publish(Unknown Source)
[error]         at com.telefonica.baikal.i18n.provisioners.PiScopeI18nProvisionerTest.$anonfun$new$1(PiScopeI18nProvisionerTest.scala:39)
[error]         at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[error]         at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[error]         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[error]         at org.scalatest.Transformer.apply(Transformer.scala:22)
[error]         at org.scalatest.Transformer.apply(Transformer.scala:20)
[error]         at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1682)
[error]         at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
[error]         at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
[error]         at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1685)
[error]         at org.scalatest.FlatSpecLike.invokeWithFixture$1(FlatSpecLike.scala:1680)
[error]         at org.scalatest.FlatSpecLike.$anonfun$runTest$1(FlatSpecLike.scala:1692)
[error]         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[error]         at org.scalatest.FlatSpecLike.runTest(FlatSpecLike.scala:1692)
[error]         at org.scalatest.FlatSpecLike.runTest$(FlatSpecLike.scala:1674)
[error]         at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1685)
[error]         at org.scalatest.FlatSpecLike.$anonfun$runTests$1(FlatSpecLike.scala:1750)
[error]         at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:396)
[error]         at scala.collection.immutable.List.foreach(List.scala:392)
[error]         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[error]         at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:379)
[error]         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[error]         at org.scalatest.FlatSpecLike.runTests(FlatSpecLike.scala:1750)
[error]         at org.scalatest.FlatSpecLike.runTests$(FlatSpecLike.scala:1749)
[error]         at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1685)
[error]         at org.scalatest.Suite.run(Suite.scala:1147)
[error]         at org.scalatest.Suite.run$(Suite.scala:1129)
[error]         at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1685)
[error]         at org.scalatest.FlatSpecLike.$anonfun$run$1(FlatSpecLike.scala:1795)
[error]         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[error]         at org.scalatest.FlatSpecLike.run(FlatSpecLike.scala:1795)
[error]         at org.scalatest.FlatSpecLike.run$(FlatSpecLike.scala:1793)
[error]         at org.scalatest.FlatSpec.run(FlatSpec.scala:1685)
[error]         at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[error]         at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
[error]         at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:304)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error]         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error]         at java.lang.Thread.run(Thread.java:748)

The offending line is this one:

given(termsService.publish(contextName, toPublish)).willReturn(Future.successful(()))

here's the definition of each variable:

private val termsService = mock[I18nTermsService]
private val contextName: String = "PiScope-event123"
private val toPublish: Map[String, String] = Map("title" -> title, "description" -> description)

I am running on scala 2.12.8, using "org.mockito" % "mockito-core" % "2.23.4" and "org.scalatest" %% "scalatest" % "3.0.5".

The SBT definition of the project with the failing tests is the following:

lazy val i18n = (project in file ("i18n"))
  .enablePlugins(PlayScala)
  .enablePlugins(BuildInfoPlugin)
  .settings(buildInfoSettings)
  .settings(
    name := "i18n",
    libraryDependencies ++= i18nDependencies,
    parallelExecution in Test := false, // mockito and multithreading aren't friends
  )
  .dependsOn(common % "compile->compile;test->test")
  .dependsOn(slickExtensions)
  .dependsOn(services)

lazy val root = (project in file("."))
  .settings(
    inThisBuild(
      List(
        organization := "com.telefonica.baikal",
        scalaVersion := "2.12.8",
        fork in Test := true,
        updateOptions := updateOptions.value.withCachedResolution(true)
      )
    )
  )

Apparent unbounded parallel download of all artifacts at once

Apparently, at least according to the logs, sbt is attempting to download all artifacts at once:

[info] Updating ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-server_2.11/2.6.20/play-server_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/twirl-api_2.11/1.3.15/twirl-api_2.11-1.3.15.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/filters-helpers_2.11/2.6.20/filters-helpers_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-logback_2.11/2.6.20/play-logback_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-akka-http-server_2.11/2.6.20/play-akka-http-server_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play_2.11/2.6.20/play_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-specs2_2.11/2.6.20/play-specs2_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-streams_2.11/2.6.20/play-streams_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-actor_2.11/2.5.17/akka-actor_2.11-2.5.17.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-slf4j_2.11/2.5.17/akka-slf4j_2.11-2.5.17.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-json_2.11/2.6.10/play-json_2.11-2.6.10.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-stream_2.11/2.5.17/akka-stream_2.11-2.5.17.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-protobuf_2.11/2.5.17/akka-protobuf_2.11-2.5.17.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/ssl-config-core_2.11/0.2.4/ssl-config-core_2.11-0.2.4.jar ...
[info] downloading https://repo1.maven.org/maven2/org/scala-lang/modules/scala-parser-combinators_2.11/1.1.1/scala-parser-combinators_2.11-1.1.1.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-functional_2.11/2.6.10/play-functional_2.11-2.6.10.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-http-core_2.11/10.0.14/akka-http-core_2.11-10.0.14.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-parsing_2.11/10.0.14/akka-parsing_2.11-10.0.14.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-test_2.11/2.6.20/play-test_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-guice_2.11/2.6.20/play-guice_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-docs_2.11/2.6.20/play-docs_2.11-2.6.20.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-omnidoc_2.11/2.6.20/play-omnidoc_2.11-2.6.20.jar ...
[info] 	[SUCCESSFUL ] com.typesafe.play#play-logback_2.11;2.6.20!play-logback_2.11.jar (1310ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-slf4j_2.11;2.5.17!akka-slf4j_2.11.jar (3077ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-server_2.11;2.6.20!play-server_2.11.jar (3948ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#filters-helpers_2.11;2.6.20!filters-helpers_2.11.jar (4012ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#twirl-api_2.11;1.3.15!twirl-api_2.11.jar (4911ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-specs2_2.11;2.6.20!play-specs2_2.11.jar (5128ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-guice_2.11;2.6.20!play-guice_2.11.jar (6232ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-akka-http-server_2.11;2.6.20!play-akka-http-server_2.11.jar (6587ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-test_2.11;2.6.20!play-test_2.11.jar (8105ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-streams_2.11;2.6.20!play-streams_2.11.jar (9654ms)
[info] 	[SUCCESSFUL ] com.typesafe#ssl-config-core_2.11;0.2.4!ssl-config-core_2.11.jar(bundle) (15307ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-functional_2.11;2.6.10!play-functional_2.11.jar (18882ms)
[info] 	[SUCCESSFUL ] org.scala-lang.modules#scala-parser-combinators_2.11;1.1.1!scala-parser-combinators_2.11.jar(bundle) (18890ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-json_2.11;2.6.10!play-json_2.11.jar (20075ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-protobuf_2.11;2.5.17!akka-protobuf_2.11.jar (20059ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-parsing_2.11;10.0.14!akka-parsing_2.11.jar (20733ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play_2.11;2.6.20!play_2.11.jar (49001ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-actor_2.11;2.5.17!akka-actor_2.11.jar (52025ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-http-core_2.11;10.0.14!akka-http-core_2.11.jar (53297ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-stream_2.11;2.5.17!akka-stream_2.11.jar (56741ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-omnidoc_2.11;2.6.20!play-omnidoc_2.11.jar (64364ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-docs_2.11;2.6.20!play-docs_2.11.jar (66601ms)

The downloading messages are logged all at once, and then the time each artifact took to download is from when it started downloading all of them. The other possibility is maybe the logger and the thing timing the downloads is logging/starting time from the start of when it starts download everything, not from the start of each artifact.

I don't think downloading all artifacts at once is a good idea, it's likely to trigger connection limits, potentially get a client temporarily blocked, and possibly flood the network, causing TCP connection drops when TCP ACKs are dropped.

This appears to be a regression between 1.1.x and 1.2.x, here's some output from sbt 1.1.0:

[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-stream_2.12/2.5.16/akka-stream_2.12-2.5.16.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play_2.12/2.6.19/play_2.12-2.6.19.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-actor_2.12/2.5.16/akka-actor_2.12-2.5.16.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-test_2.12/2.6.19/play-test_2.12-2.6.19.jar ...
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-omnidoc_2.12/2.6.19/play-omnidoc_2.12-2.6.19.jar ...
[info] 	[SUCCESSFUL ] com.typesafe.play#play-test_2.12;2.6.19!play-test_2.12.jar (3578ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play_2.12;2.6.19!play_2.12.jar (20262ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-stream_2.12;2.5.16!akka-stream_2.12.jar (23555ms)
[info] downloading https://repo1.maven.org/maven2/com/typesafe/akka/akka-protobuf_2.12/2.5.16/akka-protobuf_2.12-2.5.16.jar ...
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-actor_2.12;2.5.16!akka-actor_2.12.jar (26000ms)
[info] 	[SUCCESSFUL ] com.typesafe.akka#akka-protobuf_2.12;2.5.16!akka-protobuf_2.12.jar (3735ms)
[info] 	[SUCCESSFUL ] com.typesafe.play#play-omnidoc_2.12;2.6.19!play-omnidoc_2.12.jar (43070ms)
[info] downloading https://repo1.maven.org/maven2/com/typesafe/play/play-docs_2.12/2.6.19/play-docs_2.12-2.6.19.jar ...
[info] 	[SUCCESSFUL ] com.typesafe.play#play-docs_2.12;2.6.19!play-docs_2.12.jar (38244ms)

In this case it appears that it's batching them in predefined groups, which also isn't ideal, since as you can see it doesn't start downloading play-docs until it finishes downloading play-omnidoc.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.