Code Monkey home page Code Monkey logo

slick's Introduction

Slick

Sponsor1Badge

MavenBadge

DiscussionsBadge DiscordBadge GitterBadge

Slick is an advanced, comprehensive database access library for Scala with strongly-typed, highly composable APIs.

Slick makes it easy to use your database in a way that's natural to it. It allows you to work with relational databases almost as if you were using Scala collections, while at the same time giving you full control over when the database is accessed and how much data is transferred. And by writing your queries in Scala you can benefit from compile-time safety and great compositionality, while retaining the ability to drop down to raw SQL when necessary for custom or advanced database features.

Its features include:

  • Query API inspired by Scala's collections API
  • Full control over how the database schema is seen by Slick, by using an explicit representation that can be code-generated from the actual database
  • Asynchronous API using Future for complete results, and a streaming API that conforms to the Reactive Streams interface, for easy integration with any streaming library, such as Akka Streams, FS2, or ZIO.
  • Composability at many levels
    • Compose actions (steps to run in a database context) with for comprehensions
    • Compose queries with for comprehensions or combinators
    • Compose row expressions (column sets, predicates, and column mappings)
  • A database metadata introspection API

Slick features an advanced query compiler which can generate SQL for a variety of different database engines from the same Scala code, allowing you to focus on application logic without worrying about database-specific syntax and quirks.

Resources

Learn

Get help

Example

As a simple example we will create a Scala object Coffee, and a table to store instances of this object in the database:

import slick.jdbc.PostgresProfile.api._

// First declare our Scala object
final case class Coffee(name: String, price: Double)

// Next define how Slick maps from a database table to Scala objects
class Coffees(tag: Tag) extends Table[Coffee](tag, "COFFEES") {
  def name  = column[String]("NAME")
  def price = column[Double]("PRICE")
  def * = (name, price).mapTo[Coffee]
}

// The `TableQuery` object gives us access to Slick's rich query API
val coffees = TableQuery[Coffees]

// Inserting is done by appending to our query object
// as if it were a regular Scala collection
// SQL: insert into COFFEES (NAME, PRICE) values ('Latte', 2.50)
coffees += Coffee("Latte", 2.50)

// Fetching data is also done using the query object
// SQL: select NAME from COFFEES
coffees.map(_.name)

// More complex queries can be chained together
// SQL: select NAME, PRICE from COFFEES where PRICE < 10.0 order by NAME
coffees.filter(_.price < 10.0).sortBy(_.name)

Database support

The following databases are directly supported by Slick, and are currently covered by a large suite of automated tests to ensure compatibility:

Database JDBC Driver Tested server version
PostgreSQL "org.postgresql" % "postgresql" % "42.5.0" Latest
MySQL "com.mysql" % "mysql-connector-j" % "8.0.33" Latest
SQLServer "com.microsoft.sqlserver" % "mssql-jdbc" % "7.2.2.jre11" 2022
Oracle "com.oracle.database.jdbc.debug" % "ojdbc8_g" % "21.6.0.0.1" 11g
DB2 "com.ibm.db2.jcc" % "db2jcc" % "db2jcc4" 11.5.7.0
Derby/JavaDB "org.apache.derby" % "derby" % "10.14.2.0"
H2 "com.h2database" % "h2" % "1.4.200"
HSQLDB/HyperSQL "org.hsqldb" % "hsqldb" % "2.5.2"
SQLite "org.xerial" % "sqlite-jdbc" % "3.39.2.1"

Accessing other database systems is possible, although possibly with a reduced feature set.

Contributing

Slick is community-maintained: pull requests are very welcome, and we ask that all contributors abide by the Lightbend Community Code of Conduct.

Lightbend staff (such as @SethTisue) may be able to assist with administrative issues.

slick's People

Contributors

alexfrankfurt avatar ashleymercer avatar atkinschang avatar cvogt avatar d6y avatar electricwound avatar he-pin avatar hvesalai avatar japgolly avatar jkugiya avatar jkutner avatar kuppuswamy avatar marcospereira avatar mdedetrich avatar mergify[bot] avatar nafg avatar octonato avatar olegych avatar osleonard avatar renovate[bot] avatar robstoll avatar scala-steward avatar sethtisue avatar smootoo avatar stewsquared avatar szeiger avatar tminglei avatar trevorsibanda avatar wellingr avatar xuwei-k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

slick's Issues

Support for not quoting table & column names in generated SQL.

When I run FooTable.ddl.create, the SQL CREATE statements have quotes around the table and column names. With H2 DB, this causes all following queries to be required to also quote those names for them to be recognized. This messes up queries coming from other sources, e.g. ActiveRecord. Would it be much work to stop this from being done? I'm using this to generate the schema to run tests against, and I could either change that behavior here, or get ActiveRecord in Rails to start adding quotes to all table names in the generated SQL.

AND Condition for JOIN query

Hello,
I'm trying to do combined request with inner and left Joins
However I am facing the problem that condition always goes into the global WHERE section of the final query but never appends to the ON

SELECT * FROM readings AS r
JOIN parameters AS p
LEFT JOIN sensorvalues AS sv ON sv.parameter_id=p.id AND sv.reading_id=r.id

I tried something like this

val readings = for {
        all <-Readings join Parameters leftJoin SensorValues on (_._2.id is _.parameter_id) if(all._1._1.id === all._2.reading_id)
      } yield (all._1._1,all._1._2,all._2)

And it end up with this query

SELECT
    x2.x3,
    x2.x4,
    x2.x5,
    x2.x6,
    x2.x7,
    x2.x8,
    x9.x10,
    x9.x11,
    x9.x12,
    x9.x13
FROM
    (
        SELECT
            x14.x15 AS x3,
            x14.x16 AS x4,
            x14.x17 AS x5,
            x18.x19 AS x6,
            x18.x20 AS x7,
            x18.x21 AS x8
        FROM
            (
                SELECT
                    x22.`id` AS x15,
                    x22.`platform_id` AS x16,
                    x22.`date` AS x17
                FROM
                    `readings` x22
            )x14
        INNER JOIN(
            SELECT
                x23.`id` AS x19,
                x23.`name` AS x20,
                x23.`units` AS x21
            FROM
                `parameters` x23
        )x18
    )x2
LEFT OUTER JOIN(
    SELECT
        x24.`id` AS x10,
        x24.`reading_id` AS x11,
        x24.`parameter_id` AS x12,
        x24.`value` AS x13
    FROM
        `sensorValues` x24
)x9 ON x2.x6 = x9.x12
WHERE
    x2.x3 = x9.x11

instead WHERE I need AND. What Should I use to mention this condition or there is no functionality in slick for that?

Please release 0.10 build with nested tuple support

Stefan, this is just to capture our need for a pre-SLICK build that includes the nested tuple support, since I don't think our project schedule will allow us to wait for SLICK. :-) We're out of planning and into development now, so internally, at least, there's some urgency about this now. Thanks!

MS Server 2008 Incorrect syntax near the keyword 'NOT'

example: org.scalaquery.examples.FirstExample
when use msserver 2008's config:
Database.forURL(url = "jdbc:jtds:sqlserver://ip:1433/test1;SelectMethod=cursor",
user="sa", password="pwd",
driver="net.sourceforge.jtds.jdbc.Driver") withSession {

when run to : (Suppliers.ddl ++ Coffees.ddl).create
is there anything wrong?

get error below:
Exception in thread "main" java.sql.SQLException: Incorrect syntax near the keyword 'NOT'.
at net.sourceforge.jtds.jdbc.SQLDiagnostic.addDiagnostic(SQLDiagnostic.java:368)
at net.sourceforge.jtds.jdbc.TdsCore.tdsErrorToken(TdsCore.java:2820)
at net.sourceforge.jtds.jdbc.TdsCore.nextToken(TdsCore.java:2258)
at net.sourceforge.jtds.jdbc.TdsCore.getMoreResults(TdsCore.java:632)
at net.sourceforge.jtds.jdbc.JtdsStatement.processResults(JtdsStatement.java:584)
at net.sourceforge.jtds.jdbc.JtdsStatement.executeSQL(JtdsStatement.java:546)
at net.sourceforge.jtds.jdbc.JtdsPreparedStatement.execute(JtdsPreparedStatement.java:558)
at org.scalaquery.ql.DDL$$anonfun$create$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(DDL.scala:23)
at org.scalaquery.ql.DDL$$anonfun$create$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(DDL.scala:23)
at org.scalaquery.session.Session$class.withPreparedStatement(Session.scala:53)
at org.scalaquery.session.BaseSession.withPreparedStatement(Session.scala:92)
at org.scalaquery.ql.DDL$$anonfun$create$1$$anonfun$apply$mcV$sp$1.apply(DDL.scala:23)
at org.scalaquery.ql.DDL$$anonfun$create$1$$anonfun$apply$mcV$sp$1.apply(DDL.scala:22)
at scala.collection.Iterator$class.foreach(Iterator.scala:652)
at scala.collection.Iterator$$anon$20.foreach(Iterator.scala:344)
at org.scalaquery.ql.DDL$$anonfun$create$1.apply$mcV$sp(DDL.scala:22)
at org.scalaquery.ql.DDL$$anonfun$create$1.apply(DDL.scala:22)
at org.scalaquery.ql.DDL$$anonfun$create$1.apply(DDL.scala:22)
at org.scalaquery.session.BaseSession.withTransaction(Session.scala:127)
at org.scalaquery.ql.DDL$class.create(DDL.scala:21)
at org.scalaquery.ql.DDL$$anon$1.create(DDL.scala:42)
at org.scalaquery.examples.FirstExample$$anonfun$main$1.apply$mcV$sp(FirstExample.scala:65)
at org.scalaquery.examples.FirstExample$$anonfun$main$1.apply(FirstExample.scala:58)
at org.scalaquery.examples.FirstExample$$anonfun$main$1.apply(FirstExample.scala:58)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.scalaquery.session.Database$$anonfun$withSession$1.apply(Database.scala:46)
at org.scalaquery.session.Database$$anonfun$withSession$1.apply(Database.scala:46)
at org.scalaquery.session.Database.withSession(Database.scala:38)
at org.scalaquery.session.Database.withSession(Database.scala:46)
at org.scalaquery.examples.FirstExample$.main(FirstExample.scala:58)
at org.scalaquery.examples.FirstExample.main(FirstExample.scala)

reading columns with MColumn.getColumns

when working with oracle driver (oracle10g XE) results in a SQLException:

Caused by: java.sql.SQLException: Ungültiger Spaltenindex
at oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:1231)
at org.scalaquery.session.PositionedResult.nextStringOption(PositionedResult.scala:47)
at org.scalaquery.meta.MQName$.optionalFrom(MQName.scala:20)
at org.scalaquery.meta.MColumn$$anonfun$getColumns$2.apply(MColumn.scala:27)
at org.scalaquery.meta.MColumn$$anonfun$getColumns$2.apply(MColumn.scala:22)

The problem is, the resultset returned by the jdbc driver contains only 18 columns (upto the isNullable Value)
while scalaquery tries to read 23 coumns.

Support INTERVAL type

It's not supported by JDBC, so ScalaQuery should add its own Interval type and map that to the right SQL syntax and (probably DBMS-specific) JDBC types.

Oracle Sequeces: How to or new feature?

I couldn't figure out to use Oracle sequeces in insert statements. By looking the sources It doesn't seem possible at all.

Just to be clear, Oracle sequences are how Oracle deals with auto-incrementing fields. A insert using would look like: insert into SOME_TABLE (A,B) values (SOME_SEQUENCE.nextval, "abc");

trying to use with connection pooling

Hi there,

I've previously written a simple wrapper (which works very well) around c3p0 connection pooling library.

I want to use it with scala-query.

My (naive) approach was to subclass Database like so:

class C3P0Database extends Database {
import java.sql.Connection
override def createConnection(): Connection = {
import com.senti.db.DBCFactory
DBCFactory.getConnection(classOf[CCDBConn])
}
}

Sure enough it proves a connection but (I presume) it's not setting the correct DDL dialect.

Because when I run the following code:

object Users extends Table(Int, String, String) {
def id = column[Int]("iUser", O.NotNull)
def password = column[String]("iPassword", O.NotNull, O.DBType%28"varchar%28255%29"%29)
def email = column[String]("sPassword", O.NotNull, O.DBType%28"varchar%28255%29"%29)
def * = id ~ password ~ email
}

var myQuery = for (u <- Users) yield u.id.count
println(myQuery.selectStatement)

I get the following result ( which is invalid SQL for MySql )

SELECT count("t1"."iUser") FROM "users" "t1"

Can you recomend what I should do to fix this?

I've found stuff like MySQLDriver but I can't figure out what to do to set it active.

Thank you,

Bryan Hunt

Refactor drivers to enable subclassing

All ScalaQuery drivers should be implemented as abstract classes, with a (companion) object providing the concrete driver. That way, users can refer to the class type and plug in custom subclasses of the existing drivers.

Aggregation function not pushed down into comprehension

When replacing the old-style aggregation in MainTest with the new semantics

val q4 = for {
  u <- Users
  o <- u.orders
    if (o.orderID === (for { o2 <- Orders where(o.userID is _.userID) } yield o2.orderID).max)
} yield u.first ~ o.orderID

an incorrect query is generated:

*** (s.slick.compiler.QueryCompiler) After phase fuseComprehensions:
  Comprehension(fetch = None, offset = None)
    from s10: Table users
    from s11: Table orders
    where0: Apply Function = {StaticType.Boolean}
      0: Path s11.userID
      1: Path s10.id
    where1: Apply Function = {StaticType.Boolean}
      0: Path s11.orderID
      1: Apply Function max {scala.slick.jdbc.JdbcType$$anon$1@75c59c40}
        0: Comprehension(fetch = None, offset = None)
          from s12: Table orders
          where0: Apply Function = {StaticType.Boolean}
            0: Path s11.userID
            1: Path s12.userID
          select: Pure
            value: ProductNode
              1: Path s12.orderID
    select: Pure
      value: ProductNode
        1: Path s10.first
        2: Path s11.orderID

ScalaQuery ignores the autoCommit option

Hi,

I'm trying to disable autoCommit in the DB connections in order to use the rollback functionality in my tests. But it seems that ScalaQuery ignores the props that I'm sending. It fails for HSQLDB and SQLite databases. The following code shows the issue:

import org.scalaquery.session.Database
import org.scalaquery.session.Session
import org.scalaquery.session.Database.threadLocalSession
import org.scalaquery.ql.basic.BasicTable
import org.scalaquery.ql.basic.BasicDriver.Implicit._
import java.util.Properties

object Test {
  def main(args: Array[String]){

    val props = new Properties
    props.setProperty("autoCommit", "false")

    val database = Database.forURL("jdbc:hsqldb:mem:testdb",
//    val database = Database.forURL("jdbc:sqlite::memory:",
                               user="sa", 
                               password="", 
                               prop=props,
                               driver = "org.hsqldb.jdbcDriver")
//                               driver = "org.sqlite.JDBC")

    val Books = new BasicTable[(Int, String, String, Option[String], Option[String])]("BooksTable") {
          def sn     = column[Int]("bookSN")
          def title  = column[String]("bookTitle")
          def author = column[String]("bookAuthor")
          def translator = column[Option[String]]("bookTranslator")
          def isbn = column[Option[String]]("bookISBN")
          def * = sn ~ title ~ author ~ translator ~ isbn
    }

    database withSession {
        Books.ddl.create
        Books.insert(1, "ScalaQuery HowTo", "Brian Hsu", None, Some("123-456-789-X"))

        val q1 = for(b <- Books) yield b.title
        val first:String = q1.first
        assert(first == "ScalaQuery HowTo")
        threadLocalSession.rollback()
      }

  }
}

And the result I get is

[info] Running Test 
org.scalaquery.SQueryException: Cannot roll back session in auto-commit mode
    at org.scalaquery.session.BaseSession.rollback(Session.scala:116)
    at Test$$anonfun$main$1.apply$mcV$sp(Test.scala:39)
    at Test$$anonfun$main$1.apply(Test.scala:32)
    at Test$$anonfun$main$1.apply(Test.scala:32)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    at org.scalaquery.session.Database$$anonfun$withSession$1.apply(Database.scala:44)
at org.scalaquery.session.Database$$anonfun$withSession$1.apply(Database.scala:44)
at org.scalaquery.session.Database.withSession(Database.scala:36)
at org.scalaquery.session.Database.withSession(Database.scala:44)
at Test$.main(Test.scala:32)
at Test.main(Test.scala)

Slick does not build with Java 7

without this patch:

diff --git a/src/main/scala/scala/slick/session/Session.scala b/src/main/scala/scala/slick/session/Session.scala
index 4808bb7..3799193 100644
--- a/src/main/scala/scala/slick/session/Session.scala
+++ b/src/main/scala/scala/slick/session/Session.scala
@@ -115,6 +115,9 @@ trait Session extends java.io.Closeable with Logging { self =>
   }

   protected def loggingStatement(st: Statement): Statement = if(logger.isDebugEnabled) new Statement {
+    def closeOnCompletion(): Unit = st.closeOnCompletion
+    def isCloseOnCompletion(): Boolean = st.isCloseOnCompletion
+
     def setMaxFieldSize(max: Int) = st.setMaxFieldSize(max)
     def clearWarnings() = st.clearWarnings()
     def getMoreResults(current: Int) = st.getMoreResults(current)

Insert with autoincrement -- please support regular and forced insert

Currently SQ supports only (directly) forced insert. However it should be possible to allow insert in both modes.

Table.forceInsert(record) <-- insert new record and pass all fields -- today it is insert
Table.softInsert(record) <-- insert new record, but pass fields except autoinc

All pieces are in place, you have AutoInc flags so it is possible to rule them out automatically. Currently it is possible to do "soft" insert but it requires each time quite work:

  • create and maintain correct projection -- this is redundant
  • inserts using tuples instead of class -- using named arguments (for clarity) is not possible

getOr does not work within computed columns

Using getOr in a computed column returns 0 for a NULL value; the supplied default value and of the rest of the computation are ignored.

Short example:

import org.scalaquery.ql.extended.HsqldbDriver.Implicit._
import org.scalaquery.ql.extended.{ExtendedTable => Table}
import org.scalaquery.session.Database.threadLocalSession
import org.scalaquery.session._
import org.scalaquery.ql._
import org.scalaquery.ql.TypeMapper._

object Sqtest extends App {

  val Persons = new Table[(Int, String, Option[Int])]("PERSONS") {
    def id = column[Int]("ID", O.PrimaryKey)
    def name = column[String]("NAME", O.NotNull)
    def year = column[Option[Int]]("YEAR")

    def * = id ~ name ~ year
  }

  Database.forURL("jdbc:hsqldb:mem:testdb").withSession {
    Persons.ddl.create
    Persons.insertAll(
        (0, "Alice", Some(2000)),
        (1, "Bob", None))

    val q = for {
      p <- Persons
    } yield p.* ~ p.year.getOr(2000) ~ (p.year.getOr(2000)-0) 

    println(q.selectStatement)
    q.foreach(println)
  }

}

This yields
SELECT "t1"."ID","t1"."NAME","t1"."YEAR","t1"."YEAR",("t1"."YEAR" - 0) FROM "PERSONS" "t1"
(0,Alice,Some(2000),2000,2000)
(1,Bob,None,2000,0)

The last line should be "(1,Bob,None,2000,2000)".

Version used: 0.9.5, scala 2.9.1

None.get in compiler

The AST dump (sorry, left out ": " so it's run together with "Bind"):

chargeQBind
  from @664655927: Table charges
  select: Bind
    from @1809717042: Ref /1867609586
    select: Pure
      value: ProductNode
        1: Path @664655927.chargeid
        2: Path @664655927.price
//1867609586: Filter
  from @1293327350: Table orders
  where: Apply Function and
    0: Apply Function =
      0: Path @664655927.chargeid
      1: Path @1293327350.tintchargeid
    1: Apply Function =
      0: Path @1293327350.orderid
      1: ConstColumn[Long] 17926

The selectStatement:
chargeQ: select x2."chargeid", x2."price" from "charges" x2, "orders" x3 where (x2."chargeid" = x3."tintchargeid") and (x3."orderid" = 17926)

The code:

 def getCharge[A <: Table[_]](table: A)(column: A => Column[Option[Long]], criteria: A => Column[Boolean]) = for {
    charge <- charges
    t <- table
    if (charge.chargeid is column(t)) && criteria(t)
  } yield (charge.chargeid, charge.price)

  val chargeQ = getCharge(orders)(column, _.orderid is orderid)

The exception:

java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:313)
at scala.None$.get(Option.scala:311)
at scala.slick.compiler.FuseComprehensions.select(Relational.scala:295)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$2.apply(Relational.scala:207)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$2.apply(Relational.scala:205)
at scala.Option.map(Option.scala:145)
at scala.slick.compiler.FuseComprehensions.scala$slick$compiler$FuseComprehensions$$inline$1(Relational.scala:205)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$4.apply(Relational.scala:210)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$4.apply(Relational.scala:210)
at scala.slick.ast.Node$$anonfun$nodeMapNodes$1.apply(Node.scala:27)
at scala.slick.ast.Node$$anonfun$nodeMapNodes$1.apply(Node.scala:26)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:58)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.slick.ast.Node$class.nodeMapNodes(Node.scala:26)
at scala.slick.ast.Apply.nodeMapNodes(Node.scala:311)
at scala.slick.ast.SimpleNode$class.nodeMapChildren(Node.scala:50)
at scala.slick.ast.Apply.nodeMapChildren(Node.scala:311)
at scala.slick.compiler.FuseComprehensions.scala$slick$compiler$FuseComprehensions$$inline$1(Relational.scala:210)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$4.apply(Relational.scala:210)
at scala.slick.compiler.FuseComprehensions$$anonfun$scala$slick$compiler$FuseComprehensions$$inline$1$4.apply(Relational.scala:210)
at scala.slick.ast.Node$$anonfun$nodeMapNodes$1.apply(Node.scala:27)
at scala.slick.ast.Node$$anonfun$nodeMapNodes$1.apply(Node.scala:26)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:58)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

...

Play 2.0 + Slick = nonexisting symbol compilation error

Hi!

First of all thank you for developing Slick. I think this is the right way to go. Can't wait to have direct embedding in my code :)

I have been trying to integrate Slick with Play 2.0 and I got an error.

I don't know if I am to refer this to you or to play but I figured you wanted to know anyway.

The error I keep getting is:

[error] class file needed by WithOp is missing.
[error] reference type Cloneable of package scala refers to nonexisting symbol.
[error] one error found

More details at:

https://gist.github.com/3655893

Generated MySQL DDL is not right

I forked the slick-examples project to use MySQL (https://github.com/mslinn/slick-examples) and found problems with the generated SQL.

  • It is safer to use lower case table names and field names with MySQL
  • The generated DDL uses double quotes instead of back ticks. MySQL does not like double quotes unless it is started with the --ansi option.
  • The generated DDL does not specify a length for VARCHAR so the DDL is rejected. Other options for generating similar code exist for MySQL. Here is what is generated:
create table SUPPLIERS (
  SUP_ID INTEGER NOT NULL PRIMARY KEY,
  SUP_NAME VARCHAR NOT NULL,
  STREET VARCHAR NOT NULL,
  CITY VARCHAR NOT NULL,
  STATE VARCHAR NOT NULL,
  ZIP VARCHAR NOT NULL)

... and this is what works:

use slickexamples;
create table SUPPLIERS (
  SUP_ID INTEGER NOT NULL PRIMARY KEY,
  SUP_NAME VARCHAR(255) NOT NULL,
  STREET VARCHAR(255) NOT NULL,
  CITY VARCHAR(255) NOT NULL,
  STATE VARCHAR(255) NOT NULL,
  ZIP VARCHAR(255) NOT NULL)
  • I also found that the automatic lifting to a Query blows up with MySQL:

    You have an error in your SQL syntax; check the manual that corresponds to
    your MySQL server version for the right syntax to use near
    'VARCHAR))||' ')||cast(x2."PRICE" as VARCHAR))||' ')||cast(x2."SALES" as VARCHAR)' at line 1

Illegal SQL code generated: FROM clause missing if subquery is used twice

If a subquery variable is used more than once within a query result definition, the FROM clause in one of the subqueries is omitted.

import org.scalaquery.ql.extended.HsqldbDriver.Implicit._
import org.scalaquery.ql.extended.{ExtendedTable => Table}
import org.scalaquery.session.Database.threadLocalSession
import org.scalaquery.session._
import org.scalaquery.ql._
import org.scalaquery.ql.TypeMapper._

object Sqtest extends App {

  val Persons = new Table[(Int, String, Option[Int])]("PERSONS") {
    def id = column[Int]("ID", O.PrimaryKey)
    def name = column[String]("NAME", O.NotNull)
    def year = column[Option[Int]]("YEAR")

    def * = id ~ name ~ year
  }

  Database.forURL("jdbc:hsqldb:mem:testdb").withSession {
    Persons.ddl.create
    Persons.insertAll(
        (0, "Alice", Some(2000)),
        (1, "Bob", None))

    val q = for {
      p <- Persons
      val yearSum = ( for { q <- Persons } yield q.year.sum).asColumn
    } yield p.* ~ yearSum ~ yearSum

    println(q.selectStatement)
    q.foreach(println)
  }

This gives:

SELECT "t1"."ID","t1"."NAME","t1"."YEAR",(SELECT sum("t2"."YEAR")),(SELECT sum("t2"."YEAR") FROM "PERSONS" "t2") FROM "PERSONS" "t1"
Exception in thread "main" java.sql.SQLSyntaxErrorException: unexpected token: SELECT
[...]

In the first subquery the FROM clause is missing.

asColumn bugs

Some subqueries using asColumn are missing parentheses. I fixed a lot of them with this patch, although I'm sure there's a better way to fix it:

diff --git a/src/main/scala/scala/slick/driver/BasicStatementBuilderComponent.scala b/src/main/scala/scala/slick/driver/BasicStatementBuilderComponent.scala
index adc3b5c..946ed5a 100644
--- a/src/main/scala/scala/slick/driver/BasicStatementBuilderComponent.scala
+++ b/src/main/scala/scala/slick/driver/BasicStatementBuilderComponent.scala
@@ -322,9 +322,9 @@ trait BasicStatementBuilderComponent { driver: BasicDriver =>
         }
         b += symbolName(struct) += '.' += symbolName(field)
       case n => // try to build a sub-query
-        if(!skipParens) b += '('
+        /*if(!skipParens)*/ b += '('
         buildComprehension(toComprehension(n))
-        if(!skipParens) b += ')'
+        /*if(!skipParens)*/ b += ')'
     }

However the following case still doesn't work. One of the symbols does not exist in the from clause.

object orders_lenschoices extends Table[(Long, Long, Long)]("t1") {
      def orderid = column[Long]("a")
      def lenschoiceid = column[Long]("b")
      def chargeid = column[Long]("c")
      def * = orderid ~ lenschoiceid ~ chargeid
    }
    object lenschoices extends Table[(Long, String)]("t2") {
      def id = column[Long]("a")
      def label = column[String]("b")
      def * = id ~ label
    }
    object charges extends Table[(Long, Double)]("t3") {
      def chargeid = column[Long]("a")
      def price = column[Double]("b")
      def * = chargeid ~ price
    }

    val q = for {
      olc <- orders_lenschoices if olc.orderid is 10L
      lc <- lenschoices if lc.id is olc.lenschoiceid
      chargeprice = charges.where(_.chargeid is olc.chargeid).map(_.price).asColumn
    } yield (lc.id, lc.label, chargeprice)
    assertEquals(
      q.selectStatement,
      "select x2.x3, x2.x4, x2.x5" +
      " from \"t1\" x6," +
      " (" +
        "select x7.\"a\" as x3, x7.\"b\" as x4, " +
        "(select x8.\"b\" from \"t3\" x8 where x8.\"a\" = x9.x10) as x5" +
        " from \"t2\" x7 where x7.\"a\" = x9.x11) x2" +
      " where x6.\"a\" = 10"
    )

In general Slick outputs very confusing sql (from clauses are subqueries instead of simple table names).

Thanks.

Why not provide a pom.xml

As the project structure is based on maven , why not provide a pom.xml . I have to find the third party dependencies before running the testcases .

BTW, no need to checkin the eclipse project file , because we can use the command mvn eclipse:eclipse to generate it .

Please provide a pom.xml so that I can build it and run the testcases

Union of joined queries fails to generate correct select part for second query

For two simple queries

val q1 = for (coffee <- Coffees; tea <- Teas if coffee.pkCup === tea.pkCup) 
  yield coffee.pk ~ coffee.pkCup

and

val q2 = for (coffee <- Coffees; tea <- Teas if coffee.pkCup === tea.pkCup) 
  yield tea.pk ~ tea.pkCup

Slick fails to create a correct SQL statement for the union q1 union q3.

It outputs the SQL for q1 and q2 correctly, but q3 generates

select x2.x3, x2.x4 from (
  select x5."pk" as x3, x5."pkCup" as x4 from "Coffee" x5, "Tea" x6 
  where x5."pkCup" = x6."pkCup" 
union 
  select  from "Coffee" x7, "Tea" x8 
  where x7."pkCup" = x8."pkCup"
) x2

For the complete example, see: https://gist.github.com/3951488

Add insert-or-update feature

As discussed in http://groups.google.com/group/scalaquery/browse_thread/thread/d466c499549ab480 , InsertInvoker should have a method that performs an insert and falls back to updating an existing row in case of a primary key or unique constraint violation. Precise semantics need to be defined (e.g. more like MySQL's MERGE or REPLACE ON DUPLICATE KEY?). This should be supported with all drivers, falling back to a client-side multi-statement implementation if not natively supported by a DBMS.

Support multi-way joins

Three-way joins are only supported with a right-associative syntax as of ScalaQuery 0.9.0 but this limitation came from H2; it is not present in SQL 2008. Theoretically, this three-way join should work:

  val q2 = for {
    Join(c, Join(b, a)) <- C leftJoin (B leftJoin A on (_.aid === _.id)) on { case (c, Join(_,a)) => c.aid === a.id }
  } yield a.id ~ b.id ~ c.id

But it generates a bad syntax tree where the Join(c, _) doesn't taint the Join(b, a), and thus bad SQL code:

q2: Query
  select: Projection3
    0: NamedColumn ID
      table: JoinPart
        table: <t1> Table A
        from: <t2> Join(Table B,Table A)
          0: <t3> Table B
          1: <t1> ...
    1: NamedColumn ID
      table: JoinPart
        table: <t3> ...
        from: <t2> ...
    2: NamedColumn ID
      table: JoinPart
        table: <t4> Table C
        from: <t5> Join(Table C,Join(Table B,Table A))
          0: <t4> ...
          1: <t6> Join(Table B,Table A)
            0: <t3> ...
            1: <t1> ...

SELECT "t2"."ID","t3"."ID","t5"."ID" FROM {oj "B" "t3" left outer join "A" "t2" on "t3"."AID"="t2"."ID")},{oj "C" "t5" left outer join "B" "t3" left outer join "A" "t2" on  "t3"."AID"="t2"."ID") on ("t5"."AID"="t2"."ID")}

Unable to compile trunk

I got a "type mismatch error" after changing entries in "project/build.properties" to point to my scala-2.8.0.r18634-b20090902020144 jars and tried to compile.

$ sbt
[info] Building project ScalaQuery 1.0-SNAPSHOT using ScalaQueryProject
[info]    with sbt 0.5.1 and Scala 2.7.5
[info] No actions specified, interactive session started. Execute 'help' for more information.
> compile
[info] 
[info] == compile ==
[info] Deleting directory /Users/walter/Personal/git/scala/scala-query/target/classes
[info] Compiling main sources... 
[error] /Users/walter/Personal/git/scala/scala-query/src/main/scala/com/novocode/squery/combinator/basic/BasicQueryBuilder.scala:140: error: type mismatch;
[error]  found   : Any
[error]  required: _0 where type _0
[error]           tm(profile).setValue(v, p)
[error]                                ^
[error] one error found
[info] == compile ==
[error] Error running compile: Compilation unsuccessful.
[info] 
[info] Total time: 18 s
> 

NPE when using MappedTypeMapper over base type with zero == null

As reported by @ijuma on the mailing list:

I am running into a NPE when using MappedTypeMapper.base like so:

implicit val dateTime: TypeMapper[DateTime] = MappedTypeMapper.base[DateTime, Timestamp](dt => new Timestamp(dt.getMillis), ts => new DateTime(ts.getTime))

The column is specified as "columnOption[DateTime]". Is this expected? Am I supposed to check for null and return null in that case?

java.sql.BatchUpdateException using MSSql

Hi,

getting the following Exception (sorry its in German, but the specific MSSQL part should be clear anyhow):

Exception in thread "main" java.sql.BatchUpdateException: Ein expliziter Wert für die Identitätsspalte kann nicht in der PERSONS-Tabelle eingefügt werden, wenn IDENTITY_INSERT auf OFF festgelegt ist.
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1160)
at scala.slick.driver.BasicInvokerComponent$InsertInvoker$$anonfun$insertAll$1$$anonfun$apply$4.apply(BasicInvokerComponent.scala:106)
at scala.slick.driver.BasicInvokerComponent$InsertInvoker$$anonfun$insertAll$1$$anonfun$apply$4.apply(BasicInvokerComponent.scala:99)
at scala.slick.session.Session$class.withPreparedStatement(Session.scala:68)
at scala.slick.session.BaseSession.withPreparedStatement(Session.scala:201)
at scala.slick.driver.BasicInvokerComponent$InsertInvoker.prepared(BasicInvokerComponent.scala:81)
at scala.slick.driver.BasicInvokerComponent$InsertInvoker$$anonfun$insertAll$1.apply(BasicInvokerComponent.scala:99)
at scala.slick.session.BaseSession.withTransaction(Session.scala:229)
at scala.slick.driver.BasicInvokerComponent$InsertInvoker.insertAll(BasicInvokerComponent.scala:95)
at org.rob.slicktests.ThirdExample$$anonfun$1.apply$mcV$sp(ThirdExample.scala:44)
at org.rob.slicktests.ThirdExample$$anonfun$1.apply(ThirdExample.scala:38)
at org.rob.slicktests.ThirdExample$$anonfun$1.apply(ThirdExample.scala:38)
at scala.slick.session.BaseSession.withTransaction(Session.scala:236)
at org.rob.slicktests.ThirdExample$delayedInit$body.apply(ThirdExample.scala:38)

I am using the follwing test code (works with h2):

(Simply copied the following code in here; In the editor it looks fine; After submitting it it does not properly show up; No idea what tags to use to make it a code block neither no idea (no buttons) how to seperatly attach it as code - Sorry; See the code properly : https://groups.google.com/forum/?fromgroups=#!topic/scalaquery/yMQ3ElI2xKQ)

[code]
import scala.slick.driver.SQLServerDriver.simple._

object ThirdExample extends App {
case class Car(id: Option[Int], marke: String, kennzeichen: String)
case class Person(id: Option[Int], name: String, age: Int, carId: Option[Int])

object Cars extends TableCar {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc) // This is the primary key column
def marke = columnString
def kennzeichen = columnString
// Every table needs a * projection with the same type as the table's type parameter
def * = id.? ~ marke ~ kennzeichen <> (Car, Car.unapply _)
}

// Definition of the SUPPLIERS table
object Persons extends TablePerson {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc) // This is the primary key column
def name = columnString
def age = columnInt
def carId = columnOption[Int]
// Every table needs a * projection with the same type as the table's type parameter
def * = id.? ~ name ~ age ~ carId <> (Person, Person.unapply )
def car = foreignKey("PERS_FK", carId, Cars)(
.id)
}

// implicit val session = Database.forURL("jdbc:h2:mem:test1", driver = "org.h2.Driver").createSession()
implicit val session = Database.forURL("jdbc:sqlserver://localhost:1433;DatabaseName=playground",
user = "playground",
password = "playground",
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver").createSession()

val ddl = Cars.ddl ++ Persons.ddl
session.withTransaction {
ddl.create

val p1 = Person(None, "Rob", 44, None)
val p2 = Person(None, "Sepp", 33, None)

Persons.insertAll(p1, p2)

println((for(p <- Persons) yield p).list.mkString(", "))

}
}
[\code]

Any ideas how to set this properly up ? Thanks.

Cheers, Rob.

Scala 2.9.0-1 artifact

Using Scala 2.9.0-1, but...

$ sbt run
[info] Set current project to default (in build file:/home/yang/pod/sales/web/)
[info] Updating...
[warn]  module not found: org.scalaquery#scalaquery_2.9.0-1;0.9.4
[warn] ==== local: tried
[warn]   /home/yang/.ivy2/local/org.scalaquery/scalaquery_2.9.0-1/0.9.4/ivys/ivy.xml
[warn]   -- artifact org.scalaquery#scalaquery_2.9.0-1;0.9.4!scalaquery_2.9.0-1.jar:
[warn]   /home/yang/.ivy2/local/org.scalaquery/scalaquery_2.9.0-1/0.9.4/jars/scalaquery_2.9.0-1.jar
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/scalaquery/scalaquery_2.9.0-1/0.9.4/scalaquery_2.9.0-1-0.9.4.pom
[warn]   -- artifact org.scalaquery#scalaquery_2.9.0-1;0.9.4!scalaquery_2.9.0-1.jar:
[warn]   http://repo1.maven.org/maven2/org/scalaquery/scalaquery_2.9.0-1/0.9.4/scalaquery_2.9.0-1-0.9.4.jar
...

Verify data against definition

Due to design decision of SQ SQL-nulls are not mapped into Scala nulls.

However, they are not validated. For example -- if you have column with nulls in DB, and you won't define the column as Option + O.Nullable in SQ, your data will be casted into default value for the type (not null though).

The design of SQ is one matter, but this leak is another issue -- data should be not casted, but checked instead. If one has DB definition and SQ definition and they are not compatible with each other, the exception should be thrown.

This leak is nasty, I am writing this report after spending around 1.5 day of watching, verifying, etc. perfect code with one exception -- I forgot to add Option in column definition and the datatime nulls were saved as 1970-something.

Summary: since by design Scala nulls not used for SQL-nulls, they are invalid entities. Usage of such entities should be forbidden, not silently accepted.

Aggregation missing GROUP BY clause

For background, see:

https://groups.google.com/d/msg/scalaquery/yqpHbTJb4-k/ECWwOO_MX-AJ

To reproduce, put the following code into a file and run with sbt:

import scala.slick.driver.PostgresDriver.simple._

object Roster extends Table[(Int,Int)]("roster") {
  def playerId = column[Int]("player_id")
  def gamesPlayed = column[Int]("games_played")
  def * = playerId ~ gamesPlayed
}

object Player extends Table[(Int)]("player") {
  def id = column[Int]("id")
  def * = id
}

object Scoring extends Table[(Int,Int,Int)]("scoring") {
  def playerId = column[Int]("player_id")
  def goal = column[Int]("goal")
  def assist = column[Int]("assist")
  def * = playerId ~ goal ~ assist
}

object CompositionError {

  def main (args: Array[String]) {

    val player = for {
      r <- Roster
      p <- Player if r.playerId is p.id
    } yield (r, p)

    val playerDetail = for {
      (r, p) <- player
    } yield (p.id, r.gamesPlayed)

    val scoring = for {
      (r, p) <- player
      s <- Scoring if p.id is s.playerId
      detail <- playerDetail
    } yield (r, p, s, detail)

    val scoringDetail = for {
      (r, p, s, detail) <- scoring
      (total, goals, assists) = ( s.playerId.count, s.goal.sum, s.assist.sum )
      ppg = ( (s.playerId.count / r.gamesPlayed) ).asColumnOf[Option[Float]]
    } yield (goals, assists, total, ppg)

    println ("The erroneous statement is:\n" + scoringDetail.selectStatement)
  }
}

Table definitions in companion object result in generation of invalid SQL

Stefan,

Just so this is recorded: defining tables in companion objects and then using them in instances of the companion class results in the generation of invalid SQL, apparently because the SQL generation process can't tell when two or more references to a table should be coalesced into one "foo t1" in the "from" clause. You commented on IRC that this may be due to cloning not playing nicely with companion objects, or it could actually be a Scala 2.8.0 bug, IIRC.

scala_2.9.1 build needed

Maybe scala-tools is no longer where you put the builds but I only saw a scala_2.9.0 version there. I am guessing it is compatible but I would feel more confident with a 2.9.1 build up there as well.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.