July 3, 2013

Play! 2.1 RowParsers and LEFT JOINs

If you've used Play! Framework version 2.1, and you've tried out the RowParser API, you may have wondered, as I have, how to handle LEFT JOIN queries, since the rows from the left table will appear at least once and the rows from the right table may or may not appear. If the right table does appear, you want to parse the data using an existing RowParser, rather than having some custom parser and case class which has all Option fields.

In my initial approach, inspired by this question on the Play! Framework Google Group, was a bit parentheses-heavy.
// Parser for "foo" table
val foo = get[Long]("foo.id")

// Parser for "bar" table
val bar = get[Long]("bar.id")

// Combined parser
val combo = foo ~ (bar ?)

// Results as List[~[Long, Option[Long]]]
val results = SQL("SELECT * FROM foo LEFT JOIN bar ON foo.id = bar.foo_id").as(combo *)

Since the parentheses bothered me a bit, I enriched the RowParser class to provide a convenience method ~? which combines the two functions.
implicit def enrichRowParser[A](a: RowParser[A]) = new RichRowParser(a)

class RichRowParser[A](val delegate: RowParser[A]) extends AnyVal {

def ~?[B](p: RowParser[B]) = {
delegate ~ (p ?)
}

}

The declaration for combo can then be redefined without the parentheses which seems a bit more readable.
val combo = foo ~? bar

January 1, 2013

Using PostgreSQL's native JSON support in Play Framework 2.1-RC1

I have been experimenting with PostgreSQL 9.2's native json datatype, and wanted to have Play Framework 2.1-RC1 automatically convert the raw PostgreSQL JDBC PGobject object into a Play-specific JsValue object. Neither Play nor JDBC understand PostgreSQL's native json datatype, so neither library provides an implicit conversion. Automatic conversion can be performed by defining a parameter extractor. Without a parameter extractor, a compilation error will occur:
could not find implicit value for parameter extractor: anorm.Column[play.api.libs.json.JsValue]
In my case, I could have worked around this by extracting the value as a String and parsing this value. Note that because the data field is marked NOT NULL, I don't need to use an Option[String] in the extractor.
get[Pk[Long]]("sample.id") ~
get[String]("sample.data") map {
  case id~data => Sample(id, Json.parse(data))
}
For INSERTs and UPDATEs, I'm using the PostgreSQL CAST keyword as I haven't experimented with how to get the JsValue object converted transparently into a JDBC object (presumably of type PGobject) or defining a new varcharjson CAST on the database side. Without the CAST() the INSERT fails because the JDBC driver doesn't set the proper type:
org.postgresql.util.PSQLException: ERROR: column "data" is of type json but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Here is the sample schema:
CREATE TABLE
  sample
(
  id serial PRIMARY KEY,
  data json NOT NULL
);
Here's how a Sample model looks:
package models

import anorm._
import anorm.SqlParser._
import play.api.db._
import play.api.Play.current
import play.api.libs.json.{Json,JsValue}

case class Sample (
  id: Pk[Long] = NotAssigned,
  data: JsValue
)

object Sample {

  val extractor = {
    get[Pk[Long]]("sample.id") ~
    get[JsValue]("sample.data") map {
      case id~data => Sample(id, data)
  }

  implicit def rowToJsValue: Column[JsValue] = Column.nonNull { (value, meta) =>
    val MetaDataItem(qualified, nullable, clazz) = meta
    value match {
      case pgo: org.postgresql.util.PGobject => Right(Json.parse(pgo.getValue))
      case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" +
          value.asInstanceOf[AnyRef].getClass + " to JsValue for column " + qualified))
    }
  }

  def list() = {
    DB.withConnection { implicit connection =>
      SQL(
        """
          SELECT
            *
          FROM
            sample
        """
      ).as(Sample.extractor *)
    }
  }

  def create(data: JsValue) = {
    DB.withConnection { implicit connection =>
      SQL(
        """
          INSERT INTO
            sample
          (
            data
          )
          VALUES(
            CAST({data} AS json)
          )
          RETURNING id
        """
      ).on(
        'data -> Json.stringify(data)
      ).as(scalar[Long].single)
    }
  }
}
This approach is inspired by and adapted from a StackOverflow question.