Get or Insert within a Transaction on Doobie in Scala - sql

I'm reading through the Doobie documentation and trying to do a simple get or create within a transaction. I get an option off the first query and attempt to do a getOrElse and run an insert within the else, however I keep getting a value map is not a member of Any within the getOrElse call. What's the correct way to either get an existing or create a new row in instances and return that result in a transaction?
import doobie._
import doobie.implicits._
import cats._
import cats.effect._
import cats.implicits._
import org.joda.time.DateTime
import scala.concurrent.ExecutionContext
case class Instance(id : Int, hostname : String)
case class User(id : Int, instanceId: Int, username : String, email : String, created : DateTime)
class Database(dbUrl : String, dbUser: String, dbPass: String) {
implicit val cs = IO.contextShift(ExecutionContext.global)
val xa = Transactor.fromDriverManager[IO](
"org.postgresql.Driver", dbUrl, dbUser, dbPass
)
def getOrCreateInstance(hostname: String) = for {
existingInstance <- sql"SELECT id, hostname FROM instances i WHERE i.hostname = $hostname".query[Instance].option
ensuredInstance <- existingInstance.getOrElse(sql"INSERT INTO instances(hostname) VALUES(?)".update.withGeneratedKeys[Instance]("id", "hostname"))
} yield ensuredInstance
}

I got the following answer thanks to the people on the #scala/freenode chatroom. I'm posting it here for completeness and if people are interested in doing this without the for comprehension in the other answer.
def getOrCreateInstance(hostname: String): ConnectionIO[Instance] =
OptionT(sql"SELECT id, hostname FROM instances i WHERE i.hostname = $hostname".query[Instance].option)
.getOrElseF(sql"INSERT INTO instances(hostname) VALUES($hostname)".update.withGeneratedKeys[Instance]("id", "hostname").compile.lastOrError)

I believe something like this should work for you,
def getOrCreateInstance(hostname: String): ConnectionIO[Instance] = for {
existingInstance <- sql"SELECT id, hostname FROM instances i WHERE i.hostname = $hostname".query[Instance].option
ensuredInstance <- existingInstance.fold(sql"INSERT INTO instances(hostname) VALUES($hostname)".update.withGeneratedKeys[Instance]("id", "hostname").take(1).compile.lastOrError)(_.pure[ConnectionIO])
} yield ensuredInstance
where you are compiling the fs2 Stream and also lifting the existing instance into a ConnectionIO in the case that it does already exist.

Related

How to deserialize JSON from Kafka Consumer Record

I'm looking to access some fields on a Kafka Consumer record. I'm able to receive the event data which is a Java object i.e ConsumerRecord(topic = test.topic, partition = 0, leaderEpoch = 0, offset = 0, CreateTime = 1660933724665, serialized key size = 32, serialized value size = 394, headers = RecordHeaders(headers = [], isReadOnly = false), key = db166cbf1e9e438ab4eae15093f89c34, value = {"eventInfo":...}).
I'm able to access the eventInfo values which comes back as a json string. I'm fairly new to Kotlin and using Kafka so I'm not entirely sure if this is correct but I'm looking to basically access the fields in value but I can't get rid of an error that appears when trying to use mapper.readValue which is:
None of the following functions can be called with the arguments supplied.
import com.afterpay.shop.favorites.model.Product
import com.fasterxml.jackson.module.kotlin.jacksonObjectMapper
import org.apache.avro.generic.GenericData.Record
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.springframework.kafka.annotation.KafkaListener
import org.springframework.kafka.support.Acknowledgment
import org.springframework.stereotype.Component
#Component
class KafkaConsumer {
#KafkaListener(topics = ["test.topic"], groupId = "group-id")
fun consume(consumerRecord: ConsumerRecord<String, Any>, ack: Acknowledgment) {
val mapper = jacksonObjectMapper()
val value = consumerRecord.value()
val record = mapper.readValue(value, Product::class.java)
println(value)
ack.acknowledge()
}
}
Is this the correct way to accomplish this?
First, change ConsumerRecord<String, Any> to ConsumerRecord<String, Product>, then change value.deserializer in your consumer config/factory to use JSONDeserializer
Then your consumerRecord.value() will already be a Product instance, and you don't need an ObjectMapper
https://docs.spring.io/spring-kafka/docs/current/reference/html/#json-serde
Otherwise, if you use StringDeserializer, change Any to String so that the mapper.readValue argument types are correct.

How do I make and access regex capture groups in Django without RawSQL?

How do I annotate a Django queryset with a Regex capture group without using RawSQL so that I later can use that value for filtering and sorting?
For example, in PostgreSQL I could make the following query:
CREATE TABLE foo (id varchar(100));
INSERT INTO foo (id) VALUES ('disk1'), ('disk10'), ('disk2');
SELECT
"foo"."id",
CAST((regexp_matches("foo"."id", '^(.*\D)([0-9]*)$'))[2] AS integer) as grp2
FROM "foo"
ORDER BY "grp2"
dbfiddle
You can use a custom Func class created to get it working, but I would like to implement in a better way, just like a normal function which could be used for further processing using other functions or annotations or etc. Like a "block" in the Django ORM ecosystem.
I would like to start with an "beta version" of the class which looks like this one:
from django.db.models.expressions import Func, Value
class RegexpMatches(Func):
function = 'REGEXP_MATCHES'
def __init__(self, source, regexp, flags=None, group=None, output_field=None, **extra):
template = '%(function)s(%(expressions)s)'
if group:
if not hasattr(regexp, 'resolve_expression'):
regexp = Value(regexp)
template = '({})[{}]'.format(template, str(group))
expressions = (source, regexp)
if flags:
if not hasattr(flags, 'resolve_expression'):
flags = Value(flags)
expressions += (flags,)
self.template = template
super().__init__(*expressions, output_field=output_field, **extra)
and a fully working example for an admin interface:
from django.contrib.admin import ModelAdmin, register
from django.db.models import IntegerField
from django.db.models.functions import Cast
from django.db.models.expressions import Func, Value
from .models import Foo
class RegexpMatches(Func):
function = 'REGEXP_MATCHES'
def __init__(self, source, regexp, flags=None, group=None, output_field=None, **extra):
template = '%(function)s(%(expressions)s)'
if group:
if not hasattr(regexp, 'resolve_expression'):
regexp = Value(regexp)
template = '({})[{}]'.format(template, str(group))
expressions = (source, regexp)
if flags:
if not hasattr(flags, 'resolve_expression'):
flags = Value(flags)
expressions += (flags,)
self.template = template
super().__init__(*expressions, output_field=output_field, **extra)
#register(Foo)
class Foo(ModelAdmin):
list_display = ['id', 'required_field', 'required_field_string']
def get_queryset(self, request):
qs = super().get_queryset(request)
return qs.annotate(
required_field=Cast(RegexpMatches('id', r'^(.*\D)([0-9]*)$', group=2), output_field=IntegerField()),
required_field_string=RegexpMatches('id', r'^(.*\D)([0-9]*)$', group=2)
)
def required_field(self, obj):
return obj.required_field
def required_field_string(self, obj):
return obj.required_field_string
As you see in I've added 2 annotations and one outputs like a number and the other one like a normal string (character), of course, we don't see it in the admin interface but it does in the SQL are executed:
SELECT "test_foo"."id" AS Col1,
((REGEXP_MATCHES("test_foo"."id", '^(.*\D)([0-9]*)$'))[2])::integer AS "required_field", (REGEXP_MATCHES("test_foo"."id", '^(.*\D)([0-9]*)$'))[2] AS "required_field_string"
FROM "test_foo"
And also a screenshot with an example for you :)
Github gist with a better source code formatting https://gist.github.com/phpdude/50675114aaed953b820e5559f8d22166
From Django 1.8 onwards, you can use Func() expressions.
from django.db.models import Func
class EndNumeric(Func):
function = 'REGEXP_MATCHES'
template = "(%(function)s(%(expressions)s, '^(.*\D)([0-9]*)$'))[2]::integer"
qs = Foo.objects.annotate(
grp2=EndNumeric('id'),
).values('id', 'grp2').order_by('grp2')
Reference: Get sorted queryset by specified field with regex in django

Is it possible to parameterize queries or parameters for an Acolyte ScalaCompositeHandler?

Background:
I have attempted to accomplish the question defined here, and I have not been able to succeed. Acolyte requires you to define the queries and parameters you want to handle within a match expression, and the values used in match expressions must be known at compile time. (Note, however, that this StackOverflow answer appears to provide a way around this limitation).
If this is indeed not possible, the inability to dynamically define the parameters and queries for Acolyte would be, for my use case, a severe limitation of the framework. I suspect this would be a limitation for others as well.
One SO user who has advocated for the use of Acolyte across a handful of questions stated in this comment that it is possible to dynamically define queries and their responses. So, I have opened this question as an invitation for someone to show that to be the case.
Question:
Using Acolyte, I want to be able to encapsulate the logic for matching queries and generating their responses. This is a desired feature because I want to keep my code DRY. In other words, I am looking for something like the following pseudo-code:
def generateHandler(query: String, accountId: Int, parameters: Seq[String]): ScalaCompositeHandler = AcolyteDSL.handleQuery {
parameters.foreach(p =>
// Tell the handler to handle this specific parameter
case acolyte.jdbc.QueryExecution(query, ExecutedParameter(accountId) :: ExecutedParameter(p) :: Nil) =>
someResultFunction(p)
)
}
Is this possible in Acolyte? If so, please provide an example.
It is indeed possible to parameterize queries and/or parameters by utilizing pattern matching.
See the code below for an example:
import java.sql.DriverManager
import acolyte.jdbc._
import acolyte.jdbc.Implicits._
import org.scalatest.FunSpec
class AcolyteTest extends FunSpec {
describe("Using pattern matching to extract a query parameter") {
it("should extract the parameter and make it usable for dynamic result returning") {
val query = "SELECT someresult FROM someDB WHERE id = ?"
val rows = RowLists.rowList1(classOf[String] -> "someresult")
val handlerName = "testOneHandler"
val handler = AcolyteDSL.handleQuery {
case acolyte.jdbc.QueryExecution(`query`, ExecutedParameter(id) :: _) =>
rows.append(id.toString)
}
Driver.register(handlerName, handler)
val connection = DriverManager.getConnection(s"jdbc:acolyte:anything-you-want?handler=$handlerName")
val preparedStatement = connection.prepareStatement(query)
preparedStatement.setString(1, "hello world")
val resultSet = preparedStatement.executeQuery()
resultSet.next()
assertResult(resultSet.getString(1))("hello world")
}
it("should support a slightly more complex example") {
val firstResult = "The first result"
val secondResult = "The second result"
val query = "SELECT someresult FROM someDB WHERE id = ?"
val rows = RowLists.rowList1(classOf[String] -> "someresult")
val results: Map[String, RowList1.Impl[String]] = Map(
"one" -> rows.append(firstResult),
"two" -> rows.append(secondResult)
)
def getResult(parameter: String): QueryResult = {
results.get(parameter) match {
case Some(row) => row.asResult()
case _ => acolyte.jdbc.QueryResult.Nil
}
}
val handlerName = "testTwoHandler"
val handler = AcolyteDSL.handleQuery {
case acolyte.jdbc.QueryExecution(`query`, ExecutedParameter(id) :: _) =>
getResult(id.toString)
}
Driver.register(handlerName, handler)
val connection = DriverManager.getConnection(s"jdbc:acolyte:anything-you-want?handler=$handlerName")
val preparedStatement = connection.prepareStatement(query)
preparedStatement.setString(1, "one")
val resultSetOne = preparedStatement.executeQuery()
resultSetOne.next()
assertResult(resultSetOne.getString(1))(firstResult)
preparedStatement.setString(1, "two")
val resultSetTwo = preparedStatement.executeQuery()
resultSetTwo.next()
assertResult(resultSetTwo.getString(1))(secondResult)
}
}
}

ReactiveMongo : How to write macros handler to Enumeration object?

I use ReactiveMongo 0.10.0, and I have following user case class and gender Enumeration object:
case class User(
_id: Option[BSONObjectID] = None,
name: String,
gender: Option[Gender.Gender] = None)
object Gender extends Enumeration {
type Gender = Value
val MALE = Value("male")
val FEMALE = Value("female")
val BOTH = Value("both")
}
And I declare two implicit macros handler:
implicit val genderHandler = Macros.handler[Gender.Gender]
implicit val userHandler = Macros.handler[User]
but, when I run application, I get following error:
Error:(123, 48) No apply function found for reactive.userservice.Gender.Gender
implicit val genderHandler = Macros.handler[Gender.Gender]
^
Error:(125, 46) Implicit reactive.userservice.Gender.Gender for 'value gender' not found
implicit val userHandler = Macros.handler[User]
^
Anybody know how to write macros handler to Enumeration object?
Thanks in advance!
I stumbled upon your question a few times searching for the same answer. I did it this way:
import myproject.utils.EnumUtils
import play.api.libs.json.{Reads, Writes}
import reactivemongo.bson._
object DBExecutionStatus extends Enumeration {
type DBExecutionStatus = Value
val Error = Value("Error")
val Started = Value("Success")
val Created = Value("Running")
implicit val enumReads: Reads[DBExecutionStatus] = EnumUtils.enumReads(DBExecutionStatus)
implicit def enumWrites: Writes[DBExecutionStatus] = EnumUtils.enumWrites
implicit object BSONEnumHandler extends BSONHandler[BSONString, DBExecutionStatus] {
def read(doc: BSONString) = DBExecutionStatus.Value(doc.value)
def write(stats: DBExecutionStatus) = BSON.write(stats.toString)
}
}
You have to create a read/write pair by hand and populate with your values.
Hope you already solved this issue given the question age :D

Scala SQL DSL (Internal/External)

I have been looking into scala primarily on how to build DSL similar to C# LINQ/SQL. Having worked with C# LINQ Query provider, it was easy to introduce our own custom query provider which translated LINQ query to our own proprietary data store scripts. I am looking something similar in scala for eg.
val query = select Min(Close), Max(Close)
from StockPrices
where open > 0
First of all is this even possible to achieve in scala using internal DSL.
Any thoughts/ideas in this regard is highly appreciated.
I am still new in scala space, but started looking into Scala MetaProgramming & Slick. My complaint with Slick is i want to align my DSL close to SQL query - similar to above syntax.
There is no way to have an internal DSL (with the currently release) that looks exactly like the example you provided.
Using a macro I still had from this answer, the closest I could get (relatively fast) was:
select(Min(StockPrices.Open), Max(StockPrices.Open))
.from(StockPrices)
A real solution would take quite some time to create. If you are willing to do that you could come quite far using macro's (not a simple topic).
If you really want the exact same syntax I recommend something like XText that allows you to create a DSL with an eclipse based editor for 'free'.
The code required for the above example (I did not include the mentioned macro):
trait SqlElement {
def toString(): String
}
trait SqlMethod extends SqlElement {
protected val methodName: String
protected val arguments: Seq[String]
override def toString() = {
val argumentsString = arguments mkString ","
s"$methodName($argumentsString)"
}
}
case class Select(elements: Seq[SqlElement]) extends SqlElement {
override def toString() = s"SELECT ${elements mkString ", "}"
}
case class From(table: Metadata) extends SqlElement {
private val tableName = table.name
override def toString() = s"FROM $tableName"
}
case class Min(element: Metadata) extends SqlMethod {
val methodName = "Min"
val arguments = Seq(element.name)
}
case class Max(element: Metadata) extends SqlMethod {
val methodName = "Max"
val arguments = Seq(element.name)
}
class QueryBuilder(elements: Seq[SqlElement]) {
def this(element: SqlElement) = this(Seq(element))
def from(o: Metadata) = new QueryBuilder(elements :+ From(o))
def where(element: SqlElement) = new QueryBuilder(elements :+ element)
override def toString() = elements mkString ("\n")
}
def select(args: SqlElement*) = new QueryBuilder(Select(args))
trait Column
object Column extends Column
object tables {
object StockPrices$ {
val Open: Column = Column
val Close: Column = Column
}
val StockPrices = StockPrices$
}
And then to use it:
import tables._
import StockPrices._
select(Min(StockPrices.Open), Max(StockPrices.Open))
.from(StockPrices)
.toString
That is an admirable project, but one that has been embarked upon and which is available in general release.
I'm talking about Slick, of course.
If Scala / Java interoperability is not too much of an issue for you, and if you're willing to use an internal DSL with a couple of syntax quirks compared to the syntax you have suggested, then jOOQ is growing to be a popular alternative to Slick. An example from the jOOQ manual:
for (r <- e
select (
T_BOOK.ID * T_BOOK.AUTHOR_ID,
T_BOOK.ID + T_BOOK.AUTHOR_ID * 3 + 4,
T_BOOK.TITLE || " abc" || " xy"
)
from T_BOOK
leftOuterJoin (
select (x.ID, x.YEAR_OF_BIRTH)
from x
limit 1
asTable x.getName()
)
on T_BOOK.AUTHOR_ID === x.ID
where (T_BOOK.ID <> 2)
or (T_BOOK.TITLE in ("O Alquimista", "Brida"))
fetch
) {
println(r)
}