I did check the similar question here, but it didn't work as expected for me, I still need to reconnect to the created database and I'm not sure how or even if I can avoid that.
Here is my code:
hikari.properties:
jdbcUrl=jdbc:mariadb://localhost:3306/
driverClassName=org.mariadb.jdbc.Driver
username=root
dataSource.databaseName=DBNAME //this doesn't seem to do much, I'm getting the same behavior with or without it
fun initDB() {
val config = HikariConfig("/hikari.properties")
val ds = HikariDataSource(config)
transaction(connect(ds)) {
SchemaUtils.createDatabase("DBNAME")
}
config.jdbcUrl = "jdbc:mariadb://localhost:3306/DBNAME"
//ds.jdbcUrl = "jdbc:mariadb://localhost:3306/DBNAME" //THIS WILL NOT WORK
val ds2 = HikariDataSource(config)
transaction(connect(ds2)) {
SchemaUtils.create( Tables... )
}
}
The reason I make a new datasource, is because otherwise I'll get this error:
java.lang.IllegalStateException: The configuration of the pool is sealed once started. Use HikariConfigMXBean for runtime changes. HikariConfigMXBean doesn't seem allow jdbcUrl changes.
There must be a more elegant way to do this, right?
It is not a good practice. Creating and filling db it is two separate processes. DB creates once by the DBA, then you just connect and use it. Your approach has huge security violation. User from the datasource must have create db privilege.
But if you want proceed with your current approach, first you should create a db without using datasource, then create datasource and connect to the db. Something like this
import com.zaxxer.hikari.HikariConfig
import com.zaxxer.hikari.HikariDataSource
import org.jetbrains.exposed.sql.Database
import org.jetbrains.exposed.sql.SchemaUtils
import org.jetbrains.exposed.sql.Table
import org.jetbrains.exposed.sql.transactions.transaction
fun main(args: Array<String>) {
//LOAD proerties
val config = HikariConfig("/hikari.properties")
//Hikari properties content
// jdbcUrl=jdbc:mysql://localhost:3306/hikari
// driverClassName=com.mysql.cj.jdbc.Driver
// username=root
// password=<pwd here>
//Here replace dbname from jdbc url to empty
transaction(Database.connect(url = config.jdbcUrl.replace("hikari", ""),
driver = config.driverClassName,
user = config.username,
password = config.password)) {
SchemaUtils.createDatabase("hikari")
}
val ds = HikariDataSource(config)
transaction(Database.connect(ds)) {
SchemaUtils.create(Users)
}
}
object Users : Table() {
val id = varchar("id", 10) // Column<String>
val name = varchar("name", length = 50) // Column<String>
override val primaryKey = PrimaryKey(id, name = "PK_User_ID")
}
Related
Trying to pass a data class User from one Activity to another using Intent.
My putExtra looks like this using my observe fun:
val intent = Intent(this, MainActivity::class.java)
intent.putExtra("userData",userData)
startActivity(intent)
My get routine looks like this:
userData = intent.getParcelableExtra<User>("userData") as User
or
userData = intent.getParcelableExtra("userData")
My problem is that Android Studio strikes out the function. My User data class is marked #Parcelize. It all ends up getParcelableExtra.
I've add to my gradle build:
id 'kotlin-parcelize'
I've read several posts about Parcelable being more modern than Serialable, so that's the technique I'm using. All the posts are from 2018 or prior and many of them in Java.
How does one send an data class from one Activity to another using Intent?
Since getParcelableExtra (String name) is deprecated from api level 33 you can use getParcelableExtra (String name, Class<T> clazz) from api level 33
In Your case use :
val userData =
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
intent.getParcelableExtra("userData", userData::class.java)
}
else{
intent.getParcelableExtra("userData") as userData?
}
where TIRAMISU is constant value 33
To get more info:Read this:
https://developer.android.com/reference/android/content/Intent#getParcelableExtra(java.lang.String,%20java.lang.Class%3CT%3E)
SOLUTION:
Given the need for compiler version 33 for the most modern solution, I went with a more backward compatible solution. This code translates the data object into a string, then back into the object.
SETUP: (PUT)
private var _userData = MutableLiveData<User>() // does the username and pw validate?
val userData : LiveData<User>
get() = _userData
SETUP: (GET)
private lateinit var userData: User
PUT CODE:
val intent = Intent(this, MainActivity::class.java)
val jsonUserData = Gson().toJson(userData)
intent.putExtra("userData",jsonUserData)
GET CODE:
val jsonUserData = intent.getStringExtra("userData")
userData = Gson().fromJson(jsonUserData, User::class.java)
I have an Annotation-processor, which should generate a class MyGeneratedClass containing a variable of another class MyEntity.
My code inside the processfunction:
val elementsWithAnnotation = roundEnv.getElementsAnnotatedWith(MyClass::class.java)
if (elementsWithAnnotation.isEmpty()) {
return true
}
val fileName = "MyGeneratedClass"
val packageName = "me.myname.sdk.generated"
val classBuilder = TypeSpec.classBuilder(fileName)
for (element in elementsWithAnnotation) {
val ann = element.getAnnotation(MyClass::class.java)
println("package: "+ ann.javaClass.packageName)
val variableBuilder =
PropertySpec.varBuilder(
name = element.simpleName.toString(),
type = ClassName("", element.asType().asTypeName().asNullable().toString()),
).initializer("null")
classBuilder
.addProperty(variableBuilder.build())
}
val file = FileSpec.builder(packageName, fileName)
.addType(classBuilder.build())
.build()
val generatedDirectory = processingEnv.options[KAPT_KOTLIN_GENERATED_OPTION_NAME]
file.writeTo(File(generatedDirectory, "$fileName.kt"))
return true
But the generated code misses the import MyEntity
package me.myname.sdk.generated
class MyGeneratedClass {
var MyEntity: MyEntity? = null
}
When looking inside the generated file, IntelliJ suggests me to import MyEntity, which resolves the error. But how can I achieve, that the import MyEntity statement is being added when generating the file?
looking at the kotlinpoet documentation https://square.github.io/kotlinpoet/1.x/kotlinpoet/kotlinpoet/com.squareup.kotlinpoet/-class-name/index.html
seems like the first argument in your code, which is a empty string is the package name you are missing in the generated code.
in my experience kotlinpoet is much happier to generate code that in in packages. it sometimes does silly things with types in the root/default package.
Normally I use Code A to create a database Tasks.db with Room when the app run for the first time, I hope that Room doesn't create the database Tasks.db again when I run the app again, how can I do?
Code A
val result = Room.databaseBuilder(
context.applicationContext,
ToDoDatabase::class.java, "Tasks.db"
).build()
This is safe to use as it is. You'll only get a new ToDoDatabase instance that you can access your database file through, but the file on disk won't be erased and recreated if it already exists.
You can also use method onCreate() of RoomDatabase.Callback, which is invoke only the first time you create a data base:
val result = Room.databaseBuilder(context.applicationContext,
ToDoDatabase::class.java, "Tasks.db").addCallback(dbCallback).build()
...
var dbCallback: RoomDatabase.Callback = object : RoomDatabase.Callback() {
override fun onCreate(db: SupportSQLiteDatabase) {
Executors.newSingleThreadScheduledExecutor().execute {
Log.i(TAG, "create database")
result!!.getDao().insertAll(...) // add default data
...
}
}
}
I'm using jetbrains' exposed library to create and populate a database.
The database does not exist, and I am creating it. However I could not find a simple way to connect to the SQL engine, create a database and connect to that database without multiple connections.
That sounds a little clunky. Is there a better way to do it maybe?
Here is a small example :
var db = Database.connect("jdbc:mysql://localhost:3308", driver = "com.mysql.jdbc.Driver", user = "root", password = "aRootPassword")
transaction(db) { SchemaUtils.createDatabase("imdb") }
// avoid reconnect?
db = Database.connect("jdbc:mysql://localhost:3308/imdb", driver = "com.mysql.jdbc.Driver", user = "root", password = "aRootPassword")
transaction(db) { SchemaUtils.create (TitleRatings) }
You need a connection pool, e.g. HikariCP. It pools database connections and reuses them. This gives you a huge performance boost compared to individually opened connections.
I usually wrap it in a simple class like this:
import com.zaxxer.hikari.HikariConfig
import com.zaxxer.hikari.HikariDataSource
import javax.sql.DataSource
public object DB {
var db: DataSource = connect();
public fun connect(): DataSource {
val config = HikariConfig()
config.jdbcUrl = "jdbc:mysql://localhost:3308"
config.username = "com.mysql.jdbc.Driver"
config.password = "aRootPassword"
config.driverClassName = "com.mysql.jdbc.Driver"
return HikariDataSource(config)
}
}
My transactions then look like this one:
transaction(Database.connect(DB.db)) {
SchemaUtils.createDatabase("imdb")
}
I want to save chekpoint tests in a location on amazon S3, this is the part of my scala code on DStream,using below format but getting the error..
Exception in thread "main" java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properties (respectively).
Code:
val creatingFunc = { ()=>
// Create a StreamingContext
val ssc = new StreamingContext(sc, Seconds(batchIntervalSeconds))
val ggsnLines = ssc.fileStream[LongWritable, Text, TextInputFormat]("C:\\Users\\Mbazarganigilani\\Documents\\RA\\GGSN\\Files1",filterF,false)
val ccnLines= ssc.fileStream[LongWritable, Text, TextInputFormat]("C:\\Users\\Mbazarganigilani\\Documents\\RA\\CCN\\Files1",filterF,false)
val probeLines= ssc.fileStream[LongWritable, Text, TextInputFormat]("C:\\Users\\Mbazarganigilani\\Documents\\RA\\Probe\\Files1",filterF,false)
val ggssnArrays=ggsnLines.map(x=>(x._1,x._2.toString())).filter(!_._2.contains("ggsnIPAddress")).map(x=>(x._1,x._2.split(",")))
ggssnArrays.foreachRDD(s=> {
s.collect().take(10).foreach(u=>println(u._2.mkString(",")))
})
ssc.remember(Minutes(1)) // To make sure data is not deleted by the time we query it interactively
ssc.checkpoint("s3n://probecheckpoints/checkpoints")
println("Creating function called to create new StreamingContext")
newContextCreated = true
ssc
}
def main(args:Array[String]): Unit =
{
//the minremeberduration is set to read the previous files from the directory
//the kyroclasses serialization needs to be enabled for the filestream
if (stopActiveContext) {
StreamingContext.getActive.foreach { _.stop(stopSparkContext = false) }
}
// Get or create a streaming context
val hadoopConfiguration:Configuration=new Configuration()
hadoopConfiguration.set("fs.s3n.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "AKIAIOPSJVBDTEUHUJCQ")
hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "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")
//val ssc = StreamingContext.getActiveOrCreate(creatingFunc)
val ssc=StreamingContext.getActiveOrCreate("s3n://probecheckpoints/SparkCheckPoints",creatingFunc,hadoopConfiguration,false)
if (newContextCreated) {
println("New context created from currently defined creating function")
} else {
println("Existing context running or recovered from checkpoint, may not be running currently defined creating function")
}
// Start the streaming context in the background.
ssc.start()