Load resource file (json) in kotlin js - kotlin

Given this code, where should I place the file.json to be able to be found in the runtime?
// path: src/main/kotlin/Server.kt
fun main() {
val serviceAccount = require("file.json")
}
I tried place it under src/main/resources/ without luck. I also use Gradle to compile kotlin to js with kotlin2js plugin.

Assuming the Kotlin compiler puts the created JS file (say server.js) into the default location at build/classes/kotlin/main and the resource file (file.json) into build/resources/main.
And you are running server.js by executing node build/classes/kotlin/main/server.js
According to the NodeJS documentation:
Local modules and JSON files can be imported using a relative path (e.g. ./, ./foo, ./bar/baz, ../foo) that will be resolved against the directory named by __dirname (if defined) or the current working directory.
(https://nodejs.org/api/modules.html#modules_require_id)
In our case __dirname is build/classes/kotlin/main
So the correct require statement is:
val serviceAccount = js("require('../../../resources/main/file.json')")
or if require is defined as a Kotlin function like in the question
val serviceAccount = require("../../../resources/main/file.json")

You may just use js("require('./file.json')") if you do not have import for the require function in Kotlin. The result will be dynamic, so you may cast it to Map.
https://kotlinlang.org/api/latest/jvm/stdlib/kotlin.js/js.html

Related

Ways to import a JSON file in public folder in Vue-CLI

I want to import a JSON file to use it, I need it to modify it in the future so I put it in public folder not assets, When I refer to it like this import JSON from ../../public/Data.json it works but I don't think so after building project can be resolved because after building there is no public folder. So I tried this :
let addr = process.env.BASE_URL;
import JSON from `${addr}Data.json`;
But It throws an error : SyntaxError
I'm confused now which way is the best and is there another way ?
The assets in the public folder are copied as is to the root of the dist folder. In your code, you can reference it just as /Data.json (if your app is deployed at the root of the domain).
E.g
async someMethod() {
const baseUrl = process.env.BASE_URL;
const data = await this.someHttpClient.get(`${ baseUrl }/Data.json`);
}
If you want to import the JSON as you have tried, I suggest to put it somewhere in the src folder and import from there
E.g.
import data from '#/data/someData.json'
console.log(data);
I came across this because I was doing a stand alone SPA that I wanted to run with no DB and keep the config in a JSON file. The import statement above works great for a static conf file, but anything imported like that gets compiled with the build, so even though your someData.json will exist in the public folder you won't see any changes in your dist because it's actually reading a JS compiled file.
To get around this I:
Convert the JSON file into a simple JS variable in a conf.js file:
e.g.
var srcConf={'bGreatJSON':true};
In index.html, did
<script src='./conf.js'>
Now that the JS variable has been declared in my Vue component I can just look for the window.srcConf and assign it if it exists in mounted or created:
if(typeof window.srcConf!='undefined')
this.sConf=window.srcConf;
This also avoids the GET CORS issue that others posts I've seen runs into, even in the same directory I kept getting CORS violations trying to do an axios call to read the file.

How do I create a list of data based on actual twig files within project

I am creating an app via NPM using https://www.npmjs.com/package/grunt-twig-render. It essentially is twig.js. I'm keeping it slim, so right now there isn't any other php or anything like that. Just npm / twig.js and other npm packages, including Grunt.
Here's what I'm trying to do. Right now, I have a bunch of twig files within subfolders of a directory.
What I'd like to do is generate a list of data of the .twig files in that subdirectory. Something like this may work well
files: [
{
"name": "file1.twig"
"path": "/folder1"
},
{
"name": "file2.twig"
"path": "/folder1"
},
{
"name": "file3.twig"
"path": "/folder1"
}
]
But I'm not super picky. Just seeing if anyone has found a way to create a list of files within a folder via npm, or twigjs, or something similar.
If it generates a .json file, that would be ideal. This would be part of a build process, so doing that via Grunt would work well too.
Thank you in advance
You can register a custom grunt task in your Gruntfile.js which utilizes the shelljs find method to retrieve the path of each .twig file.
shelljs is a package which provides portable Unix shell commands for Node.js. It's find method is analogous to the Bash find command.
The following steps describe how to achieve your requriement:
cd to your project directory and install shelljs by running:
npm i -D shelljs
Configure your Gruntfile.js as follows:
Gruntfile.js
module.exports = function(grunt) {
// requirements
var path = require('path'),
find = require('shelljs').find;
grunt.initConfig({
// other tasks ...
});
/**
* Custom grunt task generates a list of .twig files in a directory
* and saves the results as .json file.
*/
grunt.registerTask('twigList', 'Creates list of twig files', function() {
var obj = {};
obj.files = find('path/to/directory')
.filter(function(filePath) {
return filePath.match(/\.twig$/);
})
.map(function(filePath) {
return {
name: path.basename(filePath),
path: path.dirname(filePath)
}
});
grunt.file.write('twig-list.json', JSON.stringify(obj, null, 2));
});
grunt.registerTask('default', ['twigList']);
};
Explanation
Both shelljs and nodes built-in path module are required into Gruntfile.js.
Next a custom task named twigList is registered.
In the body of the function we initialize an empty object and assign it to a variable named obj.
Next, the shelljs find method is invoked passing in a path to the subdirectory containing the .twig files. Currently the path is set to path/to/directory so you'll need to redefine this path as necessary.
Note: The find method can also accept an array of multiple directory paths as an argument. For example:
find(['path/to/directory', 'path/to/another/directory'])
The find method returns an Array of all paths found inside the given directory, (many levels deep). We utilize the Array's filter() method to return only filepaths with a .twig file extension by providing a regex (\.twig$) to the Strings match method.
Each item of the resultant Array of .twig filepaths is passed to the Array's map() method. It returns an Object with two properties (name and path). The value for the name property is obtained from the full filepath using nodes path.basename() method. Similarly, the value for the path property is obtained from the full filepath using nodes path.dirname() method.
Finally the grunt file.write() method is utilized to write a .json file to disk. Currently the first argument is set to twig-list.json. This will result in a file named twig-list.json being saved to the top-level of your project directory where Gruntfile.js resides. You'll need to redefine this output path as necessary. The second argument is provided by utilizing JSON.stringify() to convert obj to JSON. The resultant JSON is indented by two spaces.
Additional info
As previously mentioned the shelljs find method lists all files (many levels deeep) in the given directory path. If the directory path provided includes a subfolder(s) containing .twig files that you want to exclude you can do something like the following in the filter() method:
.filter(function(file) {
return filePath.match(/\.twig$/) && !filePath.match(/^foo\/bar/);
})
This will match all .twig files, and ignore those in the sub directory foo/bar of the given directory, e.g. path/to/directory.

Kotlin Script Engine throws "unresolved reference", even if the package and class is valid

When using Kotlin's Script Engine, trying to import packages or use any class throws an "unresolved reference"
javax.script.ScriptException: error: unresolved reference: mrpowergamerbr
fun loritta(context: com.mrpowergamerbr.loritta.commands.CommandContext) {
^
This doesn't happen when running the class within IntelliJ IDEA, however it does happen when running the class on production.
While this YouTrack issue is related to fat JARs, this also can happen if you aren't using fat JARs (loading all the libraries via the startup classpath option or the Class-Path manifest option)
To fix this, or you can all your dependencies on your startup script like this:
-Dkotlin.script.classpath=jar1:jar2:jar3:jar4
Example:
java -Dkotlin.script.classpath=libs/dependency1.jar:libs/dependency2.jar:yourjar.jar -jar yourjar.jar
Or, if you prefer, set the property via code, using your Class-Path manifest option.
val path = this::class.java.protectionDomain.codeSource.location.path
val jar = JarFile(path)
val mf = jar.manifest
val mattr = mf.mainAttributes
// Yes, you SHOULD USE Attributes.Name.CLASS_PATH! Don't try using "Class-Path", it won't work!
val manifestClassPath = mattr[Attributes.Name.CLASS_PATH] as String
// The format within the Class-Path attribute is different than the one expected by the property, so let's fix it!
// By the way, don't forget to append your original JAR at the end of the string!
val propClassPath = manifestClassPath.replace(" ", ":") + ":Loritta-0.0.1-SNAPSHOT.jar"
// Now we set it to our own classpath
System.setProperty("kotlin.script.classpath", propClassPath)
While I didn't test this yet, in another unrelated answer it seems you can also supply your own classpath if you initialize the KotlinJsr223JvmLocalScriptEngine object yourself (as seen here)

Always "requiring unknown module" when reading json file

bit of an RN newb here. I'm trying to read some json data files:
function loadCategories() {
const ids = ['tl1', 'tl2', 'tl3', 'tl4', 'tl5', 'tl6'];
ids.forEach(function(id) {
var contents = require('../Content/top-level/' + id + ".json.js");
...
});
}
But here I always get an error:
Unhandled JS Exception: Requiring unknown module "../Content/top-level/tl1.json.js".If you are sure the module is there, try restarting the packager or running "npm install".
The files exist and my relative path logic should be OK given the project structure:
ProjectDir
Components
ThisComponent.js
Content
top-level
tl1.json.js
tl2.json.js
...
i.e. the above code is running from ThisComponent.js and trying to access tl1.json.js, etc so I would think the relative path of ../Content/top-level/tl1.json.js would work.
I've tried:
Restarting the packager
Referencing ./Content/top-level/tl1.json.js instead
Referencing /Content/top-level/tl1.json.js instead
I'm on RN 0.36.0. Gotta be something obvious…right?
This isn't possible in React Native because of how the packager works. You have to require files with static string path. You can use a switch statement something like this -
switch (id) {
case 'tl1': return require('../Content/top-level/tl1.json');
case 'tl2': return require('../Content/top-level/tl2.json');
...
}
Also why does your json files have .js extension?

Scala with spark - "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package

I have simple scala application with spark dependencies. I am just trying to create spark context using the follwing code.
def main(args: Array[String]) {
var sparkConfig : SparkConf = new SparkConf() ;
sparkConfig.setAppName("ProxySQL").setMaster("local");
var sc = new SparkContext(sparkConfig)
}
When i try to run this code inside main - it throws security execption at new SparkContext(sparkConfig) with the following message .
Exception in thread "main" java.lang.SecurityException: class "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package .
At problem tab of Eclipse, it shows one warning
Description Path Resource Location Type
More than one scala library found in the build path (D:/workspaces/scala/scalaEclipse/eclipse/plugins/org.scala-ide.scala210.jars_4.0.0.201503031935/target/jars/scala-library.jar, C:/Users/prems.bist/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar).This is not an optimal configuration, try to limit to one Scala library in the build path. SQLWrapper Unknown Scala Classpath Problem
I have scala installation of 2.10.4 at windows machine.
Scala compiler version set at eclipse is 2.10.5 . What is causing this security exception? Is this the incompatiblity version issues or what exaclty else? How would i solve it?
The problem was more or less related with conflicting dependencies.
The following task resolve my issue.
Go to Project
Build Path -> Order and Export tab -> Change the order of
javax.servlet jar
either to bottom or top.
This Resolved the problem.
Well,as I follow the suggestion:Go to Project Build Path -> Order and Export tab -> Change the order of javax.servlet jar either to bottom or top.
I find my buidpath libiaries was changed and it seems mussy(too many small libs),maybe this was caused by maven.
So I try to remove all of them and reimport the libs and chose Project -> Maven ->Update Project !
Now ,it goes well.
The name of your object with the main method shoul be the same as the setAppName("ProxySQL"), also you can exttend it with app and do not use main method, but this is only if you want I find it easy.
package spark.sample
import org.apache.spark.{ SparkContext, SparkConf }
/**
* Created by anquegi on 18/05/15.
*/
object ProxySQL {
def main(args: Array[String]) {
var sparkConfig: SparkConf = new SparkConf();
sparkConfig.setAppName("ProxySQL").setMaster("local");
var sc = new SparkContext(sparkConfig)
}
}
I normally use and object like for using Spark
package spark.sample
import org.apache.spark.{ SparkContext, SparkConf }
/**
* Created by anquegi on 18/05/15.
*/
object ProxySQL extends App {
val sparkConfig: SparkConf = new SparkConf();
sparkConfig.setAppName("ProxySQL").setMaster("local[4]");
val sc = new SparkContext(sparkConfig)
}
I prefer to use val instead of var
You can also setMaster with .setMaster("local[4]"), and not work only with one
It means you did not exclude the Servlet APIs from some dependency in your app, and one of them is bringing it in every time. Look at the dependency tree and exclude whatever brings in javax.servlet.
It should be already available in Spark, and the particular javax.servlet JAR from Oracle has signing info that you have to strip out, or simply exclude the whole thing.
some of the libraries were mention here