why read tsconfig.json using readConfigFile instead of directly requiring the path of tsconfig.json? - create-react-app

Upon investigating create-react-app's configuration, I found something interesting.
// config/modules.js
...
if (hasTsConfig) {
const ts = require(resolve.sync("typescript", {
basedir: paths.appNodeModules,
}));
config = ts.readConfigFile(paths.appTsConfig, ts.sys.readFile).config;
// Otherwise we'll check if there is jsconfig.json
// for non TS projects.
} else if (hasJsConfig) {
config = require(paths.appJsConfig);
}
...
Unlike reading jsconfig.json file using direct require(paths.appJsConfig), why is here using resolve.sync and ts.readConfigFile to read the tsconfig.json?
...
if (hasTsConfig) {
config = require(paths.appTsConfig)
// Otherwise we'll check if there is jsconfig.json
// for non TS projects.
} else if (hasJsConfig) {
config = require(paths.appJsConfig);
}
...
If I change the code like just above, the result is same. (at least the console output is same.)
There must be a reason why create-react-app using such a complicated way to read the typescript config file.
Why is that?

The ts config reader is a bit smarter than simply reading and parsing a json file. There's two differences I can think of right now:
in tsconfig files, you can use comments. JSON.parse will throw an exception because / is not an allowed character at an arbitrary position
ts config files can extend each other. Simply parsing a JSON file will ignore the extension and you'll receive a config object that doesn't represent what typescript actually uses.

Related

Kotlin Room DB Export schemas

class RoomSchemaArgProvider(
#get:InputDirectory
#get:PathSensitive(PathSensitivity.RELATIVE)
val schemaDir: File
) : CommandLineArgumentProvider {
override fun asArguments(): Iterable<String> {
// Note: If you're using KSP, you should change the line below to return
// listOf("room.schemaLocation=${schemaDir.path}")
return listOf("-Aroom.schemaLocation=${schemaDir.path}")
}
}
I need to Export old db schema in json. I wanted to use the above code if any one used this as per https://developer.android.com/training/data-storage/room/migrating-db-versions#export-schemas please help me with the same.
i tried to use as per this https://developer.android.com/training/data-storage/room/migrating-db-versions#export-schemas
problem faced during testing migration. hence i need this solution.
java.io.FileNotFoundException: Cannot find the schema file in the assets folder. Make sure to include the exported json schemas in your test assert inputs. See https://developer.android.com/training/data-storage/room/migrating-db-versions#export-schema for details. Missing file: com.sboxnw.freeplay.data.database.SugarBoxDatabase/2.json
at androidx.room.testing.MigrationTestHelper.loadSchema(MigrationTestHelper.java:484)
at androidx.room.testing.MigrationTestHelper.createDatabase(MigrationTestHelper.java:238)
at com.sboxnw.freeplay.DownloadMigrationTest.testAllMigrations(DownloadMigrationTest.kt:72)
I need to Export all old db json schema for migration testing .
It is failing due to src path not define correctly. You can add source path directly in defaultConfig of build.gradle
javaCompileOptions {
annotationProcessorOptions {
arguments = ["room.schemaLocation": "$projectDir/schemas".toString()]
}
}

How do I bundle and read files within my extension?

I'm learning how to create vscode extensions and I'm simply trying to emulate a template plugin to be used internally within my company. I'd like to store the template in a file within the extension and then read the contents from that file and output it to the active window. So given the following example directory structure of the extension (other files omitted for brevity):
workspace/
|- templates/
| |- template1.yaml
| |- template2.yaml
|- extension.js
For the command that I'm registering, I'd simply like to read from one of those files and send it to the window as a snippet, but I don't know how to find the path to that folder
// extension.js
function activate(context) {
context.subscriptions.push(
vscode.commands.registerCommand("xxx.my-command"), function() {
let path = // ???/templates/template1.yaml
fs.readFile(path, (err, data) => {
let snippet = new vscode.SnippetString(data);
vscode.window.activeTextEditor.insertSnippet(snippet);
}
})
})
)
}
When I try to use relative paths to that templates folder, it appears to be relative to the VSCode installation. I looked around TextDocument provider etc but all of the examples were relevant to using the active editor as the document. How can I include files with my extension and access the contents of those files like this?
You should use context.asAbsolutePath, which by documentation says:
/**
* Get the absolute path of a resource contained in the extension.
*
* *Note* that an absolute uri can be constructed via {#linkcode Uri.joinPath} and
* {#linkcode ExtensionContext.extensionUri extensionUri}, e.g. `vscode.Uri.joinPath(context.extensionUri, relativePath);`
*
* #param relativePath A relative path to a resource contained in the extension.
* #return The absolute path of the resource.
*/
asAbsolutePath(relativePath: string): string;
For instance
const path = context.asAbsolutePath("templates/template1.yaml");
I also suggest you to avoid using Node's fs. Instead, use workspace.fs whenever possible. It will enable your extension to work on Remotes and Web, when necessary. More details here https://code.visualstudio.com/api/extension-guides/virtual-workspaces#review-that-the-extension-code-is-ready-for-virtual-resources
Hope this helps
I found that I could actually do a couple different things:
const path = require('path')
let templatePath = path.resolve(__dirname, './templates/template1.yaml');
But also as Mark suggested in the comments, using the extension context also provides that path:
/**
* #param {vscode.ExtensionContext} context
*/
function activate(context) {
let templatePath = context.extensionPath + '/templates/template1.yaml';
}
I went with the latter since it is being injected by the runtime which seemed better to me if not simpler. Thanks Mark!

Drupal 8 custom modules - Installing additional configuration from YML files with a hook_update_N

I have a custom module which installs a specific set of configurations - these are all stored in the config/install folder, which means they are installed when the module is installed.
The configuration includes a content type, paragraphs, view modes, form modes, field storages and fields attached to both the content type and the paragraphs, etc. The idea is to use this module to install a 'feature' (a blog) and use it across multiple sites, as well as provide updates and extensions when we add more stuff to this feature.
Since upon initial install, you cannot add more configuration through the config/install folder, I've been trying to find a way to import additional configuration files through an update hook, and this is one that works:
<?php
use \Symfony\Component\Yaml\Yaml;
/**
* Installs the file upload element
*/
function MODULE_NAME_update_8002() {
// Is the flaw with this the fact that the order of loading configurations now
// matters and is a little bit more difficult to deal with?
// NOTE: YES. If, for example, you comment out the installing of the
// field_storage for the field_cb_file, but try to add the field_cb_file to
// the paragraph type, the update is successful and no errors are thrown.
// This is basically me trying to re-create the drupal configuration management
// system, without the dependency checks, etc. What is the PROPER way of
// importing additional configuration from a module through an update?
// FIXME:
$configs_to_install = [
'paragraphs.paragraphs_type.cbsf_file_download',
'field.storage.paragraph.field_cb_file',
'field.field.paragraph.cbsf_file_download.field_cb_file',
'field.field.paragraph.cbsf_file_download.field_cb_heading',
'field.field.paragraph.cbsf_file_download.field_cb_icon',
'field.field.paragraph.cbsf_file_download.field_cb_text',
'core.entity_form_display.paragraph.cbsf_file_download.default',
'core.entity_view_display.paragraph.cbsf_file_download.default',
];
foreach ($configs_to_install as $config_to_install) {
$path = drupal_get_path('module', 'MODULE_NAME') . '/config/update_8002/' . $config_to_install . '.yml';
$content = file_get_contents($path);
$parsed_yml = Yaml::parse($content);
$active_storage = \Drupal::service('config.storage');
$active_storage->write($config_to_install, $parsed_yml);
}
}
however, there are flaws with this method since it means you have to order configuration files in the right order if they depend on each other, and any dependencies that are present in the config file are not checked.
Is there a way to utilise configuration management to import config properly, in this same, 'loop over the files' way? Or to point to a folder that contains all of the config files and install them?
EDIT: There are further issues with this method - even if you've ordered the files correctly in terms of dependencies, no database tables are created. The configuration is simply 'written in' as is, and no other part of Drupal seems to be made aware that new entities were created, so they cannot run any of the functions that are otherwise ran if you were to create the entities through Drupal GUI. Definitely not the recommended way of transferring more complex configuration.
I've pushed this a step further - there is a way to use the EntityTypeManager class to create / update configurations.
2 links have helped me with this largely:
https://drupal.stackexchange.com/questions/164713/how-do-i-update-the-configuration-of-a-module
pwolanins answer at the bottom provides a function that either updates the configuration if it exists, or creates the configuration outright.
https://www.metaltoad.com/blog/programmatically-importing-drupal-8-field-configurations
the code on this page gives a clearer idea of what is happening - for each configuration that you'd like to install, you run the YML file through the respective storage manager, and then create the appropriate entity configurations, which creates all of the required DB tables.
What I ended up doing was:
Utilised a slightly modified version of pwolanins code and create a generic config updater function -
function _update_or_install_config( String $prefix, String $update_id, String $module) {
$updated = [];
$created = [];
/** #var \Drupal\Core\Config\ConfigManagerInterface $config_manager */
$config_manager = \Drupal::service('config.manager');
$files = glob(drupal_get_path('module', $module) . '/config/update_' . $update_id. '/' . $prefix . '*.yml') ;
foreach ($files as $file) {
$raw = file_get_contents($file);
$value = \Drupal\Component\Serialization\Yaml::decode($raw);
if(!is_array($value)) {
throw new \RuntimeException(sprintf('Invalid YAML file %s'), $file);
}
$type = $config_manager->getEntityTypeIdByName(basename($file));
$entity_manager = $config_manager->getEntityManager();
$definition = $entity_manager->getDefinition($type);
$id_key = $definition->getKey('id');
$id = $value[$id_key];
/** #var \Drupal\Core\Config\Entity\ConfigEntityStorage $entity_storage */
$entity_storage = $entity_manager->getStorage($type);
$entity = $entity_storage->load($id);
if ($entity) {
$entity = $entity_storage->updateFromStorageRecord($entity, $value);
$entity->save();
$updated[] = $id;
}
else {
$entity = $entity_storage->createFromStorageRecord($value);
$entity->save();
$created[] = $id;
}
}
return [
'udpated' => $updated,
'created' => $created,
];
}
I placed all of my yml files in folder config/update_8002, then utilised this function to loop over the config files in a hook_update_N function:
function MODULE_NAME_update_8002() {
$configs_to_install = [
'paragraphs.paragraphs_type.cbsf_file_download',
'core.entity_form_display.paragraph.cbsf_file_download.default',
'core.entity_view_display.paragraph.cbsf_file_download.default',
'field.storage.paragraph.field_cb_file',
'field.field.paragraph.cbsf_file_download.field_cb_file',
'field.field.paragraph.cbsf_file_download.field_cb_heading',
'field.field.paragraph.cbsf_file_download.field_cb_icon',
'field.field.paragraph.cbsf_file_download.field_cb_text',
];
foreach ($configs_to_install as $config_to_install) {
_update_or_install_config('paragraphs.paragraphs_type', '8002', 'MODULE_NAME');
_update_or_install_config('field.storage.paragraph', '8002', 'MODULE_NAME');
_update_or_install_config('field.field.paragraph', '8002', 'MODULE_NAME');
_update_or_install_config('core.entity_view_display.paragraph', '8002', 'MODULE_NAME');
_update_or_install_config('core.entity_form_display.paragraph', '8002', 'MODULE_NAME');
}
}
Note that the _update_or_install_config function loops over all of the configs in the folder that match a specific entity type manager - thus you should just include the prefix in the function, and all of the YML files that import configuration of the same type will be included.

How can i encode a string in base64 using meteor

I am trying to use a form to upload files to a s3 bucket using Meteor. I am following this amazon article. At "Sign Your S3 POST Form", near the end, I need to encode a string to base64 but I've been unable to find a way to do this. Can anyone tell me how to do this? Notice that the string first needs to be encoded and then signed. This is how it's done in python:
import base64
import hmac, hashlib
policy = base64.b64encode(policy_document)
signature = base64.b64encode(hmac.new(AWS_SECRET_ACCESS_KEY, policy, hashlib.sha1).digest())
You can do this without the NodeJS crypto module, creating a package looked a bit like breaking a fly on the wheel to me so I figured out this:
if (Meteor.isServer) {
Meteor.methods({
'base64Encode':function(unencoded) {
return new Buffer(unencoded || '').toString('base64');
},
'base64Decode':function(encoded) {
return new Buffer(encoded || '', 'base64').toString('utf8');
},
'base64UrlEncode':function(unencoded) {
var encoded = Meteor.call('base64Encode',unencoded);
return encoded.replace(/\+/g, '-').replace(/\//g, '_').replace(/=+$/, '');
},
'base64UrlDecode':function(encoded) {
encoded = encoded.replace(/-/g, '+').replace(/_/g, '/');
while (encoded.length % 4)
encoded += '=';
return Meteor.call('base64Decode',encoded);
}
console.log(Meteor.call('base64Encode','abc'));
});
This is based on the base64.js by John Hurliman found at https://gist.github.com/jhurliman/1250118 Note that this will work like a charm on the server but for porting it to the client you have call the methods with a callback function that stores the result as a session variable.
You need NodeJS crypto module to perform these tasks.
First create a "packages" directory at the root of your meteor project, then create a "my-package" directory.
Inside it, you need two files : a "package.js" and "my-package.js".
package.js should look like :
Package.describe({
summary:"MyPackage doing amazing stuff with AWS."
});
Package.on_use(function(api){
// add your package file to the server app
api.add_files("my-package.js","server");
// what we export outside of the package
// (this is important : packages have their own scope !)
api.export("MyPackage","server");
});
my-package.js should look like :
var crypto=Npm.require("crypto");
MyPackage={
myFunction:function(arguments){
// here you can use crypto functions !
}
};
The function you will probably need is crypto.createHmac.
Here is an example code of how I encode a JSON security policy in base64 then use it to generate a security signature in my own app :
encodePolicy:function(jsonPolicy){
// stringify the policy, store it in a NodeJS Buffer object
var buffer=new Buffer(JSON.stringify(jsonPolicy));
// convert it to base64
var policy=buffer.toString("base64");
// replace "/" and "+" so that it is URL-safe.
return policy.replace(/\//g,"_").replace(/\+/g,"-");
},
encodeSignature:function(policy){
var hmac=crypto.createHmac("sha256",APP_SECRET);
hmac.update(policy);
return hmac.digest("hex");
}
This will allow you to call MyPackage.myFunction in the server-side of your Meteor app.
Last but not last, don't forget to "meteor add my-package" in order to use it !
You can use meteor-crypto-base64 package.
CryptoJS.enc.Base64.stringify(CryptoJS.enc.Utf8.parse('Hello, World!'));
//"SGVsbG8sIFdvcmxkIQ=="

How to configure multiple sitemaps using MVCSiteMapProvider v4 with StructureMap DI

The problem, essentially, is that I can't get my sitemap config to support multiple sitemaps. It's always looking for "default" even when I name my instances and request another. Now for the background.
I've been pouring over the docs for the new implementation of MVCSiteMapProvider. They are now using Dependency Injection to configure the SiteMapProvider. We have an existing StructureMap DI implementation, so I followed the instructions and added, in our case
ObjectFactory.Configure(x =>
{
...
x.AddRegistry<MvcSiteMapProviderRegistry>();
...
});
Then I started tweaking the MvcSiteMapProviderRegistry.cs file to implement my multiple sitemap scenario. I have multiple site map files, either will work as long as it's called "default". If I remove the "default" item then it breaks and complains that "default" is missing. Which I assume is because it can't find my instance. Here's how I have them defined. I suspect the problem is somewhere in here... the loader which it says I have to configure in the Global.asax is looking for ISiteMapLoader but I'm adding my multiple configuration to SiteMapBuilderSet... anyway here's the code.
// Register the sitemap builder
string absoluteFileName = HostingEnvironment.MapPath("~/Main.sitemap");
string absoluteFileName2 = HostingEnvironment.MapPath("~/Test.sitemap");
var xmlSource = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName);
var reservedAttributeNameProvider = this.For<ISiteMapXmlReservedAttributeNameProvider>()
.Use<SiteMapXmlReservedAttributeNameProvider>()
.Ctor<IEnumerable<string>>("attributesToIgnore").Is(new string[0]);
var builder = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
var xmlSource2 = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName2);
var builder2 = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource2);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
// Configure the builder sets
this.For<ISiteMapBuilderSetStrategy>().Use<SiteMapBuilderSetStrategy>()
.EnumerableOf<ISiteMapBuilderSet>().Contains(x =>
{
/* x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("default")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
/*
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("MainSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("TestSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder2)
.Ctor<ICacheDetails>().Is(cacheDetails);
});
In my global.asax.cs I added
MvcSiteMapProvider.SiteMaps.Loader = Resolver.Get<ISiteMapLoader>();
and to reference in my view I have
#Html.MvcSiteMap("TestSiteMapProvider").Menu(false, true, true)
but it must not be able to find "TestSiteMapProvider" because it always displays "default" or complains if it doesn't exist.
I also thought it might have something to do with the Cache, as I see the filename referenced there, but I don't know how to add multiple instances to the cache, so I just disabled it. I'm really not doing anything fancy with my sitemaps anyway, and this whole thing is really feeling like massive overkill just to get some flippin automatic breadcrumbs!
Apparently there was another help doc that I wasn't aware of. I had completed all of the steps thus far properly, but I also needed to implement ISiteMapCacheKeyGenerator.
See this doc (which wasn't named this when I started.)
https://github.com/maartenba/MvcSiteMapProvider/wiki/Multiple-Sitemaps-in-One-Application