I was needing to clearCookies and found a hidden/undocumented function - https://github.com/facebook/react-native/blob/26684cf3adf4094eb6c405d345a75bf8c7c0bf88/Libraries/Network/RCTNetworking.android.js
I am able to access it like this:
import RCTNetworking from 'RCTNetworking'
console.log('RCTNetworking:', RCTNetworking.clearCookies);
It works, but is it correct? Will import RCTNetworking from 'RCTNetworking' work guranteed?
I thought it would be more safe to import from NativeModules like this:
import { NativeModules } from 'react-native'
console.log('Networking:', NativeModules.Networking.clearCookies);
However this imports the whole NativeModules which has a bunch of other stuff. Wouldn't this be bad? Or does tree shakingng in production remove all the stuff I don't use from NativeModules?
Is there another way to access clearCookies? Is this documented anywhere?
I imported RCTNetworking like this: var RCTNetworking = require("RCTNetworking"); This import is guaranteed to work on all platforms.
I couldn't find any documentation for RCTNetworking nor for the clearCookies() function
Related
I'm new to Ktor and Kotlin in general, so please be patient.
I am currently trying to create a little website (mostly for learning) that uses key-value files for internationalization.
I already did something similar in PHP where I just decoded a JSON file and got the value related to the key I passed. This way, I could do something as <p><?php echo $langJson["presentation"][0];?></p> (with $langJson being my json key-value file) so that I would get the proper translation.
I'm trying to do an equivalent in Kotlin using Ktor, but I don't know how to do it. I found the aymanizz ktor-i18n plugin on GitHub that allows to use i18n for internationalization but I don't know if it is really adapted to what I want to do since it detects the language in the header instead of it being chose by the user (with _GET for instance).
Does anyone have any clue on how I could do that?
Briefly, what I want to do is having a single coded page where the content is dynamicly chosen from the accurate language file.
Thank you all! :)
The basic idea is to get a language code from a request (a query parameter, a header, etc), generate a path to an i18n resource file, read it and then deserialize JSON into a map. The resulting map could be used as-is or passed as a model to a template.
Here is an example where I use kotlinx.serialization to transform a JSON string to get a map and FreeMarker template engine to render HTML. To switch a language just use the lang GET parameter, e.g, http://localhost:8080/?lang=es.
import freemarker.cache.ClassTemplateLoader
import io.ktor.application.*
import io.ktor.freemarker.*
import io.ktor.request.*
import io.ktor.response.*
import io.ktor.routing.*
import io.ktor.server.engine.*
import io.ktor.server.netty.*
import kotlinx.serialization.decodeFromString
import kotlinx.serialization.json.Json
fun main() {
embeddedServer(Netty, port = 8080) {
install(FreeMarker) {
templateLoader = ClassTemplateLoader(this::class.java.classLoader, "templates")
}
routing {
get("/") {
call.respond(FreeMarkerContent("index.ftl", mapOf("i18n" to loadI18n(call.request))))
}
}
}.start()
}
fun loadI18n(request: ApplicationRequest): Map<String, String> {
val language = request.queryParameters["lang"] ?: "en"
val filePath = "i18n/$language.json"
val data = object {}.javaClass.classLoader.getResource(filePath)?.readText() ?: error("Cannot load i18n from $filePath")
return Json.decodeFromString(data)
}
resources/templates/index.ftl
<html>
<body>
<h1>${i18n.greetings}</h1>
</body>
</html>
resources/i18n/en.json
{
"greetings": "Hello"
}
resources/i18n/es.json
{
"greetings": "Hola"
}
If you want to fully support of i18n, i recomand to use https://github.com/aymanizz/ktor-i18n. You will have ability to use plulars and other thinks from i18n standart.
I have this class to configure a HttpClient instance:
package com.company.fraud.preauth.service.feignaccertifyclient;
import com.company.fraud.preauth.config.ProviderClientConfig;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.http.client.HttpClient;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.conn.ssl.TrustSelfSignedStrategy;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.ssl.SSLContextBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.security.KeyManagementException;
import java.security.KeyStoreException;
import java.security.NoSuchAlgorithmException;
#Slf4j
#Configuration
#RequiredArgsConstructor
public class FeignClientConfig {
private final ProviderClientConfig providerClientConfig;
public HttpClient buildHttpClient() throws NoSuchAlgorithmException, KeyStoreException, KeyManagementException {
RequestConfig.Builder requestBuilder = RequestConfig.custom();
requestBuilder.setConnectTimeout(providerClientConfig.getConnectionTimeout());
requestBuilder.setConnectionRequestTimeout(providerClientConfig.getConnectionRequestTimeout());
requestBuilder.setSocketTimeout(providerClientConfig.getSocketTimeout());
SSLContextBuilder builder = new SSLContextBuilder();
builder.loadTrustMaterial(null, new TrustSelfSignedStrategy());
return HttpClientBuilder.create()
.setMaxConnPerRoute(providerClientConfig.getMaxConnectionNumber())
.setDefaultRequestConfig(requestBuilder.build())
.setSSLContext(builder.loadTrustMaterial(null, new TrustSelfSignedStrategy()).build())
.build();
}
}
How to unit test this class, to see into the resulted HttpClient that these values are correctly set?
From the httpClient I cannot get access to its RequestConfig.
I am aware of these two posts:
How do I test a private function or a class that has private methods, fields or inner classes?
(the number of upvotes in this question shows that it is a concurrent and controversial topic in testing, and my situation may offer an example that why we should look into the inner state of an instance in testing, despite that it is private)
Unit test timeouts in Apache HttpClient
(it shows a way of adding an interceptor in code to check configure values, but I don't like it because I want to separate tests with functional codes)
Is there any way? I understand that this class should be tested, right? You cannot blindly trust it to work; and checking it "notNull" seems fragile to me.
This link may point me to the right direction:
https://dzone.com/articles/testing-objects-internal-state
It uses PowerMock.Whitebox to check internal state of an instance.
So I have checked into PowerMock.Whitebox source code, and it turns out reflection is used internally. And, as PowerMock is said to be not compatible with JUnit 5 yet(till now), and I don't want to add another dependency just for testing, so I will test with reflection.
package com.company.fraud.preauth.service.feignaccertifyclient;
import com.company.fraud.preauth.config.PreAuthConfiguration;
import com.company.fraud.preauth.config.ProviderClientConfig;
import com.company.fraud.preauth.config.StopConfiguration;
import org.apache.http.client.HttpClient;
import org.apache.http.client.config.RequestConfig;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit.jupiter.SpringExtension;
import java.lang.reflect.Field;
import static org.hamcrest.CoreMatchers.equalTo;
import static org.junit.Assert.assertThat;
import static org.mockito.Mockito.when;
#ExtendWith(SpringExtension.class)
#SpringBootTest(classes = {
PreAuthConfiguration.class,
StopConfiguration.class,
})
public class FeignClientConfigTest {
#Mock
private ProviderClientConfig providerClientConfig;
#Test
#DisplayName("should return HttpClient with defaultConfig field filled with values in providerClientConfig")
public void shouldReturnHttpClientWithConfiguredValues() throws Exception {
// given
when(providerClientConfig.getConnectionRequestTimeout()).thenReturn(30000);
when(providerClientConfig.getConnectionTimeout()).thenReturn(30);
when(providerClientConfig.getMaxConnNumPerRoute()).thenReturn(20);
when(providerClientConfig.getSocketTimeout()).thenReturn(10);
FeignClientConfig feignClientConfig = new FeignClientConfig(providerClientConfig);
// when
HttpClient httpClient = feignClientConfig.buildHttpClient();
// then
// I want to test internal state of built HttpClient and this should be checked
// I tried to use PowerMock.Whitebox, but then I found it uses reflection internally
// I don't want to introduce another dependency, and PowerMock is said not to be compatible with JUnit 5, so..
Field requestConfigField = httpClient.getClass().getDeclaredField("defaultConfig");
requestConfigField.setAccessible(true);
RequestConfig requestConfig = (RequestConfig)requestConfigField.get(httpClient);
assertThat(requestConfig.getConnectionRequestTimeout(), equalTo(30000));
assertThat(requestConfig.getConnectTimeout(), equalTo(30));
assertThat(requestConfig.getSocketTimeout(), equalTo(10));
}
}
Also, I answer the first question in OP about when to test private members in a class here
Whitebox was working for me. As it is not documented here I'm adding my version:
in my case wanted to test that the timeout is different from 0 to avoid deadlock
HttpClient httpClient = factory.getHttpClient();
RequestConfig sut = Whitebox.getInternalState(httpClient, "defaultConfig");
assertNotEquals(0, sut.getConnectionRequestTimeout());
assertNotEquals(0, sut.getConnectTimeout());
assertNotEquals(0, sut.getSocketTimeout());
This looks like destructuring:
const {getElementById, seedElements} = require('./utils')
but I'm confused about it. I'm used to seeing something like:
let {first, last} = name
Are these doing the same things just in different files?
You can think
const {getElementById, seedElements} = require('./utils')
as destructuring since when you export, you would write your export like
module.exports = { getElementById, seedElements };
or
export { getElementById, seedElements };
and while importing using require you would basically be importing the entire module and you can destructure the individual modules from it.
const {getElementById, seedElements} = require('./utils')
would be similar to
const Utils = require('./utils');
const { getElementById, seedElements } = Utils;
with the import syntax, you would however import the named exports like
import { getElementById, seedElements } from './utils';
Yes, that is object destructuring.
The require() function in Node.js can be used to import modules, JSON, and local files. For instance (from the docs):
// Importing a local module:
const myLocalModule = require('./path/myLocalModule');
Calling require(moduleId) returns the object module.exports of moduleId ( module.exports contains precisely all properties that are made available by the module).
I created a widget using the source code available in github. Now I'm using that widget in SonarQube V5.3. This is where I got the source code from:
https://github.com/SonarSource/sonar-examples/tree/master/plugins/sonar-reference-plugin
When I use this widget it is showing up the same data across multiple projects. I would like to know if there is any way I can display different data for different projects. Please share your ideas. Below is the code that displays the ruby widget
import org.sonar.api.web.AbstractRubyTemplate;
import org.sonar.api.web.Description;
import org.sonar.api.web.RubyRailsWidget;
import org.sonar.api.web.UserRole;
import org.sonar.api.web.WidgetCategory;
import org.sonar.api.web.WidgetProperties;
import org.sonar.api.web.WidgetProperty;
import org.sonar.api.web.WidgetPropertyType;
import org.sonar.api.batch.CheckProject;
import org.sonar.api.resources.Project;
#UserRole(UserRole.USER)
#Description("Sample")
#WidgetCategory("Sample")
#WidgetProperties({
#WidgetProperty(key = "Index",type=WidgetPropertyType.TEXT
),
})
public class OneMoreRubyWidget extends AbstractRubyTemplate implements RubyRailsWidget {
#Override
public String getId() {
return "Sample";
}
#Override
public String getTitle() {
return "Sample";
}
#Override
protected String getTemplatePath() {
return "/example/Index.html.erb";
}
}
Thank you so much in advance
You haven't specified global scope for your widget (#WidgetScope("GLOBAL")) in the .java file, so this is a question of what's in your .erb file.
This Widget Lab property widget should give you some pointers. Specifically: you want to pick up #project in your widget, and query with #project.uuid. Here's another project-level widget for comparison.
You should be aware, though, that SonarSource is actively working to remove Ruby from the platform, so at some future date, you'll probably end up re-writing your widgets (likely in pure JavaScript).
I know this is a fairly common problem. I'm writing a small Flask app and I'm trying to feed some queries back to the view.
I've connected to my local MongoDB setup, and made a successful query - but I can't generate a json object with it.
The most common solution I've seen is to import json_util from pymongo i.e.
import json
from pymongo import json_util
results = connection.get_collection('papayas_papaya')
results = results.find({
'identifier': '1',
})
serialized_results = [json.dumps(result, default=json_util.default, separators=(',', ':')) for result in results]
I've installed pymongo into my Flask virtualenv using pip i.e :
pip install pymongo
When running the above code I keep getting the following error:
ImportError: cannot import name json_util
I can see this line in the pymongo-2.3-py2.6.egg-info/installed-files.txt
../bson/json_util.py
Anyone got any tips that can help me figure out what I'm doing wrong?
UPDATE:
Having noodled about with this a little further - I've managed to get this working thus:
import pymongo
from bson.json_util import dumps
connection = pymongo.Connection("localhost", 27017)
db = connection.mydocs
def get():
cursor = db.foo.find({"name" : "bar"})
return dumps(cursor)
One of the problems I had was trying to pip install bson independently - pymongo brings bson with it and importing bson separately caused problems.
Thanks #Cagex for pointing me in the right direction
It looks like you want to import from bson not pymongo. I believe json_util was moved to that module recently.
https://pymongo.readthedocs.io/en/stable/api/bson/json_util.html
I've seen quite a few posts on this issue but they didn't resolve the issue for me. What worked for me was using dumps(), then loads():
import pymongo
from bson.json_util import dumps
from bson.json_util import loads
connection = pymongo.Connection("localhost", 27017)
db = connection.mydocs
def get():
cursor = db.foo.find({"name" : "bar"})
return loads(dumps(cursor))
you can use list() to convert pymongo cursor to json object.
import pymongo
from bson.json_util import dumps
from bson.json_util import loads
connection = pymongo.Connection("localhost", 27017)
db = connection.mydocs
def get():
cursor = list(db.foo.find({"name" : "bar"}))
return loads(dumps(cursor))
You first need define your own JSONEncoder:
import json
from datetime import datetime
from typing import Any
from bson import ObjectId
class MongoJSONEncoder(json.JSONEncoder):
def default(self, o: Any) -> Any:
if isinstance(o, ObjectId):
return str(o)
if isinstance(o, datetime):
return str(o)
return json.JSONEncoder.default(self, o)
And then use it to encode the Mongo cursor:
data_json = MongoJSONEncoder().encode(list(cursor))
that you can then load as a Python object with json.loads():
data_obj = json.loads(data_json)