I am working on a blazor server web application.
I have to display a table with a big number of rows and columns.
I want to optimize.
Is there a way to enable data compression on blazor's websockets ?
Is there a compression enabled by defaut in Developpment or Production environnement ?
Thanks
Unsure if this is going to be best answer for this but figured I would post it anyway. As other's have commented it would be best to utilize virtualization or pagination along with other app optimizations, but your question made me curious.
After a quick google search turns out asp.net has response compression.
The code used (Placed both as the very first things in their respective methods):
In Setup/ConfigureServices:
services.AddResponseCompression(o =>
{
o.EnableForHttps = true;
o.Providers.Add<BrotliCompressionProvider>();
o.Providers.Add<GzipCompressionProvider>();
o.MimeTypes = ResponseCompressionDefaults.MimeTypes.Concat(
new[] { "image/svg+xml" });
});
services.Configure<BrotliCompressionProviderOptions>(options =>
{
options.Level = CompressionLevel.Optimal;
});
services.Configure<GzipCompressionProviderOptions>(options =>
{
options.Level = CompressionLevel.Optimal;
});
In Setup/Configure:
app.UseResponseCompression();
Default Blazor server app:
Default: 2.2kB transferred.
With changes: 1.5kB transferred.
Loading my own Blazor server app and testing it:
Default: 5.4kB transferred
With changes: 3.7kB transferred
Overall these changes appear to compress what is being transferred a bit more and the compression could most likely be even further optimized, but these changes should not be a substitute for optimizing your code.
Related
I am using PouchDB and CouchDB in an ionic application. While I can successfully sync local and remote databases on Chrome and Android, I get unauthorized error on Safari / iOS when I run the sync command. Below is a simplified version of my database service provider.
import PouchDB from 'pouchdb';
import PouchDBAuthentication from 'pouchdb-authentication';
#Injectable()
export class CouchDbServiceProvider {
private db: any;
private remote: any;
constructor() {
PouchDB.plugin(PouchDBAuthentication);
this.db = new PouchDB('localdb', {skip_setup: true});
}
...
login(credentials) {
let couchDBurl = 'URL of my couchDB database';
this.remote = new PouchDB(couchDBurl);
this.remote.logIn(credentials.username, credentials.password, function (err, response) {
if (err) { concole.log('login error') }
else {
let options = { live: true, retry: true, continuous: true };
this.db.sync(this.remote, options).on('error', (err_) => { console.log('sync error')});
}
})
}
...
}
In the code above, this.remote.logIn(...) is successful but this.db.sync(...) fails. I have checked the requests via the network tab of developer tools and I believe the issue is that the cookie that's retruned in the response header of this.remote.logIn(...) is not used by the subsequent calls (thus the unauthorized error). The issue is fixed once third-party cookies are enabled on Safari, which is not an option on iOS.
How can I fix this problem?
One potential solution I'm considering is overriding fetch to use native http client (i.e., an instance of HTTP from #ionic-native/http). It seems modifying http clients is a possibility (e.g., according to this conversation) but I'm not sure how to achieve that.
Changing the HTTP plumbing sounds like a really bad idea - time cost, mainly - unless you just absolutely have to use sessions/cookies...If you don't, read on.
as noted here regarding pouchDB Security, I tried using pouchdb-authentication when it was actively maintained and went another route due to multiple issues (I don't recall specifics, it was 6 years ago).
Do note the last commit to pouchdb-authentication seems to be 3 years ago. Although inactivity is not an negative indicator on the surface - a project may have simply reached a solid conclusion - installing pouchdb-authentication yields this
found 6 vulnerabilities (2 moderate, 3 high, 1 critical)
That plus the lack of love given to plugin over the last few years makes for a dangerous technical debt to add for a new project.
If possible simply send credentials using the auth option when creating (or opening) a remote database, e.g.
const credentials = { username: 'foo', passwd: 'bar' };
this.remote = new PouchDB(couchDBurl, { auth: credentials });
I don't recall why but I wrote code that is in essence what follows below, and have reused it ad nauseum because it just works with the fetch option
const user = { name: 'foo', pass: 'bar' };
const options = { fetch: function (url, opts) {
opts.headers.set('Authorization', 'Basic ' + window.btoa(user.name+':'+user.pass));
return PouchDB.fetch(url, opts);
}
};
this.remote = new PouchDB(couchDBurl, options);
I believe I chose this approach due to the nature of my authentication workflow discussed in the first link of this answer.
I agree with #RamblinRose that you might have to include the headers manually when you define the PouchDB object.
I myself have found a solution when working with JWTs that need to be included in the header for sync purposes.
See this answer. Note: RxDB uses PouchDB under the hood so it's applicable to this situation. It helped me sync, hope it does you too!
https://stackoverflow.com/a/64503760/5012227
One potential solution I'm considering is overriding fetch to use native http client (i.e., an instance of HTTP from #ionic-native/http). It seems modifying http clients is a possibility (e.g., according to this conversation) but I'm not sure how to achieve that.
Yes, this is a possible option - especially if you want to use SSL pinning which will only work with native requests. And you don't need to worry about CORS (apart from ionic serve).
You can achieve this e.g. by taking an existing fetch - polyfill and modifying it s.t. it uses the http plugin instead of xhr. And since you'll only deal with JSON when interacting with the CouchDB, you can throw away most of the polyfill.
As you know, nuxtjs is server side rendering and there is no good example how to store data into localstorage which is client side.
My work is need to build login form where user can put username and password into the form the send it to server (via api) to check, if login data is correct, the api will return one token key then I will store this key to verify is user is authen and use it with other api.
I found some example but build with vuejs here https://auth0.com/blog/build-an-app-with-vuejs/ but I don't have an idea how to change it to nuxtjs.
Any I have read https://github.com/robinvdvleuten/vuex-persistedstate which I can plug in to my project but I would like to see other solution.
Regards.
Nuxt provides you with process.client to tell it to execute only on the client side
so use it like this:
methods: {
storeToken(token) {
if(process.client) {
localStorage.setItem("authToken", token)
}
}
}
Check this link for more info.
You can use process.client to check it's on client-side or not.
export default {
created() {
this.storeToken();
},
methods:{
storeToken(token){
if(process.client){
localStorage.setItem("authToken", token);
}
}
}
}
You also call this method in mounted without check process.client.
export default {
mounted() {
this.storeToken();
},
methods:{
storeToken(token){
localStorage.setItem("authToken", token);
}
}
}
A little late to the party, but I was having similar problems.
But first I would recommend you to use cookies for saving a key/token/jwt.
The reason being that localStorage can be hijacked through JS api's and cookies can be safeguarded from that. You will however have to safeguard your token from CSFR.
That can be done by having a look at the Refence and Origin headers server side.
This guy wrote a good post on how to do that: How to protect your HTTP Cookies
As for accessing localStorage from Nuxt, here we go:
If you are running Nuxt and haven't told it to run in spa mode it will run in universal mode. Nuxt defines universal mode as:
Isomorphic application (server-side rendering + client-side navigation)
The result being that localStorage is not defined serverside and thus throws an error.
The give away for me was that console logging from middleware files and Vuex outputted to terminal and not the console in developer tools in the browser.
if you want to read more about my solution you can find it here: localStorage versus Nuxt's modes
If you plan on storing small amounts of data, below 4096 bytes, you can use cookies. I recommend the library cookie-universal-nuxt.
npm install cookie-universal-nuxt --save
nuxt.config.js
modules: [
'cookie-universal-nuxt',
],
Then you can use:
const data = {
anything: 'you want',
}
this.$cookies.set('thing', data, {
path: '/',
maxAge: 60 * 60 * 24 * 7
});
this.$cookies.get('thing');
Read the library docs for more if you need it.
The cookie will be available server-side, so you can get around the issues with localStorage.
Just be aware that cookies can only store up to 4096 bytes per cookie.
For example, I fetch cookie data in the nuxtServerInit function in Vuex, and the data is then available everywhere in the app server-side.
Insofar as client means a web browser, all the options are spelled out in the []HTML Living Standard Web Storage section.
12.2 The API
12.2.1 The Storage interface
12.2.2 The sessionStorage getter
12.2.3 The localStorage getter
12.2.4 The StorageEvent interface
U can work with const page = useState( () => data ) or if u want to use local or session storage u can use VueUse module..
I know its a bit late and it might not be the answer you are looking for...but it could be helpful for someone..what i learned after going through many documentations and other answers is that u just cant use local or session storage. In development it just runs when u change the route but when u refresh the page or the component using session storage it throws error "sessionStorage is not defined"..if u are not planning to store data for long time like u do with session or local...u can work with useState( () => data ) property that nuxt provides..it stores your data until u refresh your webPage...
I have this use case:
- I'm working on a game with a webapp for user management and chat, which is on MERN, and a unity game, with socket.io as the real time messaging layer for the multiplayer game.
- User may register to webapp by either providing a pair of email/password, or getting authenticated on FB/Gamil/etc. as usual, in which case the user's email is obtained and saved to MongoDB and this is done by passport.
- There is no session in express side, and socket.io is on a redis. There is no cookie but JWT is used.
My problem is that I don't know what's the best practices in this. I read this
article
and this
one
which both have content and code close to what I want to do, but in the first one:
app.use(express.cookieParser());
while I don't want to use cookie at all, and the other one also has in code:
cookie: {
secure: process.env.ENVIRONMENT !== 'development' && process.env.ENVIRONMENT !== 'test',maxAge: 2419200000}...
Also, I found this on
github
which suggests for the client side (unity):
var socket = io.connect('http://localhost:9000');
socket.on('connect', function (socket) {
socket.on('authenticated', function () {
//do other things
})
.emit('authenticate', {token: jwt}); //send the jwt
});
meaning that:
1. socket is created
2. authentication is requested
but I think that the approach I found in the other article is better, where the socket is not created at all if the JWT for auth is not provided at the first ever connection request sent to "io", so if I'd do it I'd issue:
var socket = io.connect('http://localhost:9000', {query: {"JWT":"myjwt"}});
and in my server side where I have:
io.on("connection", function(socket){...});
I'd like to first get the JWT:
var jwt = socket.handshake.query["JWT"];
and then if auth will be unsuccessful, simply return socket.disconnect('reason') and do not open any connection at all (here maybe I just didn't understand, say, that the approach the Author took in the github source is using a middle ware technique and it is maybe also done before anything else).
I still could not find out what is the best practice that Gurus use, please help me get clear.
I have two websites, one client website and a pricing WEBAPI website. I have a performance issue with the pricing website as it often suspends itself due to low usage and on the first call takes time to initialize. If I repeat the request immediately after that, it is very quick.
I know that it will be used on certain pages of the client website, I therefore wish to call it when that page loads so its ready when the users valid request comes in seconds later. Please note the pricing WEBAPI site is not available from the client, only the client website can access it on the server side.
I don't know the best approach to this, I don't wish to impact the performance of the client website. I have considered an 1px x 1px iFrame calling a page but concerned it will block the page. Is an Ajax call more appropriate, but how to I call something on the client website to in turn call the webservice? Is there a better approach I haven't considered?
Known issue on shared hosting environments, a workaround is fine but I would suggest upgrading your server. My hosting has a DotNetNuke option, which essentially means it will reserve memory on the server and don't recycle the app pool due inactivity. Compared to VPS this is cheaper.
If it is not shared hosting, these IIS settings could help you.
Anyway, back to your workaround:
You say the client cannot access the webapi, only the back-end of your website can. Seems weird because an webapi exposes REST GET,POST methods. Either way you could do an async call to your webapi on server side that does not wait for a response or do a javascript call to your API.
Assuming your website is also ASP.NET:
public static async Task StartupWebapi()
{
string requestUrl = "http://yourwebapi.com/api/startup";
using (var client = new HttpClient())
{
//client.Timeout = new TimeSpan(0, 0, 20); timeout if needed
try
{
HttpResponseMessage response = await client.GetAsync(requestUrl);
if (response.IsSuccessStatusCode)
{
resultString = await response.Content.ReadAsStringAsync();
}
}
}
}
Then, somewhere in your code, that will be called at least when your client website starts.
HostingEnvironment.QueueBackgroundWorkItem(ct => SomeClass.StartupWebapi());
Or in javascript, which is executed asynchronously.
$.ajax({
url: "http://yourwebapi.com/api/startup",
type: "GET",
success: function (response) {
},
error: function (response) {
}
});
See this question for some other workarounds.
I´m buidling a mobile App with Angular JS and SQLite for offline storage.
Does anybody have an idea how to structure the SQL statements?
I have my controllers and they call the factories, but is it a good idea to write the
statements into the factories? Http-request are not useful in this case. Is there a further "abstraction layer"?
<!-- language: lang-js -->
app.controller( 'loginController', function loginController($scope, loginFactory) {
$scope.loginFactory = function() {
return loginFactory.login($scope.firstnameLogin, $scope.passwordLogin);
};
});
app.factory('loginFactory', function() {
return {
login : function(firstnameLogin, passwordLogin) {
// HERE THE SQL-STATEMENT? //
}
}
});
Edit: Added some code.
I've never actually used javascript to connect to a SQLite database in the manner which you are asking, but it does seem that if you're using HTML5, you can connect to a data store in the browser and store you data in a SQL like manner.
Adobe has a walk through of how to connect and do your standard CRUD activities with this data store. After reading through the document, I would guess that you can then use the basics to build up an AngularJS factory to return the Create, Read, Update and Delete methods for you mobile application to use.
Good luck, hope this helps you come.
Store data in the HTML5 SQLite database