I created a simple (so far) Blazor WebAssembly application in .NET 6.
I'm currently adding additional HTTP requests to every response of the application and wanted to add an X-FRAME-OPTIONS header, but when searching on how to do it, I realized I don't know how to approach it.
For starters here's my Program.cs file:
using Microsoft.AspNetCore.Components.Web;
using Microsoft.AspNetCore.Components.WebAssembly.Hosting;
using MyApplicationNamespace;
var builder = WebAssemblyHostBuilder.CreateDefault(args);
builder.RootComponents.Add<App>("#app");
builder.RootComponents.Add<HeadOutlet>("head::after");
builder.Services.AddScoped(sp => new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) });
await builder.Build().RunAsync();
When reading this webpage I learned about using middleware inside
app.Use(async (context, next) =>
{
context.Response.Headers.Add("x-my-custom-header", "middleware response");
await next();
});
I do understand from this site that in order to use the Use function I can do this:
var app = builder.Build();
app.Use();
Or that I can just pass a delegate function
app.Run(async context =>
{
await context.Response.WriteAsync("Hello from 2nd delegate.");
});
Point is, in Blazor WASM I don't have a Run method, and RunAsync does not take parameters.
I'm not sure where to go from here to add a header?
Am I missing a NuGet passage?
From what I've learned from this person on Twitter:
The Blazor WASM application is the client. It lives exclusively within the browser. It is receiving responses from a web server, and not "returning" anything. X-Frame-Options headers need to be set on the server, not by the application in the Browser.
Do you mean that the web server should add these headers when it's delivering the (static) files of your Blazor application to the browser before it starts being executed there? You need to configure your web server (whatever it is) to send these headers then.
Since I deployed my application as an Azure App Service I used Advanced Tools to edit out web.config inspired by this site.
In a Blazor WebAssembly app I have one single server side method that returns results with circular references.
I found out that I can handle this situation on the server side by adding the following:
builder.Services.AddControllersWithViews()
.AddJsonOptions(options =>
{
options.JsonSerializerOptions.ReferenceHandler = ReferenceHandler.Preserve;
});
and on the client side:
var options = new JsonSerializerOptions() { ReferenceHandler = ReferenceHandler.Preserve };
var r = await _http.GetFromJsonAsync<MyObject>>($"api/mycontroller/mymethod", options);
Unfortunately this way reference handling is enabled for every method on server. This introduces "$id" keys in almost all my methods results.
This would force me to change every client call to introduce ReferenceHandler.Preserve option.
Is there a way to specify ReferenceHandler.Preserve for some methods only (server side) or alternatively an option to force ReferenceHandler.Preserve for every GetFromJsonAsync (client side)?
You can use custom middleware in your sever . In custom middleware , you can put the method in it and do judge the URL passed by blazor. If the url meets the requirements, execute the method in the middleware, If not ,Just ignore this method.
Can i use JWT authentication with gundb? And if so, would it dramatically slow down my sync speed? I was going to try and implement a test using the tutorial here but wanted to see if there were any 'gotchas' I should be aware of.
The API has changed to use a middleware system. The SEA (Security, Encryption, Authorization) framework will be published to handle stuff like this. However, you can roll your own by doing something like this on the server:
Gun.on('opt', function(ctx){
if(ctx.once){ return }
ctx.on('in', function(msg){
var to = this.to;
// process message.
to.next(msg); // pass to next middleware
});
});
Registering the in listener via the opt hook lets this middleware become 1st in line (before even gun core), that way you can filter all inputs and reject them if necessary (by not calling to.next(msg)).
Likewise to add headers on the client you would want to register an out listener (similarly to how we did for the in) and modify the outgoing message to have msg.headers = {token: data} and then pass it forward to the next middleware layers (which will probably be websocket/transport hooks) by doing to.next(msg) as well. More docs to come on this as it stabilizes.
Old Answer:
A very late answer, sorry this was not addressed sooner:
The default websocket/ajax adapter allows you to update a headers property that gets passed on every networked message:
gun.opt({
headers: { token: JWT },
});
On the server you can then intercept and reject/authorize requests based on the token:
gun.wsp(server, function(req, res, next){
if('get' === req.method){
return next(req, res);
}
if('put' === req.method){
return res({body: {err: "Permission denied!"}});
}
});
The above example rejects all writes and authorizes all reads, but you would replace this logic with your own rules.
I'm building an isomorphic React application which is using express.js on the server. The client app makes a number of AJAX requests to other express handler which currently entails them making multiple HTTP requests to itself.
As an optimisation I'd like to intercept requests I know the server handles and call them directly (thus avoiding the cost of leaving the application bounds). I've got as far as accessing the apps router to know which routes it handlers however I'm struggling to find the best way to start a new request. So my question is:
How do I get express to handle an HTTP request that comes from a programatic source rather than the network?
I would suggest create a common service and require it in both the handlers. What I do is break the business logic in the service and create controllers which handles the request and call specific services in this way u can use multiple services in same controller eg.
router.js
var clientController = require('../controllers/client-controller.js');
module.exports = function(router) {
router.get('/clients', clientController.getAll);
};
client-controller.js
var clientService = require('../services/client-service.js');
function getAll(req, res) {
clientService.getAll().then(function(data) {
res.json(data);
}, function(err) {
res.json(err);
});
}
module.exports.getAll = getAll;
client-service.js
function getAll() {
// implementation
}
module.exports.getAll = getAll;
u can also use something like http://visionmedia.github.io/superagent/ to make http calls from controllers and make use of them.
I've been spending hours upon hours on this problem, but to no avail.
EDIT: solution found (see my answer)
Project background
I'm building a project in Symfony2, which requires a module for uploading large files. I've opted for Node.js and Socket.IO (I had to learn it from scratch, so I might be missing something basic).
I'm combining these with the HTML5 File and FileReader API's to send the file in slices from the client to the server.
Preliminary tests showed this approach working great as a standalone app, where everything was handled and served by Node.js, but integration with Apache and Symfony2 seems problematic.
The application has an unsecured and secured section. My goal is to use Apache on ports 80 and 443 for serving the bulk of the app built in Symfony2, and Node.js with Socket.io on port 8080 for file uploads. The client-side page connecting to the socket will be served by Apache, but the socket will run via Node.js. The upload module has to run over HTTPS, as the page resides in a secured environment with an authenticated user.
The problem is events using socket.emit or socket.send don't seem to work. Client to server, or server to client, it makes no difference. Nothing happens and there are no errors.
The code
The code shown is a simplified version of my code, without the clutter and sensitive data.
Server
var httpsModule = require('https'),
fs = require('fs'),
io = require('socket.io');
var httpsOptions =
{
key: fs.readFileSync('path/to/key'),
cert: fs.readFileSync('/path/to/cert'),
passphrase: "1234lol"
}
var httpsServer = httpsModule.createServer(httpsOptions);
var ioServer = io.listen(httpsServer);
httpsServer.listen(8080);
ioServer.sockets.on('connection', function(socket)
{
// This event gets bound, but never fires
socket.on('NewFile', function(data)
{
// To make sure something is happening
console.log(data);
// Process the new file...
});
// Oddly, this one does fire
socket.on('disconnect', function()
{
console.log("Disconnected");
});
});
Client
// This is a Twig template, so I'll give an excerpt
{% block javascripts %}
{{ parent() }}
<script type="text/javascript" src="https://my.server:8080/socket.io/socket.io.js"></script>
<script type="text/javascript">
var socket = io.connect("my.server:8080",
{
secure: true,
port: 8080
});
// Imagine this is the function initiating the file upload
// File is an object containing metadata about the file like, filename, size, etc.
function uploadNewFile(file)
{
socket.emit("NewFile", item);
}
</script>
{% endblock %}
So the problem is...
Of course there's much more to the application than this, but this is where I'm stuck. The page loads perfectly without errors, but the emit events never fire or reach the server (except for the disconnect event). I've tried with the message event on both client and server to check if it was a problem with only custom events, but that didn't work either. I'm guessing something is blocking client-server communication (it isn't the firewall, I've checked).
I'm completely at a loss here, so please help.
EDIT: solution found (see my answer)
After some painstaking debugging, I've found what was wrong with my setup. Might as well share my findings, although they are (I think) unrelated to Node.js, Socket.IO or Apache.
As I mentioned, my question had simplified code to show you my setup without clutter. I was, however, setting up the client through an object, using the properties to configure the socket connection. Like so:
var MyProject = {};
MyProject.Uploader =
{
location: 'my.server:8080',
socket: io.connect(location,
{
secure: true,
port: 8080,
query: "token=blabla"
}),
// ...lots of extra properties and methods
}
The problem lay in the use of location as a property name. It is a reserved word in Javascript and makes for some strange behaviour in this case. I found it strange that an object's property name can't be a reserved word, so I decided to test. I had also noticed I was referencing the property incorrectly, I forgot to use this.location when connection to the socket. So I changed it to this, just as a test.
var MyProject = {};
MyProject.Uploader =
{
location: 'my.server:8080',
socket: io.connect(this.location,
{
secure: true,
port: 8080,
query: "token=blabla"
}),
// ...lots of extra properties and methods
}
But to no avail. I was still not getting data over the socket. So the next step seemed logical in my frustration-driven debugging rage. Changing up the property name fixed everything!
var MyProject = {};
MyProject.Uploader =
{
socketLocation: 'my.server:8080',
socket: io.connect(this.socketLocation,
{
secure: true,
port: 8080,
query: "token=blabla"
}),
// ...lots of extra properties and methods
}
This approach worked perfectly, I was getting loads of debug messages. SUCCESS!!
Whether it is expected behaviour in Javascript to override (or whatever is happening here, "to misuse" feels like a better way of putting it to me right now) object properties if you happen to use a reserved word, I don't know. I only know I'm steering clear of them from now on!
Hope it helps anyone out there!