I have an Angular and NestJS project. I would need to emulate a broken API endpoint without stop restart the entire NestJS project, like any actuator in Spring framework, but in a separated maintenance app that allow me to test the Angular access to remote broken API endpoints.
I tried with a NodeJS app that changes the /etc/hosts files in realtime.
Related
Some background: I have a .NET Core API that serves as the backend for my Vue app. I've configured a bunch of environment variables in my Vue app that point to localhost:port to access that API. This works fine in my development environment, but now I'm trying to deploy this to IIS.
For now, I took the simple route to hosting my solution: both the API and the Vue app are on different sites in IIS.
I'm deploying the API as a web deploy package (using MSDeploy), and it seems to be working just fine. For the Vue app, all I'm doing is copying the npm run build output from /dist/ over to the site's physical directory in IIS. There's no web.config or anything (I wasn't sure if this was needed).
Here's the issue I'm running into: if I try browsing the Vue app on the server running IIS, everything seems to work fine, because the Vue app makes its API requests to localhost, which of course is valid. However, if I try browsing my Vue app from a different machine, it doesn't work because it's still trying to make requests to localhost on the "client machine", which obviously won't work because there's no API on the client machine.
I'm a novice when it comes to IIS, so I'm not sure if this is something that I need to address in my Vue app (i.e. by making the API URL/port configurable during deployment), or if it's a misconfiguration in IIS. Any ideas?
I have built a react application using The Movie DB Api, I am fairly new in deploying applications. I want to publish the app as a portfolio project. Currently the frontend calls the API directly using key stored in .env file. But deploying it directly will expose my API key, hence I want to use separate backend to call the API and send the data back to frontend. My question is that is there any possibility that uploading the backend on Heroku and accessing the backend server from a Netlify frontend can somehow expose my API key? Some explanation about how environment variables work in both of these platforms will also be helpful.
I haven't uploaded anything to Netlify or Heroku till now, as the API Key is sensitive information,
I have a an electron app that renders a visualization. I need to have a different local application send an http request with parameters to my electron app which will then return a png or svg image based on the parameters. Can electron respond to external http API requests out of the box or do I need to integrate an express server within my app that will allow this? Having trouble finding documentation about this.
Electron does not have any built in modules for creating HTTP APIs.
However, both Electron processes (the main process as well as renderer processes) are Node processes, which means you have access to all Node APIs, including http - so you can build servers just as you would with Node. express would certainly make it easier.
Is it possible to build an electron Desktop app frontend and API within the electron platform without having to separate the api and the Front end. I am doing this to avoid having to ask users to install two files.
Yes, you could potentially wrap an API server and a frontend into a single Electron application.
Essentially your server could be spun up within the main process. Then you could use your frontend/renderer process to make calls to your API similar to a website.
First off, I need to mention that I'm not sure if what I'm trying to achieve is even supported by Piranha CMS (that's partly what I'm trying to determine here). They mention the ability to create a standalone content hub on their website, but my assumptions of what is possible with that model might be incorrect. What I've already done is created an ASP.NET MVC application that is hosting Piranha CMS and I've published it to Azure websites for testing purposes--that part works as expected. The content management interface is the only user facing piece here--it is meant only to serve as the content hub for the client application (just the one for now as this is just proof of concept work).
I am now trying to build a client ASP.NET MVC application that pulls content from the hub. This is where I'm thinking that my assumptions may have been wrong. I was thinking that I'd be able to install the Piranha CMS nuget package(s) on the client as well, and I'd be able to configure the framework to get content from the hub in the same way that it would if the content were hosted on the client site. I realize that I could get the content from the hub using Piranha's REST api, but what I want to do is to be able to use the more friendly entity model based api for this.
So my question is whether it is possible (within reason) to setup Piranha CMS in the way that I've described. If it is, how exactly do I configure the client such that it is aware of the location of the content hub?
There are currently no .net client api consuming the rest services as the simplest scenario would be to deploy .net applications together with the server. In the setups I've done native apps & html5 knockout/angular applications have used the rest api's for getting json data. You should however be able to white such a module, performing the HTTP calls and the deserializing the json without any problems.
Regards
HÃ¥kan