If I have a working ASP.NET MVC 4 application with MVC in IIS on the back end and Angular and other scripts on the front end, and I'm looking to port that to Icenium to get it into app stores, is it correct to assume that any dynamic "server-side" stuff is outside of the scope of what Icenium will handle? In other words, the server-side implementations of any AJAX calls, etc., will not be present in the application and I'll have to make the client-side code act as though it's communicating with an external third-party API (perhaps with new concerns with cross-domain AJAX that I didn't have to worry about before).
Icenium is a Cloud Based IDE to create Cross Platform Mobile Apps. You can use any client side library like
jQuery Mobile
Kendo UI Mobile etc
to create apps. Now coming back to AJAX call scenario. Icenium has nothing to do with AJAX call. It is task of library being used to handle those and bring result back to apps.
For example if you are using Kendo UI Mobile library then you can use KendoUI DataSource to create data source using AJAX call
Bottom line is Icenium has no role in making AJAX call etc. It is an IDE which allows you to create Cross Platform mobile Apps.
One more good news is that now Icenium plugin for Visual Studio is also available
Thanks
/DJ
Related
Is there any solution to automatically generate a web UI from a REST API?
I found Swagger codegen but it generates a client for the API, not a UI.
I need a basic UI, allowing directly from the browser to use the different endpoints and display the response prettily. Something like a basic Postman that would be directly integrated into my website.
I don't have constraint about how the generation is done. Can be done once at build time, or at runtime on server side or on client side.
I've heard good things about retool.com, it seems to do what you need.
Please I have an ASP.NET Core Web API project that I need to develop a front-end UI to consume it, taking advantage of the Single Page Application (SPA) and component model of Blazor, I am thinking of using Blazor Server app, but my application is going to be an enterprise app with at least 20,000 concurrent users or more in the future and my concern is obviously the SignalR connection.
I know React also uses the SPA and component model approach.
Am at a fix as to which one to choose and whether it can be scalable in the future.
Thank you for your kind response.
I know this sound strange, but please bear with me. We have a legacy desktop app (DataFlex), and we are developing a new web app. Both apps share the same business logic. We are in the process of moving the business logic to a centralized project that is exposed as an asp.net-core web api, to be used both by the desktop app and the web app.
So far, we have been able to access the new business logic project from the legacy desktop app via COM, but because maintaining interfaces and DataFlex wrapper classes is a pain, we are hoping to find an easier way.
We could access the business logic using the web api via http. But many users are still only using the desktop app, and setting up web servers for all of them is not something that can be done anytime soon. So we want to call the asp.net-core project in a way that resembles the way a web app does it, passing method calls as strings (JSON?)
Is this feasible? If so, how?
First off, I need to mention that I'm not sure if what I'm trying to achieve is even supported by Piranha CMS (that's partly what I'm trying to determine here). They mention the ability to create a standalone content hub on their website, but my assumptions of what is possible with that model might be incorrect. What I've already done is created an ASP.NET MVC application that is hosting Piranha CMS and I've published it to Azure websites for testing purposes--that part works as expected. The content management interface is the only user facing piece here--it is meant only to serve as the content hub for the client application (just the one for now as this is just proof of concept work).
I am now trying to build a client ASP.NET MVC application that pulls content from the hub. This is where I'm thinking that my assumptions may have been wrong. I was thinking that I'd be able to install the Piranha CMS nuget package(s) on the client as well, and I'd be able to configure the framework to get content from the hub in the same way that it would if the content were hosted on the client site. I realize that I could get the content from the hub using Piranha's REST api, but what I want to do is to be able to use the more friendly entity model based api for this.
So my question is whether it is possible (within reason) to setup Piranha CMS in the way that I've described. If it is, how exactly do I configure the client such that it is aware of the location of the content hub?
There are currently no .net client api consuming the rest services as the simplest scenario would be to deploy .net applications together with the server. In the setups I've done native apps & html5 knockout/angular applications have used the rest api's for getting json data. You should however be able to white such a module, performing the HTTP calls and the deserializing the json without any problems.
Regards
HÃ¥kan
I have a web application (typical mvc webapp) that needs to call a REST API bundled in a different webapp (war file).
The first web app serves as a front to the separate REST API webapp for customers to register and view their stats, purchase plans etc. But part of the design of this webapp is that it must have example invocations to the other REST API webapp.
There are many rest clients out there, but what would be a reasonable approach to address the above?
I was thinking of using the Spring REST Template to call the REST API but from my mvc controller class in the first webapp. Is this a reasonable approach?
Once you deploy a webapp using your deployment tool of choice, you can simply call the REST URL. That's one of the great things about REST - it doesn't care about what sort of tool is calling it because it deals in a neutral medium (usually HTTP). Twitter's REST API (here) doesn't care what's calling it - in fact the beauty of it is that anyone can make an app that calls it.
So say you deployed a webapp locally to port 8080, you can just make a REST call to http://localhost:8080/firstapp/rest/foo.
If you're deployed to the World Wide Web, then just call the appropriate domain.
Yes, RestTemplate is a very convenient way for server to server REST calls. Though there are some tricks if you are going to serialize generics.