Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I want to scan for malware content directly in memory files and I want to know which is the best way to do it.
I’m in charge with the improvement of the website security policy for our web applications that were developed in .net core and Angular
I read the OWASP recommendations and one of them was to scan the files for malware
I’ve identified two ways of scanning the files: calling a third party API from cloud or calling AMSI interface. Calling a third party over the internet is not an option for me because the client wants the verification to be done on-premise.
What is AMSI:
The Windows Antimalware Scan Interface (AMSI) is a versatile interface standard that allows your applications and services to integrate with any antimalware product that's present on a machine. AMSI provides enhanced malware protection for your end-users and their data, applications, and workloads.
The Windows AMSI interface is open. Which means that any application can call it; and any registered Antimalware engine can process the content submitted to it.
I opted for calling AMSI interface from .net core to analyze the http requests for malware content but my tests are not working on some servers where Symantec endpoint protection is installed as an Antivirus provider and subscriber to AMSI. AMSI seems to be bypassed.
When I test a call to AMSI with an eicar standard content, AMSI returns the result as if there is no detected malware even if I had a post with malware content. That is why it seems that AMSI is being bypassed.
Do you know what I could do to fix it? Why is it that the AMSI is being bypassed? What should I check or take into consideration?
Would it be better to develop a windows service that scans the files from a queue and runs a .bat in order to give commands to the antivirus programs to scan the files? Is there any third party web api that can be installed on premise?
I've also been looking into APIs for .NET for AV scanning, but it seems there isn't much out there.
AMSI is a new standard (starting Win10), but seems to be only for "Fileless scans" (i.e. strings and blobs). Here is a nice article with a .NET library:
Using Windows Antimalware Scan Interface in .NET
I've also found out a fairly active open-source scanner:
ClamAV
and a library in .NET to scan in-memory (although very old from 2017):
ClamAV.Managed
Each commercial Enterprise grade AV has some kind of Web API, but there is no standard - so individual development is required for each one...
I've tried the route of queuing and manually running a CLI AV - in my case Windows Defender - but it took on average over 2 mins to scan a file (might be good enough for your use case). The major benefit is that it could be generic and support any AV that has a CLI - but the queueing and running in the console is also a major security (and memory leak) risk (not to mention it would be tricky/pricey to get running in a Cloud hosted instance like Azure AppService).
Amazing that in 2021 we still have no standard for AV scans...
Related
I'm new to web development and just built my first website with .Net Core. It's primarily HTML, CSS, and JavaScript with a little C# for a contact form.
Without recommending any service providers (question will be taken down), how do I go about deploying the website? The more details the better as I have no idea what I'm doing haha.
Edit: I am definitely going to go with a service provider, however the business I am building the website for doesn't have a large budget so I want to find the best provider at the lowest cost.
Daniel,
As you suspect, this is a bit of a loaded question as there are so many approaches. One approach is to use App Services within Microsoft Azure. You can create a free trial Azure account to start that includes a 200.00 credit, which is more than enough to do all of this for free. Then, using the Azure Management Portal, create an App Service (also free) on an App Service Plan in a region that makes sense for you (i.e. US West). Once you do that, you can download what is called a Publish Profile from within the App Service's Management Portal in Azure.
If you're using Visual Studio, for example, you can then right click your project and "Publish" it (deploy to the cloud, or the App Service you just created). One option in that process is to import an Azure Publish Profile, which you can do with the one you just downloaded. This makes it really simple. The Publish Profile is really just connection information to your Azure App Service (open it in Notepad to see). It will chug for a bit and then publish and load the app for you. You can also get to the hosted version of your app by clicking the Url of the app in the App Service management portal on the main page.
This may be oversimplifying what you need to do, but this is a valid direction to take. AWS and others have similar approaches.
Again, tons of ways to do this, but this is a free approach. :-) I don't consider Azure a Service Provider in the sense that you asked us not to. Instead, I wanted to outline one turn-key approach with specific details on how to get there.
You can find specific steps in a lot of places, such as this link:
https://www.geeksforgeeks.org/deploying-your-web-app-using-azure-app-service/
DanielG's answer is useful, but you mentioned you don't want use any services from service provider.
Usually, there are only three ways to deploy the program,
first one is the app service provided by the service provider mentioned by DanielG,
**Benefits of using service provider products:**
1. Very friendly to newbies, follow the documentation to deploy the application in a few minutes.
2. It offers a very stable, scalable service that monitors the health of our website.
3. We can get their technical support.
**Shortcoming**
It is a paid service, and although Azure's service has a free quota, it will run out.
**Suggestion**
It is recommended that websites that are officially launched use the services of service providers.
second one is to use fixed IP for access (it seems that fixed iPv4 IP is not provided in network operations),
**Benefits of using fixed IP:**
If there is a fixed IP address, or if the carrier supports iPv6, we can deploy our website, and the public network can access it. And if you have domain, it also can support https.
**Shortcoming**
1. There are cybersecurity risks and are vulnerable to attack.
2. Without perfect website health monitoring, all problems need to be checked by yourself, and it is very troublesome to achieve elastic expansion.
**Suggestion**
It is generally not recommended because there is no fixed IP under normal circumstances. Broadband operators used to offer it, but now it doesn't.
If you are interested, you can try ipv6 to test.
the last one is to use tools such as ngrok or frp for intranet penetration.
**Benefits of using intranet penetration:**
Free intranet penetration services such as ngrok, the URL generated by each run is not fixed, and there are some limitations, such as a new URL will be generated after a certain period of time, which is enough for testing.
Of course you can purchase the service of this tool, which provides fixed URLs and supports https.
**Shortcoming (same as the second one)**
**Suggestion**
The functional implementation is the same as the second suggestion, and the physical devices used by the website are all their own. The intranet penetration tool (ngrok, or frp) solves the problem of not having a fixed IP, providing a URL that you can access.
There are few users and the demand for web services is not high, so it is recommended that individual users or small business users use ngrok and frp in this scenario. Generally suitable for OA use in small businesses.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
MAKING THINGS SIMPLE EDITS
I am developing an app, have a nodejs API that read data through MSSQL server, now that API is Running on Localhost:3131/ It means it will run locally only.
I do have a Windows Server that stays online always, I can use pm2 on that server with the API files to make things Live, But if i run it how can i make my SERVER IP Public so I can access it through everywhere.
HOW DO I MAKE MY SERVER IP ONLINE BUT SECURE!
There are many ways to achieve what you desire. If you were to do it on your own network/pc then one traditional method called 'port forwarding' can be used for projects under development, but this still means that your IP will be exposed to the web.
Virtual Private Servers and Dedicated servers are used more commonly used these days to host powerful applications. You would need to run a web server first where your web files can be hosted and then you can link your node.js server and sql database.
There are several hosting providers out there which are built for nodejs applications. You can go for 'unmanaged' and 'managed' hosting providers.
Managed providers provide a simplified "Node Appliance" solution. Node and NPM will already be set up for you, and deploys are typically done via git push or similar method. You will have less control of your server, but everything will be set up for you.
There are some managed hosting providers which nodejs recommend themselves. You can view them here:
https://github.com/nodejs/node-v0.x-archive/wiki/Node-Hosting
Some popular ones include:
Heroku
Amazon Web Services
AppFog
Microsoft Azure
RedHat OpenShift
In your case, you are using SQL and Node, so Heroku actually offers free nodejs hosting and provides a free addon for MySQL databases too. Only downside is that the duration (hours) for which you can run apps are limited.
It might be worth doing this on a Virtual Private Server for lower costs. Alternatively, have a look at NodeChef who specialise in Node.js and MySQL.
Hopefully, this gives you enough information to understand what steps to take next.
I am building a UWP app that targets both x86, x64 and ARM platforms. I want to replace the current implementation that uses Azure for the backed (an App Service and an SQL Server) because of the high price and because my Pay-As-You-Go subscription does not allow me to set a spending limit.
I thought about using a local database but I don't know if that could be a solution since I want the user to be able to have his data synced on both PC and phone for example. I am also ok with renouncing the idea of a structured database in favor of structured files (like xml) if I can find a way to keep them somewhere in the cloud (and then I can read/write them from the client app - no need for App Service).
Are there any free, non-trial alternatives to Azure? Or should I look more into the file storage implementation? Thanks in advance.
Instead of Azure you could use another web hosting solution to publish you API. Azure also offers small free plans that might be sufficient.
An alternative would be to request access and store/sync data to user's OneDrive. Each logged in user with Microsoft Account should have OneDrive storage available so this is a good middle-ground, which is still free for you. A nice introduction to this can be found in this article.
UWP also offers RoamingFolder where you can store small files that are synced across the devices that you use. Unfortunately this is less reliable because you are not able to control when the sync happens and cannot resolve conflicts.
I have successfully migrated to another cloud platform: Heroku. In my opinion, at least for small apps, Heroku offers the best solution both technology-wise and price-wise.
I am now able to have a webservice hosted for free in the cloud, without worring about traffic and number of requests. Of course you can scale up if you want better performance, but you can start with a free plan. Also, I have a postgressql db hosted also in the cloud, also for free (up until 10 000 records, and it will be just 9$/month if I want to upgrade to 10 milion). One can never found an offer like this free on Azure.
I had to learn a bit of Node.js (there are a lot of languages Heroku supports for backend services, but .Net is not one of them) but it was totally worth it!
Another option that is now starting to gain more and more popularity is FireBase. I will certantly also check that out for my future apps.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
are you aware of the best companies which provide apache solr AS A SERVICE? where i can simply upload (or edit via some web control panel) my index and config files for SOLR and simply start using it
i do not want to be breaking my head with any sort of server administration on tomcat
just update my index and config files... tell solr where to look for data to index (via data import handlers) and thats it just start using it
any sort of load balancing / mirrors would be like icing on the cake
price does not matter as its for mission critical apps
please do not suggest me to boot up my own servers on amazon or rackspace or xyz and then deploy solr on them and manage all the administration - because thats what i want to avoid in the first place completely
thanks in advance
I wanted to update this post from 2011 now that it's 2017! Today for folks looking for pure self service and simple Solr search we continue to recommend www.websolr.com. If you are looking for a managed complete Solr instances + some nice search analytics capability, then we've used SearchStax very successfully.
there is also this one:
http://www.opensolr.com/
Opensolr proposes 3 types of Solr instances :
CMS instances (WordPress, Drupal, Joomla, eZ Publish, Typo3)
API instance (RESTful Web service) with fully customizable configuration files
Web sites Crawler instances
There is a free account (i think it can be used actually only for test the platform), and the price for the other kind of account is cheap (4$/8$ a month for medium accounts)
(i didn't try it yet, but it seems promising)
If you use Drupal, there is also Midwestern Mac's Hosted Apache Solr service, which works with Drupal 6 and 7, and all the different Solr integration modules. (Disclaimer: I'm the owner of Midwestern Mac—let me know what you'd like to see and I'll try to make it happen!).
The company IndexDepot (www.indexdepot.com) offers a hosted Solr service. It's easy to use, because you log into an web interface to edit your configuration files. Special configurations fitting your requirements are negotiable, e. g. dedicated master/slave Solr servers.
You can try contacting Lucidworks with that question.
I heard they were working with Boomi on PaaS/Saas for their Lucidworks 1.4.
Although Boomi don't explicitly say that they support Solr, this webinar may suggest they were working with Lucidworks to include their Solr-based search engine in their portfolio.
Even if Lucidworks people don't provide SaaS, they're surely the right address to ask who does.
Good luck in your search and please get back to us with the information you manage to find...
EDIT 04/2012:
If I had to make that choice today, I'd seriously consider CloudBees (which has WebSolr plugin). It's a complete ALM & CI cloud framework for JVM-based languages, with loads of partner plugins (Jenkins, NewRelic, Sonar, MongoHQ, Cloudant, ...), many of them with free base options.
The most significant difference, when comparing with other SaaS/PaaS services, is that you can set up development environment and even deploy your app (on one node, of course) without even leaving your credit card details.
Just to expand horizons.
please do not suggest me to boot up my own servers on amazon or rackspace or xyz and then deploy solr
You can go with ready to use:
http://aws.amazon.com/cloudsearch.
Cloudsearch provides simple API.
And more and more hosted Elastic Search solutions appears recently. I think it's because of cloud ES nature (easy to maintain search cloud).
http://indexisto.com
http://qbox.io
But of course this is the matter of how sticky are you with SOLR.
These guys also so hosted Solr.
https://www.hosted-solr.com/?locale=en
They also have a custom extension that integrates Solr into TYPO3 CMS.
http://www.typo3-solr.com/en/home/
Take a look at our Fully Managed Solr Cloud Hosting, where we enable customers like you to not worry about managing and maintaining Solr infrastructure, but instead focus on building your application.
We offer shared clusters that caters to price conscious customers and dedicated nodes and dedicated clusters with white glove service for companies who want to completely offload search infrastructure and management.
We are a technology partner with AWS and depending upon the critical needs, can customize a hosted solr solution per our customers needs.
please do not suggest me to boot up my own servers on amazon or rackspace or xyz
and then deploy solr
You can try the below solr service provider, looks cheaper as well
http://indiasolr.com/
I am in the process of integrating our custom web app with QuickBooks Enterprise 9. My thought is that I could use QuickBooks as my "database" of sorts. When a person creates an invoice, the invoice is actually stored only in QuickBooks. When a person views a list of invoices, they are actually viewing a list of QuickBooks invoices. I want to make sure the data is stored in only one location.
I realize that I could use the QB Web Connector, but the problem with that is I wouldn't have control over when the requests to QB actually get processed (That job is up to the Web Connector).
So I have my web UI to act as the QuickBooks "face," but I don't have any good way to get to and from the QuickBooks file located on an internal server. What I was thinking was that I could create a WCF web service and install it on the QuickBooks server. The web service could then be my integration point. My custom web app could then consume the web service and, viola, I have access to my QuickBooks files.
My question is this: Can a WCF app connect and run QuickBooks? If not, could i create a Windows service to act as my point of integration? If so, can my custom web app "consume" a windows service?
I'll start by warning you that QuickBooks probably isn't your best choice for a reliable back-end database accessible from a remote website. In fact... it's probably a really, really bad choice.
You should have your own application database, and then if you need to also exchange data with QuickBooks, do that outside of the normal lifecycle of your app, as a separate sync process.
QuickBooks generally isn't reliable enough for always-online type of applications due to a number of reasons:
Flaky SDK connections
Updates and single-user mode will
lock you out of accessing QuickBooks
Difficulty in establishing SDK connections from non-GUI processes (Windows Services and IIS processes)
With that said...
Yes, you could create a WCF web service, host it on the QuickBooks machine, and make your WCF web service relay messages to/from QuickBooks.
Yes, you could also create a Windows Service that does the same sort of thing.
Do NOT implement it as a Windows service, and do NOT implement it within IIS - instead implement it as a GUI app that runs alongside QuickBooks.
If you try to implement things as a Windows service or within IIS, the QuickBooks SDK requires you have a GUI available (it users a GUI COM message pump for events dispatching or something like that...) to process requests, so you'll probably need to use something like QBXMLRP2e.exe to straddle the process boundary between QuickBooks and your non-GUI Windows service/IIS. My experience has been that it's a gigantic pain in the butt, and requires mucking with DCOM permissions as well.
I have an example and some documentation on my QuickBooks integration wiki.
The IDN Forums are a good place to ask questions.
My recommendation to you would be to either:
Use the Web Connector and QuickBooks
and give up hope of keeping all of your data in one place. Cache the data in a real database, and update it by querying QuickBooks periodically. I'm almost done building a solution to do exactly this right now, and it works fantastic.
OR
Use a different account system. NetSuite is pretty nice. I'm not sure what else is out there, but if I were you I'd look for something SQL-based or with a strong SOAP/REST API.