I only starting to learn about SQL Azure, have spoken to some potential clients, they say they have not chosen Azure due to the private nature of their customers information.
Reading about Azure it has firewalls to prevent unauthorised access.
I was just wonder what other way I could market Azure so that clients who potentially want to use it would not be concerned about privacy issues.
Also as I understand Azure supports Hybrid solutions where you can store data locally or remotely?
Thanks
SQL Azure is a public service and the data is stored somewhere in the cloud provider facility. With all security measures including firewalls and sentry dogs the data is still under zero customer control.
So the provider could do some backup and store it for some very long time and you might want to destroy the data ASAP and will be unable to have it done.
Also here's what technically could happen (not that I'm saying it is likely):
the provider might dispose of undestroyed hard disks
a bug could cause the authorization to fail and allow an unauthenticated user (because you see, you don't control what software updates the provider applies)
the provider employee might be bribed and copy the data
So if the user really wants privacy (or the laws say the data he deals with must be processed according to certain requirements) or he wants actual control on how the data is dealt with then a public storage service like SQL Azure is technically inapplicable for him. You trying to market Azure as providing the same level of control and security as a local facility would provide are deceiving the customer.
Sad but true and you can't lie to the compiler. There's no such thing as control over your data in a public storage service. Risks of negative outcomes are perceived as rather low, but they exist and they are real.
Yes, the Azure service bus has connecting private and public clouds as a feature. Keeping sensitive data locally may be what your clients want/need to push parts of their infrastructure to the cloud, although it will take some effort for sure to keep that separation clear, and I'm not just talking technically.
That said, marketing Azure to a client that's not ready for the cloud may very well lose you the entire deal, so make sure you're not pushing anything they aren't ready to cope with to start with.
A good starting point is the Windows Azure Trust Center to learn about Windows Azure privacy and security.
There's also a 7-part Windows Azure security best practice series on the ISV Developer Community Blog. Part 1 has links to the remaining entries, at the end of the post.
Microsoft's data centers are run by Global Foundation Services, which has its own set of security and compliance. There you'll find a data center tour video
Related
I am running a MS Access app using a few tables hosted on our native SQL server to facilitate easy integration with PowerBI and other apps. I am an intermediate MS Access user and new the SQL Server. I don't store anything critically sensitive in any tables and am happy with our normal security for data at rest. However, when my app or PowerBI requests data from the server, how is that data protected? I don't want the increased complexity that comes w/ certificate management and processing time associated with encrypting data. However, I don't want to be low hanging fruit for attacks when I request data from the server (i.e. attacks in transit).
Thanks!
First of all - security is about three things. Confidentiality, integrity and availability. Confidentiality means, that people are not supposed to see data that is not meant for them. Integrity means, that the data is what you expect it to be (no third party is able to manipulate it). And availability means, that the server is always up and running.
This is a simplification, but just to make the point that security is more than many of us believe. If your server is down, then the security is equal to zero.
Having said that, you wanted to know how to protect the communication channel. Sorry to say that, but the best thing we have right now is TLS, which requires certificate management. Look into this page to understand, how this can be configured: http://dba-datascience.com/ssl-or-tls-encryption-on-sql-server/
If you want to know how to protect your SQL Server even more (beyond the communication channel), look into the CIS Benchmarks available here: https://www.cisecurity.org/cis-benchmarks/. They are very technical (which is good), but may be also confusing a bit when you see it for the first time.
I have a requirement to create a tablet application for use in restaurants. It will all be on a private internal network so security is not an issue. The question is which will cause the least network traffic? I can either connect directly to SQL using entity framework or I can connect to web services I create on the SQL server in IIS and the tablets communicate with that.
I guess to simplify it, does a standard SQL connection transfer more data than is necessary?
It's difficult to give a general rule, as network architecture plays into the answer quite heavily.
As a general guideline i would suggest to make web services or php "interfaces" on the server, it would give you a easier and more controllable data flow, besides you could handle transactions easier, as all of them would go thorugh one interface, all db accesses coming from one machine. It makes also debugging and errorhandling easier (log the interface and you see everything that's happening, so you don't ahve to check logs on devices) than if every client connects directly to the DB, gives you more control.
Just a general suggestion, a kind of web services/interface or whatsoever is always worth the investment, sooner or later you will go anyway this direction.
My humble oppinion
Background:
Our team is building an inhouse Intranet web application. We are using a standard three layer approach. Presentation layer (mvc web app), Business layer and data access layer.
Sql database is used for persistence.
Web app / iis handles user authentication (windows authentication). Logging is done in business and data access layer.
Question service account vs user specific Sql accounts:
Use service / app account:
Dev team is proposing to set up service account (set up for application only). This service account needs write & read access to db.
Vs
Pass on user credentials to SQL
IT ops is saying that using a service account (specifically created for app only) for db access is not deemed best practice. Set up Kerberos delegation configured from the web server to the SQL server so that you can pass on the Windows credentials of the end users & create a database role that grants the appropriate data access levels for end users
What is the best practice for setting up accounts in sql where all request to db will come through the front end client (ie via bus layer and then data layer)
The Best Practice here is to let the person/team responsible for the database make the decision. It sounds like the dev team wants to forward (or impersonate) some credentials to the DB which I know that some small teams like doing, but yes that can leave things a bit too open. The app can do whatever it likes to the database, which is not much of a separation if you're into that kind of thing.
Personally, if I understand what you're saying above, I do more of what the IT team is thinking about (I use Postgres). In other words my app deploys over SSH using a given account (let's say it's the AppName account). That means I need to have my SSH keys lined up for secure deployment (using a PEM or known_keys or whatever).
In the home root for AppName I have a file called .pgpass which has pretty specific security on it (0600). This means that my AppName account will use local security to get in rather than a username/password.
I do this because otherwise I'd need to store that information in a file somewhere - and those things get treated poorly pushed to github, for instance.
Ultimately, think 5 years from now and what your project and team will look like. Be optimistic - maybe it will be a smashing success! What will maintenance look like? What kinds of mistakes will your team make? Flexibility now is nice, but make sure that whomever will get in trouble if your database has a security problem is the one who gets to make the decision.
The best practice is to have individual accounts. This allows you to use database facilities for identifying who is accessing the database.
This is particularly important if the data is being modified. You can log who is modifying what data -- generally a hard-core requirement in any system where users have this ability.
You may find that, for some reason, you do not want to use your database's built-in authentication mechanisms. In that case, you are probably going to build a layer on top of the database, replicating much of the built-in functionality. There are situations where this might be necessary. In general, this would be a dangerous approach (the database security mechanisms probably undergo much more testing than bespoke code).
Finally, if you are building an in-house application with just a handful of users who have read-only access to the database, it might be simpler to have only a single login account. Normally, you would still like to know who is doing what, but for simplicity, you might forego that functionality. However, knowing who is doing what is usually very useful knowledge for maintaining and enhancing the application.
I'm trying to get some advice on how to approach a security architecture on Azure.
Background:
We are looking at building a multi-tenant app on Azure that needs to be extremely secure (personally sensitive data). The app will be accessed by standard browsers and mobile devices.
Security access types:
We have three types of users / access types...
1 - plain old user/password over https is fine, accessing both general, non private SQL plus hosted files
2 - user/pass over https, but need authentication of users via certificates that will be installed on user machines/devices. This level of user will need access to sensitive data which should be encrypted at rest both in database, and also any uploaded files.
3 - same as (2) but with the addition of some two factor authentication (we have used YubiKey for other things - might look towards a phone OTP offering as well)
Most users will only have access to their own tenant databases, however we have "account manager" type users that need access to selected tenant data, therefore we expect that they will need either a copy of one certificate per tenant they serve, or we will have to use some kind of master certificate.
Database type:
From a multi-tenant point of view it seems Azure Federated SQL is a good way to go because (a) we simply write one app with "TenentID" key in each table, and after login, set a global filter that handles the isolate for us (b) we understand that Azure federated SQL actually in the background maintains separate SQL database instances per tenant.(Ref: http://msmvps.com/blogs/nunogodinho/archive/2012/08/11/tips-amp-tricks-to-build-multi-tenant-databases-with-sql-databases.aspx)
Can anyone point to any links or give advice in relation to the approach needed to setup and manage file shares, encryption of SQL and file data at rest, authentication of users etc. (automated management on new user signup pref).
I can't really help on the certificates, but you will indeed need some "master certificate". If you are planning on using Azure website, you can't use your own certificates currently.
Concerning the database setup. SAAS applications build on trust, so you NEVER (EVER) want to be showing or editing the data of using to other users.
Therefore I strongly suggest that you don't use the TenantID for each table. This would leave still the possiblity of an attack by a malicious user or an error by some developer.
The only way to get around these risks are
extensive testing
physical different tables to store each tenant data.
Personally I believe that even with very extensive+automated testing you can't have 100% code coverage against malicious users. I guess I am not alone.
The only way out IMHO is physical different tables. Let's look at the options:
different server: valid, but pretty expensive in azure
different database: valid, less management overhead but same objection as the previous option - expensive if you have a lot of tenants
different schema's: the solution. Think about it...
you only have to manage users and there default schema's
you can back-up schema's using powershell
you can move schema's to other databases with some work
You can still digg into SQL federation if you need to.
the major drawback is that you will need to support database upgrades for each tenant.
Have you read on azure.com any articles about multi-tenancy? http://msdn.microsoft.com/en-us/library/windowsazure/hh689716.aspx
I need to develop an application which stores data in a SQL Server 2005 database (the app itself will be either a WCF Service or an Asp.Net Web Service).
Now, this data is supremely confidential, and I need to have it stored in an encrypted form in the database.
So, I am wondering what the best practices are around this. I know that there is some encryption capabilities that SQL Server has in-built. Is there a 'for dummies' type of resource for this so that I can quickly get going.
Alternatively I was thinking that I could encrypt/decrypt in my C# code and not in the database - maybe have a layer which handles this just above the data access layer (is that a good idea)?
Look at this link for a good introduction with samples.
I think doing the data encryption in the application is better, because in that case the transferred data is already encrypted. Otherwise you have to use a secure channel between your app and the database server.
It depends on your needs, i would say.
Have you considered encrypting your data at the file-system level?
It's Windows 2008/Vista only, but it should give you what you need and it's what it's designed for.
Before you decide on an encryption method, you need to access what parts of the system are vulnerable. If the potential for unauthorized access to the database exists, does the same threat exist for your application? Someone could run your code through Reflector and determine what methods were being used to encrypt and decrypt. You can mitigate that exposure to some extent with the code obsfucators. If that concern is not a risk, then you may find it easier to encrypt your data at the application level.
Encryption needs to happen in a few different places depending on the application. For example a consumer site using credit card info needs to encrypt the connection over the network to prevent man in the middle attacks or snooping. when the data is stored in the database you need to encrypt the data so that a low level sales rep cant read and access the customers credit card info , in which you might want to implement column level encryption as appropriate permission in addition to this if your worried that one day the janitor at your data centre might steal one of your backups then you need TDE implement to encrypt data at the disk level.
Encryption has a performance overhead esp with regard to CPU usage more importantly the overhead depends on the alogrithim being used for exncryption.