How to generate corporate level test data for Active Directory - testing

Background: I'm working on developing client functionality for OpenLDAP and ActiveDirectory in java. Have used Unboundid LDAP SDK for the same. This setup will be used for pulling data from different clients.
I've setup a Windows server with AD instance running on it. As a next step, I want to test my functionality against this AD server. However, setting up all of the "corporate level" data sounds like a big task given that I'm not terribly familiar with all of the possible configurations/group permissions setup. I did find this option which looked ideal for my scenario but the download won't complete. Please suggest what's the best way to generate "real looking" corporate data for testing AD server.
http://ldapwiki.willeke.com/wiki/LDIF%20Generator.

I followed this to setup the data.
http://blogs.technet.com/b/askpfeplat/archive/2014/02/10/how-to-use-the-active-directory-performance-testing-tool-on-windows-server-2012.aspx
No issues faced on 2012 server.

Related

Visual Studio Team Services Test Running

Apologies if similar has been asked before, I couldn't seem to find anything, just link me in the right direction if so.
I'm brand new to test automation, I will be writing selenium tests against a third party website hosted on an internal network. Our source control is provided by Visual Studio Team Services, although it is possible I can install TFS on premise.
Eventually I need to schedule test runs, I believe all this can be done with team services, seen some demo's, all good.
I will be using a URL to access the system under test which is on our internal network, if team services tries to run a selenium test and connect to the URL it will fail I imagine as it's running from wherever Microsoft are holding the code and building.
I don't think there would be a chance that we would allow Team services any access to our internal network if that was even possible.
So the question is, what are my options? can the build be moved from VS Team Services onto a local machine to run the tests with the internal URL? Is this a good idea if it can? Am i relying too much on the internet for testing on our internal network and is this a risk?
I have spent a bit of time on "the google" but struggling to find a great deal of information, it's possible I am asking the wrong questions.
Any help is greatly appreciated, links to articles are fine, don't mind doing the leg work, just need some pointers.
Many thanks for your help, apologies if any of that makes no sense.
You have a few options:
Install a VSTS Build agent on-premise and connect it to VSTS. The agent connects to VSTS using an outbound connection and it will be able to execute Builds and Release pipelines and from there orchestrate the execution of tests. You can either put this agent in a specific Agent Pool or Agent Queue, or you can add a Capability to it (e.g. "onprem"). By setting the Build Definition to use the specified Pool/Queue the agent will be selected. Or by adding the Demand "onprem" to your Build Definition it will ensure that it always requires that capability of any agent.
Use TFS 2015u3 or TFS2017 with the same agent, but that would mean you loose all the goodness that VSTS has to bring with regards to licenses, "free upgrades" and all.
With regards to security.
Adding a agent to your network that executes commands queued on a cloud service adds a risk. You can minimize that risk by configuring the build agent with a limited account, use Active Directory to limit the machines this user can run processes on/logon to and you can limit the access to this agent through permissions on the Queue and Pool as well. You can ensure that the users who have access to this pool and all your VSTS administrators have configured 2-factor-authentication on their AAD account and if needed add IP access control to these accounts as well. It's recommended that users that administer such agent pools/queues do not have alternate credentials configured and that the Personal Access Token used to register the agent is scoped to the permissions required to do just that.
With these extra measures in place you'll have a pretty secure setup. And it beats the hassle of having to install, backup, maintain a couple of TFS servers on-premise.

Out Of Browser Silverlight app with local offline database and WCF-RIA

I have the following scenario:
We develop a silverlight 4 app for our customers, that will be used as an out-of-browser app. The app is working offline, i.e. app and database are on the users local machine. The app is using WCF-RIA-services to connect to the local database. The database will be an SQL Server Express, SQL Server CE or MySQL. We are using MVVMLight and MEF.
An external webserver is only used for updating the app from time to time or adding new modules to the app. To achieve this we do something similar as shown in Jeremy Likness blog (http://www.wintellect.com/CS/blogs/jlikness/archive/2010/05/25/silverlight-out-of-browser-dynamic-modules-in-offline-mode.aspx )
The reasons why we are doing such a scenario are complex. But to keep a long story short it is mainly for compatibility reasons for a later online version and we don't want to use WPF. So we need to get this working with Silverlight and WCF-RIA services.
Ok, that's the scenario and here's the question:
Do we need a local webserver in this scenario? The app is programmatically installed as out-of-browser, the database is local and connected via WCF-RIA.
If yes, which webserver would be sufficient? It should be installed and configured via an initial setup that is executed by the customer. The customer should not have to do anything with configuring the webserver.
Any other ideas or comments on this scenario? Any other possible solutions for this?
Thanks for your help
Dirk
silverlight wasn't meant to be used this way I think. So it would be like when you are developing app in visual studio and use Cassini to see result - everything runs locally - but you still need a web server. Maybe more info here - http://www.infoq.com/news/2010/06/WPF-vs-Silverlight
I´m not able to provide with a full answer to your problem, as we are currently facing the same problem. (WPF not being cross-platform, Very specific hardware on some clients)
But I may share some of our thoughts on our type of Thick-Silverlight-Client:
To keep deployment etc. simple we use a self-hosting process (installed as background process)
We may not use RIA as the background process has to run using Mono VM (but for MS-only solution see Can WCF RIA Services be self hosted? )
Architectural thoughts on standalone "Clients":
Depending on your requirements implementing a server for each client communicating with the "main"-server by messages (NServiceBus) may be overkill. But if you want to use a client database if offline and silverlight for ui you should consider using an event-driven-architecture.
There is a slideshow on combining "Event-Driven-Architecture" & "CQRS" with Silverlight. But i would not use it as a blueprint more like an inspiration.
http://www.slideshare.net/dennisdoomen/cqrs-and-event-sourcing-an-alternative-architecture-for-ddd

keeping OpenLDAP and Active Directory in sync (windows server 08R2)

I've got a Windows Server box running AD, and a CentOS box running OpenLDAP in a mixed windows Linux network and I want to keep the two in sync. Preferably using free software/just some configuration changes. anyone know how to make these 2 authentication systems play nice? any syncing would have to be done over SSL for security reasons.
I use a home-grown perl script, which sync one-way from AD to LDAP via SSL. It is very custom and very rigid. I walked the same path 6 months back looking for tools to sync but none fits our needs. Well actually there isn't any that does sync without breaking
So my answer is get a scripting guy and give him the requirements and a months paycheck. Seriously, it is best done in-house than spend time looking for one and molding to your needs.
Perl has good libraries and has worked very well for us. We migrated from OpenLDAP to 389-DS which already has windowsSync plugin.(Hope that tempts you to switchover). :)

Testing ClickOnce Applications

What method would you use for testing a new version of a ClickOnce application (side by side with the current version) amongst multiple users? Are there any best practices (especially as the applications depend on different servers for the live/test versions of SQL / web services etc).
We use internal DNS to set up http://application.ourdomain.test sites to test web based applications. It's obvious from the address bar that you're logged into a test site and it's just a change to the connection string to force that deployment to connect to our test SQL server. Is there any way to approximate this?
For ClickOnce deployment testing, we set up a few virtual machines and have testers connect using remote desktop. The VM desktop backgrounds are an ugly color and say "TESTING" in big bold red letters.
Also, all of our applications display a warning message if the user is about to connect to anything but the production database.
I came across this link text this afternoon, which seems to offer a way of doing it - you have to use different certificates to sign the manifests. I vary the Suite Name in the Publish Options dialog to create different menus.

Know of SSO turnkey Appliance with ldap, radius, openid, etc?

I'm helping a typical small company that started with a couple of outsourced systems (google apps, svn/trac). added an internal jabber server (ejabber for mostly iChat clients). subscribes to a couple of webservices (e.g. highrisehq). and has a vpn service provided by a pfsense freebsd firewall.
And the net result of all this is that they're drowning in passwords and accounts.
It seems that if they had a single unified login / single signon service they could go a long way to combining these. E.g.: ldap as the master repository, radius linked to it for vpn, ejabber and even WPA2 wireless access, plugins for google app sign on, and perhaps an openid server for external websites like highrisehq.
It seems that all these tools exist separately, but does anyone know of a single box that combines them with a nice GUI and auto-updates? (e.g. like pfsense/m0n0wall for firewalls, freeNAS for storage). It doesn't have to be FOSS. A paid box would be fine too.
I figure this must exist. Microsoft's Active Directory is likely one solution but they'd rather avoid Windows if possible. There seem to be various "AAA" servers that ISPs use or for enterprise firewall/router management, but that doesn't seem quite right.
Any obvious solutions I'm missing? Thanks!
It's been over a year since you originaly asked the question, so I'm guessing you've solved your problem by now. But if someone else is interested in a possible solution I suggest the following:
First of all, I don't know of any "all in one" solution to your problem. However it's quite easy to combine three products that will solve all of your needs and provide a single source for User management and password storage.
The first thing to do is install an LDAP Directory to manage Users and Groups (and possibly other objects outside the scope of your question). This can be OpenLDAP, Apache DS, Microsoft Active Directory, etc. Basically any LDAP Server will do.
Second I recommend installing FreeRADIUS with the LDAP Directory configured as it's backend Service.
Third get a license of Atlassian Crowd. It provides OpenID and Google Apps authentication. Prices for up to 50 Users start at $10 and go all the way up to $8000 for an unlimited user license.
Installation and Configuration of the three is relatively easy. You'll probably put most work into creating your Users and Groups. You can install all three components on a single Server and end up with a box that allows you to authenticate pretty much everything from Desktop Login, over Google Apps and other Web Apps, down to VPN and even Switch, WiFi and Router Login.
Just make sure you configure your Roles and Groups wisely! Otherwise you might end up with some Sales Person being able to do administration on your Firewalls and Routers :-)
I would encourage anyone searching for this type of solution to check out the Gluu Server (http://gluu.org).
Each Gluu Server includes a SAML IDP for SAML SSO, an OpenID Connect Provider (OP) for OpenID Connect SSO, an UMA Policy Decision Point (PDP) for web access management, and a RADIUS and LDAP server.
All the components of the Gluu Server are open source (i.e. Shibboleth, OX, FreeRADIUS, OpenDJ, etc.), including the oxTrust web user interface for managing each component of the server.
For commercial implementations, Gluu will build, support, and monitor this stack of software on a clients VM.
You may not want to standardise passwords across so many apps (especially external ones), though for internal ones using an auth service like LDAP makes sense.
You could solve the issue of remembering passwords with an eSSO like Novell SecureLogin
Also you might be interested in Novell Access Manager and Novell Identity Manager
I too could use such a device, however the only one I could find was a (possibly outdated) data sheet from Infoblox. They seem to have since concentrated on automated network managment and I can't find the LDAP appliance on their current website. I guess building a linux box with the FOSS stuff mentioned above is what everyone does, but it would be great not to have power supplies, disks, fans etc. I suppose you could use something like an EEE PC and put the config on a flash card.
This is something I was looking for as well, and http://www.turnkeylinux.org/openldap looks like the solution: "appliance" installation, and it includes encrypted online backup which is easily restored to a new or replacement machine.