What is the best way to setup a single Google Analytics 4 (GA4) account to track hundreds of subdomains? - migration

I work with a large university and we track hundreds of subdomains and we also have a few apps. I'm looking for the best solution as we migrate to GA4. (I know, should have been done long ago.) We currently have 20+ Properties which all contain up to 25 Views, each configured to track individual subdomains or apps and we grant user access to these individual Views.
With GA4, Views are deprecated. I'm wondering if anyone has dealt with this type of challenge and how they split up subdomains. We need to keep one account (if possible) and a general plan of action for breaking up all these domains so we can streamline the Analytics data for our end-users.
We do use Google Tag Manager as well.
So far, we have one "main" GA4 property with a single datastream for the root web. The app side is not configured. We also have about 3 other properties setup for individual big players (i.e. colleges or departments which we know will need lots of customizations).
I'm wondering if we should make more properties with multiple "web" datastreams for the many subdomains, but then, how might this affect user engagement with this new Analytics model. Should we handle this in a completely different way.

You don't make properties for stakeholders. You make properties for tracked activity types. Duplicating the tracking is not a good solution.
You don't have the views, but you have the hostname dimension which you can use to filter reports. Feel free to make hostname-based segments and teach your stakeholders use them.
Finally, native app tracking is done through Firebase. Firebase can then be linked to GA4 as its own stream. Better to keep in the same property you keep the web data.
The general rule is: Don't make additional properties unless you know what you're doing. Additional properties will increase complexity on both ends: the collection and the post-processing of data. Increasing complexity needlessly is just pure incompetence that will make many people frown upon it when they inherit your design.

Related

How do I create a private and public API architecture

I got a project assigned where we already have an up and running website and one of our clients wants to be able to track statistics from the website.
We want to make this available to all our clients as soon as we finish the development. Note that each 'client' have their own 'subdomain' to say so. Eg. www.website.com/client1 , www.website.com/client2 , etc. And we want to track the usage separately for each of these clients.
We will need to create statistics based on the usage of our own platform, pull in data registered by Google Analytics and also pull in data from a 3rd party which they will offer by an API of their own (they have a 3rd party solution that uses the data accessible via our API).
All this data needs to be shown on a webpage with graphs and tables.
I wanted to make sure we choose the right architecture from the start, in order to avoid scalability issues later on.
Started reading about Private and Public API's lately.
For now, we do not have another (internal) application yet that would use our own statisics, it would just be the website using it. But in order to be able to scale-up later if needed, and another application would like to use the statistics I think a private API would benefit us greatly.
In order to allow 3rd parties to use the statistical data we chose to let out, I was thinking of creating a Public API.
Is a Private&Public API the correct way to go about this?
One of the questions I am stuck with is how does the architecture for these API's look like. Mostly, right now we already have a public API regarding vacancy data. This 'API' is basically just a PHP class (controller) inside our CodeIgniter solution. It gets called via its URL and returns a JSON object with the results. (e.g. www.website.com/api/vacancy/xxx)
In order to create a (proper) private & public API solution/architecture. Should the API be set free from the website (CodeIgniter)? What are the common go-to solutions for this?
Or is it fine to keep it in our current platform the way it is now? (and people call the stats API via www.website.com/api/stats/xxx for example?)
It's almost always right to go with microservices like architecture so your initial thoughts sounds reasonable. Acting like this will give the possibility to scale and deploy your api independently and also will help you avoid performance side effects to your site (and vice versa). Pay attention how you access your main site data from within the new api if you don't want to finish with a monolith application.
Regarding the API i would suggest you to implement protocol like oauth2 in order to achieve the flexibility you (might) need. Also you can use swagger to document and test your API.
All i said might helps you a lot but first you have to answer yourself do you really need to go so deep or you just need a simple solution.
I think multitenancy is the best choice. Generally speaking, multitenancy is when every customer has own database. Data is separate. The codebase is same and already exists. As I understood the project is in progress status. You do not redesign and rewrite anything.

How to proliferate access permission to Javascript MVC apps

I recently finished one of my first AgilityJS projects, which is a web-based file browser that lets you create and manage folders and files, and navigate around the folder tree. I followed the various AgilityJS recommendations regarding the design and ended up with all my HTML and Javascript in a single Javascript file.
Now, I would like to provide a "read-only" version of this app which does not have the ability to add/edit/remove files and folders. I'd like to have 2 user types on the website, one type which can only read the files and folders, and another user type who can administer.
My question is, how do I proliferate these permission differences to my AgilityJS app? I know how to secure my endpoints and operations on the server side, but I'm wonder about the best way to do this on the client side. Should I create a separate version of the app with a limited set of functionality? Should I simply hide certain buttons/features? Are there theories, frameworks, etc.? which deal with this issue? Any point in the right direction would be helpful.
LOL - probably one could write books about that topic. Some very basic ideas:
I would start with the philosophical debate according to MVC. There are people argue with the help of MVC that any piece of code and also any piece of data model should never be implemented twice. Business logic and model to the server. The opposite view is focussing on serving users at any cost - even if that means to double maintain code or the model for the sake of avoiding extra round trips. The way in between defines a master source for business code and model and makes sure to follow on other places that leading master (the master will be changed first). Take your choice. Your answer to that question results into boundaries for how the user interface can/have to look like for the user.
You need to think by hard about a permissions concept. Looking at Microsoft I would assume that they invested for all their applications a couple of dozens man years to make up the permission concepts. The ideal permission concept very much depends on your application. So it is close to impossible to work this out without knowing at least a very little of your application. However the permission concept has to come up with policies deciding on roles, groups, access rigths, access levels, context driven permissions (eg. based IP address), permissions black or white listing (permissions each user has at creation). An example from Microsoft: http://office.microsoft.com/en-us/windows-sharepoint-services-help/permission-levels-and-permissions-HA010100149.aspx
Data on the client is not secured!!! Whatever you do on the client, be it data hiding, encryption, compression... - if this is done on the client there are ways to read the data (even by disabling the data manipulation) or by reverting those. Somebody can send data to your server, where the client should not even have given an update form could be implemented by hackers. So as soon as you start to implement permissions make sure, that for all data you send to clients users are permitted to read and that you inlcude permissions checking for each time you add/update data to the database.

Managing users with Pirhana CMS

What is the best way to manage users with PirhanaCMS?
I would like to prevent some users from adding content (posts etc...) in some categories and prevent that some sites be listed for some users. (For people who don't know it, PirhanaCMS is a micro CMS programmer oriented).
I would like to use the sites features because I'm working on a project in which I'll have a "network" of several sites managed by different entities of an organization. I would like that each entity be only able to see its own site but that the big organization at the top be able to manage every sites. Moreover, within a site I would like that some users be only able to edit some part of the site.
Are these features built-in ? Otherwise what is the best way to implement them myself around the CMS ?
I am using ASP.NET MVC 4 and EF5.
If you take a look at System > Permissions in the manager area you can see that there are permissions you can give to groups for different parts of the manager interface.
There's however currently no built in support for restricting access to different site trees, but you are free to add a feature request for this at GitHub or maybe participate by implementing it and sending a pull request!

Any SharePoint geek to comment on architecture?

We are working with an existing SharePoint solution for a Company (its an Intranet). Now, the company is splitting into two and so is their intranet. Each company, of these two, from now on will have their own Intranet. So, idea is to split this Intranet (which is just a web application) into two web applications. But wait, these companies will also have some information to share in between. So, the idea is to put the shared information (Administrative stuff etc) into a separate web application. So, we have three web apps so far.
Previously, Company's intranet was managing the data of about 50 facilities in the form of Subsites. Now, according to new design, each web application will have 25 facilities each. In other words, each company will have 25 facilities. But, problem is that each facility is having large amounts of data almost 5GB+. So, its not possible for me to put the restriction quota at the site level. Although, we can put restriction on the size at SiteCollection level (as per my understanding). So, the idea so far is to create one site collection for each facility, it means we will have 25 site collections in a web applications. Navigation could be a nightmare. But, can we solve navigation problem with managed path etc?
any other suggestions/improvements will be warmly welcome and appreciated! Even your little comment may help me to improve my design ;)
In the proposed design, you will have to manually setup the navigation (managed paths will probably not help unless you want to have something other than /sites/ in the url).
I do question why you now have to split each of the facilities into their own site collection when, before the split, subsites worked. I would only recommend each facility be a unique site collection if you can forsee being asked to move facilities between the two new companies.

Design an API for a web service without "selling the farm"?

I'm going to try to phrase this as a generic question.
A company runs a website that has a lot of valuable information on it. This information is queried from an internal private database. So technically, the information in the database is the valuable part.
If this company wished to develop an API that developers could use to access their database of valuable & useful information, what approach should the company take?
It's important to give developers what they need. But it is also important to keep competing websites from essentially using the API to steal everything and essentially steal all traffic from the company's website.
Is there was some way the API could be used in a way that drives traffic back to the original company's website somehow? Something that gives users a reason to keep going there.
This is a design consideration that my company is struggling with that I can imagine other web-based services have come across before.
Institute API keys - don't make it public. Maybe make the signup process more complex than "anyone with an e-mail address".
Rate limit the API based on keys. If you're running more than X requests a minute, you're likely mining the database.
Don't provide a "fetch everything" API. Make the users know something to get information on it. Don't reveal what you know.
I've seen a lot of companies giving out API keys and stating a TOS that all developers must adhere to. For example, any page that uses data from the API must include your logo and a link back to your website. If any developer is found breaking the rules, the API key can be cancelled and your data is safe again.
Who is meant to use the API?
A good general method of solving this problem is to limit access to the data to end users (rather than allow applications or developers at it). Provide applications and users with identification, each, and make sure that to access a subset of the data, a combination of both user and application key is required.
Following this pattern, each user will have access to a very limited subset of the data (presumably, the data that they require for their own specific use), and you can put measures in place to enforce this. Any attempts at data-mining will become obvious.
This type of approach meshes well with capability-type security models on the server side.