I have sort of a problem how to use Google Analytics properly with Boostrap.
My page has 3 level deep subpages and the last subpage has it's own subdomain. In GA I see I can use max. 50 tracking codes within one service. What if I need more than that?
You are limited to 50 properties not 50 pages. Each property can track many pages and (up to 10 million hits a month for the free version) and events.
Typically you would use the same property code on all pages on the same site so you can see all that data together (though with option to drill down).
You would only use a new property code for a new site (though your subdomain might qualify for that if you want to track it separately).
So the two questions you want to ask yourself are:
Do you want to be able to report on two pages together? E.g. To see that your site gets 10,000 hits and 20% are for this page and 5% are for that page. Or people start at this page and then go to that page and then on to this page. If so it should be the same analytics property.
Do different people need to see these page stats? And is it a problem if they do? If so put as a separate property so you can permission separately.
It sounds like these are part of the same site so I'd be veering towards tracking them together on same property.
On a different note you should set one page as the main version (with a rel canonical tag) and redirect other version to that page to avoid confusing search engines thinking you have duplicated content. Do you have a reason for having the same content on two different addresses? It can cause SEO and other problems.
Related
I have submitted 1000 pages to Google included in my sitemap, but I did it by mistake. I didn't want to submit all, I wanted to submit 800 and release 1 page per day during 200 days in order to add new fresh content every day, that way Google would see it as a frequently updated website which is a good SEO practice.
I don't want Google to know about the existence of those 200 pages right now, I want Google to think it is fresh content when I release it every day.
Shall I resend the sitemap.xml with only 800 links and hide the pages in the website?
If Google has already indexed the pages, is there any change to make Google "forget" those pages and not recognize them in the future when I release them back?
Any suggestion about what to do?
Thank you guys.
I wouldn't do that. You're trying to cheat, Google doesn't like this. Remain your site as it is now, create new content and submit it into Google's index as frequently as you want. If you'll exclude previously submitted data from the index, with a high probability it won't be indexed again.
I see that search engines are prominently capable of finding pages chronologically for forum websites and the like, offering the option to show the results for the last 24 hours, last week, last month, last year, etc.
I understand that these sites need to be continuously crawled to provide those updates, but I have technical doubts about what structure, tags or whatever I need to do to achieve it for my website.
I see that at the client side (which is also the side search engines are at) content appears basically as static data, already processed by the server, so the question is:
If I have a website for which I update and add content constantly to the index page to make it easily visible, and for which I even add links, times and dates as text for the new pages, why don't these updates show at all in search engines?
Do I need to add XML/RSS feeds, or what else?
How do forums and sites with heavy updates with a chronological mark achieve the capability to allow search engines to list results separated by hours, days, etc.?
What specific set of tags and overall structure do I need to add for this feature?
I also see that search engines, mainly Googlebot, usually take a minimum of 3 days to crawl those new pages, but still, they aren't organized persistently (or at all) in a chronological way in search results.
I am not using any forum, blog or other kind of web publishing software, just raw HTML and PHP written by hand, and the minimum I mentioned above, of pointing to new documents from the index page of the website along with a description.
Do I need to add XML/RSS feeds, or what else?
Yes. Atom or one of the RSS formats (or several formats at the same time, so you could offer Atom and RSS).
Search engines will know about new blog posts, microblog post, forum threads, forum thread answers etc., because they subscribe to the feed. So sometimes you'll notice that a page is indexed by a search engines only minutes after it was published. But for smaller sites, search engines probably don't check for updates every few minutes, instead it might take even days until a new page is indexed.
A sitemap might help, too.
For example we have 5 landing pages, running under the same URL, being served randomly based on their weightage.
What I want is to check which page is converting more and increase it's weightage automatically so that it get served more.
This is the simple explanation of my problem. Is there any standard algorithms and techniques available to achieve this. What I don't want is to reinvent the wheel.
Thanks.
I would do something very simple such as keeping a running count of how many times a landing page was converted. After that you have a variety of choices:
a) Sort them by hit count and serve the top one(s)
b) Serve the pages in a fashion weighted by the number of hits. For example, you could serve the pages based on a probability distribution derived from the hit count (e.g. if you have two landing pages and page A is hit twice more than page B, you serve page A twice as often as page B). Tweaking the function will allow to control the relative rates at which different pages are served.
However, this begs a question: if a user returns twice to your site, will they get a different page? If so, wouldn't that be confusing?
I have a client who over the years has managed to get their product to the top of Google for many different search terms. They're adamant that the new site shouldn't have a detrimental effect to their google ranking.
The site will be replacing the site that is on there current domain, as well as going up on to 5 further domains.
Will any of this lose the client there current ranking on google?
Google re-ranks the sites it has regularly. If the site changes, the ranking very well could... if more or fewer people link to it or if the terms on the site (the content) is different.
The effect might be good or bad, but uploading different content isn't going to make their rank go away overnight or anything like that.
Page Rank is most about incoming links. So if the incoming links won't be broken page rank will not be affected that much.
Though, overall ranking is not just Page Rank, so... further discussion is needed
if they retain current link structure they should be fine
I have 2 1/2 years experience of VB.Net, mostly self taught, so please bear with me if I seem rather noobish still and do not know some of the basics. I would recommend you grab a cup of tea before starting on this, as it appears to have got quite long...
I currently have a rather large application (VB.Net website) of over 15000 lines of code at the last count. It does not do retail or anything particularly complex like that - it is literally just a wholesale viewing website with admin frontend, catalogue / catalogue management system and pageview system.
I don't really know much about how .Net applications work in the background - whether they are all loaded on the same thread or if each has its own thread... I just know how to code them, or at least like to think I do... :-)
Basically my application is set up as follows:
There are two different areas - the customer area and the administration frontend.
The main part of the customer frontend is the Catalogue. The MasterPage will load a list of products but that's all, and this is common to all the customer frontend pages.
I tend to work on only one or several parts of the application at a time before uploading the changes. So, for example, I may alter the hierarchy of the Catalogue and change the Catalogue page to match the hierarchy change whilst leaving everything else alone.
The pageview database is getting really quite large and so it is getting rather slow when the application is first requested due to the way it works.
The application timeout is set to 5 minutes - don't know how to change it, I have even tried asking this question on here and seem to remember the solution was quite complex and I was recommended not to change it, but if a customer requests the application 5 minutes after the last page view then it will reload the application from scratch. This means there is a very slow page load whenever it exceeds 5 minutes of inactivity.
I am not sure if this needs consideration to determine how best to split the application up, if at all, but each part of the catalogue system is set up as follows:
A Manager class at the top level, which is used by the admin frontend to add, edit and remove items of the specified type and the customer frontend to retrieve a list of items of the specified type. For example the "RangeManager" will contain a list of product "Ranges" and will be used to interact with these from the customer frontend.
An Item class, for example Range, which contains a list of Attributes. For example Name, Description, Visible, Created, CreatedBy and so on. The form for adding / editing loops through these to display relevant controls for the administrator. For example a Checkbox for BooleanAttribute.
An Attribute class, which can be of type StringAttribute, BooleanAttribute, IntegerAttribute and so on. There are also custom Attributes (not just datatypes) such as RangeAttribute, UserAttribute and so on. These are given a data field which is used to get a piece of data specific to the item it is contained in when it is first requested. Basically the Item is given a DataRow which is stored and accessed by Attributes only when they are first requested.
When one item is requested from a specific manager is requested, the manager will loop through all the items in the database and create a new instance of the item class. For example when a Range is requested from the RangeManager, the RangeManager will loop through all of the DataRows in the Ranges table and create a new instance of Range for each one. As stated above it simply creates a new instance with the DataRow, rather than loading all the data into it there and then. The Attributes themselves fetch the relevant data from the DataRow as and when they're first requested.
It just seems a tad stupid, in my mind, to recompile and upload the entire application every time I fix a minor bug or a spelling mistake for a word which is in the code behind (for example if I set the text of a Label dynamically). A fix / change to the Catalogue page, the way it is now, may mean a customer trying to view the Contact page, which is in no way related to the Catalogue page apart from by having the same MasterPage, cannot do so because the DLL is being uploaded.
Basically my question is, given my current situation, how would people suggest I change the architecture of the application by way of splitting it into multiple applications? I mean would it be just customer / admin, or customer / admin and pageviews, or some other way? Or not at all? Are there any other alternatives which I have not mentioned here? Could web services come in handy here? Like split the catalogue itself into a different application and just have the masterpage for all the other pages use a web service to get the names of the products to list on the left hand side? Am I just way WAY over-complicating things? Judging by the length of this question I probably am, and it wouldn't be the first time... I have tried to keep it short, but I always fail... :-)
Many thanks in advance, and sorry if I have just totally confused you!
Regards,
Richard
15000 LOC is not really all that big.
It sounds like you are not pre-compiling your site for publishing. You may want to read this: http://msdn.microsoft.com/en-us/library/1y1404zt(v=vs.80).aspx
Recompiling and uploading the application is the best way to do it. If all you are changing is your markup, that can be uploaded individually (e.g. changing some html layout in an aspx page).
I don't know what you mean here by application timeout, but if your app domain recycles every 5 minutes, then that doesn't seem right at all. You should look into this.
Also, if you find yourself working on various different parts of the site (i.e. many different changes), but need to deploy only some items in isolation, then you should look into how you are using your source control tools (you are using one, aren't you?). Look into something like GIT and branching/merging.
Start by reading:
Application Architecture Guide