blog.domain.com vs domain.com/blog [closed] - seo

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Other than style, is there any significant difference between the 2, perhaps in something such as SEO?
And style-wise, is there a "standard"/good practice about it, or its just up to your personal preference?

No "best practice" applies.
The decision is more often dictated by administrative considerations that anything else.
Some considerations might be:
Subdomain
you can host it on an entirely different machine to the primary site
it might make integration with the primary site more difficult (cookies, authentication, database servers, etc)
the "blog" DNS record is the first point of control
Subdirectory
the blogging software ideally uses the same technology as the primary site (eg php)
the blogging software necessarily uses the same technology at the primary site (eg linux)
the webserver is the first point of control
As far I'm aware it makes negligible difference to SEO.
The difference it makes to SEO is a long debated moving target. As of 2015 there is a surge of evidence and opinion toward subdirectories.

Subdomains make it easier if you want to swap just the blog over to another server (since you can change the DNS for the subdomain but keep the main portion of the domain pointing to the original machine), but they can also make AJAX requests and cookies behave differently due to subdomains being seen as "different domains" in some cases.

blog.domain.com is interpreted as a website all on it's own where as domain.com/blog is see as a sub-page or sub-directory of the domain.com depending on how your blog is setup. I believe Google Analytics even has an option to verify and track sub-domains, which segments them as a separate site.
Site, like DIYNetwork.com and About.com utilize sub-domains, because even though it is still dependent on the parent URL it allows opportunity to house an entirely different and independent website with ease in tracking analytics.

Related

Setting Up a Private MediaWiki [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I have a MediaWiki (1.32.0) running locally via WAMP on my Windows PC. I want to make the Wiki privately available online to people with a username/password that then allows people to see the Wiki.
So basically I have two big problems:
I've never hosted a wiki before, but I have hosted other, less complicated sites (such as my old personal website on HostGator) - but these sites never required a "back end" to serve content
I've never created a password blocked website. I'm thinking we'll just have one username/password combo, because we'll only allow 5-20 editors max on the Wiki; there will never be more than 1000 visitors simultaneously (and that's very, very generously high)
Any advice on either of these issues would be much, much appreciated.
Miraheze free wiki hosting will take care of this easily.

Should i use parked domains or separate accounts in cpanel [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
My question is is there any performance difference between:
1) 10 Parked domains and htaccess and php handling to what to load on same account
2) 10 separeate accounts
Users get same on front end, and i am currently using one account and all parked domain but i wonder if that slowing down my performance?
There is no performance difference that I know of.
With separate accounts, cpanel basically creates different vhost entries to map domain name to their respective websites.
.htaccess mapping is pretty much the same thing with vhosts.
There is however an exception:
If the hosting company you are in uses CloudLinux OS, or anything to isolate each tenant to his environment, then you are only allowed a portion of server's CPU per account. That means the more sites you have in one account, the slower it gets (provided the sites are all actively handling requests and responses).
By the way, I think web hosting questions like this are perfectly valid here. Programmers have different needs too :)

General SEO tactics related to publishing small content sites for the purpose of driving traffic to a web shop [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Calling all SEO experts!
The web company that I work for is going to begin systematically launching a bunch of small 1-5 page, content focused, websites for the purpose of driving traffic to the web shop.
The websites will look, and work, identical to an affiliate website that feeds products. All product images and such are feed from XML and text files only.
We have been trying to decide how to host the smaller sites. Our resident SEO "expert" claims that we ought to a) host the websites externally and b) that we ought to host in every country that we have a web shop in. So our co.uk domains ought to have hosting in the UK and our .de domains ought to have hosting in Germany.
This is creating an ENORMOUS logistics problem for administration.
The question I have is: is it really necessary to host these small content sites externally and do we need to spread them all over God's green Europe?
Thanks!
/Brian
P.S. No, I don't trust the competence of our resident SEO "expert"....
From my experience, the country that a domain is hosted in as compared to the TLD makes no difference. The exception to this might be page load times for your users which are a factor in Google rankings, but having it "nearby" should suffice - it wouldn't need to actually be in a particular country as long as the round trip between your users and server is acceptable. If your customers are in Germany it might not make sense to host in the USA, but the UK would likely be fine (assuming it is a good host with a good interent connection, etc).
What matters far more in terms of SEO is having a good, clean site with relevant information that loads quickly and is linked to by other good, clean sites with relevant information.

Where to get a large list of safe-for-work domain names? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Does anyone know where I could find a list of safe-for-work (i.e. no porn, piracy sites, etc) domain names that I can use to stress test software that performs asynchronous DNS lookups without raising questions if my network admin happens to be watching?
At least several thousand would be ideal. Most lists I've found have not been filtered at all. So far, using "raw" lists for DNS queries have not raised any questions, but my next step is to create TCP connections.
EDIT: I've cleared everything with local network admin people, however, this would still be nice to have for future developers on the project.
I think you probably worry too much. Having said that how about doing a google search for 'interesting facts about butterflies', parsing all the resulting domains and using those?
Your network admin will probably be more concerned with the fact that you're stress testing a network service on his network on the order of thousands of domains. If you have any kind of decent corporate firewall it's inspecting DNS queries and could choke on a high rate of queries. If your requirement is a legitimate business requirement the best option is to have your boss talk to the head of the network department to CYA.

Static Pages vs. Dynamic Pages, Which is Better for SEO? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Static Pages vs. Dynamic Pages, Which is Better for SEO?
Am not SEO , just i want to know..which is better...
Regards
It doesn't matter. In both cases you send HTML as a response to the browser or search engine bot.
You mean static websites (HTML only) versus dynamic websites (PHP, ASP, JSP, ...)?
There is only one relevant difference between static and dynamic pages for SEO, and that are URLs. Static pages work "naturally", that is, the organization of the URLs in folders follows the organization of your website, there is only one URL for each page, etc...
If you use a dynamic website, it depends on how do you structure it. If you have a separate server page for each page then it's the same. If you use a front controller pattern, then you should attempt at using URL rewriting, so that your URLs follow the logical structure of your site.
For the rest, there is no difference, as both static and dynamic pages just produce HTML, which is the content consumed by users and search engine, regardless of the technology employed.
Basically I agree with the argument that it does not matter regarding to SEO whether a web site is a dynamic or static.
However, there are some caveats that you have to consider.
URL--- You have to make sure all of the URLs are user-friendly.
Loading speed---- It does not necessarily mean all of dynamic web sites are slower than static ones. But you have to make sure that the loading speed of your web site is as quick as possible. FYI, Google recently stated openly that they will put loading speed into consideration.
If you make sure those two things are right. Then there is no big difference any more.
The static pages are the ancesters of web pages, of course they are the best for SEO because google bots are smart but their algorythm is more adapted on this kind of web site. the bots can check the code informations very quickly. That's why the static web pages are better for SEO.