Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
My question is is there any performance difference between:
1) 10 Parked domains and htaccess and php handling to what to load on same account
2) 10 separeate accounts
Users get same on front end, and i am currently using one account and all parked domain but i wonder if that slowing down my performance?
There is no performance difference that I know of.
With separate accounts, cpanel basically creates different vhost entries to map domain name to their respective websites.
.htaccess mapping is pretty much the same thing with vhosts.
There is however an exception:
If the hosting company you are in uses CloudLinux OS, or anything to isolate each tenant to his environment, then you are only allowed a portion of server's CPU per account. That means the more sites you have in one account, the slower it gets (provided the sites are all actively handling requests and responses).
By the way, I think web hosting questions like this are perfectly valid here. Programmers have different needs too :)
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I have a MediaWiki (1.32.0) running locally via WAMP on my Windows PC. I want to make the Wiki privately available online to people with a username/password that then allows people to see the Wiki.
So basically I have two big problems:
I've never hosted a wiki before, but I have hosted other, less complicated sites (such as my old personal website on HostGator) - but these sites never required a "back end" to serve content
I've never created a password blocked website. I'm thinking we'll just have one username/password combo, because we'll only allow 5-20 editors max on the Wiki; there will never be more than 1000 visitors simultaneously (and that's very, very generously high)
Any advice on either of these issues would be much, much appreciated.
Miraheze free wiki hosting will take care of this easily.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I planning to run a private media wiki server on debian(SELinux) for all my important home documents.
I would like to be able to securely access it from the outside with laptop, tablet, or even a live-CD like LPS. It seems to me I would have the smallest attack surface if I only provided SSH to the cloud and tunneled in, maybe even incorporate a port knock to prevent casual detection. I will be serving content to a known and essentially unchanging set of users. Bandwidth efficiency isn't really a factor as concurrent connections would be rare.
Is there a more secure way to access a web server? It seems the government really likes to use smart cards although I'm not sure how. What about client side browser certificates? Yubikey?
The safest solution is probably using a virtual private network so that the server cannot be contacted at all except through an SSH-like protocol. A decent router should support this; you can get more help over at SuperUser.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Calling all SEO experts!
The web company that I work for is going to begin systematically launching a bunch of small 1-5 page, content focused, websites for the purpose of driving traffic to the web shop.
The websites will look, and work, identical to an affiliate website that feeds products. All product images and such are feed from XML and text files only.
We have been trying to decide how to host the smaller sites. Our resident SEO "expert" claims that we ought to a) host the websites externally and b) that we ought to host in every country that we have a web shop in. So our co.uk domains ought to have hosting in the UK and our .de domains ought to have hosting in Germany.
This is creating an ENORMOUS logistics problem for administration.
The question I have is: is it really necessary to host these small content sites externally and do we need to spread them all over God's green Europe?
Thanks!
/Brian
P.S. No, I don't trust the competence of our resident SEO "expert"....
From my experience, the country that a domain is hosted in as compared to the TLD makes no difference. The exception to this might be page load times for your users which are a factor in Google rankings, but having it "nearby" should suffice - it wouldn't need to actually be in a particular country as long as the round trip between your users and server is acceptable. If your customers are in Germany it might not make sense to host in the USA, but the UK would likely be fine (assuming it is a good host with a good interent connection, etc).
What matters far more in terms of SEO is having a good, clean site with relevant information that loads quickly and is linked to by other good, clean sites with relevant information.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Does anyone know where I could find a list of safe-for-work (i.e. no porn, piracy sites, etc) domain names that I can use to stress test software that performs asynchronous DNS lookups without raising questions if my network admin happens to be watching?
At least several thousand would be ideal. Most lists I've found have not been filtered at all. So far, using "raw" lists for DNS queries have not raised any questions, but my next step is to create TCP connections.
EDIT: I've cleared everything with local network admin people, however, this would still be nice to have for future developers on the project.
I think you probably worry too much. Having said that how about doing a google search for 'interesting facts about butterflies', parsing all the resulting domains and using those?
Your network admin will probably be more concerned with the fact that you're stress testing a network service on his network on the order of thousands of domains. If you have any kind of decent corporate firewall it's inspecting DNS queries and could choke on a high rate of queries. If your requirement is a legitimate business requirement the best option is to have your boss talk to the head of the network department to CYA.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Other than style, is there any significant difference between the 2, perhaps in something such as SEO?
And style-wise, is there a "standard"/good practice about it, or its just up to your personal preference?
No "best practice" applies.
The decision is more often dictated by administrative considerations that anything else.
Some considerations might be:
Subdomain
you can host it on an entirely different machine to the primary site
it might make integration with the primary site more difficult (cookies, authentication, database servers, etc)
the "blog" DNS record is the first point of control
Subdirectory
the blogging software ideally uses the same technology as the primary site (eg php)
the blogging software necessarily uses the same technology at the primary site (eg linux)
the webserver is the first point of control
As far I'm aware it makes negligible difference to SEO.
The difference it makes to SEO is a long debated moving target. As of 2015 there is a surge of evidence and opinion toward subdirectories.
Subdomains make it easier if you want to swap just the blog over to another server (since you can change the DNS for the subdomain but keep the main portion of the domain pointing to the original machine), but they can also make AJAX requests and cookies behave differently due to subdomains being seen as "different domains" in some cases.
blog.domain.com is interpreted as a website all on it's own where as domain.com/blog is see as a sub-page or sub-directory of the domain.com depending on how your blog is setup. I believe Google Analytics even has an option to verify and track sub-domains, which segments them as a separate site.
Site, like DIYNetwork.com and About.com utilize sub-domains, because even though it is still dependent on the parent URL it allows opportunity to house an entirely different and independent website with ease in tracking analytics.