Setup Content Server - shared-hosting

This is more of a strategy question instead of a 'help with code' question.
I have a content server and need to supply content to various shared hosting and would love to know how you guys would set this up. We would like the content to be 'cached' on the servers so that is appears as static (and to reduce the load on the content server).
Any help would be appreciated. The basic question is regarding how you would deliver and then handle the content for the shared hosting.

The usual way is to use rsync to keep the "slave" servers in sync with the "master" server. The slaves are then configured to simply display the local files. rsync will make sure that files are copied efficiently and correctly (including time stamps and permissions). It will also make sure that clients don't see partial files and it will return useful error codes, and many more issues that only someone who has been doing this for decades would think of.

Related

What's elasticsearch and is it safe to delete logstash?

I have an internal Apache server for testing purpose, not client facing.
I wanted to upgrade the server to apache 2.4, but there is no space left, so I was trying to delete some files on the server.
After checking file size, I found a folder /var/lib/elasticsearch takes 80g space. For example, /var/lib/elasticsearch/elasticsearch/nodes/0/indices/logstash-2015.12.08 takes 60g already. I'm not sure what's elasticsearch. Is it safe if i delete this logstash? Thanks!
Elasticsearch is a search engine, like a NoSql database, and it stores the data in indeces. What you are seeing is the data of one index.
Probobly someone was using the index aroung 2015 when the index was timestamped.
I would just delete it.
I'm afraid that only you can answer that question. One use for logstash+elastic search are to help make sense out of system logs. That combination isn't normally setup by default, so I presume someone set it up at some time for some reason, and it has obviously done some logging. Only you can know if it is still being used, or if it is safe to delete.
As other answers pointed out Elastic search is a distributed search engine. And I believe an earlier user was pushing application or system logs using Logstash to this Elastic search instance. If you can find the source application, check if the log files are already there, if yes, then you can go ahead and delete your index. I highly doubt anyone still needs the logs back from 2015, but it is really your call to see what your application's archiving requirements are and then take necessary action.

Send very large file (>> 2gb) via browser

I have a task to do. I need to build a WCF service that allow a client to import a file inside a database using the server backend. In order to do this, i need to communicate to the server, the setting, the events needed to start and set the importation and most importantly the file to import. Now the problem is that these files can be extremely large (much bigger then 2gb), so it's not possible to send them via browser as they are. The only thing that comes into my mind is to split these files and send them one by one to the server.
I have also another requirement: i need to be 100% sure that this file are not corrupted, so i need to implement also a sort of policy for correction and possibly recover of the errors.
Do you know if there is a sort of API or dll that can help me to achieve my goals or is it better to write the code by myself? And in this case, which would be the optimal size of the packets?

Possibilities of Datazen server migration

I know that similar topics have been already raised, but maybe there are some latest news or ideas?
I want to migrate Datazen users/sources/dashboards etc. to another server (production one) in a smooth way. I was trying to do that via backup/restore, but then I couldn't access the control panel on the target server. I received an error
401 unauthorized access.
Maybe I should change something in logs/config files on the destination server?
Any ideas? I would be grateful for any help!
I dont think there is a way to do this out of the box. However, the files are quite simple XML, so can be pointed at a different server if you know PowerShell (and work out the correct values from the server).
You will have to re-point the GUID, ServerGUID and ServerURI within the sources.xml file and then rezip (as .datazen). Providing you have your hubs set up the same, Datazen will then believe the file belongs to your prod environment and you will be able to publish

Can I use an API such as chef to automatically create, name and set passwords to multiple servers?

I am new to this so forgive me for not understanding the lingo.
I have been using rackspace cloud control panel to build multiple virtual servers, i use them for maybe a couple of hours then i delete them. I need these servers to all have specific and unique names such as: "server1, server2, server3, etc." I also need them to have a specific password unlike the randomly generated password that is assigned by default.
I have been creating each individual server manually (based on an image that's set up) then I have to go back and reset the password andreboot all of them. Doing each one manually is a bit time consuming and I'm sure there is an easier way. Please help me figure this out.
I've been doing some searching but I haven't found anything too relevant to my problem on top of that I'm not too familiar with programming and such.
Basically what I'm looking to do is automatically create these servers with their appropriate names and passwords already built in from the start. I'm not sure if some sort of "API" is the answer, or if there's some sort of script that can be written, or both.
Any assistance is much appreciated.
thanks,
Chris

How can i access and manipulate a mdb file available online (on web) using VB

I have a mdb file hosted on my site http://www.simplyfy.co.in/db/dbfile.mdb. I am developing an application which will be running on multiple machines and will contact the mdb file via internet. I am not sure how do i go about it as building the connection string for an online connection. Any help?
You do not - not at all, not even a little bit, want to expose a .MDB file directly over the internet. You really, really do not want to do this.
There are two reasons and I'll start with second, even if it works - and since it needs to be able to create a .ldb file if its not read only I'm not sure it will - it is liable to be horribly slow. Multi-user MDB can be bad enough over a local network.
The other reason is security, assuming it works at all you're going to really struggle to make this even vaguely safe.
Broadly speaking what you need to do is to create a web service that runs on your site that provides an secured API that your client applications can use to access your database - this gives you two benefits: 1) its much more secure (you're not exposing webspace with write permissions) and 2) it gives you the ability to change the back end data store if required without affecting the clients. There are various possibilities for implementing this but it will depend on the tools you have/are comfortable with.
I think it is possible to access the same way that access a local file, simply using the URL as Data Source. That is, the connection string looks like:
Provider=Microsoft.Jet.OLEDB.4.0;User ID=...;Data Source=http://www.simplyfy.co.in/db/dbfile.mdb;Mode=..., etc
HTH