Extracting large zip file onto server while pc is turned off - cpanel

I've got a zip file of 1,6gb and it takes me forever to extract it on a server. I left it all night long and when i woke up it wasn't finished. There is no way to keep track how much time is left on extracting a file and how much percantage is done so i'm not sure if the whole thing works properly. Is there a way to exctract that file using File manager in Cpanel so that it can be done while the pc is off and maybe to note me on an email when it's done. I basically need to copy a webshop from live server to developers server and am just loosing too much time on that. So if anyone has a better idea how to extract it please feel free to suggest it.
P.S. Deleting of those files that did extract takes forever too
P.P.S. I'm a linux/SystemAdmin

If it's all about copying files from one server to another - why not just use rsync and avoid archiving?
I mean, if extraction is a pain - remove it from the equation :)

It is not a good ideato use the cPanel File Manager for this task, as the server will probably kill the extract process if it takes too long.
The best way to go about this would be via SSH, while logged in as root. If you need to switch off your computer, you should run it in screen.

You can also use unzipper.php which you can get from github.
It will require you to upload your file and unzipper.php too. Then run wwww.yourdomain.con/unzipper.php

Related

SQL (Find & Replace) Entire Database

I am using PHPMYADMIN from SiteGround CPanel.
Story: I had Cloudflare setup for a php platform, I then realised it was causing issues so I removed it. The issue I'm left with is that half of my site is still running of (https://www.example.com).
What I have done so far: In the config files of my script I have already set it so that it runs through https alone.
What I want to achieve: I noticed in the database that there are some fields that are running through the www. I want to execute a command that will automatically find anything with my old domain (https://www.example.com) and replace it with (https://example.com). I noticed that the fields are not all appearing from a single column/file, it is all over the place, so a field&replace overall should fix the issue.
I would appecaite any help. Since it is database I don't wish to try out random things from different websites provding their feedback. I was recommended to use this website for assistance (if possible).
Thank you in advance.
Probably the most straightforward and quickest way, is to simply take a dump of the entire database, open the sql dump file in some text editor, and then do a text replace from [old url] to [new url]. Then import the dump file back to the database. This should work just fine and avoid the headache of uncertainty and risk over doing a write operation on the entire database's tables via some db query.

A persistent simple data storage for Node.JS app implementation?

I'm planning to launch a simple Node.JS utility and push it to heroku. A fire and forget solution, will sleep for like 90% of the time probably. Unfortunately it seems that I require a persistent data storage for my purposes (heroku apps get rebooted daily and storing everything in RAM is unrealistic), and I don't know which way to look as:
Most SQL hostings are paid / limited time free / require constant refreshing ( like freemysqlhosting ).
Storing stuff in plain .txt format is seemingly hard to implement, besides git always overwrites the contents of a tracked .txt file, and leaving it untracked disposes of it on heroku and leads to ENOENT No such file error. Yeah, I tried.
So, the question is - how do I implement a simple and built in solution for storing data? Are there any relevant typical solutions? It's going to be equivalent to just 1 SQL table.
As you can see, you can answer this on many levels - maybe suggest a free deploy and forget SQL hosting (it obviously has to support external connections), maybe tell me how to keep a file tracked in git without actually replacing all of its content with every commit, maybe suggest some module to install. I hope this is not too broad..

Zipping and moving SQL backups - Powershell

I am having issues getting a script of mine to work and need some asstiance.
The goal of the script is to take SQL backups, ZIP them using 7zip, them move them to a NAS. After the move, the original backup is deleted, but not before confirming the name and size.
You can find my script here.
If there's a better way to achieve what I'm trying to do or an already built script I could be pointed to, that would be great!

Is there an automated way to push all my javascript/css/images to s3 everytime I do a website push?

So I am in the process of moving all the thumbnails of my major sites to S3 and now I am thinking about how I can consistently put all my CSS/JS/images that power the actual sites to it. It's easy enough to upload everything the first time but I am trying to think of a way to somehow automate the process everytime I push out to production.
Does anyone have any clever ways of doing this?
I used to use s3sync to compare and update the assets just before upload the site files using a bash file to iterate through my files
This works well but when the amount of likes to compare (lets say thousands) gets big this process start being really slow. If you have an small architecture (in term of assets) this would do the trick
to make this better I would recommend capistrano or some other assistant that helps you to deploy...this way you can run at all once..
upload assets
deploy your files
In the other hand you could take a look to cloudfront (amazon's CDN) and set it up using ORIGIN..this way you dont need to worry about upload the files to s3 since they will be automatically pulled on demand. The down side of this approach is the caching if you need to update a file and keep the same name (AKA expire the object)...you can do this in cloudfront but will need an script to do the task.
Depending in the traffic (and other factors, ofcourse) one or other path will fit the best.

How to compare test website and live website

We have our production server running our website. Then we have a test server which has exact same data but with changes to code to do some new functionality. This web app has over 500 pages.
Is there any program that can
Login to the test site
Crawl through each page and then save the page as html
Compare with the same page saved with live site?
This way we can make sure that new features that we add to our test site will not break the live site when code updates are applied to production.
I am currently trying to use WinHTTrack website copier and then comparing the test and live folders with some code comparison tool like beyond compare. This works ok but there are lot of files changed because of the domain name changes.
Looking forward to ideas / solutions for this problem.
Regards
Have you looked at using Watir for this? It's not exactly the thing you are looking for but it might allow you some more granularity in your tests and ensure the site is functionally identical rather than getting caught up on changing guids, timestamps and all the other things that tend to change across any significant size website from day to day as part of it's standard functionality.
Apparently you can't make consistent, reproduceable builds in your project, can you? I would recommend moving towards that in the long run, it will save you a lot of headaches. That way you would know exactly what was deployed to which server when, so there would be no more need to bend around backwards to get the deployed sources back like this...
I know this is not a direct solution to your problem... but maybe it is worth comparing, whether you would save more in the long run by investing the efforts into your build process now, instead of implementing this workaround (and then improving your build process anyway - because one day you will almost surely need to do that).
wget has a --convert-links option, there are also some options to preserve cookies that might let you do it logged in http://drupal.org/node/118759#comment-664498
use an Offline Downloader, download all files to your computer from both sources, then compare the folder contents using a free tool like Total Commander.
EDIT
Load both of your sources into a CVS, and compare it there.