Can Pelican be used on shared hosting server? - pelican

We changed web hosting companies (from VPS to shared hosting server) I have been hold by the company I cannot use Jekyll with them. This leaves me looking for a static blog generator option I could use.

If you use Pelican to generate your site on your local workstation, you should then able to transfer the generated files to your shared hosting provider via SFTP, rsync, or whatever other file transfer options are available.

Justin already answered your actual question. I only wanted to add that, to my knowledge, the same is true for Jekyll (or any static site generator): You generate everything locally and just push it to the server for publication.

Related

Can I direct a cpanel folder to a domain name hosted with a different registrar?

I'm wondering if this is possible and I'm not sure that it is. Before I explain, I found one other post [here]: Separate Domain Registrar and Host, possible to use CDN? which sounds similar to my problem, but I specifically don't want to point the Name Servers away. I only mention this because my question is going to sound very similar.
So I'm trying to help a friend who has a Domain Name registered on Site A (let's assume it's a place like Wix). He also has a hosting package on Site B (to have cpanel access for a site he had built with HTML & PHP).
for example:
1. www.yourdomain.com is hosted on Wix with a template website
2. "database" is a cpanel folder on Site B
Is there a way to have www.yourdomain.com/database link to the "database" folder on Site B's hosting -- without changing the Name Servers?
I don’t believe you can point a directory to a completely different server or hosting package.
However creating a sub domain instead could be your answer.
Is there anything stopping you from using database.yourdomain.com and creating an A record for this in your domain registrar and point it to the hosting package of database?

How to exclude subdomain directory from main website

I've setup a VPS with apache2.
I am using cloudflare for DNS management.
Now, I have my websites's files in "var/www/website" folder. Inside that, there is another folder for forum. like "var/www/website/forum" inside which there are all forum related files.
Now, suppose I have www.website.com pointing to "var/www/website"
and I also have a subdomain forum.website.com pointing to "var/www/website/forum".
What I want to do is make the files inside "var/www/website/forum" accessible via subdomain only. I don't want users to access forum via www.website.com/forum, but I want them to access it only via "forum.website.com"
What you need to do is set up what's called a virtual host. You would put your forum at /var/www/forum and website at /var/www/website.
inside /etc/apache2/sites-available, you'll need to add an additional configuration file for that site called forum.website.com.conf.
You'll then need to create a symbolic link to /etc/apache2/sites-enabled for that file so that apache sees it. From there, you reboot the server and are good to go.
Here's some documentation:
http://httpd.apache.org/docs/2.2/vhosts/
http://httpd.apache.org/docs/2.2/vhosts/examples.html
https://www.digitalocean.com/community/tutorials/how-to-set-up-apache-virtual-hosts-on-ubuntu-14-04-lts
This may be a bit different depending on the flavor of linux, but should be about the same. Control panels like Cpanel, Plesk and WebMan can make this process a bit easier by abstracting the configuration to a web control panel.
Hope this helps you.

Storing and retrieving files stored separately from codebase coldfusion

We currently have a site running cold fusion 11. In an effort to improve some aspects of security we would like to store all files uploaded by our users on a server separate from our codebase and DB servers.
I'm pretty much starting from scratch here as I wasn't able to find much in my searches so far. What's the best practice for doing this and what cold fusion functions would work for storing and retrieving files from an external source?
I could use some more information to be more helpful. But let's say you have a separate server that stores all your user files on a Windows network. I would use CFContent to serve those files with the file being retrieved over a UNC path.
I'd recommend reading this blog entry of mine on Securely Serving Files via CFContent. Wil, also from CF Webtools, posts one here: Serving File Downloads with ColdFusion
We had a similar issue when we migrated to a Unix platform. Our solution was to mount a file server to the webserver. It's accessed programmatically by ColdFusion as if it's on the same server, but it's inaccessible from the web root (browser). It's worked very smoothly for us.

How to access system properties from a Tomcat app deployed on Cloudbees?

I want to run a Tomcat app in Cloudbees. This app accesses some private and confidential properties from the file system. How could I access a file system on Cloudbees? Please note that it should be highly protected, e.g. 700 or similar.
Regards,
Marco
RUN#Cloud platform don't provide a persistent (nor distributed) filesystem. So you can't use it to as canonical store for those files, but need to use an external file store to match your security requirements, and copy them as application is starting (or lazy-load) to java.io.temp directory. As files are stored on RUN#Cloud there is no security issue as your server instance is fully isolated, and files will be deleted after application undeployed/passivated
So you can use Amazon S3 or comparable to store files
Another option is for you to attach properties to the RUN#Cloud instance as configuration parameters, and access them as System properties. See http://wiki.cloudbees.com/bin/view/RUN/Configuration+Parameters
If they data is modest in size - you could consider using properties - using the CLI you can set them using
bees config:set propertyName=value
you can then access that as a System property (for example) in your application. The properties themselves are stored encrypted by cloudbees.
I've actually moved to OpenShift since then and I solved the problem. Thank you for your answers

Building a service for a Drupal site to duplicate a node to another Drupal site in a multi-site setup

I'm trying to set up one of my Drupal sites to push a node to another Drupal site in a multi-site configuration. It looks like I need to do this with services somehow, but I can't find any tutorials out there and I need at least to be pointed in the right direction.
What I believe I need is set up Services on the receiving site to accept a call from the sending site which will be sending the node object via Json or serialized PHP using a Key that was set up on the receiving site. Can anyone show me an example of this working or give me some insight?
thanks
have you checked out the deploy module on d.o (drupal.org)? It's a great tool to push changes (also nodes) from one installation to another. It uses the services module for the communication.
I have not tested it with a multisite installation, but I guess it should work if at least the domain names are different for each site.
Regards
Mike