How can administrators of WebSite can turn of CloudFlare for me? - cloudflare

We’re organization, which trying to archive russian independent media. But some websites block acces via Cloudflare. Can we get permission from the media to access their site so that Cloudflare doesn’t block us?
Now we're trying to archive this websites using selenium but this is not comfortable for us

Cloudflare configured from the that "media" side, you can't do anything with that without getting permissions from that media (you can try to connect with them via e-mail for example).
Or, you can always use VPN for getting access.

Related

How do I password protect a website with static and dynamic parts?

I have a website on my personal domain that I would like to password protect securely. This website has two parts. One is a static website I have created with the SSG Eleventy and copied into the web directory. The other is an installation of MediaWiki. My site is hosted on Dreamhost shared hosting and is currently password protected with Apache .htaccess and .htpassword files for basic authentication. This is mostly fine, except for the fact that it isn’t easy to “log out”. I would like a way to password protect this site that gives me more control over logging in/out.
I want to be able to create my own log in page that looks more like the rest of the site (instead of the default Apache login page). I would also like there to be an easy way to create new users and passwords, though I want the ability to create new users to be restricted to the admin account (my login). I want users to be able to log out whenever they want, and have the site properly protected from any logged out users. I would like this password system to also protect the part of my site that is built with MediaWiki. This means that, if I navigate to a page within the wiki, I want it to redirect to the site-wide log in page, before allowing access to MediaWiki. Once the user logs in, they should have access to the whole site, including the wiki section without being prompted to enter the password again. (I am aware MediaWiki has its own login page, but I want to be able to use a single password for accessing the entire site. Also, I am NOT looking to integrate MediaWiki’s internal login with this site-wide login.)
I do not intend this site to ever have that many users. Really just me and a few others I want to invite (via their own username and password). So I really just need a simple way to manage these user accounts and prevent access to others.
What is the best way to do all this? Is it even possible to do any of this for a static website? I know I probably have to use php to make this work, which I don’t know too well, but am willing to learn. It also probably shouldn’t be too hard to create and manage a database to store the usernames and passwords if necessary. I would just like to know if all this is possible and what the easiest way to do it would be.
I am pretty new at web development and this is the first real website I have built and deployed. I would really appreciate any suggestions and help.

Can a Google Search Appliance that doesn't have a digital certificate scan sites that are SSL enabled?

HTTPS has been enabled on a site and the Google Search Appliance now isn't able to crawl or access the site. My tech team can't figure out what is the issue and I haven't been able to find a definitive solution. Any help you can provide would be appreciated. Thx.
You should receive an error when using Real-time diagnostics but the best way, I've seen, is to use the "Forms Authentication" setup page.
Configure the sample URL to point to your site's home page and set the pattern to be the root URL for your site.
When you click "create" an SSL link will try to be created and you'll likely receive an error back at this point.
My guess is that the certificate on the web site may not be 'perfect' (the GSA is less forgiving than browsers are).

Allowing read and write access to Google Drive files to unauthenticated clients

We have been working on a web service (http://www.genomespace.org/) that allows computational biology tools that implement our REST API to, among other things, read and write files stored on "The Cloud".
Our default file storage is a common Amazon S3 bucket, but we now allow users to mount their own private S3 bucket as well as files on Dropbox.
We are now trying to enable similar functionality for Google Drive and have run into some problems unique to Google Drive that we have not encountered with S3 or Dropbox.
Only way to allow clients that are not Google-authenticated to read files unobtrusively is to make the files "Public". Our preference would be that once the user has authorized access to our application via OAuth2, the user files could remain "Private" in Google Drive.
However, even though the user has already authorized our web service to offline access to their "Private" files, we have not found a way to generate a URL that a client authorized by our system can use to GET the file directly without the client being logged into Google as well.
The closest we have come to this functionality has been to change the file permissions to "Anyone with Link", except that for files greater than 20MB Google insists on returning an intermediate web page warning that the file has not been scanned for viruses. In addition to having to mess with file permissions, this would break our existing clients. Only when the file is "Public" and we utilize URLs of the form https://googledrive.com/host/PARENT_FOLDER_ID/FILENAME can non-Google clients read the files without interference.
Have not found any way for clients that are not Google-authenticated to upload a file to Google Drive. Our API allows our authorized clients to PUT files directly to the backing file storage using URLs provided by our server. However, even if a folder is marked "Public", the client requires Google authentication credentials to save to Google Drive. We could deal with both of these issues with intermediate hops through our system (e.g., our web server would first download the file from Google Drive and then allow the client to GET it) however this would be woefully inefficient and, hopefully, unnecessary. These problems have been discussed multiple times before on stackoverflow (e.g. here and here and have read the responses very carefully, but have not seen any recent discussion.
The Google folks direct their API users to post on stackoverflow for support, so I am hoping for a fresh look from insiders.
The general answer is: dont make the drive requests through the user's browser. Insead do everything from your servers. You are the one having the (refresh) tokens for users, so you should make all requests like a proxy between the user and Drive. Same for downloading, you download it and return to the user. As long as you use each drive's token there shouldnt be rate limit/quota issues.

How to integrate apache and google site via proxy

My question is "How to integrate apache and google site via proxy"?
I found this tutorial but it didn't work as I expected. It redirect to google site instead of keep my domain in address bar and change content only.
In my case, I want whenever people access to http://mydomain.com, they will see the content from https://sites.google.com/site/mydomain/
Thanks!
I think it's not possible because you can't acces on the dataBase on your Google Sites. The url of your Google can't change.

How to test file.watch on localhost?

I am displaying the contents of a folder in my AngularJS front-end (with Rails back-end). I want to watch the folder for any changes, such as new file, deleted file.
I obviously want to test the app on my localhost before deploying to a server, but I am not able to add localhost as an allowed domain in the apis console.
How can I set-up file.watch for testing?
Thanks
It's impossible to test push notifications without a verified domain, it's why we cant push confidential information to untrusted endpoints. I'd recommend you to buy/use a test domain/sub-domain for testing.
there's portly.co
worke's really nice , you can verify the domain with google site verification, and add it as a push domain.