executable in URL - cgi

While I was planning a trip on a railroad website, I noticed the website was rendered by a .exe ( http://reiseauskunft.bahn.de/bin/query.exe )
My question: is this a CGI script, and if so, isn't that terribly inefficient?
I mean, every query launches a new thread. Imagine a train gets cancelled, the server would be DDoS'd in no time, no?

You can put anything into a URL structure using URL rewriting, so this could really mean anything. Maybe the planners of the portal preferred .exe to signify an action; maybe indeed an executable is called via CGI. Chances are it's a custom-built web server anyway.
You'd have to ask the people who built bahn.de's traffic information for a definitive reply - as far as I know, it's these guys.

Related

Blocking web spider

I want to block all web spiders to vaccum my web site.
Is there a way ?
I only found some Apache rules from 2008 (like this one )
http://perishablepress.com/ultimate-htaccess-blacklist/
Unfortunately, there isn't a way to block ALL scripts from accessing your site: if a human can, nothing prevents anyone from writing a spider that behaves similarly to a person and can therefore view every page.
You can look up techniques to prevent certain robots from accessing your site (you can do it with the majority of search engines), but if it remains up long enough and gets some visits it's likely that it will eventually end up in some database.
Have a look here.

Dynamic url shortening script for text input

We are looking for script, which automatically detects url, as you type and shorten it, in text input window, before press "submit". The shortening service used is http://yourls.org/
Have you tried implementing one yourself? Deploy the shortener to your own web site (it's written in PHP, as far as I can see from a cursory glance at the web site) and provide a simple Ajax endpoint which will dynamically perform a shortening conversion, then implement calls to that from the main page using JavaScript.
You might want to impose a reasonable delay to allow the user to finish typing, to avoid performing lots of unnecessary conversions of bogus URLs (which may require, e.g. writes to a file or database - I haven't looked at how the library referenced does things).
I'm not sure what you're trying to achieve; if you create new shortened URLs for each substring before the user has finished typing the full URL, you will just proliferate your database.
I don't see how shortening a URL before it's finished makes sense.
If you want to relieve the user from the arduous task of clicking the submit button, then initiate the submit using javascript (jQuery, or something). I'm not sure if that's what you want to do.
http://monkeytooth.net/2010/12/htaccess-php-how-to-wordpress-slugs/
simple means of implementing the concept its a lot more easier than one would think. Querying a DB or some other means of matching the slug/id with the that of which is found in the URL wouldn't be all to hard either. The linked article doesn't really go in depth as what to do next but catching and breaking the URL apart is the essential process of making it work. I have person used the method myself on several sites and it works like a charm for me and the sites it was used on.

Refresh browser via cron(or not) to a different page on remote request?

I need to display pages in a tutorial fashion. I looked in to netsupport, beamyourscreen and other possibilities but, I do not want the viewers to download anything. I cannot use gd / send screenshots due to audio / video instructions embedded in some of the pages.
Basically, I need the ability to "refresh" a users browser window to a different page via an interface on my end. Whether via a form submission, javascript or any other type of "controller" that allows me to change the page on the viewers browser. PERL preferred but, PHP / javascript whatever works and is cross browser. I set up a simple javascript page forward timer that "works" but, page load times and conversation interruptions are a huge factor.
The entire tutorial website will be developed around this ability.
I was looking in to curl / cron / wget methods but, found little information.
I have seen forum and chat scripts that basically perform a similar task but, there must be a simple(ish) solution in leau of hacking up another script to suit my needs.
I do not want others to control the pages either. The site really, only needs to be accessable during the tutorial however, It "could" remain web accessable as long as user interaction was normal unless (being controlled).
The initial site concept is based on instructing people how to properly introduce new pets into a home. Will be operated by a veteranarian that saved my pets life. I wanted to give something back.
Possible? I really appreciate simple examples etc...
You have no other way but to keep polling the server for "instructions" using javascript. No, you can't send nothing to the end user browser, neither curl nor wget.
Mainly, you'll have to set up a simple request/response protocol between the browser and the server.
If you want to go deeper, you can use something like cometd/meteord/etc. If not, a hidden iframe that reloads himself and receives pages with javascript code for the needed actions can do the trick.
Another alternative.
With javascript dopolling and single character flatfile. Have a simple one character flatfile with a single var. Write it in perl (it is faster and uses less resources than php). The parent script calls a javascript variable in a flatfile. It hits the flatfile and goes wherever the var sets it. The flatfile is written to by the controller. Done.
I guess you could also rename an empty flatfile and use that as the controller. I am usure which is faster, open and read a specific file or hit the directory and return the file name. On the controller side, opening and writing to a file vs renaming a file. Maybe they counter each other in resources and time?
This way the site can act as a normal site. When you want to have remote users see a "presentation" (automatically being shown the site pages at the controllers pace), the controller activates polling and tells the viewers to push a start button. This allows a remote instructor to load pages for the viewers at his leisure.
It is a simple solution that works with nothing really sophisticated going on. No frames are needed either. Just need javascript enabled.
Any better suggestions are welcome!
It occurred to me that what you might want to use is HTML Push technology. Check out the wiki, they have several links. I have never used it myself

Web site migration and differences in firebug time profiles

I have a php web site under apache (at enginehosting.com). I rewrote it in asp.net MVC and installed it at discountasp.net. I am comparing response times with firebug.
Here is the old time profile:
Here is the new one:
Basically, I get longer response times with the new site (not obvious on the pictures I posted here but in average yes with sometimes a big difference like 2s for the old site and 9s for the new one) and images appear more progressively (as opposed to almost instantly with the old site). Moreover, the time profile is completely different. As you can see on the second picture, there is a long time passed in DNS search and this happens for images only (the raw html is even faster on the new site). I thought that once a url has been resolved, then it would be applied for all subsequent requests...
Also note that since I still want to keep my domain pointed on the old location while I'm testing, my new site is under a weird URL like myname.web436.discountasp.net. Could it be the reason? Otherwise, what else?
If this is more a serverfault question, feel free to move it.
Thanks
Unfortunately you're comparing apples and oranges here. The test results shown are of little use because you're trying to compare the performance of an application written using a different technology AND on a different hosting company's shared platform.
We could speculate any number of reasons why there may be a difference:
ASP.NET MVC first hit and lag due to warmup and compilation
The server that you're hosting on at DiscountASP may be under heavy load
The server at EngineHosting may be under utilised
The bandwidth available at DiscountASP may be under contention
You perhaps need to profile and optimise your code
...and so on.
But until you benchmark both applications on the same machine you're not making a proper scientific comparison and are going to be pulling a straws.
Finally, ignore the myname.web436.discountasp.net url, that's just a host name/header DiscountASP and many other hosters add so you can test your site if you're waiting for a domain to be transferred/registered, or for a DNS propagation of the real domain name to complete. You usually can't use the IP addresse of your site because most shared hosters share a single IP address across multiple sites on the same server and use HTTP Host Headers.

Automate adding entries to a wiki

Once I have my renamed files I need to add them to my project's wiki page. This is a fairly repetitive manual task, so I guess I could script it but I don't know where to start.
The process is:
Got to appropriate page on the wiki
for each team member (DeveloperA, DeveloperB, DeveloperC)
{
for each of two files ('*_current.jpg', '*_lastweek.jpg')
{
Select 'Attach' link on page
Select the 'manage' link next to the file to be updated
Click 'Browse' button
Browse to the relevant file (which has the same name as the previous version)
Click 'Upload file' button
}
}
Not necessarily looking for the full solution as I'd like to give it a go myself.
Where to begin? What language could I use to do this and how difficult would it be?
Check if the wiki you mean to talk to supports XMLRPC, because if it does it should be a snap. I wrote a tool called WikiUp to solve a similar problem (updating a delineated section on a wiki page).
If you're writing in C#, the WebClient classes might be a good place to start. I bet people could give more specific advice if you mentioned which wiki platform you are using, and whether it requires authentication, though.
I'd probably start by downloading fiddler and watching the http requests from doing it manually. Then you could use some simple scripts and regexes to build your http requests for automating the process.
Of course, if your wildly lucky, your wiki would have a backend simple enough that you could just plug them into its db directly. :)
You might find CoScripter useful -- it's a Firefox extension that allows you to automate tasks you perform on websites. I'm not certain how you'd integrate this with the list of files you're changing on your local system, but it can certainly handle the file uploading through a web form.
Better bet is probably using cURL or a similar HTTP library with your programming language of choice. If you're on *nix, you can use the cURL commandline program inside your shell script to get this done fairly easily. (Like #jsight said you will need to analyze the actual forms you're using on the webpage, using Fiddler or just looking at the form elements and re-creating the POST through cURL.)