I load in my app some Facebook image of users (and for performance I load the url of this picture on the DB). But when the user change the profile pic the url must be changed too.
How can I detect the response of an external server (e.g. when is 404) so I know that I must refresh the url pictures of this user?
You could use rest-client, execute a head request to the url and check the response status.
Related
I have a relatively large app where there are a lot of user profile pages. I want to make it so that if you share one of the user's profile page it will preview their name and picture on social medias like FB and Twitter (think sharing a Twitch streamer's page on Twitter). I used create-react-app to start the project so I don't have server side rendering or any middleware for pre-rendering tools. Is there another way I can accomplish this?
There two ways you can get this to work
Is the server your files via express server and check for who has the made the request by checking user-agent header from request and if its a bot instead of sending them the usual response you can fetch the required user profile data and use that data to populate the open-graph metatags and return them the HTML with those metatags.
Second way would be to use a network interceptor from the CDN you're using to identify the who is requesting the page (either bot or a person) if its a bot, make a request to your backend to fetch related data and send them the HTML with populated metatags.
Explained approach
Every time a request comes into our server, it comes with a header value user-agent which tells the server who is requesting the resources (human or a bot from Facebook trying to do link preview). Just by comparing a list of known user-agent (so it won't work on all but will work all know platforms and 90% of others.)
Let's say we have something.com where we want the link preview and let's say a request comes for something.com/john. What we will do is check for request that is coming to the server and will check for user-agent property, if its a human it will be redirected to our normal site but if its a bot (so it just wants an HTML for link preview) what we are going to do is since it's our server we can grab the data of akkshay and set the proper metatags inside our HTML and send it back as a response.
So what happens here is whenever a human tries to go for something.com/john he will be redirected to our landing page as he is more concerned about what he sees on his browse but when a bot comes in we will send it HTML response with proper metatags as its the link previews which is the concern for the bot.
This thing can be done on our express server with something like this. But this can also be done infrastructure level.
I'm try to redirect a illegal access and bring user to a log-in page, if user get permission and continue to access original, I need to keeping original request url. I try to write original url into http header zone, but I cannot retrieve this data from client.
Did apache2 or other module ignore custom http heaer? or I just miss something?
(BTW: I dont like use querystring, think about maybe next page still come as a redirection)
code example:
ap_set_content_type(r, "text/html");
apr_table_add(r->headers_out, "Location", conf->authurl);
apr_table_add(r->headers_out, "RequestUrl", url);
return HTTP_MOVED_TEMPORARILY;
// following code will be work fine.
apr_table_add(r->err_headers_out, "RequestUrl", url);
see as:
https://source.jasig.org/cas-clients/mod_auth_cas/tags/mod_auth_cas-1.0.9.1/src/mod_auth_cas.c
I have the following requirements for a (Rails) web application that uses S3/Cloudfront for image storage:
A user may only see an image if they are logged in. If the user sends an image URL to a friend, it will not work.
If a user has seen an image, it should be cached by their browser, so they don't have to download it again.
…
Requirement 1 can solved with S3's Query String Authentication (QSA)
(e.g. with 30 second expiry).
Requirement 2 can be solved using HTTP
caching.
Is it possible to use them both together?
The challenge I'm facing is that QSA effectively changes the URL of the image after expiry, even though a perfectly good copy may reside in the browser cache.
I am trying to load test a webapp, which has following functonality
1. Login in app (setting some cookie variables )
2. Serach customer with some parameter
3. Get detail of particular customer
4. Logout from webapp
When i am running Jmeter i am getting status code 404
Any reference or help will be appriciated.
After googling i found 4** says you have sent bad request.
To check what request has been sent i am using fiddler and capturing original request (which is working from browser ) and request sent by Jmeter , I am comparing data under Inspector tab in Headers in fiddler, Is this right way to resolve the issue of 4**. What else i can do to fix this issue ?
Screen shots attached
I think, the HTTP header manager and HTTP cookie manager must be pushed up just before Recording Controller. Otherwise the requests are made without these header or cookie informations.
If you doing localhost testing should in the cookie manager config domain
localhost:8080.
You can see the request/response in View Results Tree. Just click on the tab Request or Response data respectively. If you're getting a 404, chances are the Response data tab will contain the 404 with (hopefully) information about the values that are invalid or missing.
I am making app that takes a screenshot of a URL requested by the user. I want to make it as transparent as possible when sites that require username and passwords are in question.
For instance, if user wants to screenshot its iGoogle page, he will send the server URL but, the screenshot will not be the same as what he sees on his screen.
Is there any way to do this ? I guess that in such cases I will have to actually request screenshot from the user. Perhaps user can even deliver me his cookie for that domain.
Any thoughts ?
Ty.
Yes, in most cases you'll need user's cookies.
If site uses regular cookies, you can create bookmarklet that reads document.cookie. This will not work with httpOnly cookies which are used increasingly often for sessions.
Some sites limit sessions to certain IP, and in that case you can't take screenshot without proxying request through user's computer.
If you can get user to use bookmarlet, an interesting trick would be to read and send DOM to your server:
image.src = 'http://example.com?source=' +
escape(document.documentElement.innerHTML);
For HTTP authentication easiest solution would be to ask user for login/password.