Dynamic Image for Facebook Opengraph og:image meta tag - dynamic

I am trying to share an image from my website on facebook. The image can be dynamic, but other meta will remain the same.
Is there a way I can have dynamic data in OG:IMAGE tag, or will I have to go with the other option of FB Post APIs.

Yes and no. Facebook scrapes your site once, and cached the metadata that it finds, unless you specifically go here and force the scraper to crawl your site again. The cache does generally expire (perhaps after 1-2 days?), so when requested again outside of this cache period, Facebook will crawl the site again.
You can have a dynamically generated og:image meta tag, but it will only be read the one time (per cache period), and only that instance of the image will be saved.
For example, if user A shares your page, and your page returns imageA.png in the og:image tag, then that is the image that will be associated with your page's metadata.
If user B then shares the same page within the same cache period, Facebook will forgo the metadata scraping and assume that imageA.png is still a valid og:image.

Take a look at this Gist.
You should ping facebook to recrawl your page again every time you update your og:image tag.
def share_facebook_fanpage(link,msg,PAGE_ID,OAUTH_ACCESS_TOKEN,apiversion='v2.8'):
BASE_URL = "https://graph.facebook.com/%s" % apiversion
POST_URL = "%s/%s/feed" % (BASE_URL, PAGE_ID)
# force facebook scape the link first to avoid not showing thumb
f = requests.post(BASE_URL, data={
'id': link,
'scrape': True,
'access_token': OAUTH_ACCESS_TOKEN
})
# share the link
r = requests.post(POST_URL, data={'access_token': OAUTH_ACCESS_TOKEN, 'link': link, 'message':msg})

Related

I am needing to do an internal link that is in FAQs on a site using Vue 3 and Inertia.js?

I have an array of content coming from a database that will be displayed on a page as a group of FAQs. Some of the content will have links to other internal pages on the site. How do I link to the pages using Inertia's link component so that a full page refresh doesn't happen?
It depends on what is returned after using the link. If you return a full view in the response, the page is reloaded. If you return a small JSON object or something else, you can process it without full loading.

Force SPA route navigation to return 200

I published my website (a SPA made with vue) to Github-Pages. This website uses "history mode", so the # does not appear when navigating to a different "page".
When direct URL navigation (user types website.com/downloads for example) or a refresh while not on the root page happens, the website tries to display 404.html.
When the 404.html loads, it redirects to the homepage, passing the route name taken from the URL:
<script>
const segment = 1;
//Gets the relative path and the hash of the URL.
sessionStorage.redirect = location.pathname.split('/').slice(0, 1 + segment).join('/');
sessionStorage.hash = location.hash;
//Forces the navigation back to the main page, since it's the only entry point that works.
location.replace('/' + location.pathname.slice(1).split('/').slice(segment).join('/'));
</script>
For the user, it is a bit noticeable, but it will display the correct page.
But while loading, it will report a 404 in the network tab, which could cause issues in integrations with other websites.
Is there anyway to fake a 200 response when loading these pages?
This is a typical issue with Single Page Applications using history mode (history.pushState) to simulate a full URL so that a page isn’t reloaded when the URL changes.
Since vue.js is an SPA framework, it means there is only one HTML and tag containing the “app” id. Due to this disadvantage, Google bots would not be able to read the content of a particular landing page and your website might not get the higher rankings. To make Google bots read the content, you can use two method, “Pre-rendering” and “Server-side rendering.”
Also you can try using routing in <li> and <a> tags and buttons instead of href=“/path”. Using a router link makes page navigation very fast and it benefits the SEO of your website as well.

Vuejs SPA Change meta tag content from third party api

I've tried to change meta tag content like
document.title = response.data.seo_page_title[0].text;
document.head.querySelector('meta[name=description]').content = response.data.seo_description;
it change the content when inspect the page but not when i use view soure. So Google, FB and Twitter can't recognize the updated content and doesn't load the proper text when i try to share the page on social media.
View Source shows the original source that has been fetched from the network - it is not the current snapshot of the page. As you have noticed - the current snapshot of the page is available in the DevTools. If you need Google / FB / etc. to see the proper text - you will have to produce this text on the server side (when composing the HTML template) instead of in the browser.

Google Search Result with indexed file and map

When we google for some business, it is displaying the result with all page indexed and Map. What changes we need to do in our website to display in such way.
For example:
https://www.google.com.sg/search?q=emhealth
For site links that displaying below the home page, Google will detect it automatically. Just focus on your SEO and Google will change it by itself.
For a map, you need to register on Google Business: https://www.google.com/business/
Google will index when you completely finished. when you doing local listing or local directory submission than your site map show SERPs.

There is a way in Tumblr to get media images URL in the same domain of the blog name?

Lets say my blog is http://foo.tumblr.com.
All the post's images are stored in xx.media.tumblr... (for example: https://24.media.tumblr.com/tumblr_kzjlfiTnfe1qz4rgho1_250.jpg) (first 2 numbers can be skipped)
But i want the URL of the image be in the same domain of my blog, and looks something like this:
http://foo.tumblr.com/tumblr_kzjlfiTnfe1qz4rgho1_250.jpg
(that doesn't exist)
Why i need that? I am creating a script, and it generates a canvas that detects if the image have transparency with a getImageData (all the .jpg are skipped), but since the subdomain is different, i get a cross-domain security error, and the canvas is tainted, avoiding the use of getImageData.
So.. how can i do that?
I think Tumblr API could be useful, but how?
Scrape your sitemap for all posts and get their images. You could use the API or just with Javascript in the browser console:
xmlin = prompt(); // view-source:biochemistri.es/sitemap1.xml
parser = new DOMParser();
xmlDoc=parser.parseFromString(xmlin,"text/xml");
xmlDoc.querySelectorAll('loc')[0].remove();
posts = xmlDoc.querySelectorAll('loc');
postlist = [];
for (i=0;i<posts.length;i++) {postlist.push(posts[i].innerHTML)};
...to generate an array containing all posts, which can be navigated through for photo posts (div.post.photo) and their URLs copied.
Then simply generate a new list of images with a for loop and newImg = document.createElement('img'), setting an origin attribute using newImg.setAttribute('origin') = myPhotoList[n] which can then be used to select an image programmatically:
document.querySelector("img[origin='"+{PhotoURL-HighRes}+"']"
(or {PhotoURL-1280}, {PhotoURL-500}, {PhotoURL-250} etc. Once retrieved over an XMLHttpRequest, you could simply switch the post in the DOM. The {PhotoURL-HighRes} in my example above wouldn't work, it'd be an attribute from the page I'm just indicating which part you'd want to get from the theme HTML.
As explained in this post, there is a variable which could be used as a more concise attribute than the full origin URL if you want to be a bit more specific with regular expressions.
This would effectively put all of your images onto your local URL, with URLs like foo.tumblr.com/images/tumblr_kzjlfiTnfe1qz4rgho1_250.jpg, and avoid cross-domain restrictions. I'm guessing it'd work only if you don't have a ton of posts as custom pages such as you'd be using to store images do have a restriction on their size (though I suppose you could make a second one).
Also might be sensible to include CSS to set display: none in case anyone stumbles upon the page by accident, and a redirect function to the homepage with window.onload or similar.