How to exclude certain portions of my website from bots/web crawlers? [closed] - indexing

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 months ago.
Improve this question
I have made custom segments/blocks around my website which I use it for advertising/marketing. Google search bots are considering those as part of my website and gets confused of what my site really is versus advertisement.
This negatively impacts my SEO. Is there a way I can register or use certain directives or elements to inform google and other bots to avoid crawling that portion or even if crawled should not be considered as part of that page.
I am aware of robots.txt file but that is for an entire page. I would like to block a certain blocks within each page. This could be a side bar a floating bar.

there's no way to ensure all bots don't index parts of a page. it's kind of a all or nothing thing.
can could use a robots.txt file and with
Disallow: /iframes/
then load the content you don't want indexed into iframes.
there's also the data-nosnippet-attr tag attribute.

Related

What sort of URL structure should be used to display AMP HTML vs vanilla HTML [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
There's an existing wordpress plugin that creates AMP formats automatically by adding /amp onto the end of any posts URL. I'm worried about duplicating my content at multiple URLs and wondering if adding some parameter like ?v=amp would be better? Also, if a parameter is used to render the page via AMP, how do we let Google know about these pages? Can we submit a separate AMP sitemap?
The AMP standard uses a Canonical link to define the original source and a link rel=amphtml on the canonical page to point to the amp version. Google understands both so no action is needed on your part if you amp pages pass validation.

How to ensure search engines index specific version of site [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am just starting to work on my first responsive site and am realizing some SEO things may fit on the front page of our site may not fit 'naturally' on the front page of the mobile version.
Is there anyway to ensure search engines see the full-size site?
Once complicating matter is that I am designing the site 'mobile first'. So the site does not default to full-size, it defaults to mobile sizes.
Assuming you deliver the same content to the end user regardless of device, and just show/hide or reformat based upon a media query, it really doesn’t matter. Google will still get the full content of the page so will index all of your content. What is visible in the viewport isn’t really significant to Google.
Google will, however, understand the use of media queries and give you some additional SEO benefits as a result. Google favours responsive design over separate sites/mobile specific pages. Responsive design also helps improve the indexing efficiency of the Googlebot.
One thing they do advise is not to block access to any of your ‘external’ resources (css, js, images etc)
Plenty of good information here

Search engine page creating [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I noted that Google use web page content when they index pages for SEO purposes. Therefore, what I did was I created the web pages and I used lot of keyword on the web pages. Then I applied the background color to the above keywords to show users.
Question is do they block this kind of pages?
Search Engine Optimization (SEO) is something you really need an expert for these days. The days of having some keywords and meta-data only have long gone, so you need to keep up to date with current SEO tricks to get your site up the Google ranking. You can also check the Alexa rankings for your website.
Take a look at the SEO guidelines from Google here
Take a look at some pointers here and here, but you really need to invest some time and research into the best practices.
You should also make your site as accessible as possible, this will make the site easier to spider, there are some tools here to look at and there's a site here you can use.

Is serving a bot-friendly page to google-bot likely ot adversly affect SEO? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have a site which has a homepage that contains a great deal of javascript. I am concious that this isn't great for mobile clients, javascript-less browsers and crawlers/bots. The page uses propper <noscript /> alternatives, alt attributes, etc.
The user-agent can easily be sniffed to serve up a the page content without Javascript (there is a non-javascript version of the content already on the site), but I don't want to be seen to be cheating to crawlers (google-bot).
Humans that use mobile-clients and javascript-less browsers would surely appreciate a tailored version (given an option to switch back to the full version if they want). Bots might think they're being cheated.
Finally, the site has been indexed very well so far, so I am tempted to not tailor it for google-bot, just for humans that use mobile clients and javascript-less browsers. It seems like a safer option.
If you serve different content to the search engines then you do your users you are cloaking and definitely in violation of Google's terms of service.
The proper way to handle generated with JavaScript is to use progressive enhancement. This means that all of your content is available without JavaScript being required to fetch or display it. Then you enhance that content using JavaScript. This way everyone has access to the same content but users with JavaScript get a better experience. This is good usability and good for SEO.

SEO : things to consider\implement for your website's content [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
lets say i have a website that i am developing...
the site may have wallpapers, question & answers, info (e.g imdb,wikipedia etcetera)
what do i need to do so that when some search engine analyzes a particular page of my website for particular, lets say 'XYZ', it finds 'XYZ', content it finds 'XYZ' content if it present in that page...
please i am new to this so pardon my non-techy jargon...
The most important tips in SEO revolve around what not to do:
Keep Java and Flash as minimal as is possible, web crawlers can't parse them. Javascript can accomplish the vast majority of Flash-like animations, but it's generally best to avoid them altogether.
Avoid using images to replace text or headings. Remember that any text in images won't be parsed. If necessary, there are SEO-friendly ways of replacing text with images, but any time you have text not visible to the user, you risk the crawler thinking your trying to cheat the system.
Don't try to be too clever. The best way to optimize your search results is to have quality content which engages your audience. Be wary of anyone who claims they can improve your results artificially; Google is usually smarter than they are.
Search engines (like Google) usually use the content in <h1> tags to find out the content of your page and determine how relevant your page is to that content by the number of sites that link to your page.