Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
My company has given me a PDF that's already been mastered, packaged, etc. but of course the one thing they didn't do is add a linking hotspot over the 2" square ad space that they've set out for themselves. I have a hard time imagining people click links in PDFs with any regularity but can't find any knowledge to back that up. Are there any benefits to search? As of now the only place this linked version will exist is on the site that the ad itself links to.
Thanks.
I haven't seen anything definitive that says search engines parse PDFs for links or if they help SEO or not. But Google does read and index PDFs, and some flash as well, so those links are probably being seen by them. If I were to speculate about this I would say those links do have some SEO value. All of the rules would apply to it such as anchor text, link popularity (probably to the PDF document), etc.
Most people seem to regard PDF as "Printable Document File" rather than "Portable Document Format". If the document has the website address on it, that is sufficient.
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 months ago.
Improve this question
I have made custom segments/blocks around my website which I use it for advertising/marketing. Google search bots are considering those as part of my website and gets confused of what my site really is versus advertisement.
This negatively impacts my SEO. Is there a way I can register or use certain directives or elements to inform google and other bots to avoid crawling that portion or even if crawled should not be considered as part of that page.
I am aware of robots.txt file but that is for an entire page. I would like to block a certain blocks within each page. This could be a side bar a floating bar.
there's no way to ensure all bots don't index parts of a page. it's kind of a all or nothing thing.
can could use a robots.txt file and with
Disallow: /iframes/
then load the content you don't want indexed into iframes.
there's also the data-nosnippet-attr tag attribute.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am a founder for a tech summer camp program. My website has a page full of resources for web-development meant for camp participants and has been getting lots of traffic from people querying html colors, css cheat sheet, and other similar terms.
My question is: will traffic from these terms hurt my SEO for queries involving things like summer camps,tech camps halifax, or other more related queries? or Is any traffic good for my SEO?
Note: We have no problem with people accessing these resources, so I haven't bothered to password protect it or add robots.txt or anything. The site is compcamp.ca and the resource page I mentioned is compcamp.ca/web-development-design-resources/
Google ranks the site compcamp.ca/web-development-design-resources/ well for search-queries like css cheat-sheet, because the content of your site contains the keywords and so on.
There are no Keywords for "tech camps halifax" and so on. So Google won't rank this subsite.
If you want to rank fpr "tech camps halifax" you have to take content on a site (i would expect the start page) which contains those keywords.
The other way round: Successful search queries on your cheat-sheet sub-site won't hurt your rankings from other sub-pages which delivers different information = different keywords.
I hope this is answering your question, don't bother to ask if not.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have developed a website for a firm that deals in pumps, valves and diesel engines. They require that when an interested user searches with some keywords like "Pump Dealers" or "Valve Dealers", their site should appear in the results. Currently I am not aware of how I can go about this, so my question is what should I do in order for better page ranking. I am using meaningful page titles and have enough text in every page.
Any suggestion is welcome.
Firstly Pagerank is irrelevant these days, so don't worry about that.
You should ensure that you use Google's Webmaster Tools to check that Google knows about your site etc. This will tell you what things it is coming up for on Google.
Make sure that the page has the text on it you want to rank for - as you mention, titles, headers etc will help but don't over do it.
The main thing to do is to get links to your site – write interesting blog posts, contact customers etc so they link to you.
It really depends on who your competition is for those terms - if there are already 10 huge companies ranking for those terms then you are stuck.
The other way to do this is to buy Adwords – this will likely cost upwards of $5-10 a day to get any meaningful traffic though.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
When we find in search engines, the text we typed is seen in result list with some links as it is. But the matter is, when we go to those sites, the same text is also there, but saying 'not found'.(For example, lets say, we type 'best software to doSomeThing' in google, the search results shows results including what we typed. When we go to some links listed in search results, those sites also have the same text 'best software to doSomeThing-Not found or saying what ever..' the amazing thing is, some sites is not relevant to what we find. that means if we find a software, some sites on tourism,drugs also says about our software)
I want to know how those sites catch what the search engines finds or what we type in search engines?
Is it something done with Javascript or any other methodology?
You can check the HTTP_REFERER and parse the query string looking for q=
these can be done using several ways
you can use query string parameter which is appended with url of a page u you want to visit
or you can use hidden fields in webpage,like view state,control state
hope this helps...
Edit:
here is the link,that shows basic functionality of query string..
http://dotnet.dzone.com/news/aspnet-query-strings-client-si
edit 2: check
http://docs.oracle.com/javase/1.4.2/docs/api/java/net/URL.html#getQuery%28%29
and this ones too
http://docs.oracle.com/javaee/1.3/api/javax/servlet/ServletRequest.html
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
lets say i have a website that i am developing...
the site may have wallpapers, question & answers, info (e.g imdb,wikipedia etcetera)
what do i need to do so that when some search engine analyzes a particular page of my website for particular, lets say 'XYZ', it finds 'XYZ', content it finds 'XYZ' content if it present in that page...
please i am new to this so pardon my non-techy jargon...
The most important tips in SEO revolve around what not to do:
Keep Java and Flash as minimal as is possible, web crawlers can't parse them. Javascript can accomplish the vast majority of Flash-like animations, but it's generally best to avoid them altogether.
Avoid using images to replace text or headings. Remember that any text in images won't be parsed. If necessary, there are SEO-friendly ways of replacing text with images, but any time you have text not visible to the user, you risk the crawler thinking your trying to cheat the system.
Don't try to be too clever. The best way to optimize your search results is to have quality content which engages your audience. Be wary of anyone who claims they can improve your results artificially; Google is usually smarter than they are.
Search engines (like Google) usually use the content in <h1> tags to find out the content of your page and determine how relevant your page is to that content by the number of sites that link to your page.