I rewrite my site completely and now google show 17000 link errors from the old site. what's the best solution for this as i cannot manually remove one by one url or disallow one by one to robots.txt
Can a new sitemap help to resolve this ?
Have you looked at this: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
That is the only thing that I can really tell you. A new sitemap should resolve this.
Related
I'm having a helluva time trying to figure how to link and download a pdf on my Hugo site.
I use [Download Resume](/static/Resume.pdf) and it creates the link but I get a 404 error even though it's definitely the correct path.
I bet I'm missing something extremely basic but neither by Hugo guidebook nor their documentation makes it clear on how to do this.
Any help would be greatly appreciated.
Well, all I had to do was remove the /static/ direction and it worked. Hope this might help someone in the future!
I've been using the old Search Console despite the new have been around for a while already, but as Google has stopped to support the old version, I have started to transition to the new one.
I started doing technical SEO audit for one project and was really irritated that couldn't find robots.txt tester from the new Webmaster Tools and all Google's tutorials point to the old version (tester still works there) so my question would be that is there even robots.txt in the new version or am I just missing something?
It does not look that it is added. When I tried to search the robot.txt using new webmaster help box, it brought the robot.txt but if you see the navigation is pointing to the old webmaster. It still opens the old webmaster for robot.txt test.
I hope this helps to resolve your issue.
Refer the highlighted URL which is still direct me to old webmaster.
You can still access the old search console.
The head of GSC recently said that he suggests using the old and new search console as they have not done a good job of transferring everything over.
I've had this site (donake.net) up for about a year. Was notified by someone that the link on one of the pages to a PDF wasn't working and a link to another page wasn't working.
When I tried logging in, it wouldn't take my username and password. I realized that I need to update the version of Joomla on my GoDaddy hosting account. Once I updated the version of Joomla to 3.5.1, I was able to login and access the admin side of Joomla. I think the site was attacked because there were a lot of "registered users" that weren't real.
One of the pages was set to unpublished, so I published it and that link started working fine.
The link "VIEW A SAMPLE of the book here" on this page - http://donake.net/just-make-me-a-sammich-book still won't work. I've re-linked the PDF. Deleted the PDF and re-uploaded it with the file name changed and nothing has worked.
One other thing is the icons in the Admin side of Joomla aren't displaying either. Not sure if all of this is tied together or not. My MAIN CONCERN is getting the link to work.
Any help would be greatful!!!!
Thanks
I had the same issue, I ended up calling our hosting company and they helped me edit the .htaccess file to include pdf in the RewriteRule (for me it was on line 3).
Hope this helps because I know it drove me mad trying to fix it.
Good luck!
I want to redirect to a maintenance page when my apache server is down.I tried through
htacess
and nothing worked.. Please direct me to some working blog.
Thanks...
The .htaccess file contains no executable program code to do some activities. If you want to show a webpage then you need a running webserver. If you have no one bad luck.
Take a look at this post about the technicel enviroment to reach your goal.
Edit: I found another post about this topic, though the same thing is the use of an additional webserver.
i received a messages with the above notice "Notice of Suspected Hacking on my website. I'm not sure what happen here, however, I noticed there were some pages listed and they are not apart of my website??
Has anyone received the same message before? If so, what did you do to solve the issue?
and what i do for the next?
May be some malware virus introduced that pages in your site. Remove all pages and rescan them and upload again. May be it will solve your problem.Also check if there are any suspicious javascripts included your pages.
It means that Google detected malicious links in your site... (generally spam).
Generally a quick scan at http://sitecheck.sucuri.net finds the issue.