How to Remove Submitted Content from Search Engines (Google)? - seo

I have submitted 1000 pages to Google included in my sitemap, but I did it by mistake. I didn't want to submit all, I wanted to submit 800 and release 1 page per day during 200 days in order to add new fresh content every day, that way Google would see it as a frequently updated website which is a good SEO practice.
I don't want Google to know about the existence of those 200 pages right now, I want Google to think it is fresh content when I release it every day.
Shall I resend the sitemap.xml with only 800 links and hide the pages in the website?
If Google has already indexed the pages, is there any change to make Google "forget" those pages and not recognize them in the future when I release them back?
Any suggestion about what to do?
Thank you guys.

I wouldn't do that. You're trying to cheat, Google doesn't like this. Remain your site as it is now, create new content and submit it into Google's index as frequently as you want. If you'll exclude previously submitted data from the index, with a high probability it won't be indexed again.

Related

Google suddenly stopped indexing my posts after I changed the categories

Why suddenly Google stopped indexing my posts? I started posting regularly in October. And the site got Adsense approval. I just changed the categories and front design. Some posts were in the top results but they lost ranks too. Domain age 2 yrs.
All of my posts got indexed after I submitted request. But after changing the category structure new posts are not indexing. But when I run live test search console says, "the post is ready to index" and no error sign shows up.
If it’s for the category change then how can I fix that?
I repeatedly submitted imdexing requests after running the live url test. I also submitted a new site map on the search console. But no luck. Google indexed no new post.

Multipage Bootstrap and Google Analytics

I have sort of a problem how to use Google Analytics properly with Boostrap.
My page has 3 level deep subpages and the last subpage has it's own subdomain. In GA I see I can use max. 50 tracking codes within one service. What if I need more than that?
You are limited to 50 properties not 50 pages. Each property can track many pages and (up to 10 million hits a month for the free version) and events.
Typically you would use the same property code on all pages on the same site so you can see all that data together (though with option to drill down).
You would only use a new property code for a new site (though your subdomain might qualify for that if you want to track it separately).
So the two questions you want to ask yourself are:
Do you want to be able to report on two pages together? E.g. To see that your site gets 10,000 hits and 20% are for this page and 5% are for that page. Or people start at this page and then go to that page and then on to this page. If so it should be the same analytics property.
Do different people need to see these page stats? And is it a problem if they do? If so put as a separate property so you can permission separately.
It sounds like these are part of the same site so I'd be veering towards tracking them together on same property.
On a different note you should set one page as the main version (with a rel canonical tag) and redirect other version to that page to avoid confusing search engines thinking you have duplicated content. Do you have a reason for having the same content on two different addresses? It can cause SEO and other problems.

Make search engines distinguish website chronological updates over time (like in forums)

I see that search engines are prominently capable of finding pages chronologically for forum websites and the like, offering the option to show the results for the last 24 hours, last week, last month, last year, etc.
I understand that these sites need to be continuously crawled to provide those updates, but I have technical doubts about what structure, tags or whatever I need to do to achieve it for my website.
I see that at the client side (which is also the side search engines are at) content appears basically as static data, already processed by the server, so the question is:
If I have a website for which I update and add content constantly to the index page to make it easily visible, and for which I even add links, times and dates as text for the new pages, why don't these updates show at all in search engines?
Do I need to add XML/RSS feeds, or what else?
How do forums and sites with heavy updates with a chronological mark achieve the capability to allow search engines to list results separated by hours, days, etc.?
What specific set of tags and overall structure do I need to add for this feature?
I also see that search engines, mainly Googlebot, usually take a minimum of 3 days to crawl those new pages, but still, they aren't organized persistently (or at all) in a chronological way in search results.
I am not using any forum, blog or other kind of web publishing software, just raw HTML and PHP written by hand, and the minimum I mentioned above, of pointing to new documents from the index page of the website along with a description.
Do I need to add XML/RSS feeds, or what else?
Yes. Atom or one of the RSS formats (or several formats at the same time, so you could offer Atom and RSS).
Search engines will know about new blog posts, microblog post, forum threads, forum thread answers etc., because they subscribe to the feed. So sometimes you'll notice that a page is indexed by a search engines only minutes after it was published. But for smaller sites, search engines probably don't check for updates every few minutes, instead it might take even days until a new page is indexed.
A sitemap might help, too.

Getting data from Google Spreadsheets

I quickly made a little form in Google Docs that lets people insert the most current attraction wait times at Disneyland and submit them to a Google Spreadsheet. I want to make a web page that will display the bottom, most recent row from that spreadsheet so the current wait time for each attraction is always displayed when someone visits the web page. Is there a possible way already to share and embed just the bottom row of data from the spreadsheet?
Hooray for google's api documentation section, although it's hard to sometimes find the right section... I've never done this before but it looks pretty straightforward
for list based feeds
see this: http://code.google.com/apis/spreadsheets/data/3.0/developers_guide.html#ListFeeds
or for cell based feeds
see this: http://code.google.com/apis/spreadsheets/data/3.0/developers_guide.html#CellFeeds

New site going on to an old domain

I have a client who over the years has managed to get their product to the top of Google for many different search terms. They're adamant that the new site shouldn't have a detrimental effect to their google ranking.
The site will be replacing the site that is on there current domain, as well as going up on to 5 further domains.
Will any of this lose the client there current ranking on google?
Google re-ranks the sites it has regularly. If the site changes, the ranking very well could... if more or fewer people link to it or if the terms on the site (the content) is different.
The effect might be good or bad, but uploading different content isn't going to make their rank go away overnight or anything like that.
Page Rank is most about incoming links. So if the incoming links won't be broken page rank will not be affected that much.
Though, overall ranking is not just Page Rank, so... further discussion is needed
if they retain current link structure they should be fine