I have a Google Classic Site and I do not have access to the Robot.txt. I do have access to the html code. How can I prevent text or a portion of a page from search engine indexing? Thanks.
Thank you for the link. Since these tags would need to be place in the Head section of the html code, would I be able to use them to prevent search engines from indexing specific text/portions of a page or would these tags only prevent search engines from indexing the entire page or both?
The meta robots directive should block the entire document.
There is no standard method for blocking the content of specific elements in a document from being indexed, but you can accomplish this by putting the content in an external document with that document URL in an <iframe> element src attribute and having a meta robots directive in that document.