Duplicate Content
Yahoo suggests that webmasters should try to have only one unique URL for one page of content.Well according to them they can easily spider out the dynaic urls but it is easier and faster for search engine to spider the website with static pages as they'll be crawled before the others.
Duplicate content and google
One more thing the faster your website get crawled the better you ranked this is my mantra
Even better than CNN or other websites.. Cnn name is given just for example.
Duplicate content is a thing which is disturbing many of the webmaters nowdays.It is found in most of the cases that a webmaster has no influence on third parties that just scrape out the material and redistribute the content without the webmaster's consent.
Well if it is a generall thing then it is sure that any webmaster cannot stop person from doing the same.But if the content is protected through copyrights then also many people create that duplicate content and webmaster cannot do anything about it.There was a very big question around the search engines as which content provider is the original provider and which one is copier.
Well,it is not sure shot from google and they tell in their forum that the original content website will not be having any negative effects but it might happen as search engine are all not perfect.
we all know that there are two problems faced by the webmasters generally duplicate content within website of webmaster and the duplicate content of third party.
both can hurt sometimes generally the first situation occurs with dynamic websites and as many people are not familiar with cms systems like joomla etc. I have been watching many forums of joomla and mambo they have problems relating to sef urls and content duplication due to the icons which produces pdf copy and other that results in duplicasy of content.
well so we here see what matt cutts have to say about these according to him
1) Avoid over-syndicating the articles that you write,
2) if you do syndicate content, make sure that you include a link to the original content.
That will help ensure that the original content has more PageRank, which will aid in picking the best documents in our index.
Prevention of duplicate content: according to http://googlewebmastercentral.blogspot.com's post They tell to check out Adam Lasnik's post Deftly dealing with duplicate content and Vanessa Fox's Duplicate content
summit at SMX Advanced
well what google considers when it see the external proxy websites and other external websites which copy your matter.
According to google when they encounter any such duplicate content on different external websites, they first look at various signals to determine which site is the original one, which usually works very well.
They tells that it is not worrying matter if somebody is scrapping your websites content and there are no negative effects on your website's presence on Google if you notice someone scraping your content.
well as far as me is concern i think google's signal of finding the original one starts as soon as there is a website in front of the search engine sipder then that spider evaluates that whether that webite was already there before if it was then there is much possibilites that older website have the genuine content,it is sure that a new website will have some content which will be matching with older websites but if that proportion increases then google will be putting them down in results.
it is like this that >> Older website >> More genuine and original content when compare to new ones and will be ranked better than new ones if there is no new thing involved in that content.Links again play an important factor in it.
What google suggets to prevent your content from being ranked back>> if it is genuine or real
Have an eye that if your content is still accessible to google crawlers.
Don't block your content till it is necessary from being crawled.You might unintentionally have blocked access to parts of your content in your robots.txt file.
Update and add your sitemap regularly.
Check if your site is in line with google webmaster guidelines
Duplicate content due to scrapers
1 comment:
Do you still use free service like blogspot.com or wordpress.com but they have less control and less features.
shift to next generation blog service which provide free websites
for your blog at free of cost.
get fully controllable (yourname.com)and more features like forums,wiki,CMS and email services for your blog and many
more free services.
hundreds reported 300% increase in the blog traffic and revenue
join next generation blogging services at
www.hyperwebenable.com
regards
www.hyperwebenable.com
Post a Comment