authenticity of web content?
are there any rules or regulations in place to ensure the quality of web content? can we trust wikipedia and other websites? does google or other search engines have any mechanisms in place?
- Anonymous7 years agoFavorite Answer
The 'delivery' of content (in an unadulterated form) from a website can be enhanced by using SSL encryption, which is a protocol of data transmission (TCP/iP).
This assures content is not replaced or modified during transit across the Internet.
The actual content of any website is ungoverned, and indeed, the Internet is for all intents and purposes 'the Wild West'...and anything goes.
Mix into that hodge-podge the possibility that a site (even a well known & widely used one) can be hacked and it's genuine content revised by persons unknown, and you have yet another element of doubt to deal with.
Trust in a website is just that (as so much is regarding everything on the Internet): you trust someone else for reliable and competent data or facts.
Largely it's a matter of reputation, and 'track record' of proven or reliable information.
Wiki has a policy of open donor contribution, so anyone can do it.
But don't take my word for it (unless you trust me) ;)
- 6 years ago
SSL have nothing to do with the Authenticity of web content.
The content quality is like 80% of an successful website. Recently http://thehostbay.com published an article on " The new rules of content quality " . Google & Bing have start look more on the website content quality . You can read the full article here: http://thehostbay.com/new-rules-content-quality/ . The are also more articles there where you could read.Source(s): http://thehostbay.com