I think these downloads are intended for people to reuse the content on other websites or other ways. For example, Answers.com reuses the content on their site - http://www.answers.com . Google Earth also uses the content, taking the geocoordinates and locating the articles on maps in Google Earth, making a "Wikipedia map layer".
If you are into web programming, php has the ability to parse the xml files and allow you extract content:
php5 - http://uk2.php.net/manual/en/ref.xmlreader.php
The SQL database dumps work easily to bring them back into MySQL for use in web applications (web sites) or other ways.
Otherwise, as an alternative, Webaroo allows you to download Wikipedia, in what looks to be an easy to use way for offline browsing: http://www.webaroo.com/