The fact that someone is tracking and reporting Internet Page Views worries me a lot. It should never be viewed in isolation as any sort of reliable metric for website performance. With the current site based on frames, this stat is probably skewed even more. Page views are a great stat to hype to people who have no idea what it means, especially when it's done in isolation.
This probably explains why the CVB is happy to keep the embarrassing website that we have today.
If it works, why fix it, right? (Although, even the stats as presented show that it's not working).
From Web Analytics on Wikipedia:
Two units of measure were introduced in the mid 1990s to gauge more accurately the amount of human activity on web servers. These were page views and visits (or sessions). A page view was defined as a request made to the web server for a page, as opposed to a graphic, while a visit was defined as a sequence of requests from a uniquely identified client that expired after a certain amount of inactivity, usually 30 minutes. The page views and visits are still commonly displayed metrics, but are now considered rather unsophisticated measurements.
The emergence of search engine spiders and robots in the late 1990s, along with web proxies and dynamically assigned IP addresses for large companies and ISPs, made it more difficult to identify unique human visitors to a website. Log analyzers responded by tracking visits by cookies, and by ignoring requests from known spiders.
The extensive use of web caches also presented a problem for logfile analysis. If a person revisits a page, the second request will often be retrieved from the browser's cache, and so no request will be received by the web server. This means that the person's path through the site is lost. Caching can be defeated by configuring the web server, but this can result in degraded performance for the visitor to the website.