On this question there seem to be two approaches: an eValid script and a simple web-server "last modified date" check.
For the eValid solution, all you need to do is make a recording of what you think has to be constant from week to week, and then if you run the script a week later and something changes then eValid will FAIL that test (and that makes you aware of the change). You could even record all of the visible text of key pages -- which results in very long script files, but no matter, eValid will detect even single-bit changes this way.
The other solution is outside what eValid can do, but if can access the web-server machines then you can run a simple find command that lists the "last modified date" for all of the files in the website...and this will tell you what's changed. But a cautionary note: the content of the site may have changed and the files may not have been modified. So this route may not be 100% reliable, particularly with some of the subject website is delivered from 3rd parties.
The eValid solution will work reliably, because it would examine what was delivered from wherever and however to the browser face. That solution will be accurate, indeed, but may involve a lot of work if the number of pages is large and/or the expected regions of change are many.