I've always said, newspaper and tv websites are the worst. They're just disseminating information, basically readonly, they don't have to be complicated, but they are! Almost all of them I turn js off, browse, jump to h1, and there's the story. Well a few don't work like that, like nasa.gov, and I've already talked about that one at length. Yesterday I wanted to read a story that somebody linked to, https://www.goodmorningamerica.com/culture/story/real-life-iron-man-robert-downey-jr-launches-63499797 With js off, nothing! With js on, just one line. Time for debugging. I fixed one thing after another after another after another, see the various commits in the last 24 hours, and if you pull all those then bring up the site, 402 lines, including the story, and links to various other stories. So it works! The story is fluff, cause it's GMA and not a real science outlet, but oh well. The point is, this kind of debugging was flat out impossible without the snapshot() feature that lets me make a (mostly) local copy of the website. Then I put in alerts and breakpoints and so on. The local copy isn't true in every way. When I bring it up I only get 169 of the 402 lines, not sure why, some xhr stuff pulls in the story and has to run off the real website, but stil, still, all the debugging I could do, and it's kinda fun. Brousing this site is really slow, as are some others, and I should probably stop debugging and look at performance, cause you don't really want to wait 2 minutes for this news story. Karl Dahlke