DCC website - denying access to archival spiders

Karen Anderson made this Official Information request to Dunedin City Council

Response to this request is long overdue. By law Dunedin City Council should have responded by now (details and exceptions). The requester can complain to the Ombudsman.

From: Karen Anderson

Dear Dunedin City Council,

My preference is to receive the requested information by email.

It is commonplace to use links from “web archive” type sites when referring to web pages in scholarly discussions. This avoids “link rot” and ensures the correct version of the page is being considered.

An example of these is the “Wayback Machine” found at https://archive.org/. That operates by "spidering" web collecting and archiving websites. It also allows a user to identify a specific page for archiving and then use the archived “link” when referring to the material.

The robots.txt for the DCC website denies access to the “spiders” used by these kinds of websites. This prevents the routine archiving of web pages as well as specific page requests for scholarly and other legitimate uses.

Accordingly I have been requested by members of the Dunedin Dog Bylaw Group to ask why the DCC has set the robots.txt to specifically deny access to legitimate and credible web archival spiders.

Yours faithfully,
Karen Anderson

Link to this

Things to do with this request

Dunedin City Council only: