Jump to content

Wikipedia:Reference desk/Archives/Computing/Early/double redirect study

From Wikipedia, the free encyclopedia

Double redirects are an annoying and reoccurring problem that is only going to get worse over time if left in its current state. This study is to recognize the possible solutions to this problem and come up with a list of their pros and cons. Community involvement is highly encouraged.

The current solutions identified are:

Scan and fix

[edit]

The current method of controlling double redirects involves scanning the entire Wikipedia and either generating reports for human cleanup or automated bot work. Currently Timwi has a bot which can do this, wpfsck can detect double redirects, and there are various SQL methods that can do this. The reports are easy to generate and the fixes are easy to apply. However the solution is overall incomplete. There are two steps to the generation of a double redirect: a page links to a redirect first and then the linked page is changed into a redirect. This happens casually as there is no warning to the user.

The existing method will eventually fix the double redirects (on the frequency of the database dump) but leaves the situation ready to happen again. The next step is to proactively convert links to the redirected pages. This is also easy to accomplish with a bot but will likely amass a significant amount of edits (a proof of concept report is in the plans). With this proactive step the double redirect situation is much less probable. Overall the database will be in a more consistent (perhaps better?) state and users will see far less double redirects during their browsing.

Enforce during edit

[edit]

The MediaWiki software could enforce the very real (but currently not enforced) double redirect rule. Currently from a database consistency standpoint MediaWiki does the worst possible thing: fail silently. It is a minimal failure but it produces an inconsistent user experience with little opportunity for feedback to the user doing the editing. In this situation the problem is unlikely to be solved. However it can be argued that this creates a lower barrier to entry for newbies.

If the software produced an error to the user (making it impossible to create a double redirect) with a clear explanation of why the action is denied it may be possible to educate them and keep the data consistent at the same time. It is also possible that MediaWiki could silently update double-redirects as it encounters them (it is pragmatically a simple problem). BD2412 has noted that this should also apply to page moves; I have a feeling this is what generates the majority of double redirects.

The possible user interface changes could go as follows: When a user attempts to perform a double redirect they receive a message saying (in some kind of friendly words) they are not able to link there. The software could then offer a suggestion as to where the link should be, the user clicks ok, and MediaWiki updates the link for them.

Warn and scan

[edit]

This compromise requires continuation of the "scan and edit" cleanup, but encourages people to clean up their own messes. This reduces the lifespan of the average double redirect and the amount of work that dedicated cleanup volunteers need to do. However, it has the benefit of allowing new editors to continue with their work, even if they make a mistake.