Jump to content

User:RobinBot

From Wikipedia, the free encyclopedia

RobinBot

[edit]

RobinBot is a bot which is proposed in order to remove Affiliate marketing links that nefarious editors place in pages in order to secretly make money off of user clicks. As it stands, it would simply remove the affiliate portions of URLs; however, if the Wikimedia Foundation would like, it can replace them with Wikimedia's own affiliate tag information instead (thus redirecting the credited referral to Wikimedia, itself, as a pseudo-donation). Hence, the bot is named "RobinBot" both to keep people from "robbing" the public at whole, and optionally to "Robin Hood" otherwise evil links into donations for Wikimedia.

Maintainer

[edit]

If you have any questions, comments, suggestions, compliments, or complaints, please contact koder, this bot's developer.

What RobinBot does

[edit]
  • RobinBot is a Recent Changes patrolling bot that will use the RSS feed of diffs to watch for user and anonymous user changes to links that match regexes for affiliate marketing links and revert them, warning the user in the process.
  • RobinBot will automatically change its recent changes xml feed hit frequency depending on site activity so as to conserve both bandwidth and cpu cycles.
  • The bot runs continuously (except, obviously, for maintenance).
  • The bot maintains a list of offenders in a SQL table and will publish frequent/repeat offenders to a subsection of its user page.
  • RobinBot is for the most part automatic, but can be manually instructed to check a page for problems.
  • RobinBot will initially only handle removing Amazon Affiliates links. More will be added if the bot is approved and the subsequent additional checks are approved as well.
  • RobinBot is written in PHP (only tested on php5) and runs as a background process using phpcli. It makes use of native libcurl and expat integration with PHP for easier use of http and xml processing, respectively.
  • RobinBot will crawl old database dumps initially to find old candidates for editing (these will be added to its queue until after trial period, at which point they will be gradually crawled and implemented). Actually, the database dump crawling portion would actually be done using C instead of PHP, simply because it would be a bajillion times faster due to the large data set.

Status

[edit]
  • Inactive.