Wikipedia talk:Bots/Archive 15
This is an archive of past discussions about Wikipedia:Bots. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 10 | ← | Archive 13 | Archive 14 | Archive 15 | Archive 16 | Archive 17 | → | Archive 20 |
One time run request - Kakashi Bot; Text replacement
(copied from User talk:AllyUnion.)
Am I correct in assuming that one can request jobs to be done by your bot here? I'm currently trying to orphan Image:Flag of Czech Republic.svg and replace it with Image:Flag of the Czech Republic, but the amount of pages it is used in (especially English, French, Spanish wikipedias) is enormous... ナイトスタリオン ✉ 20:01, 24 November 2005 (UTC)
- I don't understand the difference between the two. --AllyUnion (talk) 23:34, 24 November 2005 (UTC)
- For countries/regions with the terms "Republic" or "Islands" in their short name, correct grammar is "Flag of the Czech Republic", not "Flag of Czech Republic". Image names are expected to follow the same naming rules as article, AFAIK, so this grammar mistake should be corrected, and it'll be much easier with a bot. Furthermore, I've managed to upload basically all the national flags in svg format, and those should replace the earlier png versions, so the bot could be put to further use... Sorry if I'm getting on your nerves, but your bot seemed to be available for tasks. ナイトスタリオン ✉ 00:04, 25 November 2005 (UTC)
- I dont mind doing it, thats if we have agreement that it should be done. thanks Martin 15:21, 27 November 2005 (UTC)
- For countries/regions with the terms "Republic" or "Islands" in their short name, correct grammar is "Flag of the Czech Republic", not "Flag of Czech Republic". Image names are expected to follow the same naming rules as article, AFAIK, so this grammar mistake should be corrected, and it'll be much easier with a bot. Furthermore, I've managed to upload basically all the national flags in svg format, and those should replace the earlier png versions, so the bot could be put to further use... Sorry if I'm getting on your nerves, but your bot seemed to be available for tasks. ナイトスタリオン ✉ 00:04, 25 November 2005 (UTC)
User:Chlewbot (interlanguage bot)
I am asking for permision to run User:Chlewbot under a bot flag. It's primary goal is to check, add and fix interwikis originated at Spanish language Wikipedia.
Thank you.
— Carlos Th (talk) 21:16, 29 November 2005 (UTC)
- Are you using the pywikipedia framework? Please specify you are, or using something else. --AllyUnion (talk) 09:04, 9 December 2005 (UTC)
- Yes. I am using pywikipedia framework.
- — Carlos Th (talk) 12:34, 10 December 2005 (UTC)
- Approved for a trial test run of 1 week. Please keep your bot updated with the latest CVS code. May apply for bot flag after trial test run if no complaints. --AllyUnion (talk) 09:55, 11 December 2005 (UTC)
- Well. It seems there have been no complains. I am keeping chlewbot updated by CVS. Thank you. — Carlos Th (talk) 14:18, 16 January 2006 (UTC)
- Request at meta fulfilled. --Ascánder 02:44, 22 January 2006 (UTC)
KaiserbBot
I would like to request a bot flag for a manually assisted bot running on the English wikipedia to assist in fixing a variety of firearms, woodworking, and other pages with numerous redirects, and double redirects. Kaiserb 02:16, 1 December 2005 (UTC)
- You'll have to be more specific of what you plan to fix. --AllyUnion (talk) 09:31, 9 December 2005 (UTC)
- When testing the bot it was used to disambiguate a number of pages from Wikipedia:Disambiguation pages with links. The bot performed well for this task and would continue to disambiguate pages. Additionally I would like to use it to solve disambiguation and redirects on specific firearm pages. There are numerous links on each firearm page to the particular ammunition used by said firearm. One that is a particular issue is 9 mm, 9 mm Luger and 9 mm Parabellum, that all link through a redirect pointing to 9 mm Luger Parabellum. In the first case 9 mm could refer to (9 x 18 mm), (9 x 20 mm SR), 9 mm Glisenti, (9 x 19 mm), (9 x 21 mm), (9 x 23 mm Steyr), or (9 x 23 mm Largo) while 9 mm Luger and 9 mm Parabellum are both 9X19. It would be nice to clean up this and other ambiguous ammunition links to tidy up these pages. --Kaiserb 05:14, 10 December 2005 (UTC)
- Approved for an one week trial run. If no objections made, may apply for a bot flag. Please remember to leave a note on this page if you plan to change the scope of what your bot does. Also, please be more descriptive on your bot's user page... like listing the same information that you described in detail here. --AllyUnion (talk) 10:41, 10 December 2005 (UTC)
OrphanBot
I would like to run User:OrphanBot to orphan images in Category:Images with unknown source and Category:Images with unknown copyright status in preparation for deleting them. The bot would go through the categories, and for each image, it would replace the image tag with , to keep from breaking table layouts. It would then record on the image description page which articles it has removed the image from.
For images in articles in the main namespace, the Category namespace, and the Portal namespace, it would remove them. Images in articles in the User:, Talk:, User talk:, Template talk:, Image:, Image talk:, Category talk:, Wikipedia talk:, and Portal talk: namespaces would be ignored. Images in articles in the Wikipedia:, Template:, Help:, and Help talk: namespaces would be logged to the bot's talk page for human review, since they shouldn't have been there in the first place. --Carnildo 09:11, 1 December 2005 (UTC)
- There is a conflict of interest in the regards with notification bot on the removal of the link to in User talk space, unless the only thing you plan to do is to change it from an image to a linked image. Furthermore, removal of links may cause problems in the Wikipedia namespace when it comes to WP:IFD. --AllyUnion (talk) 09:29, 9 December 2005 (UTC)
- I specified that it doesn't touch anything in the User: namespace, or in most of the talk namespaces. Also, it doesn't do anything about images that have been linked, rather than inlined, so IfD is safe.
- Also, I've made a slight change in how it removes images. It now replaces them with HTML comments, to make seeing what it's done easier. --Carnildo 09:42, 9 December 2005 (UTC)
- I feel safer with it commenting it out, that way people understand there WAS a picture there, and something should replace it... rather than leaving a missing image notice. --AllyUnion (talk) 10:37, 10 December 2005 (UTC)
- How's this? [4] [5] --Carnildo 22:20, 11 December 2005 (UTC)
- I feel safer with it commenting it out, that way people understand there WAS a picture there, and something should replace it... rather than leaving a missing image notice. --AllyUnion (talk) 10:37, 10 December 2005 (UTC)
Hmm. Can we tag these with a category? I go through runs asking people to delete images, and I generally start with Orphaned images advertising to prospective deleters that the probability of someone complaining about the deletion is fairly low... But that is only true for naturally orphaned images. Also be aware that if you do this to fair use images it is going to cause them, ultimately, to become targets under the fair use CSD. (course, they are already CSD so...). --Gmaxwell 03:44, 12 December 2005 (UTC)
- What sort of categorization are you thinking of? The bot already puts a list of what pages the image was removed from on the image description page, see [6] for an example.
- Any fair-use image that this bot removes is already a CSD, because it was unsourced. Making it an orphaned fair-use image won't change things one bit. --Carnildo 06:54, 12 December 2005 (UTC)
SuggestBot, matching people with pages they'd edit
Just an update, SB has made suggestions to a couple hundred people unsolicited without complaints, and to a few dozen who asked for recommendations. Overall feedback has been positive on its talk page. So, thanks to everyone who had thoughts on how to make it go in a way that would be acceptable. Per AllyUnion, I posted a request over on m:Toolserver to start the process of making this a freely-available tool, that still seems like a good idea, but haven't heard any response. Do I need to actively bug the people who manage it? -- ForteTuba 18:39, 8 March 2006 (UTC)
Hi, I'm doing some research to try to help people find pages to edit -- in particular, stubs that they might be willing to contribute to. We're doing some information retrieval stuff on a dump of the database to come up with algorithms for predicting edits based on other edits, and eventually we'd like to see if we can help real humans find articles they'd like to work on.
I am considering writing SuggestBot, a bot that selects a set of users, looks at their contribution history, picks a set (10-100, not sure yet) of stub pages they might be willing to edit, and posts that set of pages to the user's talk page. All processing except posting would happen on our lab's servers, using dumps of the Wikipedia database.
1. Whether the bot is manually assisted (run by a human) or automatically scheduled to run
- This bot would be manually assisted
2. The period, if any, we should expect it to run
- Sometime in January for a few weeks, more if it produces favorable results.
3. What language or program it is running
- Not yet developed, but likely a standard framework for editing Wikipedia pages.
4. The purpose of your bot
a. Why do you need it?
- To test our algorithms for recommending work to do in Wikipedia with real people, to help reduce the number of stubs in Wikipedia, and to help increase the amount of content and the value of Wikipedia.
b. Is it important enough for the Wikipedia to allow your bot?
- I think so. A recent paper we did showed that 4 times as many people were willing to contribute to a movie database if we picked movies they had seen compared to other plausible strategies like picking recommended movies, random movies, or movies that appeared to be missing information. Showing these techniques work on Wikipedia could lay the foundation for increasing contributions to Wikipedia (and other online communities where people build community resources) and help to reduce the number of stub articles in Wikipedia.
The slightly scary part to us is that we modify user talk pages. I'm not sure how people will react, and I'm looking for community guidance. NotificationBot alters talk pages, but at a user's request. I wonder whether/how much this would be perceived as spam.
-- ForteTuba 23:35, 1 December 2005 (UTC)
- Interesting. I have some questions :
- How does it work? Does it looks at the totality of what one user has edited, and the totality of what other users have edited, and then identify similar people (giving extra weighting to those who are extremely similar, and less to those who are semi-similar), and come to some list of recommendations based on what the people most like you have edited, but you have not? If so, I have wondered (purely conceptually) about doing the same thing for music. Basically we each like to think we unique - but we're not, and often specific people (who have never met, and who live in completely different locations) will have very similar tastes. For example, I have a good friend who lives in Sydney and who has musical tastes totally unlikely anyone else I know, who one day, purely-by-chance, found someone in the US, who had published their extensive list of music they owned on the Internet, and it was almost a verbatim match for his music collection. At first it kind of freaked my friend out, but then he promptly went out and bought those items this other person had and had liked, and he really enjoyed that new music. It saved him hours of research, of trial-and-error purchases - basically he could just go straight to the good stuff. He still checks back on the other guy's music collection, and will buy anything this other guy likes, because he knows that the chances he'll like it too are extremely high, because the totality of their musical tastes are extremely similar. Now, normally it's hard to do this for music because we just don't have the data (e.g. How do I know what music you've tried? How do you know what music I have tried? How do I know what you liked? How do you know what I liked?). However, with the Wikipedia, we do have some hard data, based on past edits. So, my question is: Is this how your approach will work (look at the totality of what I've edited, the totality of what you have edited, and if we're a very good match, then recommend to you some of the articles I have edited and that you have not yet edited?). Or is something else?
- Spamming: Some users probably will see this as spamming. Better maybe to allow people to request this (at least at first), and give them some pages where they can provide feedback, both good and bad (e.g. positive feedback and negative feedback). You'll know after you've got 50 or 60 bits of feedback whether you're onto a winner or not. I'd certainly be willing to consider taking part in an early stage trial. -- All the best, Nickj (t) 00:56, 2 December 2005 (UTC)
- We're testing a bunch of algorithms and variations against edit histories in the dump. Some are collaborative filtering style, like what you describe above (there are music recommenders, like AudioScrobbler/last.fm that are said to use CF. Amazon does this for many/all of its things, and MovieLens is our CF research site). There are a bunch of CF-based algorithms and we're going to try a number of them. We're also doing some search engine-style matching, using your edit history as a query against the wikipedia database. We won't necessarily use all edits, we're trying to figure out whether we can tell you're particularly interested in a topic by looking at characteristics of edits (length change, is it a reversion, marked minor, etc.) and maybe only use some of your edits to build the query/your profile. We'll actually deploy the algorithms that seem to do best in our offline tests.
- Very good! I like that edits marked as minor (or where the amount of changed text is quite small) will be excluded. For example, as part of the Wiki Syntax Project, I frequently make small edits to topics that I personally don't care about, solely to clean up their wiki syntax. Also, I'm checking out Last.fm now - I had never heard of it before, so thank you for that! I'll also read over the collaborative filtering page (up until now I had only thought about the idea in a "wouldn't this idea be cool?" kind of way, and never knew before what the correct term was). One thing that may help you is if you can get users to show you their watchlists (if something is on my watchlist, chances are pretty good that I care about it), but since I don't think you can access this, it might be best to start off just using a user's edit history.
- I was excited about watchlists until I posted on Wikipedia:Village pump asking people to mail them to me and someone said "you know, I put a lot of pages I hope get deleted on my watchlist". So, edit history it will be. We might come up with an interface that allows people to type in some arbitrary text as well, not sure yet.
- On second thoughts, I agree with not using watchlists. I've reviewed the things on my watchlist, and I'm not so sure that they are representative of my interests (at least some things are listed because they were things I thought that should be deleted, some I thought should be merged, and some were things that I can no longer recall why they were originally watched, etc). Significant edits are probably the best indicator, as you say. -- All the best, Nickj (t) 01:26, 8 December 2005 (UTC)
- I was excited about watchlists until I posted on Wikipedia:Village pump asking people to mail them to me and someone said "you know, I put a lot of pages I hope get deleted on my watchlist". So, edit history it will be. We might come up with an interface that allows people to type in some arbitrary text as well, not sure yet.
- Very good! I like that edits marked as minor (or where the amount of changed text is quite small) will be excluded. For example, as part of the Wiki Syntax Project, I frequently make small edits to topics that I personally don't care about, solely to clean up their wiki syntax. Also, I'm checking out Last.fm now - I had never heard of it before, so thank you for that! I'll also read over the collaborative filtering page (up until now I had only thought about the idea in a "wouldn't this idea be cool?" kind of way, and never knew before what the correct term was). One thing that may help you is if you can get users to show you their watchlists (if something is on my watchlist, chances are pretty good that I care about it), but since I don't think you can access this, it might be best to start off just using a user's edit history.
- When you say request, how do you mean? Would the bot just put a link that says "hey, we can find you more things we think you'll edit, click here" on user talk pages so it's not as intrusive as just dropping a bunch of links on someone's page? Or are you talking about trying to put a link in a public place? Someone suggested changing a welcome template up above... wasn't sure other Wikipedians would go for that. Thanks for the thoughts. -- ForteTuba 12:37, 2 December 2005 (UTC)
- Well, I think you've got two issues. The first is, does it work and do people like it? For this, to start with, you might want to put up a notice at the Wikipedia:Village pump and the Wikipedia:Announcements page saying "we're running a trial and would like some participants" to get some people to test it and see what they think. This would be an opt-in type of thing (e.g. get people to add their names to a User:SuggestBot/sign me up for the trial page), so none of the participants will complain about spamming, and you're bound to learn something from the feedback that people give (both what they like about it, and what they don't like about it, and you should fine-tune it based on that feedback), and once you've done a trial with public feedback then people will be far more receptive on the second issue. The second issue is once you've got a working system, how do you tell people about articles they might like, without it being considered spamming? The danger for you is that if even a handful of people consider it to be spamming and complain, then your bot is likely to get blocked (just telling you the way it is - basically the onus is on you to structure it so that people will not get annoyed by this). For telling users about articles they maybe interested in, I think you have three categories of users:
- Posting asking for watchlists on the pump didn't work very well, I've gotten 5 in two-ish weeks, but I agree, getting people to opt in would be nice. Maybe the bot could create subpages of user pages (or of user talk pages) that contain suggestions and then put a polite little note on the talk page. Not quite as intrusive as an in-your-face list of suggestions. Opting in via community pages mostly only works for experienced users... which dovetails nicely with your next.
- Existing established users (these you should probably require to opt-in, and should not add to their talk pages unless they opt, as they are the most likely to complain loudly and extensively, and are the most likely to have already found the topics they are the most interested in).
- Heh, I agree about both points.
- Anonymous/not logged in users who use an IP address (you should probably not do anything with them, as they may be existing established users who have not logged in, or they may be a user who makes only one edit and is never seen again).
- Also agreed.
- New users who have a named account that's quite new and who have only a few edits (and for these users you could consider adding something to their talk page). For example: "Hello, I am the SuggestBot, and I notify new users about articles that they may be interested in, based on the edits that they make. I have included a list of the top 10 articles that you have not edited yet, and which people who edited the articles you have edited also liked. My suggestions are: (insert ranked bullet list of top 10 suggestions). I hope that you like these suggestions, but if you don't then please don't be alarmed because I will not leave you any more messages (unless you explicitly ask me to at my opt-in page). If you have any comments on this list, both good and bad, we would love to hear them at (insert link to positive feedback page and negative feedback page)."
- Hope the above helps. Personally I think the approach outlined above will probably work, but others may have different opinions. -- All the best, Nickj (t) 02:20, 5 December 2005 (UTC)
- This is close to what we were looking at. Opt-in for experienced via community portal and a polite page creation bot that samples infrequent/new editors might be a winner. Thanks for your thoughts, let me know if you have more. -- ForteTuba 21:55, 6 December 2005 (UTC)
- All sounds good, but just a quick thought: if you're going to create talk subpages, you might want to not put them under the user's talk page, but rather under the SuggestBot's talk page (e.g. User:SuggestBot/suggestions/Nickj), and then put a quick note on the user's talk page linking to that. From a technical perspective, it makes almost no difference whether it's stored under the user's page or SuggestBot's; but from a human psychology perspective, they're very different things: one is a bot walking into "my" space and messing with it, whereas the other is the bot making a list in its space and then inviting me into that space to review the list. It's quite possible to get annoyed by the former, whereas it is much harder to get annoyed by the latter. -- All the best, Nickj (t) 01:26, 8 December 2005 (UTC)
- That is a super-good idea. -- ForteTuba 03:45, 10 December 2005 (UTC)
- All sounds good, but just a quick thought: if you're going to create talk subpages, you might want to not put them under the user's talk page, but rather under the SuggestBot's talk page (e.g. User:SuggestBot/suggestions/Nickj), and then put a quick note on the user's talk page linking to that. From a technical perspective, it makes almost no difference whether it's stored under the user's page or SuggestBot's; but from a human psychology perspective, they're very different things: one is a bot walking into "my" space and messing with it, whereas the other is the bot making a list in its space and then inviting me into that space to review the list. It's quite possible to get annoyed by the former, whereas it is much harder to get annoyed by the latter. -- All the best, Nickj (t) 01:26, 8 December 2005 (UTC)
- This is close to what we were looking at. Opt-in for experienced via community portal and a polite page creation bot that samples infrequent/new editors might be a winner. Thanks for your thoughts, let me know if you have more. -- ForteTuba 21:55, 6 December 2005 (UTC)
- Well, I think you've got two issues. The first is, does it work and do people like it? For this, to start with, you might want to put up a notice at the Wikipedia:Village pump and the Wikipedia:Announcements page saying "we're running a trial and would like some participants" to get some people to test it and see what they think. This would be an opt-in type of thing (e.g. get people to add their names to a User:SuggestBot/sign me up for the trial page), so none of the participants will complain about spamming, and you're bound to learn something from the feedback that people give (both what they like about it, and what they don't like about it, and you should fine-tune it based on that feedback), and once you've done a trial with public feedback then people will be far more receptive on the second issue. The second issue is once you've got a working system, how do you tell people about articles they might like, without it being considered spamming? The danger for you is that if even a handful of people consider it to be spamming and complain, then your bot is likely to get blocked (just telling you the way it is - basically the onus is on you to structure it so that people will not get annoyed by this). For telling users about articles they maybe interested in, I think you have three categories of users:
- We're testing a bunch of algorithms and variations against edit histories in the dump. Some are collaborative filtering style, like what you describe above (there are music recommenders, like AudioScrobbler/last.fm that are said to use CF. Amazon does this for many/all of its things, and MovieLens is our CF research site). There are a bunch of CF-based algorithms and we're going to try a number of them. We're also doing some search engine-style matching, using your edit history as a query against the wikipedia database. We won't necessarily use all edits, we're trying to figure out whether we can tell you're particularly interested in a topic by looking at characteristics of edits (length change, is it a reversion, marked minor, etc.) and maybe only use some of your edits to build the query/your profile. We'll actually deploy the algorithms that seem to do best in our offline tests.
- I think this is a great idea, but I wanted to note that merely looking at who has edited an article, and perhaps the number of edits they made, isn't terribly indicative. What really indicates interest is substantial edits, where they have contributed significant new content or offered up a detailed explanation in the edit history or talk page. For example, many users performing disambiguation will edit hundreds or even thousands of pages they really don't give a damn about.
- Also, keep in mind that unlike Amazon, our articles are strongly interlinked (some say you can't find two articles more than 6 links apart if you tried). These links, along with properties such as belonging to the same category/list, could be a valuable way of finding related articles once you've established the sort of article a person likes.
- One last comment: don't recommend articles people already know about if you can. This would be really annoying. :-) Deco 06:16, 8 December 2005 (UTC)
- Definitely agree suggesting things that the user has already edited in any way, shape or form is bad. Categories could be useful; Also articles that link to an article that the person has edited, but which are not backlinked, could be good (e.g. you've extensively edited A, and B links to A, but A does not link to B, and you have not edited B in any way; In this situation, you may like B, but not know about it). Could be more a "future directions" thing though. P.s. Off-topic: For a proof-by-existence of two articles being more than 6 degrees apart, see Gameboy to Maine Coon, which are at least 10 steps apart (but in one direction only). -- All the best, Nickj (t) 07:02, 8 December 2005 (UTC)
- We were thinking of using local link structure to edited articles as a reasonable baseline recommender. We've thought about building one that uses categories to help cluster edits and make sense of interests; that is probably a Next Step compared to fielding a few plausible and easy-to-construct ones and seeing how people react first.
- As for the substantive edits, we're trying both using all edits and trying to filter out non-substantive ones. A naive, ad-hoc approach to filtering non-substantive did pretty poorly compared to using all edits on the offline data, but it's hard to say how people would react without trying it both ways. (And, there are other strategies for filtering edits that we haven't tried yet.) It's a good suggestion. -- ForteTuba 03:45, 10 December 2005 (UTC)
Okay, I sincerely recommend that this be not a bot, and rather be a tool like Kate's contribution tool. We should have the ability to just simply load, based on a user's name, what pages we should attempt next. This provides it to anyone who wants to use it. You can even sign up for a m:Toolserver account to place your new script tool there, since the Toolserver has access to database dumps anyway. --AllyUnion (talk) 09:27, 9 December 2005 (UTC)
- Interesting idea, why do you recommend so? My initial reaction: I don't know enough about the toolserver to know if this is a plus or minus at this stage. Long run it's probably a win, but currently some algorithms are slow and would be hoggy and not suitable for interactive use (think many-word queries against the searchindex). Also, we would probably be happier in our research hearts in the short term to control who gets recommendations when, for the purpose of getting more generally useful and more likely valid results. I definitely think in the long run it would be nice for people to be able to get results interactively and of their own volition. FWIW, the bot wouldn't be running continously: we'd probably pick a set of users, run the algos offline for them, and put up the suggestions as a batch, then get a dump a week or two later to see what happened. -- ForteTuba 03:45, 10 December 2005 (UTC)
- There is one problem that I have not yet worked out of the NotificationBot, that is how to deal with user talk pages that have been redirected cross project (i.e. a person may have a meta redirect) - the closest answer I have so far is to allow the site to do the work for you, but that would involve a lot of correction and assumptions in the pywikipedia code.
- Huh... yeah, that could be annoying. Is it painful to chase the redirects yourself?
- That aside, it's my feeling that you could easily create some kind of tree of articles for each article on the Wikipedia. So if we have an article on A, it follows that if B relates to A, then it should be in the set S1. (Then we continue to build different sets of S for each article in the Wikipedia. I suppose somewhat on the idea of Nick's Link suggestion bot...) I figure that you could basically create a network for what relates to what article using trees and such. Then when a user queries against his own name, you could then pull up a list of articles he or she has edited, pull up the trees, and generate a list based on tree relationships. Does that make any sense to you? --AllyUnion (talk) 10:34, 10 December 2005 (UTC)
- Yes, it makes sense. There are a lot of ways to define "relates": text similarity, pairs of pages that are edited by the same person, link structure, categories, and more I'm sure. For now we're going after the first three:
- text similarity because it's so obvious to try (and is okay in our offline experiments, though slow)
- co-editing because collaborative filtering is effective in lots of domains (and appears to do fairly well here in limited dump-based experiments, and is fairly fast)
- a simple link-based relatedness algorithm because it's both reasonable and something that you could imagine people doing manually (still to be built)
- As for building a relationship structure (trees are one way to do it), the relationships get represented differently for each. In the text similarity case, we use mysql's built-in text indexing because it's dead simple. For co-editing, we use revision history info from a full dump, lightly processed and text removed, to establish relationships between editors (and then to articles). Eventually I want to implement it using the revision table directly. For link-based relatedness I'm looking to use the pagelinks table. I'm hoping that with just a little optimization the last two will be quick enough to run online so we won't have to build the whole relateness model: time-consuming and space-expensive. -- ForteTuba 16:04, 11 December 2005 (UTC)
- Yes, it makes sense. There are a lot of ways to define "relates": text similarity, pairs of pages that are edited by the same person, link structure, categories, and more I'm sure. For now we're going after the first three:
- There is one problem that I have not yet worked out of the NotificationBot, that is how to deal with user talk pages that have been redirected cross project (i.e. a person may have a meta redirect) - the closest answer I have so far is to allow the site to do the work for you, but that would involve a lot of correction and assumptions in the pywikipedia code.
This bot has been adding category tags incorrectly, so I've blocked for 3 hours and notified the author. I'm going offline now, so I hope an admin can keep an eye out to see if the problem is A) fixed before the three hours is up, so that the bot can be restarted, or B) the block extended if the bot carries on after 3 hours with the same errors. Cheers! — Matt Crypto 01:59, 4 December 2005 (UTC)
- Yes, my bot went nuts. I was running it in semiautomatic mode checking from time to time how it was doing. Due to a bug (which I meant to be a feature) in my code, it started spitting garbage. Now the bot is stopped, and I don't plan to use it until I figure out the problem (it screwed up the last 32 of the 354 articles it changed, and I reverted all of those and checked a bunch more). Sorry for that, I will pay much more attention on this kind of things from now own. Oleg Alexandrov (talk) 02:26, 4 December 2005 (UTC)
- Given the above comments, I unblocked User:Mathbot. -- Jitse Niesen (talk) 03:42, 4 December 2005 (UTC)
Mathbot is pasting messages in user's talk pages asking them to do more edit summaries. I find this very obnoxious. Wikipedia is supposed to improve by cooperation and peer review in the act of editing itself. The kind of eye-in-the-sky monitoring of this bot-function is IMHO quite contrary to that spirit. Also, I think there should be a very high burden of proof on bots who leave messages in user's talk pages. Bacchiad 17:13, 3 March 2006 (UTC)
CricketBot bot flag
I originally proposed CricketBot further up this page. It has now been running for over two weeks with no complaints, and positive feedback from WikiProject Cricket, so I've applied for a bot flag at m:Requests for bot status#en:User:CricketBot. Thank you. Stephen Turner (Talk) 10:25, 4 December 2005 (UTC)
Pfft Bot, again
I haven't run Pfft Bot much in a while, but have recently started using it again. It still has no bot flag. Anyway, I got a request from Natalinasmpf to upload ~300 (actually ) small Xiangqi related images. They are in PNG format. I'm told they're a total of around 3 megabytes, but I have not recieved the images yet. I have source and licensing information on them (gfdl-her). Additionally, I'm assuming upload.py in pywikipedia will do what I want? --Phroziac . o º O (mmmmm chocolate!) 03:24, 5 December 2005 (UTC)
- Ok, now she wants me to upload more, for {{Weiqi-image}}, to upgrade the images. Am i going to need to post here for every run I want to do? It's not like i'm uploading copyvios or anything :) --Phroziac . o º O (mmmmm chocolate!) 03:46, 5 December 2005 (UTC)
- Dude, upload them to commons...there's no reason not to--Orgullomoore 08:48, 5 December 2005 (UTC)
- I thought about that last night, and well you're right. I put a similar request on the commons village pump a few minutes ago. :) --Phroziac . o º O (mmmmm chocolate!) 16:39, 5 December 2005 (UTC)
- Dude, upload them to commons...there's no reason not to--Orgullomoore 08:48, 5 December 2005 (UTC)
By the way, I just noticed pywikipedia has a script to copy images to commons and put a NowCommons template on wikipedia. Any objections to me using that? --Phroziac . o º O (mmmmm chocolate!) 00:36, 6 December 2005 (UTC)
- None from me. --AllyUnion (talk) 09:22, 9 December 2005 (UTC)
New bot!
I have made a new bot, well its not necessarily a bot as it is primarily designed for semi automatic editting, there are many features to come, but if anyone wants to see what it is like then I would really like some feedback, let me know if you want a copy. See User:Bluemoose/AutoWikiBrowser for more details. Martin 21:33, 8 December 2005 (UTC)
- This would be more classified as a tool, rather than anything else... so long as this tool doesn't perform high speed edits and makes a user logs in under themselves, then we're okay. --AllyUnion (talk) 09:21, 9 December 2005 (UTC)
- It can function as a bot if you want. At the moment it is lacking some functionality that pywikibot's have, but ultimately it should be more powerful, and much easier to use. Martin 12:41, 9 December 2005 (UTC)
- I rather not have it function as a bot. Unless you can built in safety functions to prevent the tool to be abusive and make certain that part of functionality is approved on a per user basis, I rather not you make it easier to make an automatic editor which may have the potential to damage the site rather than help it. Furthermore, please ensure that you add throttles to your code, as it is an apparent dev requirement. --AllyUnion (talk) 10:12, 10 December 2005 (UTC)
- It does have a throttle, and also requires users to be logged in and their version of the software enabled here. I will disable the bot function on the freely available version anyway, only available on request. Remeber that anyone could use a pywikibot which is potentially just as dangerous. Martin 10:48, 10 December 2005 (UTC)
- The pywikipedia bot framework has some built-in restrictions to at least prevent serious abuse. Additionally, while anyone can use the pre-existing bots available in the framework, some programming knowledge is required to turn the tool to be abusive. --AllyUnion (talk) 23:08, 10 December 2005 (UTC)
- I am working on making it so it only works for users who have their name on a certain page (and are logged in with that name), so that way no one will be able to abuse it at all. Martin 23:11, 10 December 2005 (UTC)
- Can we make it so that they have to be a logged in user with a bot flag? --AllyUnion (talk) 09:53, 11 December 2005 (UTC)
- I am working on making it so it only works for users who have their name on a certain page (and are logged in with that name), so that way no one will be able to abuse it at all. Martin 23:11, 10 December 2005 (UTC)
- The pywikipedia bot framework has some built-in restrictions to at least prevent serious abuse. Additionally, while anyone can use the pre-existing bots available in the framework, some programming knowledge is required to turn the tool to be abusive. --AllyUnion (talk) 23:08, 10 December 2005 (UTC)
- It does have a throttle, and also requires users to be logged in and their version of the software enabled here. I will disable the bot function on the freely available version anyway, only available on request. Remeber that anyone could use a pywikibot which is potentially just as dangerous. Martin 10:48, 10 December 2005 (UTC)
- I rather not have it function as a bot. Unless you can built in safety functions to prevent the tool to be abusive and make certain that part of functionality is approved on a per user basis, I rather not you make it easier to make an automatic editor which may have the potential to damage the site rather than help it. Furthermore, please ensure that you add throttles to your code, as it is an apparent dev requirement. --AllyUnion (talk) 10:12, 10 December 2005 (UTC)
- It can function as a bot if you want. At the moment it is lacking some functionality that pywikibot's have, but ultimately it should be more powerful, and much easier to use. Martin 12:41, 9 December 2005 (UTC)
DFBot
I have created DFBot. Right now it has only one task. Once an hour it reads WP:RFA and creates a summary in my userspace of open noms. Maybe it will do more in the future, but that's it for now. Dragons flight 09:53, 18 December 2005 (UTC)
- Approved for one week testing, may apply for a bot flag after one week. Please add your bot to the Category:Wikipedia bots, and list it on the Project page. If you should wish to expand the scope of your bot, please make sure you get it approved here first. --AllyUnion (talk) 12:29, 25 December 2005 (UTC)
AutoWikiBrowser regex to reduce date links
Martin has been kind enough to embed the date delinking regex as an option in AutoWikiBrowser. I have published its objectives and the actual regex in the hope that openness can lead to improvements. Please look at User_talk:Bluemoose/AutoWikiBrowser and suggest improvements. Bobblewik 20:06, 19 December 2005 (UTC)
AutoWikiBrowser -> flag needed?
I was fixing up some categories after a CFD when I found AutoWikiBrowser. I used it to quickly fix all the links to the categories. I found myself making ~8 edits per minute. Is that too many for a non-bot account to be making? The "Bots running without a flag" says I should be under 30-60 seconds per edit (granted it's not really a bot, but the effect is similar). Should I be using a separate account (eg. User:BrokenBot) with a flag to do such cleanups? Broken S 03:50, 20 December 2005 (UTC)
- Yes, as I mentioned above before to Martin (Bluemoose), that his tool at that speed would in effect qualify as a bot for anyone who uses it. Users who wish to use his tool beyond the recommended limit must apply for a separate account, and run any high speed edits under a bot flagged account. --AllyUnion (talk) 12:32, 25 December 2005 (UTC)
No template?
Is this a guideline? A policy? A rule? What? How did it make it into the Wikipedia:Policies and guidelines without a heading template? Stevage 14:09, 21 December 2005 (UTC)
- My interpretation of this question is: Do you think we should have a {policy} or {guideline} template at the top of the project page; Wikipedia:Bots?--Commander Keane 14:51, 21 December 2005 (UTC)
- My humble thanks for interpreting my poorly phrased query :) Stevage 14:55, 21 December 2005 (UTC)
- This page has been around long before {policy} and {guideline} templates existed. One would think this page would be grandfathered in, but I think no one has bother to add that template to the top of the page. --AllyUnion (talk) 12:35, 25 December 2005 (UTC)
- My humble thanks for interpreting my poorly phrased query :) Stevage 14:55, 21 December 2005 (UTC)
I have now added a {{policy}} template, I think you'll all agree that is correct. Martin 00:10, 27 December 2005 (UTC)
- Thanks. I have just edited the comments above to stop them linking to the policy template. Not for any really good reason, in retrospect. Stevage 18:52, 6 January 2006 (UTC)
request : Wybot
Wybot is python bot by user:WonYong. it uses pywikipedia framework. it is manually used. period: 1 year or less. I can't not login frequently.
- Unless you can describe the purpose of your bot in further detail other than what you can given above, your bot will not be approved from running on the English Wikipedia. --AllyUnion (talk) 12:36, 25 December 2005 (UTC)
- Purpose: test. I will study the bot usage without a flag. -- WonYong 22:53, 30 December 2005 (UTC)
RefBot
RefBot uses m:standardize_notes.py to process references and citations. It was developed in SEWilcoBot but has become specialized enough and there is enough demand for it that it has been separated to have its own identity. (SEWilco 19:57, 23 December 2005 (UTC))
- I do not know on what extent the RfA ruling has against this, but should you wish to apply or run this bot, I believe it requires approval of the Arbitration Committee first. --AllyUnion (talk) 00:01, 27 December 2005 (UTC)
Object - Too many citation-based conflicts surrounding this user. -- Netoholic @ 12:41, 1 January 2006 (UTC)
- ----
- The Arbitration Committee has ruled, Please assume the broadest possible interpretation. We will back up any administrator that blocks you under a broad interpretation. Meanwhile help work out policy. Fred Bauder 18:59, 28 December 2005 (UTC); Therefore, since this account is already blocked, the request is denied. --AllyUnion (talk) 08:06, 4 January 2006 (UTC)
- RefBot will not be running the code which presently is public at m:standardize_notes.py and will behave differently. The ArbCom, in an improper RFAr, placed WP:V restrictions upon me under any account so the specific account name is not relevant. The RefBot account is blocked by someone who claimed it was created to bypass the ArbCom ruling, yet he also confirmed that the ruling applies to me under any account as I had stated, then he went on vacation (David Gerard's status is on WP:AC). The block does not yet matter, as RefBot is still being rewritten. (SEWilco 09:52, 11 January 2006 (UTC))
Adrian Buehlmann Bot
See Adrian Buehlmann Bot. Manually assisted bot without flag (edits visible in recent changes). Replace template calls and update list of used templates of articles. Uses Python Wikipediabot Framework. Adrian Buehlmann 19:27, 25 December 2005 (UTC)
- Approved for the intention of not running without a bot flag. Please make sure you list your bot in the correct section on the project page. May not apply for a bot flag, due to nature of the bot. --AllyUnion (talk) 00:03, 27 December 2005 (UTC)
- Object - I am uncomfortable with this user running a bot. Has shown some bad judgment regarding templates in the past. -- Netoholic @ 12:40, 1 January 2006 (UTC)
- Emotional personal attack of Netoholic based on bilateral technical disagreement how to migrate a template. See [7] (lower part). Not based on any bad edit by me or my bot. I even reverted the template in question (template:Infobox President) to Netoholic's version which he said was a good revert to help fix a problem. I did not a single revert of Netoholic nor his bot. I've suspended my bot for the moment until further instructions are given. Adrian Buehlmann 13:12, 1 January 2006 (UTC)
I would like permission to use the above bot to aid in renames for WP:SFD in the same way as Mairibot, mentioned earlier.
It will use the following components of the pywikipediabot to accomplish this:
- m:template.py - for template renames
- m:touch.py - for category renames
--TheParanoidOne 14:00, 26 December 2005 (UTC)
- Approved for trial run for one week, may apply bot flag after trial run. Please make sure your bot is listed on the project page. --AllyUnion (talk) 00:09, 27 December 2005 (UTC)
- Request made for bot flag. --TheParanoidOne 22:20, 3 January 2006 (UTC)
This bot is using pywikipedia framework. I've been using this bot on :tr and :az. Mostly use for mass category creation etc. But also working as interwiki bot. My main reason for requesting to work on en is; in tr wikipedia there are interwikis for :en but mostly on :en for the same article there isn't :tr interwiki. I'll work the bot on :tr but use multi-login and when it finds :en doesn't link to :tr it will update the article on :en too. For my user name: Ugur Basak--Ugur Basak Bot 21:23, 29 December 2005 (UTC)
- Approval for trial run for one week, may apply bot flag after that after no complaints. --AllyUnion (talk) 12:18, 1 January 2006 (UTC)
- I guess i must add my bots name to the list on project page. I'll start a test run. And check how it works here.--Ugur Basak 11:23, 2 January 2006 (UTC)
- changed group membership for User:Ugur Basak Bot@enwiki from to bot --Walter 08:53, 7 February 2006 (UTC)
Request: User:CPBot
I have discovered WP:AWB (as have many others looking up this page) and would like to change all of the occurances of "Pokemon" to "Pokémon" - the correct spelling. "Pokemon" is never right. --Celestianpower háblame 12:50, 30 December 2005 (UTC)
- I don't see potential for a bot getting this wrong, as long as it's properly implemented. Support trial run. --IByte 22:14, 30 December 2005 (UTC)
Sadly, Martin has removed functionality under Windows 98 so I won't be able to use this. However, if Martin re-enables it - it would be nice to have this ready to start. --Celestianpower háblame 12:37, 1 January 2006 (UTC)
statbot - non-editing
I'm not sure where to request this, but I would like to have permission to run a robot which downloads random pages in small numbers (say 200 to max 1000 / rate of maybe four a minute maximum) from wikipedia for some statistical analysis. There are several reasons why I don't want to do this from a DBDump.
- size. - I only need a small amount of data, not the whole data.
- proper randomness - I don't want to select a limited dump
- proper representation - I want to see it _exactly_ as it would be seen by an end user.
As far as I can see; the impact will be minimal since it's just normal page reads and quite slow is fine by me. In fact I've already been doing this manually to some extent, so it wouldn't make any real difference to end usage. Comments? Mozzerati 21:35, 1 January 2006 (UTC)
- As long as it downloads the pages while logged out, any impact should be minimal. --Carnildo 07:19, 3 January 2006 (UTC)
Vina-iwbot would like to run on English Wiki
I would like to request permission to run my interwiki bot User:Vina-iwbot on English wiki. I will be running the latest version of pywikipedia bot (have been for over a year in Chinese wiki) under the new multi-login mode. English wiki is updated to create the backlink to Chinese pages.
Waiting for permission before starting trial run.
--Vina 23:43, 1 January 2006 (UTC)
- Approved for trial run for one week. Please add your bot to Category:Wikipedia bots, and on the project page. May apply bot flag status after one week free of complaints. --AllyUnion (talk) 10:44, 3 January 2006 (UTC)
Minor complaint about bot
The bot is occasionally forgetting to sign in, and is making edits from User:71.241.248.89. See the history of changes to Pope Pius XII. It appears to be working otherwise. Robert McClenon 17:00, 5 January 2006 (UTC)
- Asking Vina to stop. --AllyUnion (talk) 19:08, 5 January 2006 (UTC)
- No longer running on English wiki. I have sent a message to the mailing list, asking the developers to check what is going on. --Vina 22:16, 5 January 2006 (UTC)
new template for bot users (in case of emergency)
I made {{emergency-bot-shutoff}} for bot owners to put on their bots' user pages. This template leans more towards the fun side! :) --Ixfd64 02:12, 2 January 2006 (UTC)
- This has the potential to actually block the bot and user... just FYI. --AllyUnion (talk) 10:41, 3 January 2006 (UTC)
I'm requesting bot status (flag) for bot User:Sashato. This bot is active on sr: and hr: wikis. The only job will be adding interwiki links here on en: wiki. I use pywikipediabot for this interwiki job. I can speek English, Serbian, Bosnian and Croatian language. Thank you — Stefanovic • 03:02 2-01-2006
- Approved for trial run for one week. Please add your bot to Category:Wikipedia bots, and on the project page. May apply bot flag status after one week free of complaints. --AllyUnion (talk) 10:44, 3 January 2006 (UTC)
- Just notcied this edit made by User:Sashato which included interwikis for ja, he, no etc which are outside the scope of the description above. I thought the policy was to only add interwiki links you spoke the language of, but I can't find that written down anywhere.--Commander Keane 09:34, 11 January 2006 (UTC)
- Hmm, yes i see that edit, but what is wrong with that? Every interwiki is ok, and going to same article in all wikis. — Stefanovic • 04:21 13-01-2006
- I'm just wondering how you can be sure that each of the interwiki links are correct if you can't speak the language. Do you guess?--Commander Keane 08:27, 14 January 2006 (UTC)
- Hmm, yes i see that edit, but what is wrong with that? Every interwiki is ok, and going to same article in all wikis. — Stefanovic • 04:21 13-01-2006
- Just notcied this edit made by User:Sashato which included interwikis for ja, he, no etc which are outside the scope of the description above. I thought the policy was to only add interwiki links you spoke the language of, but I can't find that written down anywhere.--Commander Keane 09:34, 11 January 2006 (UTC)
- I got bot flag. Commander, you don't know how pyinterwikibot work, so please don't ask that stupid question. Thank you all. (If someone want to leave me a message, leave it on my user talk page.) — Stefanovic • 03:29 16-01-2006
Request for permission for UBXBot
I am requesting permission for the trial run of User:UBXBot a bot that will assist with the Wikiproject Userboxes task. Its primary tasks are touching pages and mass editing userbox template includes with permission of WP Userboxes. --Grand Edgemaster Talk 00:07, 4 January 2006 (UTC)
- There seems to be several issues regarding user templates, and I would suggest to hold off until the matter is settled. --AllyUnion (talk) 07:55, 4 January 2006 (UTC)
- That probably would be the best target. I don't want any more conflict with the terrible mess that there is/was. --Grand Edgemaster Talk 21:27, 4 January 2006 (UTC)
79 androids request
I'd like to request permission for a trial run for a Python Wikipediabot, 79 androids. It will be human-assisted only and will be used to help automate standardization of the usage of {{Mlbplayer}} and other templates related to Wikipedia:WikiProject Baseball players. android79 03:31, 4 January 2006 (UTC)
- Approved for trial run for one week, may apply for bot status after. --AllyUnion (talk) 07:56, 4 January 2006 (UTC)
User:Alperenbot
Used in Turkish Wikipedia (tr.wikipedia.org)
This bot is used to start stub articles with coordinate + location + external google maps links, it was running succesfully howeever the bot account is suspended infinetely by a bureucrat. I suppose that this action is illegal, and I want my right to use bot, editing only 1 article per minute. I accept my fault that was running an unapproved bot with 5 articles per minute speed, however an infinite suspension is a heavy punishment, because the bot was creating useful articles, not garbage. I request my bot to be activated to create articles again I promise not to use it too fast.
My account: http://tr.wikipedia.org/wiki/User:Alperen My Bot account: http://tr.wikipedia.org/wiki/User:Alperenbot
Thanks
I'm sorry, but this is the English Wikipedia. We have nothing to do with the approval or disapproval of what happens on the Turkish Wikipedia. Please go to m:Requests for bot status if you want anything done. --AllyUnion (talk) 13:26, 5 January 2006 (UTC)
Requesting permission (trial run). This assisted-bot will fix typos using Bluemoose's AutoWikiBrowser and possibly disambiguate pages. This bot will be manually run and each edit will be checked. Gflores Talk 04:22, 6 January 2006 (UTC)
- Please sign your username --AllyUnion (talk) 04:03, 9 January 2006 (UTC)
- I will also be running solve disambiguation.py assisted bot from the pywikipedia framework on the account. Gflores Talk 06:19, 9 January 2006 (UTC)
Approved for one week trial. --AllyUnion (talk) 10:47, 12 January 2006 (UTC)
The page User:U-571 says it's a bot but I didn't find it listed. What links here doesn't give a clue either. Anybody knows something about that? Who operates that? Adrian Buehlmann 16:56, 6 January 2006 (UTC)
- In theory it should be blocked, but it isnt doing any harm, if the owner doesnt present himself soon then I would block it. Martin 17:04, 6 January 2006 (UTC)
- Yes looks good what it did (as I've seen so far). Maybe that user just does not know what the term "bot" means :-). Adrian Buehlmann 17:12, 6 January 2006 (UTC)
DFBot - AFD Summaries
I am experimenting with using a script to generate summaries of ongoing AFDs. The work in progress can be seen at User:Dragons flight/AFD summary. Right now the only updates come when I am working on it and trying to perfect the analysis code, and I would appreciate feedback on what people think of the idea and what can be done to make it useful. Eventually, I'd intend to have it update automatically from the DFBot account. Dragons flight 07:59, 8 January 2006 (UTC)
- Approved for one week trial run. --AllyUnion (talk) 10:46, 12 January 2006 (UTC)
It's been a month or so since I got approved for the trial.... well I've used it only a few times over the last month but it seems to work out as intended. Can I do 30 seconds in between edits instead of 60? --Rschen7754 (talk - contribs) 08:40, 8 January 2006 (UTC)
- From Wikipedia:Bots#Current policy on running bots:
- 3. Until new bots are accepted as ok they should wait 30-60 seconds between edits.
- I've been running mine at 30 second intervals after the first week, based on this statement. --TheParanoidOne 11:58, 8 January 2006 (UTC)--TheParanoidOne 11:58, 8 January 2006 (UTC)
If you wish to exceed the 30 second limit and run at 10 second intervals, please apply for a bot flag. --AllyUnion (talk) 10:46, 12 January 2006 (UTC)
Anonymous bots with no information?
At the moment User:70.193.215.124 is making robot-like interlink edits, and getting a few wrong, but there is no information on what bot is running in the edit summary. The User page and talk page are empty, so I assume it's a user not-logged in. However is it good policy that robot edits can occur with insufficient information in the edit summary? -Wikibob 14:22, 8 January 2006 (UTC)
- post on the ips talk page, if you can't get in touch with him that way consider requesting a block. note that i've done some pretty major interwiki updates accross many wikipedias by hand on a mostly anonymous basis with no edit summaries so i wouldn't nessacerally assume this is a bot.
This is User:Zscout370 here, but under my bot account. What I want to do is use replace.py under this account to replace flag images, such as from Image:foo flag large.png to Image:Flag of Foo.svg. I have been doing this by hand for months now, but with the scope of the whole process of changing flag images, I believe that a bot should run the task. And, since I wish to use python, I have created my own bot account so that the python task can be used on this on, not on my main account. Is there is any suggestions or comments to make, you can ask them at User talk:Zscout370. Thank you. Zbot370 20:49, 9 January 2006 (UTC)
- Approved for trial run & testing for one week. May apply for a bot flag after no complaints after trial run. --AllyUnion (talk) 10:43, 12 January 2006 (UTC)
- Thank you, but I was wondering if I can get assistance with running replace.py. I am logged in as the bot now on WP and on Python, but I am just having trouble running the program itself. I am trying to replace Image:Flag of Mexico.png with Image:Flag of Mexico.svg as my trial run. Zbot370 22:27, 16 January 2006 (UTC)
- Is there a specific problem? Martin 22:43, 16 January 2006 (UTC)
- I am just having problem trying to run it. This is my first time running python (I should have mentioned this). Zbot370 22:46, 16 January 2006 (UTC)
- Is there a specific problem? Martin 22:43, 16 January 2006 (UTC)
- Thank you, but I was wondering if I can get assistance with running replace.py. I am logged in as the bot now on WP and on Python, but I am just having trouble running the program itself. I am trying to replace Image:Flag of Mexico.png with Image:Flag of Mexico.svg as my trial run. Zbot370 22:27, 16 January 2006 (UTC)
If all the pages that need changing are in a specific category you would enter the command:
python replace.py -cat:category_name -putthrottle:20
hit return then enter what to replace then what to replace it with. Or if you have the names of the articles that need changing in a txt file called abc.txt in the directory of the bot you could enter
python replace.py -file:abc.txt -putthrottle:20
Does this help? Martin 22:55, 16 January 2006 (UTC)
- Kind of. I am trying to follow the steps on Meta about how to run the program using Windows, but I am still not able to run anything, since when I do the "CD" command to locate files, it does nothing. Would this be best suited to continue the discussion at my bot's talk page? Zbot370 01:07, 17 January 2006 (UTC)
- NVM, I got help on IRC and things are running smoothly now. Thanks again. Zbot370 04:57, 18 January 2006 (UTC)
Permission
I was, unfortunately, unable to get WP:AWB to run properly (see [8], [9]). Thus, I've downloaded the pywikipedia kit and would like permission to run a manually assisted bot for such operations as fixing ambiguous links, double redirects, wiki-syntax errors, etc. I hope to expand this to other tasks as I become more familiar with the programming language, but all edits will be manually reviewed. — FREAK OF NURxTURE (TALK) 11:22, Jan. 11, 2006
- You will need a separate account for such a purpose. --AllyUnion (talk) 10:42, 12 January 2006 (UTC)
- I'm aware of this, I haven't created as I am seeking approval first. — FREAK OF NURxTURE (TALK) 10:48, Jan. 12, 2006
- Ok bot has been created and considers itself to be in trial mode. — FREAK OF NURxTURE (TALK) 10:34, Jan. 18, 2006
- I'm aware of this, I haven't created as I am seeking approval first. — FREAK OF NURxTURE (TALK) 10:48, Jan. 12, 2006
Just to keep everyone here up to date, User:Crypticbot is now maintaining WP:TFD's daily subpages in much the same manner as AllyUnion's bots maintain AFD/CFD. The major difference is that, in order to keep from breaking the watchlist for TFD like it was for AFD and CFD (see archived discussion), I'll be creating the daily subpages by a page move from Wikipedia:Templates for deletion/Log/Seed. Sorry about the lack of notification, but we didn't get any advance warning of the split on Wikipedia talk:Templates for deletion, either. —Cryptic (talk) 01:11, 14 January 2006 (UTC)
- What, your crystal ball broke? Seriously, good work on getting this working so quickly. Impressive. KillerChihuahua?!? 22:07, 14 January 2006 (UTC)
html => wiki bot
Using a new database analysis tool I just made I have identified ~7000 articles that contain <i> or <b> html markup, I would like to run my bot to convert this to standard wiki markup, using the pywikibot framework. I am aware that the html works over multiple lines but the wiki markup doesnt (as demonstrated here). thanks. Martin 22:03, 14 January 2006 (UTC)
- Careful with this one: if there was a movie or book called 'Bots, it would probably be coded as
<i>'Bots</i>
, because'''Bots''
would create Bots, which would be improperly formatted and also affect whatever text follows. - Thus, it would be ideal to have the bot either ignore cases like this, or change any affected literal apostrophes to some &#nnn; or ‘ ’ instead. — FREAK OF NURxTURE (TALK) 00:17, Jan. 15, 2006
- I see what mean, it should be pretty easy to avoid those situations if I am careful with some regular expressions. thanks Martin 00:24, 15 January 2006 (UTC)
Pokemon
On request I am going to change instances of Pokemon to Pokémon. Martin 23:29, 16 January 2006 (UTC)
NetBot concurrent runs
As raised on WP:VPT, I've noticed the same bot running multiple instances. In this case, because of problems with the change from {{See}} to {{Further}}. For one thing, I never saw a discussion on using Further instead of See.
For Israel, they appear to be from Special:Contributions/205.196.208.21:
- 2006-01-15 09:21:44 205.196.208.21 (Robot: migrating See calls to Further)
- 2006-01-15 09:48:03 205.196.208.21 (Robot: Migrate See to Further (match displayed text))
The latter edit broke the display at Zionism and Aliyah, through bad interactions with other common templates ({{Main}} and {{See also}} and {{Israelis}}). Note the differences using "Older edit".
However, others show up in Special:Contributions/NetBot, see the range:
- 2006-01-15 09:18:44 ... Canadian federal election, 2004 (Robot: migrating See calls to Further)
- 2006-01-15 10:07:07 ... Tank (Robot: Migrate See to Further (match displayed text))
Is somebody running the same Bot simultaneously from an IP address? Is that permitted?
- --William Allen Simpson 04:30, 18 January 2006 (UTC)
- In that case we need to figure out how to set the templates so they don't break each other. — FREAK OF NURxTURE (TALK) 06:51, Jan. 18, 2006
It is a bug with pywiki bots, I have found that roughly 1 in 1000 edits are performed logged out. Martin 13:37, 18 January 2006 (UTC)
- Here, by actual count, 15:142, which is a bit worse than that.... However, that seems like a rational explanation.
- If you use a static IP you might put a message on the User talk page for those who notice and look for info. (SEWilco 15:25, 18 January 2006 (UTC))
I have various spelling/grammar corrections I would like to make, as well as redirect and disambig repair (such as fixing all the redirects from Cambridge University to University of Cambridge), and stub sorting. Zscout370 helped set me up with this pywikipedia bot, and it appears to be working (made my first edit). — 0918BRIAN • 2006-01-18 06:37
- Please add the parameter;
-namespace:0
- As you should only make edits like this to main namespace pages. Martin 10:08, 18 January 2006 (UTC)
bots on the commons?
Where do I go on the commons to request permission to use a bot? I have just under 500 images I wish to upload. Dread Lord CyberSkull ✎☠ 23:42, 18 January 2006 (UTC)
- Ask at the Commons... We really couldn't tell you the answer. --AllyUnion (talk) 03:36, 24 January 2006 (UTC)
Vandal Count Bot
I'd like to run VandalCountBot as an automated bot that would run once an hour. I have already created an IRC script that logs the amount of vandalism in #wikipedia-en-vandalism, and the purpose of VandalCountBot would be to grab the vandal count information from my server (based on the hour) and place it in a table, as it can be seen here: [[10]] (NOTE: The information in that table was manually entered.)
The bot wouldn't put any stress on the wikipedia server, as it would only be sending one POST request to the server every hour. The purpose of the bot would be to show trends of vandalism based on time of the day, day of the week, and even month of the year. I believe it will be a beneficial tool to the CVU to be aware of when the high vandalism times are. --Lightdarkness 02:58, 20 January 2006 (UTC)
- This is based on the potential vandalism from the IRC channel? Would it also be possible to list the Recent Changes for the past hour, and count how many have some form of "rvv/Reverted edits by/revert vandalism" in their edit summary? This could be another useful curve. — 0918BRIAN • 2006-01-20 03:50
- My script already does what you've described at the IRC level. It looks for the text used by scripts such as Godmode-lite and Popups, and disregards if their edit summaries match. If I'm not understanding what you're suggesting/commenting on, please let me know :-) --Lightdarkness 03:53, 20 January 2006 (UTC)
- No, I understand what your current bot does. But the IRC vandalism bot gets false positives. It would be nice to have another curve to look at and compare to, specifically to get all the occurrences of "rvv/Reverted edits by/revert vandalism" in edit summaries in the last hour, or something like that. — 0918BRIAN • 2006-01-20 04:39
- I see what you're saying :-) I'll start logging the RC changes per hour, then I'll write something to sort through them tomorrow, and we'll see how they match up. EDIT: After looking through just a few minutes of the RC feed, it'd be very hard for me to sort out all the non vandalism. The IRC CVU channel does a good job with extensive blacklists, whitelists, and numerous rules to see what is vandalism and what is not. Sure some things might be mixed, but if I were to analyze the raw RC feed, my vandal count would be much much higher, that it wouldn't even be accurate. Maybe down the road when I can write all the rules the CVU bots have, but as the beginning stages of VandalCountBot, I'd just stick with the CVU IRC feed. --Lightdarkness 04:52, 20 January 2006 (UTC)
- No, I understand what your current bot does. But the IRC vandalism bot gets false positives. It would be nice to have another curve to look at and compare to, specifically to get all the occurrences of "rvv/Reverted edits by/revert vandalism" in edit summaries in the last hour, or something like that. — 0918BRIAN • 2006-01-20 04:39
- My script already does what you've described at the IRC level. It looks for the text used by scripts such as Godmode-lite and Popups, and disregards if their edit summaries match. If I'm not understanding what you're suggesting/commenting on, please let me know :-) --Lightdarkness 03:53, 20 January 2006 (UTC)
- I have done one test edit, the difference can be seen here: http://wiki.riteme.site/w/index.php?title=User%3ALightdarkness%2FVandalism&diff=36052090&oldid=35998063 As you can see, it has no ill effects on any other part of wikipedia, and it worked exactly as intended. --Lightdarkness 05:04, 21 January 2006 (UTC)
- I am running a test on the automation of Vandal Count Bot. Up until this point, the python script that does the updating has been manually run. Therefore, tonight at 8pm Eastern (0:00 GMT), VandalCountBot should update two pages. User:Lightdarkness/Vandalism and User:Lightdarkness/Vandalism/Sandbox I will be around to supervise it, but there should be no ill effects, as this is just a timer that is running the python file. --Lightdarkness 21:28, 23 January 2006 (UTC)
- Test run completely sucessfully. 0 problems (Execept it was 1:00 GMT, not 0:00. Damn my math!) --Lightdarkness 01:54, 24 January 2006 (UTC)
- Please add your bot to Category:Wikipedia bots and the main page. --AllyUnion (talk) 03:31, 24 January 2006 (UTC)
SundarBot
Hi, User:SundarBot is a bot based on pywikipedia that I'm using primarily for creating interwiki links for articles in Tamil wikipedia. I've been running this bot for sometime now. You can check it's contributions. -- Sundar \talk \contribs 12:44, 20 January 2006 (UTC)
- Please add your bot to Category:Wikipedia bots and the main page. You may apply for a bot flag after... however, I as have never seen a request from you, and do not know if any complaint has been lodged, please give your bot an one week continuous run, then apply for a bot flag after. --AllyUnion (talk) 03:30, 24 January 2006 (UTC)
- Forgot to tell that I've done that. I've let the bot run twice on the complete set of articles on the Tamil wiki. You can look at it's contributions. -- Sundar \talk \contribs 03:53, 2 February 2006 (UTC)
I would like to use me Interwiki.py robot at EN.wikipedia. I have a multilogin robot, editing more than one wiki at the same time. See its Japanese edits. I will do a testrun of about 50 edits at EN.wiki and than wait for permission. Dutchy-Dick 14:35, 20 January 2006 (UTC)
I have end my testrun with 33 edits I hope I can have A Bot Flag Dutchy-Dick 16:11, 20 January 2006 (UTC)
- I checked some of the test-edits and they are ok. Jcbos 16:45, 20 January 2006 (UTC)
- Please run your bot for a complete week. If no complaints for a trial run of one week, you may apply for a bot flag. Also, please do not forget to add your bot to the Category:Wikipedia bots and place your bot on the main page. --AllyUnion (talk) 03:27, 24 January 2006 (UTC)
- I applied for a botflag at meta. Dutchy-Dick 19:27, 1 February 2006 (UTC)
Flag activated. Bye. meta:User:Paginazero. 21:56, 1 February 2006 (CET)
I am requesting the bot flag for tsca.bot on the English Wikipedia. The bot has been active on several projects since 2004 (at this point it has the flag on pl.wiki, csb.wiki, pl.wikt, pl.books, en.books, commons, nl.wikt, da.wiki, sv.wiki, cs.wiki, pl.source, it.wikt, de.wiki). My intention now is to add interwiki links to Polish articles to en: (I can read the languages the bot adds interwiki to). Any other activity will be undertaken only after a proper discussion and making sure the community is interested. tsca ✉ 12:45, 28 January 2006 (UTC)
- The bot and its owner are doing an extremely good job at Polish Wikipedia. The bot is also the Polish Wikipedia users with the biggest numer of edits :). Ausir 12:47, 28 January 2006 (UTC)
- Which framework does it use? pywikipedia? That information would be helpful on the bot's userpage.--Commander Keane 13:10, 28 January 2006 (UTC)
- For the interwiki links the bot obviously uses interwiki.py. For any other work it uses my own software (written in perl + sh). tsca ✉ 13:21, 28 January 2006 (UTC)
- Which framework does it use? pywikipedia? That information would be helpful on the bot's userpage.--Commander Keane 13:10, 28 January 2006 (UTC)
Just a question
I'm confused about the different sections on this, especially WP:B#Bots running without a flag. I just want to do a one shot replace.py of a certian category that isn't that big, but has too many to be done manually. So do I need to request permission here for something that simple? I can do--Commander Keane 07:20, 21 January 2006 (UTC) even like 5 minute throttle and at whatever off-peak time! It seems like this bot flag thing is for permanent bots. I just want to run this replace.py one time and probably never use it again. --Chris 06:57, 21 January 2006 (UTC)
- You probably don't want to bother with a flag if you are only going to be making a few changes but you still have to request permission here. This is because no matter how straightforward you think a change is, since a bot does a lot of work it's a good idea for others (ie at this page) to look over the idea and see if it is ok (take a look above here and here, both are cases where a bot operator thought they were doing something simple and yet infact bot permission was refused). So, what is the category change and where is the consensus for it? Also, you may want to consider using category.py if it's a simple category name change. If it's just a one off thing then you could consider a Bot request instead (and someone else will do it for you).--Commander Keane 07:20, 21 January 2006 (UTC)
- No it's not a category change. I'm introducing a new parameter into a template, but all of the articles that use it (which are all in this one category) will break if I just change it, so I want to fill all of the articles with a default value for the parameter. Eventually, many of them will probably manually bjavascript:insertTags('--Chris 21:24, 22 January 2006 (UTC)',,);e changed to something else, but I don't want them all breaking beforehand. --Chris 07:32, 21 January 2006 (UTC)
- Give the parameter a default value like this:
{{{param|default}}}
, then go assign a value to the parameter at each article afterward and you won't be throwing ugly curly braces into articles for even a few minutes. Note that the default can be whitespace/blank like this{{{param|}}}
. — FREAK OF NURxTURE (TALK) 07:44, Jan. 21, 2006- holy crap. i never knew that. i have no clue how i missed that for this long. that must be a fairly new feature. ok, well that's good. i don't need a bot at all. thank you very much for the tip. --Chris 21:24, 22 January 2006 (UTC)
- Give the parameter a default value like this:
- No it's not a category change. I'm introducing a new parameter into a template, but all of the articles that use it (which are all in this one category) will break if I just change it, so I want to fill all of the articles with a default value for the parameter. Eventually, many of them will probably manually bjavascript:insertTags('--Chris 21:24, 22 January 2006 (UTC)',,);e changed to something else, but I don't want them all breaking beforehand. --Chris 07:32, 21 January 2006 (UTC)
Syrcatbot request
As an admin who has started doing a lot of category moves / etc at CFD, I'm thinking of making a separate account for the recat edits. As a first step, I've created User:Syrcatbot. I guess my first question is: is this a reasonable idea, or does nobody care that I will have a ton of recategorizations for User:Syrthiss?
Syrcatbot will still be me sitting there with AutoWikiBrowser, and will obviously not be tagged as Admin.
I'm still getting the hang of tagging cat moves for the other bots, but if that is what I should do for all my cat moves then I'm willing to stop doing recats by hand and leave it to the bots.
Thanks! --Syrthiss 15:18, 22 January 2006 (UTC)
- The reason why you'd apply for a bot account is to make faster high speed edits that seemed unhumanly possible. Also, this helps you being blocked accidently by Curp's autoblocker. So it is a good idea to make a new bot account, and apply for a bot flag after. --AllyUnion (talk) 03:25, 24 January 2006 (UTC)
- Ok, then this is my official request for a bot flag for User:Syrcatbot per Wikipedia:Bots. I've populated the userpage for Syrcatbot, and added it to Category:Wikipedia bots. --Syrthiss 22:24, 24 January 2006 (UTC)
NetBot
Netoholic is running User:NetBot and modifying all instances of the {{main article}} template, which was previously (TFD'd and?) redirected to {{main}}. Has he asked for permission to do this or is just more disruption regarding conditional templates? — Omegatron 19:42, 23 January 2006 (UTC)
Xiaogiabot: Need help with web crawler permission
I am doing a school assignment where I need to configure Heritrix web crawler to retrive pages from Wikipedia. I read Wikipedia:Bots and it says that I need to approval on Wikipedia:Bots talk. I am confused whether I need to get approval for the web crawler engine.
This is the information of the Bot:
- The bot is automatic. I configured the URL to point to a page in Wikipedia.
- It should run from Jan 25 - Mar 31 2005.
- Heritrix from Internet Archive's Heritrix homepage. It is a Java program.
- Purpose:
- I need to crawl a topic to retrive the pages. The purpose is to preserve the topic for future use.
- I notice that for every page there is a history that shows the history page. But, the purpose that I am doing this is for web archiving purpose. This is to archiving one topic and show a prototype of how this can be done. Wikipedia must allow me to crawl because I need this to accomplish my assignment. Please.
The user page for my bot User:Xiaogiabot.
Please let me know if my web crawler can crawl a topic in Wikipedia.
Xiaogiabot 08:43, 25 January 2006 (UTC)User:Xiaogia
- Crawlers are not allowed, see Wikipedia:Database_download#Please_do_not_use_a_web_crawler, although if you are only crawling a small number of pages it may be ok. Martin 09:39, 25 January 2006 (UTC)