User talk:Yurik/Archive 2007
This is an archive of past discussions with User:Yurik. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 2000 | ← | Archive 2005 | Archive 2006 | Archive 2007 | Archive 2008 | Archive 2009 | Archive 2010 |
YurikBot not running on Latin Wiki (la:)
Same for us. ;-) What happened to the bot? --Roland2 07:28, 7 January 2007 (UTC)
pl.wiki - Caldera
Your bot isn't working properly ..... [1] Lothar
- Please see FAQ above. --Yurik 16:23, 24 January 2007 (UTC)
- OK. I'll try to use his "iwfixer" tool. I hope it helps Lothar
api watchlist feed
hi Yurik. I came across the API page, and started wondering about the watchlist feature. As it requires being logged-in, how could it be read by a rss reader, without having to open wikipedia and log in? I'm saying this cause since you are logged in there's no need to have the watchlist in rss format as you could access it from the special page. I know there are privacy issues, but what about merging 'action=login' with 'action=feedwatchlist' in something like http://wiki.riteme.site/w/api.php?action=feedwatchlist&lgname=user&lgpassword=password? --Waldir 02:07, 26 January 2007 (UTC)
- Hi again, Yurik. I am afraid i have bad news: apparently, rss doesnt allow login sessions.... there is, however, a tag in rss 2.0 spec that may be of use, but honestly i dont think so. check http://www.rssboard.org/rss-specification#lttextinputgtSubelementOfLtchannelgt -- Waldir 02:40, 7 February 2007 (UTC)
Revisions from the MediaWiki API
Hello Yurik, I have made a script which requests the recent revisions of a page from the api and places them in the sidebar (nl:User:JePe/recentrevisionsbox.js). Some other users are also using this script. So, when you see a lot of requests for page revisions on particularly nl.wikipedia.org in the logs of the api, they probably come from this script. (image from the output) JePe 15:25, 12 February 2007 (UTC)
- No problem, will keep an eye on it. One suggestion though - you might want to use json instead of xml - it will be substantially easier to work with from javascript. --Yurik 17:37, 12 February 2007 (UTC)
- I shall try to implement json in the script. Is it also for the server easier to generate json? JePe 00:40, 13 February 2007 (UTC)
- Yes - json is produced by a native library, whereas xml is more verbose, requires more custom processing, and gets generated by php code. The client also benefits because all it has to do is an
result=eval('json string');
command. --Yurik 02:07, 13 February 2007 (UTC)
- Yes - json is produced by a native library, whereas xml is more verbose, requires more custom processing, and gets generated by php code. The client also benefits because all it has to do is an
- I shall try to implement json in the script. Is it also for the server easier to generate json? JePe 00:40, 13 February 2007 (UTC)
I hope the output of the json object from the api is good. I have seen several examples from json objects, but they all contain arrays of objects, so you can easy access an iterated object. For example you can access a username within a revision with obj.query.pages[0].revisions[0].user. However the json output from the api doesn't contain arrays [2]. So my question is, how to access a field within a revision if you don't know the revision id? JePe 13:47, 14 February 2007 (UTC)
- See query API example here. --Yurik 19:19, 14 February 2007 (UTC)
- Thanks, I didn't know there was some kind of manual for it. JePe 21:43, 14 February 2007 (UTC)
API contribution
Hi! I was looking at the wiki API as I am currently writing a C# interface for this API. I was wondering if there is any way I could contribute and help with the API in PHP (eg. I would like to see the categorymembers implemented and therefore I could do it). --mfx Q&A 16:03, 20 February 2007 (UTC)
- I would love some help (swamped at work otherwise). You will need to get an environment (Apache+MySql+PHP), setup the mediawiki on your own machine (get the latest directly from the anonymous SVN), and import the Simple DB (its much smaller then EN, yet you will have no issues reading non-latin characters in mysql db browser). Once this is done, start hacking! Let me know once you have something, and i will assist getting it into the source tree. Thanks! --Yurik 23:03, 22 February 2007 (UTC)
API
Я пишу на Яве и активно использую PHP-API. Прошу мне помочь в двух вопросах -
- как можно дозагрузить очень длинный список (например статей в категории) - проблема в том, что если я гружу следующий кусок списка "от" последней статьи, оказывается кривой сортировка и мне оно отгружает частно не с того места (в английском языке это не видно, а в немецком - к сожалению) - то есть, грузить на с буквы Ü а с 501 позиции
- как можно быстро проверить наличие статьи или изображения? Механизм должен быть, ибо пытаться по-простому читать каждое и сравнивать с пустым местом (что дулает сервер на каждой "красной" ссылке) = база данных замается.
Благодарю заранее TheNeon 12:26, 1 March 2007 (UTC)
- 1) Используйте ??continue параметр который возвратил предыдущий запрос.
- 2) http://wiki.riteme.site/w/api.php?action=query&titles=MissingPageTitle -- "missing" аттрибут показывает что страницы нет. Запрос может содержать много titles разделённых "|". --Yurik 08:10, 5 March 2007 (UTC)
Api.php
Heya Yurik. Wondering if you might be willing to comment on my suggestions on meta. Nobody's responding to me there :'(. Btw, is it maybe time to start archiving your talk page? :) AmiDaniel (talk) 01:26, 3 March 2007 (UTC)
- Hi. Yes, good suggestion :). If only i would find a good dev to assist me with development though :) --Yurik 08:11, 5 March 2007 (UTC)
- I know what you're saying! Good devs are hard to find--I now have a few helping me out with VP who are wonderful and who have made a tremendous difference in the project (which I had all but abandoned!). If you need assistance with api.php, I would be more than willing to lend a hand, although I am quite horrible at php and haven't really done any in quite some time. I'd definitely be willing and capable of doing maintenance tasks, etc. , for you--give me a ring, you know where I'm at :). AmiDaniel (talk) 09:35, 5 March 2007 (UTC)
rvcontent in query.php
Hi - the use of "rvcontent" in the query.php URLs seems to crash it, though the query works (under a different name) on api.php. As background, I'm trying to bring AVB up to date, but it needs this function to be working in query.php - do you have any ideas for when it might be fixed/resolved? Martinp23 19:10, 7 March 2007 (UTC)
- Correction: the ststem throws a database error when rvuniqueusr and rvcontent are passed,. as here. Thanks, Martinp23 19:18, 7 March 2007 (UTC)
Login
Здравствуйте, Юрий! У меня к вам вопрос. Я посылаю на сервер запрос с информацей для входа в систему. Почему запрос всегда завершается успешно, но в ответе иногда нет cookies?
I have a question to you. I send request to server with login information, and recieve message "Login successful". Why sometimes I don't recieve cookies (but login is successful)? - VasilievVV 09:15, 8 March 2007 (UTC)
- Hmm, you should always get either an error or a cookie. I can't think of a reason -- will have a look at the code [3]. --Yurik 00:00, 9 March 2007 (UTC)
- Yes, API always return cookies, but I want to use it to access to MediaWiki directly (because API now can't edit and perform some extension-specific queris). How I can use cookies from API on MediaWiki direcly? - VasilievVV 06:40, 11 March 2007 (UTC)
OK. I'll check. And 2 more questions about MediaWiki:
- I want to integrate wiki1 and wiki2 and access to user1 (wiki2) rights via wiki using user1@wiki2 syntax on [[Special:Userrights]. How to do it?
- How to do interwiki like in Wikipedia - VasilievVV 08:43, 10 March 2007 (UTC)
- Not sure its possible yet - you are talking about common login across wikies - Brion was working on that a while ago, and I don't know the status. For interwiki to work, you have to configure one of the tables (i think its called interwiki) to add an entry to wiki2 in wiki1, and wiki1 in wiki2. I have never done that - I suggest you look on m: --Yurik 21:17, 12 March 2007 (UTC)
Can your bot fix...
Hi! I was looking at interwikis from en:dump truck to see if I could catch any useful info. I found out that dump truck going over german leads back to en:dumper. Also it seems some other languages are badly linked, and the es: leads to a disambig page. Is this something you and your bot can help fixing? G®iffen 09:56, 11 March 2007 (UTC)
- Once i get it up and running again, it should be able to resolve such issues. It might take some time though :( --Yurik 21:17, 12 March 2007 (UTC)
- Ok, the article mis-linking won't ruin my sleep, so don't stress :-) G®iffen 19:10, 13 March 2007 (UTC)
iishared problem with query.php
There's a problem reported at bugzilla:9225 and on the query talk page which stops image previews working correctly in popups at the moment. I think this should be an easy configuration fix but don't know enough to do it myself. Please could you look into it when you get time? Thanks, Lupin|talk|popups 09:41, 14 March 2007 (UTC)
I sent you one. Yonidebest 11:12, 18 March 2007 (UTC)
api.php and the watchlist
I notice that action=feedwatchlist and list=watchlist seem to have disappeared from api.php. Is this for server load reasons (sorry, in that case, it was probably my fault), or for some other reason? --ais523 11:33, 26 March 2007 (UTC)
- I haven't touched anything for a while, and no new code checkins have been done latelly. Check with brion - he might have changed something manually on the servers. --Yurik 01:22, 27 March 2007 (UTC)
Ruby Framework
I'm working on a Ruby implementation of the API, and I'm hitting some roadblocks that I was hoping you can help me with. I was playing with MediaWiki 1.9.2 for a while to get the framework going, but something must have changed between there and 1.9.3/4 since my POST requests aren't returning YAML formatted results anymore. I've posed in the API talk page, with no response. I'm pretty fluent in PHP, but haven't been able to dig through the source for the API enough yet. Can you take a look at my comments on the talk page and maybe give me a pointer or two? Thanks.
- YAML works fine on the server ([4] and [5]). If it does not work on your machine, you might not have all the files. IIRC, YAML output tries to use a built-in native yaml formatting function, and if the function does not exist, uses php code. If your installation does not produce yaml, the yaml formatting php code must be missing. Please sign your posts using --~~~~. Thanks! --Yurik 20:07, 11 April 2007 (UTC)
Category:Systems
Thank you for your contribution to Category:Systems in the past. There is currently a Call for Deletion for this category. If you would like to contribute to the discussion, you would be very welcome. In particular, if you would like to save this category, please add a Keep entry with your "signature" using "~~~~". Please do this soon if possible since the discussion period is very short. Thank you for your interest if you can contribute. Regards, Jonathan Bowen 18:30, 13 April 2007 (UTC)
Question about interwiki bot
How to allow bot to remove interwiki links only if page doesn't exists on other wiki, and forbide to do it in other cases? - VasilievVV 08:06, 14 April 2007 (UTC)
- I am not sure there is a way to do that. The bot at present is not using a very reliable way to check for page existence, so sometimes it thinks the page does not exist when it does - I wouldn't trust it until the bot starts using the new api. --Yurik 04:34, 15 April 2007 (UTC)
api.php / query.php contributions count
You disabled this a while ago for server load reasons. If I remember correctly, the information is now available cheaply in the database; if it is, could api.php or query.php be used as an easy way to access it (thus creating less server load than going via Special:Contributions and screenscraping or paging through the api.php output)? --ais523 12:05, 8 May 2007 (UTC)
- Done in query.php. --Yurik 05:23, 11 May 2007 (UTC)
- Yurik, I'm afflicted with terminal editcountitis, so I've included your counter on my user page. I've noticed that it gives a consistently higher count than Interiot's program. In my case, the number is 36 higher. That seems to be about the number of edits I've had deleted. (Nothing scandalous - just a few articles that got AfDed, plus a few hoaxes I prodded myself.) I was wondering if your edit counter picks up those deleted edits. Thanks for the quick counter! Casey Abell 18:27, 15 May 2007 (UTC)
- Apparently it's a known problem of the page update code. I can only show what is in the DB. Once that code is fixed, the counter should become correct again. --Yurik 23:20, 15 May 2007 (UTC)
- Just noticed that I forgot to thank you for this. Thanks, Yurik! --ais523 13:07, 16 May 2007 (UTC)
- Yurik, I'm afflicted with terminal editcountitis, so I've included your counter on my user page. I've noticed that it gives a consistently higher count than Interiot's program. In my case, the number is 36 higher. That seems to be about the number of edits I've had deleted. (Nothing scandalous - just a few articles that got AfDed, plus a few hoaxes I prodded myself.) I was wondering if your edit counter picks up those deleted edits. Thanks for the quick counter! Casey Abell 18:27, 15 May 2007 (UTC)
Search via API
Will access to search be available via API? - VasilievVV 16:30, 17 May 2007 (UTC)
- Eventually ALL features should be available through the api that you can do on the site. The biggest restrain is really the time -- I am the only one working on the whole API at this point, and definitely looking for help :). Make sure you add this to the requests at the mw:API. --Yurik 22:14, 17 May 2007 (UTC)
API returning non-RFC-compliant headers?
Hi, and thanks for all the work. I have noticed that the query API returns an extra semicolon in the the content-type header. It is no big deal, and we have already patched the client that had trouble with this, but I thought I would let someone know. Here's an example:
With this URL: http://tr.wiktionary.org/w/api.php?action=query&prop=info&format=xml&titles=foo this header: Content-Type: text/xml; charset=utf-8;
is returned. My cursory reading of section 14.17 of RFC-2616 seems to confirm that the last semicolon is disallowed.
- never mind, I'll enter a bug into mediazilla. --tr:User:Bm
Sabians
en:Sabians & de:Sabier is a (extinct) religious community, fr:Sabian a factory for cymbals --de:Benutzer:Sirdon 14:37, 25. Jun 2006 (GMT)
GPCR
You have repeatedly added the interwiki link es:Proteína G (english: G-protein) to the german article de:G-Protein-gekoppelter Rezeptor (english: G protein-coupled receptor). However, these are two different terms for two different things. Therefore, interwiki links between G-protein und G protein-coupled receptors should be avoided. Thank you. --Sven Jähnichen (06. Jul. 2006)
The en: and it: are incorrect. Dom.
Yiddish wikipedia request from your bot
First of all we thank u for putting all the interwikis first in English then in second Hebrew, we would like to organize deutsch-german to be the third since our Yiddish languge originates from German, and this will help allot to term articles and wording according to its parent language after comparing the English and Hebrew options which is intertwined in modern Yiddish. thanks. also if you can organize this that other bots should note it as well. thanks. --71.247.133.186 5 mrt 2007 18:34 (CET)yi:באנוצער:יודל
Bot status in Gujarati Wikipedia
Bot status in Gujarati Wikipedia granted. --gu:User:Spundun
Writing data to mediawiki directly (not through the web-interface) via API/Query
hi
I'm trying to write a Firefox extension allowing to post data to a mediawiki page (either by appending the text to an existing page or creating a new page without the need to open and use the mediawiki regular web interface (like in wikipedia). I understand from reading your page that most of the API is still not implimented. is Query the thing to use to be able to write content to the mediawiki db directly? I would be very greatful if you can please point me to the right direction!
Thanks for your time!
--Gargamel573 21:30, 31 May 2007 (UTC)
- Hi, currently there is only one way to do edits -- the main site itself. It is planned for the API, but due to complexity has not been implemented yet. Query.php is the older readonly interface which is being ported to the new api.php. --Yurik 16:29, 3 June 2007 (UTC)
- do you know of a way I can use the main site to write data to mediawiki but automate the process?
- does writing a firefox extension or a script (via javascript) to automate edits via the main site web interface sound feasible in your opinion?
- any ideas/input greatly appreciated!
- Thanks again for the help.
deleted rev/rc/log entries
Hallo Yurik, just saw your security updates in the API - that made me think:
A long standing feature request in MediaZilla is reviewing deleted entries - would it be possible to include special functionality for sysops to see (and filter) these db entries? -- srb 23:56, 2 June 2007 (UTC)
- Should be possible, once I have more bandwidth (time) :) --Yurik 12:59, 4 June 2007 (UTC)
API limits and Bot programing
When I'm here on the site ;-)
I'm currently checking the possibilities of the API for bot programming - and I see a major drawback for usability:
There are at least 4 limits to think of - two limits for different queries combined with different limits dependend on user status (normal vs. bot/sysop). Especially the user status introduces heavy problems on portability.
Would it be possible to introduce a limit=maxlimit feature (maybe limit=0, because 0 is currently not allowed) to retrieve reproducable the maximal possible results?
BTW: Encouriging the usage of a maxlimit feature would introduce (at least in my opinion) additional flexibility for limiting the impact of API requests dependent on server load - e.g. on base of maxlag the limits could be changed dynamically without breaking existing applications.
What do you think of it? -- srb 00:12, 3 June 2007 (UTC)
- Hmm, sounds interesting, although when multiple queries are combined, you cannot ask for maxlimit allpages and maxlimit content - all pages would create a list of 5000, whereas content will not be able to fill them all due to a lower limit.
- Would it be better to be able to query the limits themselves? query=limits&lmmodules=<list of modules>. It will return the limits appropriate for the logged in user. --Yurik 13:05, 4 June 2007 (UTC)
- You are right: the determination of the correct limit for multiple queries is a bit tricky, but the user has no need to know the exact limits - with a "maxlimit feature" it would be enough when the API knows the limits. E.g. when using the query
- generator=allpages&gaplimit=0&prop=revisions&rvprop=content
- just determine the limit in the API and deliver what you (=the API) thinks is appropiate - the user need not know how large the list will be, he has the query-continue if he needs more. If there is more than one place to check the limit (e.g. for multiple queries) just hand over the limit=0 to the other routines or set an internal flag usemaxlimit=True to allow the modules an appopriate modification of the limits when checking the parameters - allpages would allow 5000, but content then could reduce the limit to 200.
- In the meanwhile query=limits would help much to make API queries reusable by different users. Another possibility would be to reduce limits that are to high instead of raising an error (maybe combined with a warning). -- srb 09:39, 6 June 2007 (UTC)
- You are right: the determination of the correct limit for multiple queries is a bit tricky, but the user has no need to know the exact limits - with a "maxlimit feature" it would be enough when the API knows the limits. E.g. when using the query
Helping out with the API
I'd like to help you out with implementing the API, as I'm tired of just sitting around and just waiting for it to be fully implemented :) I know PHP and MySQL, and I have experience with wikis, although I don't know anything about MediaWiki's gore internals. But I guess I could learn that. I'd prefer to start working on some bugs from Bugzilla (starting, of course, with my very own InterWiki table request), and by that time I hope to have learned enough to be able to help with implementing action=submit (something I've been wanting to have functional for quite some time). If you could send me instructions on how to access the source code, I'll get working on it.
Hope I'll be able to help you --Catrope 09:23, 11 June 2007 (UTC)
- Hi, This sounds great! :) The mw: has all the information on how to get started for developers. You start with sending patches (you can attach them to the bugs), and after a while you get a direct commit access. Let me know if you have any questions. --Yurik 15:47, 11 June 2007 (UTC)
- Alright, I'll dive into it when I have time (i.e. next weekend). --Catrope 19:14, 11 June 2007 (UTC)
internal_api_error
Hallo Yuri, you comment this error with "Something is seriously wrong", so you may want to know when it happens:
I just caught the exception in de: (around 10:15 UTC)
- code: internal_api_error
- info: Exception Caught: DB connection error: All servers busy
- query-path: action=query&generator=allpages&gapnamespace=0&gaplimit=200&gapfilterredir=redirects&prop=revisions&rvprop=ids|timestamp|flags|comment|user|content&gapfrom=Acquirieren&format=json
Ganglia status looks good. -- srb 10:32, 13 June 2007 (UTC)
- Thanks, but it seems everything is fine now - i think DBs got too many connections at once or something. Especially if they tried to get content for 200 pages. --Yurik 00:59, 14 June 2007 (UTC)
Implementing tokens
What is the status on implementing tokens through prop=info? Is someone working on it? If not, is it OK for me to start working on it? I think action={move,delete,protect,rollback} can be implemented relatively easily compared to action=submit (although I might be wrong; haven't looked at that code yet), and I need the tokens up and running before I can even start doing those. --Catrope 17:14, 14 June 2007 (UTC)
- No one is working on it at the moment, and you are very welcome to help out! --Yurik 16:34, 16 June 2007 (UTC)
Just a thought about your security issues in the API documentation: Would it be possible to disable these actions/tokens on the base of the useragent delivered with the api query? If disabling this functionality for common browser useragent strings would be possible, this would prevent your link scenario but would keep it still useful for bots. -- srb 23:10, 17 June 2007 (UTC)
- Basically, tokens were designed to fix these security issues: before being allowed to move, delete, protect, rollback or edit a page, you have to get a token. The User.php code generates a random string, which is stored in the $_SESSION array. The MD5 hash of that string is then returned as a token. When A is clicking on a link that contains B's token, it won't work, because A will have a different random string in his $_SESSION (although there is the microscopic chance of a collision). I'm not sure if all this will work when the client doesn't return the enwiki_session cookie, but I'll see. --Catrope 17:52, 19 June 2007 (UTC)
- When I looked at the implementation more closely, I realized that edit tokens are useless without a session cookie. Because I don't like the idea of bots having to use cookies (adds a lot of overhead to the code), I propose we allow edit/delete/move/rollback/protect actions without the respective tokens. To fix the ensuing security hole, we should ignore a user's cookie and force everyone to authenticate through action=login and lg{username,userid,token}. This is quite easy to implement. I have an inkling not everyone will like this, but it is way more secure. Besides, when you're a bot, those three extra parameters are a piece of cake anyway. --Catrope 18:50, 19 June 2007 (UTC)
- Some clients may not be bots -- like Navigation popups. --Yurik 21:08, 19 June 2007 (UTC)
- Even those are auto-generated, which means they can very well pass an lgtoken. Anyway, since navpopups don't make any changes, they don't need to be logged in. --Catrope 13:40, 20 June 2007 (UTC)
- NavPopups allow rollbacks (IIRC). --Yurik 14:46, 21 June 2007 (UTC)
- Even those are auto-generated, which means they can very well pass an lgtoken. Anyway, since navpopups don't make any changes, they don't need to be logged in. --Catrope 13:40, 20 June 2007 (UTC)
- Some clients may not be bots -- like Navigation popups. --Yurik 21:08, 19 June 2007 (UTC)
embeddedin/eifrom/eicontinue
I am just starting to work on a bot in Java using api.php, and am running into some problems understanding how embeddedin works. My original query is:
http://wiki.riteme.site/wiki/api.php?titles=Template:Numismaticnotice&action=query&list=embeddedin&eilimit=5&format=jsonfm
. If there are more results than my limit, I get:
"query-continue": { "embeddedin": { "eicontinue": "10|Numismaticnotice|46346" }
at the end of my output. The docs don't list exactly what to use to continue, but under all pages, it says to use apfrom with the info given in the output. I tried using eicontinue with the text above ("10|Numis..."), but it doesn't work (my original list of pages came up). I also tried using eifrom (which I saw in one of the docs) with the last page link that was found, and it didn't work either (the api error page came up). What am I doing wrong? And thank you for the work you've done on this api! Ingrid 18:12, 21 June 2007 (UTC)
- http://wiki.riteme.site/w/api.php?action=query&list=embeddedin&eilimit=5&format=jsonfm&eicontinue=10%7CNumismaticnotice%7C46346 will work. You have to remove titles=... --Yurik 18:37, 21 June 2007 (UTC)
- I could've sworn I'd tried that, but it works now, so I guess not. Thank you, Ingrid 00:42, 22 June 2007 (UTC)
Your bot on Be:
Please notice page your created there was moved to the page named in correct grammar and wording be:Вікіпедыя:Праект:Праца для робата/Сумесь алфавітаў. Thanks. Yury Tarasievich 08:04, 8 July 2007 (UTC)
國際奧委會新成員國
在2007年7月7日,國際奧委會第119屆全體大會上,表決通過吐瓦魯及黑山為新的成員國;但還沒有編碼,希望各位提供意見如何編輯。—202.86.135.173 21:30, 13 July 2007 (UTC)
Possible API Module
I was wondering if you'd be interested in reviewing an API query module I've cooked up, to see if it'd be worthy of a commit. It's basically a rip of ApiQueryAllpages, except it implements Special:Protectedpages instead of Special:Allpages, removing the need for bot developers to either screen-scrape Special:Protectedpages for a list of protected pages, loop through QueryLogevents, or run QueryAllpages through QueryInfo for protection information. Perhaps it'd be better implemented by extending an existing module? Let me know what you think. :) — madman bum and angel 00:53, 7 August 2007 (UTC)
User:Madman/ApiQueryProtectedpages.php
The Interwikilink to zh: (whatever this is) on this page done by YurikBot is doubtful (and the spanish link was wrong). Pls check for contence and, in case, delete. --172.176.193.102 18:50, 6 September 2007 (UTC) (i. e. de:Benutzer:Marinebanker)
- The bot has not been running in a very long time, the links might have been outdated. Please correct any errors you see. --Yurik 18:54, 6 September 2007 (UTC)
- I'll remove because auf doubt. Most and probably all Interwikilinks on this page were incorrect. To be honest, I do not consider it adequate when Bots create Interwikilinks between languages their owners do not understand and others have to clean up. Best regards --172.176.113.46 20:33, 7 September 2007 (UTC) (i. e. de:Benutzer:Marinebanker)
Обрати внимание - вдруг да пригодится: mw:Project:Translation/ru ;) И ещё: я там поправил навигацию, но нужно решить - делать как в справке (отдельно языколвой бокс для шаблона, и отдельно - для каждой страницы; я пока сделал так), или зашить "универсальный" в шаблон.--Kaganer 18:39, 12 September 2007 (UTC) (UTC)
edittokens from api.php
Hello, why the api.php don't display edittoken on such queries ? http://wiki.riteme.site/w/api.php?action=query&prop=info&titles=Einstein&intoken=edit
If it was desactivated, have we a chance to see it working again ? For performance issues it is far much convenient than get the edittokens from action=edit.
Regards Meithal 23:21, 19 October 2007 (UTC) (could you reply on my french talk page if it don't annoy you ?)
EditPage.php merge
Yurik! The changes to EditPage are beautiful! Very glad to see you're working on the problem. -- David McCabe 04:00, 1 November 2007 (UTC)
- Thanks, Vodafone people helped with that one, I'm slowly migrating to the main branch... Too slow though, no time :( --Yurik 07:06, 3 November 2007 (UTC)
Bot
Вижу, бот в украинской Википедии исправляет ошибки в раскладке. А он может переименовывать статьи с ошибкой раскладки в названии?--Ahonc (Talk) 19:03, 30 November 2007 (UTC)
- Может. Более того, он это и делает. :) --Yurik 20:33, 30 November 2007 (UTC)
- Ahonc, подсобите с uk:Вікіпедія:Завдання_для_роботів/Суміш_розкладок. Спасибо. --Yurik 22:38, 30 November 2007 (UTC)