Wikipedia:Bot requests

From Free net encyclopedia

{{{1{{{1|}}}|

Shortcut:
{{{1|}}}
}}}

This is a page for requesting work to be done by a bot. This is an appropriate place to simply put ideas for bots. If you need a piece of software written for a specific article you may get a faster response time at the computer help desk. You might also check Wikipedia:Bots to see if the bot you are looking for already exists.de:Wikipedia:Bots/Anfragen es:Wikipedia:Bot/Solicitudes eo:Vikipedio:Laboro por robotoj fr:Wikipédia:Bot/Requêtes nl:Wikipedia:Verzoekpagina voor bots pl:Wikipedia:Zadania dla botów ru:Википедия:Проект:Работа для бота uk:Вікіпедія:Завдання для роботів fa:ویکی‌پدیا:اتحادیهٔ کارگری ربات‌ها/تقاضای پرچم Please add your bot requests to the bottom of this page.

If you are a bot operator and you complete a request, note what you did, and archive it. Requests that are no longer relevant should also be archived in a timely fashion.

Archives:

</div>

Contents

Bot for replacing images

Currently, png flag images are being replaced by superior svg versions. Replacing all of the instances personally will be rather tedious, and I think a bot would be much faster and more effective; alas, I don't know anything about bot programming, so I'd like to request assistance with this endeavour. Image:Austria flag large.png ナイトスタリオン 07:55, 24 November 2005 (UTC)

There has recently been a development of the Python wikipediabot to do this exact task. If you are still interested I'll investigate further.--Commander Keane 16:35, 23 December 2005 (UTC)
I am also interested in something like this. While I will try out the new pywikipediabot function, I hope that something could be done to solve this issue. Zach (Smack Back) 22:57, 29 December 2005 (UTC)
Has someone run a bot to replace all of the flags? I can do it if it still needs doing. Tawker 09:19, 19 February 2006 (UTC)

Duplicate link removal

Constantly edited articles, especially current events, are severely susceptible to linking to the same subjects more than once. Subjects linked in an article should be linked only once at usually the first instance of the subject keyword. Is there a bot that performs duplicate link removal? I imagine how the bot would function is first by indexing the links on a page, identifying links that occur more than once, and removing the additional links from the bottom of the article upward. Adraeus 19:16, 25 November 2005 (UTC)

Anyone? Adraeus 03:46, 4 December 2005 (UTC)
My citations bot is being rewritten, and logic to recognize several types of duplicate information is part of the tasks to be done. A side effect of the rewrite is that it will be easy to examine all the wikilinks. If nobody else tackles this, remind me in a few weeks or grab the code for parsing WikiSyntax from pywikipedia's standardize_notes.py. (SEWilco 08:16, 23 December 2005 (UTC))
WP:AWB software has logic to do this, although it deliberately does not automatically remove the links. Martin 12:45, 11 January 2006 (UTC)
I think this is a bad idea. If you use the table of contents at the top of an article to jump to a particular section, you would get many linkable topics without any links. Not very useful for navigation. First mention in a section might be a better policy. First mention of a topic only is too strict. Enforcing this policy via bot is too strict. --ChrisRuvolo (t) 14:47, 11 January 2006 (UTC)
Well you should take it up with the manual of style, but it is for the reasons you give that software does not do it automatically, and it certainly shouldn't be done with a fully automatic bot. Martin 14:56, 11 January 2006 (UTC)
I wasn't aware of this in the MOS. I just updated Hairspray_(movie) because Divine was listed in the intro as a link but not elsewhere. Further down the page, there is a list of the cast, where he isn't linked, which looks really strange (because everyone else is linked). Linking only once makes sense but not in an instance like this. --geekyßroad 01:44, 1 February 2006 (UTC)
The manual of style states "a link is repeated in the same article (although there may be case for duplicating an important link that is distant from the previous occurrence)" so i think its quite clear that human judgement is needed before removing additional links. Plugwash 01:48, 1 February 2006 (UTC)

Backlog bot

Looking over Category:Wikipedia backlog, I notice several perennial backlogs for which a bot may be of assistance. I would like to hear the opinion of people more versed in botting than I am. In particular, I think the Move To Wiktionary/Wikibooks/Meta may be bottable (it probably requires a human doublecheck to see if things are properly tagged, but after that transwiking is still a lot of work and a bot would automatically fill in the transwiki log)

Wikipedia:Duplicated sections is about the old bug that doubled parts of pages. This sounds like something a bot could plausibly fix.

Would it be feasible to help out Special:Uncategorizedpages and/or Wikipedia:List of lists/uncategorized by using bots to categorize them using some keywords found on the page?

I don't suppose Category:Articles that need to be wikified could be meaningfully aided by a bot?

My first thought about Wikipedia:Templates with red links would be to employ a bot to strip the redlinks from the templates, but that's probably not what people want.

And finally, Category:NowCommons sounds mechanical enough that a bot might work but I'm not sure about that.

Comments please? Radiant_>|< 23:53, 28 November 2005 (UTC)

All those things critically need some human interaction, at the moment I am working on a browser-cum-bot that will be able to function in this manner, in a similar fashion to User:Humanbot, but much more advanced and versatile ;) well thats if I ever get it finished anyway. Martin 00:13, 29 November 2005 (UTC)


UTC/DST bot?


This is probably a stupid point (and a bot may be doing it already), but I noticed on a visit to a random US city that an anon had corrected the infobox to account for an additional hour of difference with UTC after Daylight Saving Time ended. Could a bot, or other automation, do this sort of thing en masse on the appropriate dates instead? Xoloz 19:22, 29 November 2005 (UTC)

imo really shouldn't be making periodic (twice anual) changes to articles in this way for a periodic change. we should be documenting what the periodic change is (in this case mentioning both the base timezone and the dst rules)!
Would a template that does this add too much load?

Disambiguation bot for Austin

A lot of tedious disambiguation work could be done by a bot to replace
[[University of Texas]] at [[Austin]]
by
[[University of Texas]] at [[Austin, Texas | Austin]].

Consider it done. Rich Farmbrough 20:24 7 March 2006 (UTC).
There aren't any. Rich Farmbrough 23:07 7 March 2006 (UTC).
Thanks for looking. I think since I posted this request (quite a while ago but I forgot to sign it), someone else has changed all occurrences of [[University of Texas]] at [[Austin]] to [[University of Texas at Austin]]. Colonies Chris 23:23, 7 March 2006 (UTC)

Convert infoboxes

There are many album articles that use an old infobox, see here. I'm sure a well-programmed bot could do these conversions quickly and easily. It would save a lot of work for us (members of the project). Thanks for listening, I hope someone's up to the task. Gflores Talk 06:36, 3 December 2005 (UTC)

Could you provide a diff of an infobox being properly converted? —Cryptic (talk) 06:54, 3 December 2005 (UTC)
Here you go. [1]. Gflores Talk 07:00, 3 December 2005 (UTC)
Hrm. So it's not just a conversion of an old template to a new one, but changing hardcoded tables. Glancing through a dozen or so of the articles listed on Wikipedia:Wikiproject Albums/Needs infobox conversion, I'm finding a lot of variance in the formatting used. Are there a lot more to do than what's currently listed there? If that page is exhaustive, it would be almost as easy to do them all by hand. —Cryptic (talk) 07:47, 3 December 2005 (UTC)
Not as bad as I thought; all the ones I checked seem to be direct substs of various revisions of Template:Tl. This looks feasible. I'll see what I can put together. —Cryptic (talk) 08:09, 3 December 2005 (UTC)
That would be fantastic. On the talk pages, users are saying there may be twice or maybe three times as many articles with old-style infoboxes than what is listed. Believe me, it's not fun doing it manually... ;) I did notice that there is some variance in the formatting used... some don't even have the Professional Reviews header at all. Hopefully you can use your 1337 Perl hacking skills to find a way. If not, oh well. Thanks for considering this. :) Gflores Talk 08:59, 3 December 2005 (UTC)
Mh. Do you think the same bot, or a slight modification of it, could be used to make all country articles use the standard infobox? Image:European-Austrian flag hybrid.svg ナイトスタリオン 12:09, 3 December 2005 (UTC)
Do you have a list of country articles that need fixing? —Cryptic (talk) 01:33, 4 December 2005 (UTC)
I'll start discussion at Template talk:Infobox Country and get back to you soon, okay? Thanks in advance! Image:European-Austrian flag hybrid.svg ナイトスタリオン 10:01, 4 December 2005 (UTC)
As mentioned in Tt:Infobox Country, I'm working on a template extraction bot for the country infoboxes. As a side effect of a new design, that bot will probably be easily extended to understand the structure of the templates so translations between variations of those infoboxes can be done. At present I'm coding the action logic for this specific task as I haven't identified a way to generalize the tasks for more general manipulations. (SEWilco 08:06, 23 December 2005 (UTC))

NowCommons

Hi, A bot for clearing the never-ending backlog at NowCommons is needed to do the following tasks:

  1. The {{NowCommons}} on images should be replaced with {{NowCommonsThis}}, if the image is uploaded to commons with the same name.
  2. It should check whether the file exists on commons.
  3. And whether The entry has proper copyright information on the commons page and proper attribution is given to the uploader/creator and the file history is copied in the case of GFDL'ed images.

If the bot can do task 1, that alone would considerably speed up the whole process. Thanks in advance. --PamriTalk 07:00, 5 December 2005 (UTC)

Another crucial task for the bot would be to point to the images at commons in articles, in the case of images that are uploaded to commons with a different name. --PamriTalk 07:25, 5 December 2005 (UTC)
Funny that you mention it...on es: just last week we did something similar. I wrote a bot that checked all of es:'s images and checked if commons had an image with the same title. If it did, the image description was marked with a template. This not only helped us to find images that we had on es: as well as on commons, it also helped us find images with name conflicts. Ex: "Mango.jpg" on es: and "Mango.jpg" on commons were not the same picture and consequently "mango.jpg" was inaccessible from es:. Humans than came by and confirmed whether they were identical images or they were a problem name, and acted accordingly. I would be more than willing to run this bot on en:, but I don't have a flag...perhaps I could pass the code to a bot operator here? Cheers--Orgullomoore 08:44, 5 December 2005 (UTC) Oh yeah, its in python and based heavily on pywikipedia
Thanks for the info. We may possible need to modify it here, since it need not run on all images, but just the ones tagged at nowcommons. I think we can contact someone listed at Wikipedia:Bot who has a flag to run it. Or someone could apply for it. If you don't want to run it, I could do it too, but only after january. --PamriTalk 17:52, 6 December 2005 (UTC)

Spell Check

Could a bot be made to check commonly mispelled words? it sounds obvious but a lot of articles are hard to find because of spelling errors in the titles Veritas Liberum 22:29 6 December 2005 (GMT)

No, according to WP:BOT#Spell-checking_bots, theses bots should never be used. Unless I suppose you did not wish to fix them, but just find them, that's OK.--Max Talk (add)Contribs 06:16, 5 February 2006 (UTC)

close parens SPACE character

There is a popular grammar error of entering a parenthetical remark, and then not putting a space between the close paren and the rest of the text. A bot could look for these fairly easy. Sample: George Bush (the current President)doesn't like brocolli. SchmuckyTheCat 22:39, 6 December 2005 (UTC)

  • (Yes, that would more properly be an appositive. It's an example, get over it.) SchmuckyTheCat 22:40, 6 December 2005 (UTC)
  • Doing this is a bit more complex than you'd first think. There are lots of exceptions, such as 501(c)3; blocks of code like printf("Hello world\n"); mathematics like <math>f(x)=y</math>; markup (particularly CSS styling); plurals like 'enter your modification(s) in the text box; brackets used as delimiters for groups of letters in a word in linguistics; and probably more... Cmdrjameson 15:15, 18 January 2006 (UTC)
  • Close parens,2+non space chars. There are about 6500 of these, most (90%+) are errors. I'll see what I can do. Rich Farmbrough 00:22 8 March 2006 (UTC).

Infomovie box bot

Recently, the infomovie box was changed. As a result hundreds of pages were affected. Images in the infobox used to have brackets around them, with Image: in front of it. Now it's built like this: batmanmovieposter.jpg with out anything around it. Could a bot be made to fix all pages that have it the old way? See Enduring Love for an example of how its messed up...

Aww...I feel really bad for Adraeus, who did it all (or a big part) by hand, poor kid.--Orgullomoore 10:19, 7 December 2005 (UTC)
Don't feel bad. I only fixed those 200-or-so articles for polito-editorial reasons — to ensure the changes I made to the template remained and were improved upon. Adraeus 08:51, 8 December 2005 (UTC)

"Attention" tags on talk page

I see on Wikipedia:Template messages/Maintenance that Template:Tl tags are supposed to go on each article's Talk page, but that "older uses have not yet been moved."

Would this be a useful bot to write? I am a capable programmer and am willing to take this on (though I have not yet had experience writing bots for Wikipedia). Tim Pierce 18:44, 10 December 2005 (UTC)

This doesn't make sense as the template says "More information may be available on the article's talk page." Rich Farmbrough 00:26 8 March 2006 (UTC).

Small-image batch uploader

For Wikibooks:Castle of the Winds, I need to upload dozens of 32x32x4bpp icons to use in the monster and item tables. Could a bot be written to do this as a batch operation and convert them to PNG? The most convenient way, for this use, would be to extract all the icons from the Windows executable (which I could upload). Seahen 16:19, 11 December 2005 (UTC)

Would that be ok in terms of copyright? I don't know if that would really count as fair use - anyone want to clarify? BadgerBadger 21:34, 11 December 2005 (UTC)
  • the entire game, data and binaries, was released into the public domain according to the wikibook. Sparr 22:16, 18 December 2005 (UTC)
  • Check out Reshacker: http://www.angusj.com/resourcehacker/, that can extract the images. Insomniacity 11:09, 18 December 2005 (UTC)
    Reshacker won't handle CotW because it's a 16-bit program. Even if it would, that wouldn't solve the problem of uploading. Seahen 19:38, 31 January 2006 (UTC)

Bot request for fixing movie infobox

Currently template:Infobox Film has parameters like this:

{{Infobox Film | name = 
 | image = 
 | caption =
 | director = 
 | producer = 
 | writer = 
 | starring =
 | music =
 | cinematography =
 | editing =
 | distributor = 
 | release_date = 
 | runtime = 
 | language = 
 | budget = 
 | imdb_id = 
}}

However, many old movie pages used differnet titles for the same thing. For example, some use "movie_language" instead of the (now) correct "language". Could someone write a bot to change the incorrect titled fields in all the movie pages to the correct titles for these fields. Here's a list of what we would need (and this would be for all pages that use the Infobox Film template only):

  • movie_language should be changed to language
  • image caption should be changed to caption
  • movie_name should be changed to movie
  • original_score should be changed to music
  • "awards" field and what ever follows "=" in the template should be removed, same for
  • "proceeded by" and "followed by" (these were in some old movie templates but are no longer used)

This bot was first mentioned by Bobet 17:22, 10 December 2005 (UTC). This would be very helpful, then we could finally get all the movie page templates to be the same.

I know this is a very tough task but it would help greatly. Going to see King Kong Steve-O 13:47, 15 December 2005 (UTC)

Whoops, for some reason you can't see the template here. I suck. You can see it at the talk page for the template.... preceding unsigned comment by Steve Eifert (talk • contribs) 13:48, 15 December 2005 (UTC{{{3|}}})

Should instead be fixed with default template parameters (ie, instead of using {{{language}}}, use {{{language|{{{movie_language}}}}}}). Otherwise, this is a very large amount of work (every article that transcludes Template:Tl - there are about 2250 - would need to be checked) for no enduser gain whatsoever (the only difference is what you see when you edit the article). —Cryptic (talk) 16:48, 18 December 2005 (UTC)

I am handling this request with User:NetBot. -- Netoholic @ 18:22, 18 December 2005 (UTC)

Supreme Court Cases bot

There should be a bot that generates nice information for Supreme Court cases, possibly also including links to Oyez, etc. wdaher 07:51, 17 December 2005 (UTC)

Hi Wdaher, this sounds interesting, can you post something on my talk page explaining exactly what you would like? Bear in mind I am a complete lay person with regards to legal matters, so use small words and talk slowly ;) Smitz 20:18, 21 December 2005 (UTC)

Bot for uncategorized pages

Could a bot go over the articles and tag uncategorized ones with Template:Tl? Maintenance categories should be ignored (like stubs, clean ups) also categories like from Template:Tl or Category:xxxx births/deaths. This would create real-time editable uncategorized pages storage. Now Special:Uncategorized are not editable, are limited to 1000 pages, often neglected to refresh, etc. It would really foster categorization efforts. Renata3 20:52, 19 December 2005 (UTC)

Reference desk bot

Could a bot be used to notify users when they get a reply to their question on the Reference desk? Bart133 21:13, 23 December 2005 (UTC)

Typo fixer

I'm part of Wikipedia:Typo and I've been fixing typos in this way: Using this google search, I extract the names of the pages unto a text file using a simple program I wrote. From there I load it up on AutoWikiBot and fill in the appropriate search and replace fields and make the necessary changes. I hit save for each one and very rarely is there a false positive. However, I know that this could easily be done with a bot, which would save time doing this manually. Any takers? Gflores Talk 03:04, 25 December 2005 (UTC)

I dont think auto spell checking bots are allowed, see Wikipedia:Bots#Spell-checking_bots. Martin 11:58, 25 December 2005 (UTC)
Yes, I realize that, but I think that rule applies for searching in an article for any spelling mistakes. What I'm saying is changing one specific type of spelling error (decieved --> deceived). The issues in that page don't apply (US/British spellng) and there is virtually no chance of error for most words. Again, only the pages listed under the google search would be corrected. I don't see what's the difference between a bot doing it and someone else doing it using AWB. I corrected about 1000 or so and I did not find one correction that was a mistake. Gflores Talk 01:48, 26 December 2005 (UTC)
The hard part is parsing google's output, which you already seem to have done. From here, it's a simple matter of fetching each page, doing a search and replace, and posting it. If you have access to a system that can run perl, I can write the bot for you; I'm not going to run it under my own bot's account, though. —Cryptic (talk) 16:13, 27 December 2005 (UTC)
Well, I'm using the AutoWikiBrowser by User:Bluemoose (who recently quit). It gets the results from google and does the find and replace. I'm using that, but it looks like I'm going to have to get a bot account, b/c I'm making too many edits in a short amount of time. It's only semi-automatic, I still have to hit 'Save' again and again. I'm also uncertain how other users feel about this method of fixing typos. Gflores Talk 22:38, 2 January 2006 (UTC)
It's conceivable that there might be some places where a word is intentionally mispelled for some purpose, e.g. Spelling Pimlottc 13:07, 23 March 2006 (UTC)

Bot to replace "notion" with "concept"

Does anyone else feel the word "notion" is a bit POV? "Concept" is just the dry definition, but "notion" seems to imply skepticism. Thoughts? -MPD 14:42, 28 December 2005 (UTC)

I think a human needs to decide in each case, thus a bot is not appropriate.--Commander Keane 14:50, 28 December 2005 (UTC)
Sounds good. Now that we're approaching the 80%+ threshold (personal impression) of project completion, it will be interesting to see what final loose-ends projects will come into play. The last mile is always the hardest. - MPD 23:55, 28 December 2005 (UTC)

I recently decided that I wanted to consolidate all of my extracted email files and put them in one basic folder. Then I realized that not only the folder names contained valuable information but the entire folder hierarchy. Ever had your mother clean up your room - you'll live to regret it!

Change External Iink to External links...

From what I understand, "External links" at the bottom of an article should always be plural reguardless if there is only one external link. There are many articles that have "External link" or "External Link" (the L captialized). Could a bot be made to correct them all to read "External links"?Steve-O 17:40, 28 December 2005 (UTC)

I did ~30,000 of these corrections with User:Bluebot a few weeks ago, it didnt add an s though because there is no agreement whether it should always be pluralised. My new semi-automatic editor also corrects this type of mistake (and many more!) so hopefully they will be eradicated soon. Martin 17:51, 28 December 2005 (UTC)
thanks for your response. I guess the pural is not always the case... Steve-O 17:55, 28 December 2005 (UTC)
If you are interested the External links guideline indicates the lack of consensus regarding the use of the plural.--Commander Keane 17:56, 28 December 2005 (UTC)

subst ll template

Can someone run a one time bot to subst all uses of the ll template. i.e. replace all: {{ll|Foo}}) with {{subst:ll|Foo}})

You can use this page to feed the bot. 67.165.96.26 23:30, 28 December 2005 (UTC)

Featured Article Review

See WP:FAF, which documents how featured articles have changed so they can be continually evaluated with WP:FAR. FAF lists articles with a link to the version that was originally approved and a diff showing the changes since then. It is quite tedious to crawl through the history of the talk page, find the date the featured template was added, compare it to the history of the article and add the revision id to FAF. Could this be automated? All or virtually all of the talk pages had the featured template added in an edit with the summary "{{featured}}" or "{{msg:featured}}". Using the date of that revision, a bot could presumably get the id of the version that was current at that time, and add it to WP:FAF. Tuf-Kat 06:05, 30 December 2005 (UTC)

Q) bot usage

I want to know about bot commands :

1. to delete "all" articles or images command in a specific category daily.
2. to get a article list "by written date" AND article summary="delete request" in a specific category.
3. to get a article list "by user" AND article summary="delete request" in a specific category.
4. to notify "{{sample}}" to article's first and second write user talk page, and write a summary="{{sample}}"
5. to find recent 7 days articles as summary="aa" -- WonYong 23:11, 30 December 2005 (UTC)

User:Sashato

I'm requesting bot status (flag) for bot User:Sashato. This bot is active on sr: and hr: wikis. The only job will be adding interwiki links. I use pywikipediabot for this interwiki job. Thank you — SasaStefanovic • 18:07 1-01-2006</small>

This page is to request a bot to do something for you. You need to paste this message at the bottom of Wikipedia talk:Bots. Also, when you do post there, please say what languages that you speak and which language Wikipedia's you will be woking on.--Commander Keane 21:15, 1 January 2006 (UTC)

Redirect bot

It would be really awesome if someone could write a bot to automatically fix single redirects (that is, when A links to B, and B is a redirect to C, the bot would insert C into A, piping B, [[C|B]], that is.) --maru (talk) Contribs 02:03, 2 January 2006 (UTC)

personally it pisses me off when people "fix" single redirects. it puts trash in my watchlist display and loads of piped links make wikitext harder to read (which is never a good thing). Plugwash 02:07, 2 January 2006 (UTC)
Agreed. Many such redirects should not be fixed, e.g. Category:Redirects with possibilities. A bot to do this for some of the other redirect categories (particularly Category:Redirects from misspellings) might be in order, but certainly not for all redirects. —Cryptic (talk) 02:16, 2 January 2006 (UTC)
What would be great is changing links that are already piped, and that point to redirects that are not in Category:Redirects with possibilities, to point to the real target article. I have seen and manually fixed a bunch of these, so they seem common enough to deserve a bot. I don't know of any situation in which that would be an undesirable edit (if such a thing was a bot, I believe its edits would not show up on recent-changes pages, which includes watchlists?).
If this proposal or a variant of it is okay, I'd like to write the bot to do it (because, e.g., programming is fun). (Is looking at the pywikipedia bots' code and wikipedia.py a right way to go about figuring out how to write a bot? Is there more bot-coding-related documentation somewhere?) —Isaac Dupree(talk) 21:54, 8 January 2006 (UTC)
I still disagree with the idea of a bot changing non-piped links into piped ones. both because it makes the wikitext harder to read and because Category:Redirects with possibilities is unlikely to be comprehensive. Chaging known misspellings (don't pipe in this case just replace). Really stupid stuff like linking to one redirect to a page whilst using the title of another and linking to a redirect and titling it with the articles real name would be good to clean up too. Other existing piped links should probablly require manual approval of the change after a human has checked for a pluralisation piping for a redirect with a possibility. Plugwash 23:28, 8 January 2006 (UTC)
We need to make things clearer; it looks like, e.g., Plugwash thought someone had disagreed with his first comment, when it seems neither Cryptic nor I did. Therefore I present a list of cases that may be considered wiki-editable by anyone, including adding to the list; the analyses presented there should reflect the discussion, though, which may continue after the list. —Isaac Dupree(talk) 11:33, 11 January 2006 (UTC)
  1. [[Redirect]] -> [[Direct|Redirect]]: A bot should never do this without more information (such as in disambiguation); there is disagreement over whether it should be done at all, aside from disambiguation.
  2. [[Redirect|text]] -> [[Direct|text]] where Redirect is in Category:Redirects with possibilities: Should not be done.
  3. [[Redirect|text]] -> [[Direct|text]] where Redirect is not in Category:Redirects with possibilities: Usually fine. A bot wouldn't notice if Redirect should be in Category:Redirects with possibilities, despite not being there. I may note that many humans wouldn't notice either, not knowing that redirects may have possibilities.
  4. [[text|texting]] -> [[text]]ing: Good.

Touch Template:User zodiac

Could someone touch the "What links here" for Template:User zodiac. I just need it done once. Thanks.--Commander Keane 22:52, 2 January 2006 (UTC)

In progress. —Cryptic (talk) 23:10, 2 January 2006 (UTC)
And complete. —Cryptic (talk) 00:00, 3 January 2006 (UTC)
Absolutely awesome, cheers!--Commander Keane 00:11, 3 January 2006 (UTC)

Article list for "related changes"

I wrote a java program to generate a list of all articles which link to Template:Tl and are thus part of Wikipedia:WikiProject Numismatics. Actually, I manually use "What links here" and cut and paste into a file. The program then takes that file and formats it so I can cut and paste back to Wikipedia:WikiProject Numismatics/Articles. I try to do this about once a week. I also ran it for the folks at Wikipedia:WikiProject Hawaii, and someone asked if I could make it a bot. I am relatively new here and know nothing about bots. Is this the sort of job that could be done by a bot? I'm sure other projects would appreciate it too. Ingrid 18:24, 3 January 2006 (UTC)

How much Python do you know? There is a Python Wikipedia Robot Framework in which you could work out of, or I could assist you with. --AllyUnion (talk) 00:54, 4 January 2006 (UTC)
I've never heard of Python. In my past life (5+ years ago) I was a programmer, so I was going to say that I should be able to pick it up quickly. However, in my current life, I'm a mom who is constantly distracted and doesn't get enough sleep, so who knows. Ingrid 04:51, 4 January 2006 (UTC)

To make matters more complicated, the list that I get from "what links here" suddenly got a lot smaller. It appears to be because some of the template references include "template". So, instead of having {{Numismaticnotice}}, lots of pages have {{template:Numismaticnotice }} or sometimes {{Template:Numismaticnotic}}, and unfortunately there may be others slight changes but I don't know. I haven't updated the list, so the one at articles (link above) is still the full list. Is there a bot that can go through those articles and fix them? Or is it a bug? The template looks right on the page. An example is at Talk:History of the English penny (1066-1154). Ingrid 05:15, 8 January 2006 (UTC)

I don't know if I should post here or start a new entry at the bottom of the page. I guess I'll start here, and see if I get any replies. I have ported my java code to python, and have downloaded the pywikipediabot. Would someone mind helping me get started? Pywikipediabot intimidates me, and I just don't know where to begin. Ingrid 02:11, 24 February 2006 (UTC)

Bot for collecting similar type of information from the net

Hi, I need a bot which I can use to gather some information of the same kind from the net. I will also edit part of it by hand. More specifically I would like to have a bot so that I can organize the World Univeristies: basic information, their history, famous people taught or educated at these institutions, etc, etc. Is there a bot I can use for this purpose? If not, can anybody generate it for me? I would appreciate if somebody could help. I am pretty new to the bot business so the information you will provide should be rather soft. Thanks,

Resid Gulerdem 03:48, 5 January 2006 (UTC)

If you know a resource which has a lot of the information you need available to the public without copyright, and in a predictable format it might be possible.
For example, you can find just about any movie at IMDB and it will tell you the date released, who is in it, and all sorts of other information. The problem with a resource like IMDB is that the information belongs to IMDB (see IMDB#Copyright_issues). However, since it sounds like you are after statistics and factoids it might be legal to use a bot to collect this information which you then interpret and enter into wikipedia.
Unfortunately, without a resource which has already organized this information once, it would require a bot of extraordinary abilities to make any sense of the internet and find what you need. I hope I understood your question. --Phil Harnish (Info | Talk) 08:59, 6 January 2006 (UTC)

Nowiki templates in log archives

Wikipedia:Deletion log and Wikipedia:Upload_log are both repositories for the old logs from before 2005 that are not recorded in the current logs. However, since they are now in the Wikipedia namespace, all the wiki syntax in the deletion reason fields and upload summaries become real. As a result, any dozens of templates named in those fields have become real templates, like {{stub}}s in deletion reasons and {{GFDL}}s in upload logs, and the pages are now inappropriately in categories. Could someone create a one-time script that will go through all those log archives and <nowiki> any templates contained in them (there should be none)? Dmcdevit·t 07:28, 7 January 2006 (UTC)

If it is true, that templates should be disabled in log reports, perhaps a bug report would be apropriate? Really, I'm asking. --Phil Harnish (Info | Talk) 20:50, 7 January 2006 (UTC)
It's not the currect log reports, like at special:log or special:log/protect, etc. but the old ones, which were stored in the Wikipedia namespace. Similar to how you can put {{stub}} in an edit summary and it doesnt show up, but when you write it in the edit field it does. Dmcdevit·t 03:25, 8 January 2006 (UTC)

Bot that can do null edits

I added a category to a template Wikiproject Indian cinema after it was created and added to more than 100 articles. Is there anyway a bot can run through all articles using that template (what links here) and do a null edit on them. This is only way the category would refresh. - Ganeshk 09:18, 7 January 2006 (UTC)

A bot is already working on it - you may want to see this. --Gurubrahma 09:37, 7 January 2006 (UTC)

Bot to add a template table to the end of list of articles

A bot would be really helpful to add Template:Time Person of the Year to all of the different pages that the template links to. Thanks for your help. P-unit 00:04, 10 January 2006 (UTC)

Has the usefulness and validity of this template been discussed anywhere?--Commander Keane 00:31, 10 January 2006 (UTC)

Mass move bot

Looking for a bot that can or which can be modified (by me) to perform a large series of simple moves on articles with a certain name format. Something that can take regular expressions would be ideal, and perform a substitution on the name a la sed to form the new name. It should use the normal WP move operation to retain redirects on the old names. Prefer something that runs under Unix and preferably not Python unless it can work out of the box with only configuration or arguments.

Purpose is for a move of area codes to be non-US-centric as per discussion in Talk:North American Numbering Plan. - Keith D. Tyler 22:37, 12 January 2006 (UTC)

  • Suggest the name User:WillyBot (In Memory) ;) —Ilyanep (Talk) 22:39, 12 January 2006 (UTC)
  • User:Uncle G's major work 'bot has moved the occasional mountain before. ☺ If this is a once-off thing, I might be able to help. Uncle G 16:13, 19 January 2006 (UTC)
    • I'm still awaiting a consensus on the relevant talk pages. It appears that whilst there's agreement that the current article titles are ambiguous, there's no agreement on what the new article titles should be. Uncle G 11:34, 31 January 2006 (UTC)

Osprey Publishing

It is a company with a major specialization in military history and is frequently cited as a reference on Wikipedia military history pages. I recently created a stub article on it, and would like a bot to go through and Wikify the current plaintext on the several hundred pages to read Osprey Publishing instead of Osprey Publishing. Palm_Dogg 19:47, 13 January 2006 (UTC)

That shouldn't be too hard :) Xxpor 14:36, 12 February 2006 (UTC)

I started working on it with User:Xbot

Disambiguation bot for music genres

I was wondering if it is possible to create a bot to handle music genre links that point to disambiguation pages, such pop, rock, LP, heavy metal, folk, indie, punk, rap, emo, hip hop, album, EP, hip-hop (should point to hip hop music, country, bluegrass. and probably a couple others, and exchange them with their proper links. The bot would only search in articles with the Template:Album infobox and Template:Infobox Band, so as to avoid incorrect edits. Any takers? Comments? Gflores Talk 03:59, 14 January 2006 (UTC)

I'm on it. — FREAK OF NURxTURE ([[[Template:Fullurl:User talk:Freakofnurture]] TALK]) 02:21, Jan. 19, 2006
That's great! I added a few more things for dabs. Let me know if you have any questions. You can also post on the WP:ALBUMS talk page. Gflores Talk 02:59, 19 January 2006 (UTC)
Note that album (music) got moved to album at some point, so all infoboxes linking to album are now correct. Perhaps there are a few false positives mixed in with them, though. I've pretty much finished with most of them, but "country" will be hard, because it's not a disambiguation page being linked to. I need some sleep, I've been at this almost 30 hours. — FREAK OF NURxTURE ([[[Template:Fullurl:User talk:Freakofnurture]] TALK]) 09:00, Jan. 21, 2006

Maintenance template sort key

The majority of the maintenance templates do not use a sort key for the category. Sorting on the word Template in the category page is not useful. The template pages need to change from:

[[Category:Wikipedia maintenance templates]]

to

[[Category:Wikipedia maintenance templates|{{PAGENAME}}]]

I suspect that the same issue affects other template categories. -- Doug Bell (talk/contrib) 03:46, 15 January 2006 (UTC)

Done for all the templates that weren't protected. —Guanaco 04:15, 23 January 2006 (UTC)
Thanks...much better! :-) – Doug Bell talkcontrib 04:44, 23 January 2006 (UTC)

Capitalisation

I'm requesting a bot to change pages linking to the wrong capitalisation pages to the correct titles:

Will this be possible? --Thorpe | talk 18:41, 15 January 2006 (UTC)

Some similar ones:
  • [[Gameboy]] → [[Game Boy line|Game Boy]]
  • [[Gameboy Advance]] → [[Game Boy Advance]]
  • [[GameBoy Advance]] → [[Game Boy Advance]]
  • [[Gameboy Color]] → [[Game Boy Color]]
Nikai 13:05, 10 February 2006 (UTC)

Malformed external links

There are a large number of external links which are not functioning because the editor who added them mistakenly used the pipe symbol (|) as it is used in internal wikilinks rather than just a space, or sometimes a pipe and then a space.

E.g. Wrong [http://www.bbc.co.uk|BBC Website]

Wrong [http://www.bbc.co.uk| BBC Website]
Right [http://www.bbc.co.uk BBC Website]

Can a bot be created to automatically put these external links in the correct format? --Spondoolicks 16:29, 16 January 2006 (UTC)

Dead external links

Is anyone working on bots for Wikipedia:Dead external links? Please reply on my talk page as well as below. Thanks. --James S. 05:45, 18 January 2006 (UTC)

Request regarding blocked User:Gibraltarian

I am trying to find ways to stop User:Gibraltarian. We've tried everything save one idea. He uses the IPs 212.120.224.0 to 212.120.231.255. I wrote a template at Template:Gibraltarian. It makes it clear under which circumstances that the IP could reasonably be considered to be G. If someone wants to make it clearer, go ahead. I would like a bot to put this template on the IP range that he uses. It would aid users. Right now, he's being warned several times before we're blocking him because non-admins especially are unaware of this guy's MO and his status (the arbcom is about to ratify the indefinite block). This is what I mean. He basically roamed free for half an hour until someone realized who it was. Here is the list of IPs he has used just to show that they are all in that range. Thanks. --Woohookitty(cat scratches) 21:15, 17 January 2006 (UTC)

  • Uncle G's 'bot is ready to prepend this to all talk pages for the IP address range 212.120.224.0/21, and I am awaiting word from the arbitration committee. Uncle G 18:10, 21 January 2006 (UTC)
  • Uncle G's 'bot has now completed this task. A handful of edits did not succeed, because of server problems. I'll go through the 'bot error log and fix these up by hand at some point. Uncle G 09:12, 23 January 2006 (UTC)

Cycle hitters

I would like a bot that could add a Category tag (Category:Hit for the cycle) to all the baseball players who have hit for the cycle (there is a list at that page). zellin t / c 17:47, 21 January 2006 (UTC)

Subst a few templates

The Template:Tl and Template:Tl templates are widely used on wikipedia, however, it would probably be beneficial to subst: most of their use), if only because they are solely sued as a quick shortcut. This would be best doneon a semi-regular basis so that some pages where the templates are actually useful, such as WP:WSS/ST or pages listing templates can be reverted and removed from later substing. This could cover several of the templates in Template:Cl. Circeus 20:06, 21 January 2006 (UTC)

Yes, they are widely used, AND Wikipedia:Template substitution#Templates that should NOT be subst'd, and confirmed on the talk page.... This is a bad idea!
--William Allen Simpson 23:30, 1 February 2006 (UTC)

Duplicate image finder

  1. Not sure if it's a good idea for bandwidth, etc. but a bot that went around all the languages and commons and checked for duplicate images would be nice.
  2. If not actually downloading them and checking that the files are the same, it could at least add links from identically-titled images to each other (on various languages and commons). — Omegatron 00:38, 26 January 2006 (UTC)

I'll give it some thought... But how would one go about this? downloading each file individualy and checking md5sums? Honestly, this seems like something that would be best performed by someone with access to the boxes, so to preserve bandwidth and time. Jmax- 23:19, 28 January 2006 (UTC)

You can compare the files bit by bit. Computing md5s would be counterproductive. Yes, it would be better for someone with access to the servers to do it. Or maybe have it download the database dumps and run on those? — Omegatron 00:03, 31 January 2006 (UTC)
Actually, you could run an image comparison algorithm and find copies of the same images that have been resized, too.  :-) — Omegatron 00:04, 31 January 2006 (UTC)

"Future" bot

It seems there are many pages that change yearly (logo of Super Bowl winner, picture of Miss America winner, names/cabinet/etc. of presidents, etc.) that could use one of two things. a) a {future} tag to high-light the fact that yearly changes need to be made. and/or b) have the bot automatically submit edit requests. Waarmstr 04:27, 30 January 2006 (UTC)

Great idea. I can see three cases to handle here:
Pimlottc 13:35, 23 March 2006 (UTC)

banned-user image finder

I have been asking around about whether it would be possible to write a bot that would produce a list of all images uploaded by indefinitely blocked users. Based on my own experience, it seems like at least 75% or so of such images are unencyclopedic, orphaned, copyvios, or similarly inappropriate. Thoughts? Thanks, Chick Bowen 18:19, 29 January 2006 (UTC)

Note that these users are listed at Special:Ipblocklist with an expiration of "infinite." Chick Bowen 18:22, 29 January 2006 (UTC)
Yes, it's possible. Would it be easier to do it manually, though?--Orgullomoore 03:11, 10 February 2006 (UTC)
I'm working an a script to detect these. — FREAK OF NURxTURE ([[[Template:Fullurl:User talk:Freakofnurture]] TALK]) 03:36, Feb. 15, 2006
Thanks, F of N! I tried doing it manually as Orgullomoore suggested; since there are thousands of blocked users and many of them have no edits at all (all edits deleted, or blocked for username violation before they could edit), but since you can't tell that until you look at the contribs, it was unbelievably inefficient. You can't even sort the list by block length, which would be helpful. Chick Bowen 05:43, 15 February 2006 (UTC)

Dead Hyperlink Locator Bot?

Is there one? Pattersonc(Talk) 8:00 PM, Sunday; January 29 2006 (EST)

There is Weblinkchecker.py--Commander Keane 00:14, 31 January 2006 (UTC)

Database dump analysis - welcome messages

A new service for newbies has been created, they can now use Template:Tl to get help (run by Boot Camp). The standard welcome messages have this new service listed (eg Template:Tl etc), but many editors have personal welcome messages (eg User:Sam_Spade/Welcome).

I'd like an analysis of the database dump to be done which finds as many of the these alternative welcome messages. A script would check for "Welcome" or "welcome" in the title of pages in the User namespace (or any other keyword you can think of). I would then go through the report and contact the owners of the alternative welcome templates. Can anyone do it?--Commander Keane 00:25, 31 January 2006 (UTC)

Probably. I going to sleep now, I'll see what I can do in the morning. Martin 00:27, 31 January 2006 (UTC)
done. Martin 13:36, 31 January 2006 (UTC)

Album (music ) to Album bot

There are thousands of music albums on Wikipedia that link to Album (music). However the Album (music) page just redirects to the actual Album page. Can a bot be written to repoint these to the Album page instead of continuously going through a redirect. I'd write one myself and get approval but I don't know how to write one to do this. I've been changing the links when I've come across them but there are thousands of them (and thousands that link directly to Album so just moving and redirecting the other way won't work). Ben W Bell 10:54, 31 January 2006 (UTC)

See Wikipedia:Redirect#Don't fix redirects that aren't broken. If we tell human editors that they shouldn't be fixing these, we shouldn't have bots do it (unless they're already editing the article for a different reason anyway). —Cryptic (talk) 16:02, 31 January 2006 (UTC)
Agree... only double-redirects and links directly to disambiguation pages are problem. -- Netoholic @ 03:07, 1 February 2006 (UTC)

Perverted-Justice conviction/bust update bot

The Perverted-Justice.com article lists an ongoing total for the number of convcitions and number of busts that are listed on the website the article is about. Since this number changes so frequently, and I think it wouldn't be difficult to automatically retrieve it, I thought it might be a good idea if someone could write a bot to automatically update this. I was thinking that the bot could run once weekly, pull the recent conviction/bust count, and update the approprait sections in the article. Possible? Reasonable? Fieari 16:21, 31 January 2006 (UTC)

Possible? yes. Reasonable? Personally, I don't think so. Jmax- 04:26, 1 February 2006 (UTC)
If the number changes frequently, rewrite the article so it doesn't need the exact number. Instead of "43 convictions", for instance, use "more than 40 convictions" (and include an as of link). --cesarb 00:34, 2 February 2006 (UTC)

Rambot demographics to past tense

The majority of the US location articles that were added by Rambot are written in the present tense. "As of 2000, the population is... the average income is... the majority of families have..." etc. These should be changed to past tense.

It's pretty straightforward to do - within the demographics section, replace "is" with "was" (10 instances); replace "are" with "were" (12 instances); replace "have" with "had" (4 instances). I've been doing this manually when I come across them (see, for example, Jasper, New York), but that'll take a while across 30,000 articles and it seems to me like a task ideally suited to a bot. Any offers? --OpenToppedBus - Talk to the driver 10:07, 3 February 2006 (UTC)

No, I'm pretty sure these should remain present tense. They are still describing the current world, just somewhat inaccurately. Superm401 - Talk 09:38, 20 February 2006 (UTC)
They are absolutely not describing the current world. They are explicitly describing what the situation was six years ago, at the time of the last census. They should be changed because a) except for the very smallest of communities, these figures are unlikely to still be accurate; b) even where the figures haven't changed, we don't know that - all that we know, verifiably, is what the situation was six years ago; c) it's simply bad grammar to say, "As of the census of 2000, there are 1,270 people... residing in the town"; d) these articles are currently inconsistent, as the intro is already in the past tense. Note that I am not suggesting that the "geography" section (also based on the census bureau figures) should be changed to past tense, as the area of the towns and the portion covered by water is unlikely to have changed. --OpenToppedBus - Talk to the driver 11:07, 20 February 2006 (UTC)
I had been changing these when I met them, recently I have changed a whole bunch and now started serious automated testing. Rich Farmbrough 14:44 11 March 2006 (UTC).

Template:Tl and Template:Tl

I request that a bot goes over the Whatlinkshere for both these templates and removes them from any and all talk pages and user pages (where it shouldn't be to begin with, and likely is as the result of a move), so that the relevant categories Category:Articles actively undergoing a major edit and Category:Articles actively undergoing construction are useful once more. Especially the former consists for 90% of talk and user pages. >Radiant< 13:11, 3 February 2006 (UTC)

It would seem sensible at the same time to remove the first one from articles where there have been no edits for a set period of time and to give a warning when the second one is on an article that hasn't been touched for some time (e.g. an article reconstruction left half finished) Plugwash 21:06, 12 February 2006 (UTC)
I have done the Template:Tl --Ligulem 17:14, 3 April 2006 (UTC)
Template:Tl is done --Ligulem 18:16, 3 April 2006 (UTC)

Vandalism Bot

Could someone program a bot that scans through articles, and tags articles which have had a significant reduction in content since the last edit as articles that have possibly been vandalised. The tag would also add them to a category, allowing humans to go through and see which ones have actually been vandalised, reverting as nessicary. This would be run daily, and would help to keep out vandals. --GW_Simulations 14:32, 4 February 2006 (UTC)

I was considering this today, maybe this is something for CryptoDerk's_Vandal_Fighter or a new tool. My ideas, in addition to the above. A tool that can help guesstimate the likelyhood that a edit is vandalism, the tool would use the above, i.e. remove of large parts of the article, adding of certain key words, i.e. work the same way as spam filter with a list of words that normally is used in vandlaism like "Rulez","shit","fuck","!!!!!" and so on,I'm sure we can make a better list and enhance that list as time goes by. The tool would also add to the vandlism likelyhood for not logged in users, maybe even check for new users, have a white list and black list and so on. It could also check for same/simmilar editis within a time persion that have been reverted. I do not think this bot should add a category to the articles, it is probably better to just add them to a page, with the vandalism likelyhood score next to it, but I'm not sure? The vandal fighters can remove form the page when verified. This would not be a editing bot so I do not think it need permission, it would only write to its 'home' page, but the discussion should probably be done here. Stefan 05:08, 10 February 2006 (UTC)
Loked a bit in pywiki but can not find any function to get a historic page. Maybe as simple as using pagename&oldid=num but I have not tried. One more idea is that this bot should check for excessive use of UPPECASE character in the edit. Stefan 06:34, 10 February 2006 (UTC)
Pywikipedia is capable of looking at the history of an article, and from there it's not a big step to get the historic version. --Orgullomoore 18:06, 11 February 2006 (UTC)
That is my guess also, BUT Pywikipedia is not very well documented, I can not find how to get a historic version, I will continue to look but I am to busy with work now to have any time, if you do know how to do it please gve me a hint. I can find how to look at the history, but that is basically scanning the history web page, that is not what I want, I just want to get the text of a historic version, and I can not find a function that does that, unless it is the standard get page function with a id in the name as I suggested above, I have still not tested that.
Also after thinking about the design a bit more I think that the only way to make this work is to have a active web site that does this, it will be updated in the background, you can have users that can tick off a artcile that they have confirmed that it is vandalism or not, but I do not have a site that I can do that on or the skill to make a web site like that. Thinking futher this really should be built into wikipedia, then it will be much more efficient, hum wonder how easy it would be to be allowed to add some code there .... I guess that is a new project, get the code and try to figure out how to add this functionallity and figure out how you get code added :-) Stefan 06:27, 14 February 2006 (UTC)
Could it also be programmed to look for "Text Language"? --GW_Simulations 20:31, 14 February 2006 (UTC)

I'm currently writing this bot. It uses a highly modified pymediawiki to both get the history and do reverts. It uses a pretty complex mechanism to determine if something is vandalism or not. It will hopfully be launched under the name User:tawkerbot2 joshbuddytalk 18:33, 4 March 2006 (UTC)

Convert star ratings to text

Was wondering if a bot could convert these star ratings (and their derivatives) in albums to plain text. Here is the discussion. Wikipedia_talk:WikiProject_Albums#Stars_to_text. Gflores Talk 06:14, 5 February 2006 (UTC)

Here are the conversions that need to happen:
Hope that helps. -- WB 06:20, 5 February 2006 (UTC)
I will set User:Tawkerbot to do this once I recieve a bot flag. Tawker 02:26, 14 February 2006 (UTC)
Please see here for the reasons this didn't go ahead. Rich Farmbrough 01:58 10 March 2006 (UTC).

Referral ID spam remove bot

I've been removing referral IDs from outgoing links I've been able to find every now and then. I frankly don't like the idea that someone could make money from Wikipedia by sneaking these links into places where they even could be considered legit.

What I'd like this bot to do is to find links that contain a referral ID, strip it off and post a normal one that works just as well.

example:

http://www.amazon.com/gp/product/B000000W5L/sr=1-1/qid=1138522986/ref=pd_bbs_1/103-0299503-7272610?%5Fencoding=UTF8

becomes:

http://www.amazon.com/gp/product/B000000W5L/sr=1-1/qid=1138522986/

Obli (Talk) 23:11, 5 February 2006 (UTC)

That's a good idea. I did a quick search over the enwiki dump from 20060125 and there are about 1500 amazon links with /ref in them. Note that you can reduce the URLs even further. Your example can be reduced to:
http://www.amazon.com/gp/product/B000000W5L
Cmdrjameson 14:52, 6 February 2006 (UTC)
Someone should get on the job of finding the top sites with referral programs and make an algorithm to remove those as well, but I guess Amazon would be the major culprit, though.
Obli (Talk) 16:54, 6 February 2006 (UTC)
I've updated my scripts and I'm working my way through the Amazon links I've found. Hopefully it'll only take a few days. If you do find any other prominent sites with referrals, I'd be interested to hear about them. Cheers, Cmdrjameson 23:59, 6 February 2006 (UTC)
Cool, I though it would require a lot more bureaucracy than that to get a thing like this done. Thanks! Obli (Talk) 07:58, 7 February 2006 (UTC)
OK I've finished with all the Amazon URLs I could find, and have moved on to allmusic.com. These are rather impressive; they can have a 10 (or so) character uid= component, along with a 128-character token=. There's about 3600 of them in enwiki. Cmdrjameson 13:25, 9 February 2006 (UTC)

Ladnav bot, to reverse a vandal

Since fy: has been getting a higher load of real vandalism lately (as opposed to the occasional graffiti-editor), I'm looking for a way to revert the contributions of a specified anonymous user in a somewhat automated way. As the bot bit does have a function in this process, as done by administrators, I expected to find an actual bot to help them out, but I can't find it. If I'm blind, could someone point me in the right direction. If not, would someone be willing and able to write such a bot? The task seems pretty straight forward; it's just an awful lot of clicking when done by hand. 217.123.4.108 21:03, 7 February 2006 (UTC)

Janitor bot: classify Category:Cleanup by month by type of edit needed

The backlog on Category:Cleanup by month is getting out of control, with 1.3% of Wikipedia currently tagged for cleanup. In order to speed the cleanup process, I propose a janitor bot to move "cleanup" pages that belong in other maintenance departments elsewhere.

The bot would have the following proposed behaviors:

  • If {{cleanup}} is found on a list, replace it with cleanup-list.
  • If {{cleanup}} is found on a disambig, replace it with disambig-cleanup.
  • If less than MINWIKILINK % of the text contains hyperlinks within Wikipedia, replace {{cleanup}} with {{wikify}}. A proposed value for MINWIKILINK is 0.1%.

The first two tasks appear to be within the capabilities of current bots, and should be easy to accomplish. Going through the capabilities of current Wikipedia:Bots does not show the capability to determine percentage of wiki links, but this is probably not a difficult task, as word counting and a repeated regexp search for /[[*]]/ should be all that's required.

These are the tasks that seem obviously automatable. Much of WP:CU requires human interaction; but at least we can figure out some of the human interaction that's necessary. Alba 00:39, 8 February 2006 (UTC)

The 3rd task may be within the capacity of User:Gnome (Bot), More info will be posted on the bot page. The bot can do that now, but is not completly tested and some of the code is lacking. :-). P.S. the Gnome bot is a C++/CLI bot. If interested contact me on my talk page:-) Eagle (talk) (desk) 22:08, 27 February 2006 (UTC)
User: Gnome (Bot) has been deemed capable of doing the task...It is currently undergoing modifications to be able to preform the task. (Bot was in existance before I mentioned it) Now Alba and I are working on it.Eagle (talk) (desk) 21:44, 21 March 2006 (UTC)

Convert interwiki redirects to softredirect

I created Template:Tl some time ago, and it seems to have been well received. It is supposed to be used instead of interwiki redirects, which do not work. However, it's hard to find the interwiki redirects without a bot. I'd like to ask for a bot to convert all interwiki redirects into uses of the Template:Tl template. --cesarb 01:25, 11 February 2006 (UTC)

That can be done, but are you looking for interwiki redirects in all namespaces, just the User namespace, Wikipedia, Help, the main namespace?--Orgullomoore 15:49, 11 February 2006 (UTC)
All namespaces. I created the template to be used on all namespaces except article, but since interwiki redirects shouldn't be done on articles, the category associated with the template will allow us to find and fix them. Also, the pseudo-namespace WP: (which is also in the article namespace) has interwiki redirects which should be converted. --cesarb 18:01, 11 February 2006 (UTC)

Category:Stock exchanges

Category:Stock exchanges appears to be populated by all the stock exchanges, most of which fall into the geographical categories of ...in Europe, ...in North America, ...in Asia, etc. According to WP:CAT, articles shouldn't be a broad category that is covered by a sharper category. What bots, if any, do mass recategorization like this? --Christopherlin 18:03, 11 February 2006 (UTC)

Common typo

Here's an easy one: phd, Phd, PhD and Ph.D should be changed into Ph.D. Kjaergaard 05:36, 17 February 2006 (UTC)

I think that Mathbot by Oleg Alexandrov can correct spelling errors already. -- King of Hearts | (talk) 00:06, 25 February 2006 (UTC)

Robot wanted "id=toc" into "class=toccolours"

This was originally posted on WP:VPA, but User:Angela referred me here.

Does anyone have a robot that they could run which could change all occurences of "id=toc" into "class=toccolours"? They both look the same to most folk, but id=toc hides the division from folk who have preferences for "contents turned off". And 99% of these are not tables of contents but are related items link boxes. Editing the 1% by hand would be easier than the 99%.

For an example, see Elisa Oyj and Template:Finnishmobileoperators which has had this change done. -- SGBailey 21:17, 18 February 2006 (UTC)

Google video URL conversion

I just created Template:Google video for Google video URLs, I'd like a bot to change any instances of these links:

http://video.google.com/videoplay?docid=7521044027821122670

to

{{Google video|ID|description}}

The ID is the number appearing after ?docid=, note that some links also contain a search term: &q=search+term . Also note that some Google video links link to a search result page, these should probably be ignored. Alternatively, I could settle for a list of articles with these links if anyone has got access to a database dump. -Obli (Talk) 22:42, 19 February 2006 (UTC)

Yes...search terms ought to be excluded from all links to such places. — Ilyanep (Talk) 23:15, 19 February 2006 (UTC)
  • I've realized that this task is a little bit too complex for bots, so I searched for the URLs instead and compacted them, they weren't that many... Obli (Talk) 23:32, 20 February 2006 (UTC)

American Cities/Towns

It might be nice if American Cities/Towns were displayed as nicely as British towns were. E.g. on the right side are all the vital/geo stats, and the articles expalins the town in question. American towns currently get a red dot on a map of the state they are in, but the non-American readers probably have no idea where the dot is in realtion to the USA as a whole. In general, the system from American towns/cities/etc seems a bit American-centric. Maybe I'm off base. I wouldn't mind a discussion, as I feel I can learn quite a bit more about British towns in general than the American counterparts by the general info displayed.

Recategorising

The category Category:Eurovision Song Contest needs categorising better, but it's kind of a bit hard to do alone manually. I'm wondering if a bot could do it better. Here's what essentially needs to be done:

Thanks if anyone can help. Esteffect 00:18, 23 February 2006 (UTC)

I've added it to my bots que, give it a few days and I should be able to run the job. I think I can run it fairly quickly as it's only 100 pages (12 min on the bots clock, I might just use AWB because its faster for the small jobs. Tawker 08:01, 2 March 2006 (UTC)

Iinterwiki and getlinks requests

I'm wondering if I can find any of these bots somewhere. Right now I'm using pyWikipedia with XML Dump file.

  1. Bots that can get page titles from dump file if there is no interlanguage link in the articles.
  2. Bots that can find articles containing in-line interlanguage links ([[:fr:___]] [[:de:___]]). There are too many of these links in Thai WP such as this page th:LAOTSE (containing - en, fr, ja)
    • Then it might be good to get all the links in the articles that someone else can create articles (or even stubs) similar to Wanted Pages
  3. Similar to above request, can I get the wanted pages for specific categories (including sub-cat)

--Manop - TH 20:56, 24 February 2006 (UTC)

Bot for COTW, AID, etc.

There should be a bot for automating the processes of WP:COTW, WP:AID, and other collaborations. It should count the number of votes, remove failed nominations, and do everything listed under Wikipedia:Collaboration of the week/Maintenance if the clock reaches 18:00 Sunday, so people can have more time to work on expanding the collaborations. -- King of Hearts | (talk) 00:05, 25 February 2006 (UTC)

I don't think it's a good idea for a bot to count votes based on human edits. If people use a different format, they accidentally (or purposely) sign twice, etc etc... What might be able to be done is that the winning articlename be put in a protected page, and the bot read that.--Orgullomoore 15:31, 26 February 2006 (UTC)

Welcome substitution

I just now came across someone (Karmafist) who has placed A LOT of un substituted welcome templates. Would it be possible for a bot to go through his contribs and fix all these? Or, better yet, could someone tell me how I could make one myself? -Mulder416 19:09, 26 February 2006 (UTC)

I just ran a massive welcome subst job, it *should* be done -- Tawker 07:52, 2 March 2006 (UTC)

Moving slogans from infobox company template

In disscussion at Template talk:Infobox Company there is an emerging consensus that the slogan field should be removed from the infobox, but we want to hang on to the data in the field. There are two things which you might be able to help us with, firstly could a bot create a list of the pages which contain infoboxes with slogans, and secondly would it be possible for a bot to remove the data and insert it perhaps as a section just before 'see also' with a heading such as 'corporate branding'? (The first would be very useful, the second is still subject to the discussion outcome - just trying to get a feel for what can be done) Many thanks Ian3055 22:09, 27 February 2006 (UTC)

Working on it (just the list, not actually editing). —Cryptic (talk) 14:25, 2 March 2006 (UTC)
Template talk:Infobox Company/Slogans. —Cryptic (talk) 18:02, 2 March 2006 (UTC)
Thanks! Its much easier to consider the problem now that we know how big it is... thank you Ian3055 22:38, 3 March 2006 (UTC)

Userbox and Userboxes bot

We need a bot to replace Userbox and Userboxes with Wikipedia:Userboxes. Thank you. --Fang Aili 22:06, 3 March 2006 (UTC)

Sounds like fun, I'll get right on it. — Mar. 3, '06 [22:13] <freakofnurxture|[[[Template:Fullurl:user talk:freakofnurture]] talk]>

bot for adding box to an entire category of articles

Is there a bot for adding an info box to all articles in a category? I checked the Wikipedia:bots page, but I didn't see a bot that would do what I'm thinking.

Currently the dinosaur pages on WP are in bad shape. There are several hundred categorized dinosaur stubs that could use an infobox, but manually adding them might take some time. Can't a bot do all that work instead?--Firsfron 02:47, 6 March 2006 (UTC)

Putting the infobox in would be a cinch, populating it less so. Rich Farmbrough 19:44 8 March 2006 (UTC).

Bot needed to fix incorrect links

Saint Louis, Missouri has become St. Louis, Missouri per consensus since it is the official and by far most widely used name for the city. However there are still hundreds of links to the old title on Wikipedia and it is very tedious to go through and fix them manually. This is a simple task for a bot to do -- can someone who is able take care of this? GT 04:02, 8 March 2006 (UTC)

Consider it done. Rich Farmbrough 19:51 8 March 2006 (UTC).
Thanks! That's a lot better. What about changing Category:Saint Louis, Missouri and all its derivatives to the proper spellings? Is that doable? GT 11:58, 10 March 2006 (UTC)
I'll have a look... Rich Farmbrough 15:18 11 March 2006 (UTC).
I've got nothing else for my bot to do right now, I'll run a whatlinkshere change on all of the pages. -- Tawker 06:53, 12 March 2006 (UTC)
Think it's all done now. Rich Farmbrough 12:17 15 March 2006 (UTC).

List of images used in an article

This suggestion is about a tool and not a bot, but I didn't know were to put it. I'm suggesting a tool that searches through a page's history and lists all the images that have been used, even if they were removed. This way, we could retrieve images that were replaced by better ones in the article, but still good for usage. The only problem, is that Wikipedia doesn't categorises images, that's why I think this tool would be useful. CG 21:22, 10 March 2006 (UTC)

A blacklist of the vandal's favourite pix might be useful... Rich Farmbrough 12:19 15 March 2006 (UTC).

the Crimea -> Crimea, the Ukraine -> Ukraine

Formerly names "Crimea" and "Ukraine" were used with the definite article. Today only Crimea and Ukraine (without article) are considered to be correct forms. But the Crimea and the Ukraine still can be found in many wiki-articles. IMHO it will be a good task for a bot - to remove "the". Don Alessandro 09:07, 12 March 2006 (UTC)

subst

Could someone subst: Template:Warning used in User talk pages. Thank you. CG 17:19, 12 March 2006 (UTC)

Consider it done. However, I've only subst'ed templates reffering to misconduct (empty ones don't count either). Fetofs Hello! 23:44, 12 March 2006 (UTC)

wikibooks: THINKSTARSHIP

is there a bot that can perform search functions to link Wikipedia articles that are relevant to a Wikibook? Also, and, more generally, Wikicities, other areas of WikiBooks, Wiktionary, (And, I know, getting less likely the heuristic would be able to discern what was relevant, but i'd rather delete bad links than put in all of my own.) Google? Yahoo? Prometheuspan 18:18, 16 March 2006 (UTC)



I will look that up. As a side note, I had envisioned linking to as many other wikibooks as were relevant, and to as many wikipedia articles as possible. Prometheuspan 22:26, 28 February 2006 (UTC)

[edit] Bot Unfortunately, I haven't heard of anything like this. --Derbeth talk 23:40, 16 March 2006 (UTC)


What about a bot that links references to religious texts to the approprate section of that wikibook? This may need to be written for each book seperately, but of particular interest to me would be the Jewish/Christian Bible and Islamic Qu'ran. If this would interest anyone, please contact me on my talk page! Andrewjuren 20:50, 24 March 2006 (UTC)


Prometheuspan 00:33, 17 March 2006 (UTC)huh. You'd think they would have like an RSF or some such thing set up to link a new wiki to its parent networks like that. I have asked at the bot request wikipedia zone. Is there somebody else or someplace else to go look? Prometheuspan 00:33, 17 March 2006 (UTC)

Retrieved from "http://en.wikibooks.org/wiki/User_talk:Prometheuspan"

Is this possible?

There has been discussion at Wikipedia talk:Categorization about repopulating some categories that had previously been depopulated after being divided into subcategories. One example is Category:American actors. There is a good deal of support for doing this. Before there was Template:Tl it was necessary to break large categories into smaller subcategories, and there is a value in having these smaller categories. However, categories also serve as the master index of subjects and it is often frustrating to have to look in several subcategories to browse through the articles in a subject. A good example of this is Category:Film directors. The proposal is to keep the subcategories, but also have articles duplicated in parent (or grand-parent) categories up to the level of topic articles.

I am wondering if a bot could be created to run frequently (once a day?) which would go through a list of categories that should be duplicated in other categories and check to see if the duplications exist. If they do not, they would be added. I suspect that there will need to be a page created to discuss this duplication process (Wikipedia:Duplicated Categories?) and editing the list of duplications would probably have to be limited to admins. By having this bot, a person could add the lowest level category that applies and the category would also end up in the higher level categories. The bot would have to look at each article in the category and see if the higher level categorization exists, if it does not, the categorization would be added. For categories of people, the piping should be copied so that the article is alphabetized correctly.

Another bot might scan through the higher level lists and collate a list of articles that have not been put in any of the lower level subcategories.

I am just wondering if this is possible. There would have to be quite a bit of discussion about whether this should happen and how it will happen. First I want to know what is possible. Thanks. -- Samuel Wantman 10:21, 19 March 2006 (UTC)

Well, if you just want the bot to add a category (the higher up one) to every page in a list its trivial, any pywikipedia bot (including Tawkerbot) can do it. If you want the AI, that's getting into fuzzy logic and might be a little trickier -- Tawker 18:35, 19 March 2006 (UTC)
What do you mean by the AI? Are you talking about the piping? The bot can look at the categorization for the subcategory and use the same piping when adding the categorization in the parent category. As a test, would it be possible to duplicate the categorization of the articles in Category:American film actors so that they are also in Category:American actors. In this case the aphabetizing of each article in Category:American actors would be the same when copying the piping from Category:American film actors. I've been doing this with AWB and it takes quite a bit of work and there are hundreds of articles. Thanks. -- Samuel Wantman 09:17, 20 March 2006 (UTC)
By "AI" he means that the bot will run on its own with artificial intelligence. A pywikipediabot can't run on it's own, as the human must tell it what do, where to do it, etc. Fetofs Hello! 12:05, 20 March 2006 (UTC)

It seems to be that a better solution would be to get the MediaWiki software to display all subcat articles of a particular category. If this feature is introduced in the future, carrying out the category population with a bot would have been a waste.--Commander Keane 12:26, 20 March 2006 (UTC)

This is a common response that I've heard for about a year and a half. I am not convinced that this will ever happen, and I'm not sure it needs to happen. Usually, the higher level categories only have subcategories. Having them populated adds the flexibility of seeing the larger sets and the smaller ones. If having a large number of articles makes it difficult to see the set of subcategories, it is possible to split the category such as Category:Operas and Category:Opera. -- Samuel Wantman 21:43, 20 March 2006 (UTC)

Spoken Wikipedia Project

Hi there! Those on the Spoken Wikipedia Project would like to explore using a bot to help with our work. Here are a couple of things that have come up in discussions with other project members:

RSS Feed Updater

Right now, we have a manually-updated RSS feed that lists new articles that have been recorded. That way, project members and casual listeners can find our new content easily. It would be great if we had a way to automate this, to save SCEhardt the work. Let me know if you're interested in that project.

Tagging

Currently, we use several tags for our project:

1. We have a tag that people use when they request an article to have read aloud and recorded.

2. We are discussing two tags that project members can add to the article's talk page:

  • One that replaces the request tag when a project member has volunteered to record the article
  • a related tag to use when a project member decides to record an article, but nobody has requested it for recording.

Once an article is recorded and uploaded:

3. We have a couple of tags that go on the article's page itself:

  • one for recordings that are a single file.
  • and one for long recordings that have been split into several parts for faster downloading
  • and one for recordings of a page summarised in another article

4. In addition to that, we add a tag to the article's talk page

  • A version for recordings where the article is unchanged (stable versions, for example)
  • And we have a version that is used when the article has changed--it provides a link to the old version that the recording it based on.

5. Additionally, we have a tag that goes on the Wikipedia:Featured articles page that alerts people that the article is available in an audio format, too.

6. Finally,

  • We add a tag for the article on Category:Spoken articles
  • But of course, there's a slight variation to the tag if the article was featured at the time of the recording.

So as you can see, we use between 4 and 6 tags for each recording. They all serve a good purpose: they promote the project, help organize our work, and make sure that people can find our recordings.

However, it's a lot of work to do this. Not all of these tags can be automated, but it seems to me that at least a couple could be. For instance, #4 might be. And it would be very useful if we could automate #6.

Again, if this idea is something you;d like to pursue, I can re-explain all of this and/or provide more details. Ckamaeleon ((T)) 02:41, 20 March 2006 (UTC)

AllyUnion's Bots

Several of AllyUnion's bots appear to have gone offline several days ago. It's only when the automated tasks that you are used to being done don't get done that you realize how much you depend on a bot. And this is currently the case. From AllyUnion's user page, it appears that he is mostly on wikibreak. I tried emailing him, but his email does not work. So I've left a message on his talk page. But if he's on break, who knows when he will see it.

So the next question becomes, how long do we wait until the bots are declared out of service, and how then can we get some other bots to pick up the duties. Specific bots that appear to be down include:

NekoDaemon being out of service is what brought this all to my attention, as CFD is one of my normal home playgrounds. But AFD bot appears to have an even more critical role. - TexasAndroid 15:00, 21 March 2006 (UTC)

I'll try to examine the exact behavior and write up a clone. Bear with me on this. — Mar. 21, '06 [23:14] <freakofnurxture|[[[Template:Fullurl:user talk:freakofnurture]] talk]>

Anyone have a Transwiki bot?

I have been trying to clean up the cocktails articles, I have tagged about 90 articles for "move to wikibooks". Anyone have a bot that could transwiki them? They all have cocktail recipes in them, the majority are nothing but recipe. They'd need to end up in the wikibook Bartending. http://en.wikibooks.org/wiki/Category:Bartending_pages_needing_work Once transwikied, I could then clean up the wikipedia articles. --Xyzzyplugh 09:36, 26 March 2006 (UTC)

minor template fixes

a bot to make the following fixes to template could do much, much good to wikipedia (it,d avoid manual fixing of these things, at the very least lol):

  • replace id="toc" with class="toccolours"
  • remove trailing/empty rows: many templates end with |-|},which makes no sense, or include |-|-, which is equally nonsensical
  • replace <center> with align="center" + margin:0 auto; style declaration
  • Remove trailing </center> tags
  • Replace <br clear="all" /> with a clear:both; style declaration

Circeus 20:07, 31 March 2006 (UTC)

Bot needed for search and replace mission

See Talk:Voivodes of the Polish-Lithuanian Commonwealth#Bot help for details. Thanks!--Piotr Konieczny aka Prokonsul Piotrus Talk 03:27, 1 April 2006 (UTC)

Switching order of days on monthly news archives

I've been on a little crusade for a few weeks now to get the monthly news archives to start listing the days in forward chronological order (i.e. beginning March 1 and ending March 31). Why aren't they already in that order, you might ask? It's because these monthly archives are produced by moving the page Current events at the end of the month, and on Current events the newest events are placed at the top (which is appropriate there).

I've taken care of March and February of 2006 (see before and after for March if you're unclear on what I'm talking about). However, every other one between January 2002 and January 2006 (inclusive) is in reverse chronological order and needs fixed.

It appears the formatting on these pages has evolved over the years, complicating matters quite a bit. But if anyone feels like they can help out with this please do (or reply here if you want me to help out somehow). GT 07:05, 1 April 2006 (UTC)

Welcome messages

Not a technical kind of person myself, but is it possible to create a bot to welcome new users? One that would put the welcome template on new user's talk pages? It would be a real benefit - I know a lot of users appreciate it. Obviously, a bot takes away the personal touch, but it would possibly save a lot of time as newbies would have links to help pages and guidelines as soon as they registered. Robdurbar 21:06, 1 April 2006 (UTC)

I think it's been suggested and rejected a couple of times due to the fact that it would have to be huge scale and because it's extremely impersonal. Pegasus1138Talk | Contribs | Email ---- 21:22, 1 April 2006 (UTC)
It's not just the personal touch, but having an experienced Wikipedian to talk to and ask questions of is far more important than links to policies (which you get every time you log in). I think manually is the only way to go. We could add some more welcome text to MediaWiki:Welcomecreation... Dmcdevit·t 21:25, 1 April 2006 (UTC)
Ah well, I suppose... though the new 'help me' feature does kinda augment against the experienced wikipedian touch. Alternatively, the bot could provide a link to a list of Wikipedians - perhaps those who are currenlty in the welcoming committe - who are happy to deal with newbie's requests. Robdurbar 21:31, 1 April 2006 (UTC)

US -> U.S.

Is it possible to get a bot to change all the links that point to US, meaning the country, to be changed to [[United States|U.S.]]. This would do two things, first it would make the link direct to the page for the United States. Secondly, it would change US to U.S. to fall in line with the standard set forth in the Manual of Style. And as long as the bot is in the article, could it change all other instances of US (exact match and case sensative, of course) to U.S.? It is usually the case that if an article references the United States in this way that the U.S. will be mentioned later in the article as well. Also, a similar function for a bot would be to change all instances of U.K. to UK which is that countries preferred abbreviation per the MoS. Dismas|(talk) 01:47, 2 April 2006 (UTC)

That is possible, however fixes like this are normally taken to be better done in a semi-automated fashion. If you have someone willing to do the job... Fetofs Hello! 01:49, 2 April 2006 (UTC)
I will have a look at that. I might do this, using WP:AWB. I hope this is not controversial... --Ligulem 09:32, 3 April 2006 (UTC)
I've done all links to US and U.K. (in namespace main). --Ligulem 12:48, 3 April 2006 (UTC)
Hey! Cool! Thanks! Dismas|(talk) 02:24, 4 April 2006 (UTC)

Years -> Years'

We have lots of wars and we name them XX Years' War... which, is close XX Years War. I think the apostrophe is the more common way to do it... and the proper... but they are both used in some settings. Should this be bot-ted?

gren グレン 02:35, 2 April 2006 (UTC)

Yes, yes, yes. Please have a bot do this. It hurts me physically to witness the lack of apostrophes. —Nightstallion (?) Seen this already? 13:17, 5 April 2006 (UTC)
I'm looking into doing that with WP:AWB. I'll report here. Consider it done for now. --Ligulem 15:12, 5 April 2006 (UTC)
Done for the redirects Seven Years War, Thirty Years War and Hundred Years War. --Ligulem 12:22, 7 April 2006 (UTC)

POV detector bot

I can envision the development of a bot that searches for phrases such as "is a great", "is a fantastic", "is a terrific", "is an awful" ..etc. that could indicate strong POV within the article text. If the phrase appears within quoted text, i.e. as dialog, then it would be excluded.--Hooperbloob 21:04, 2 April 2006 (UTC)

Barring exceptionally brilliant AI, the bot would not be able to conclude from the context whether the statement is truly POV or not. It would still have to be reviewed by a human, and you can already achieve this functionality by doing a google search for site:en.wikipedia.org "is a great". Good idea though! GT 05:06, 5 April 2006 (UTC)
Agreed, it would be too hard to perform automatically. I guess I wasn't considering its implementation as a stand-alone bot but perhaps as an add-on to an existing spelling type bot that editors could use when browing articles. I did that exact Google search and others before putting this note here...lots of POV hits.--Hooperbloob 05:51, 5 April 2006 (UTC)
That is going to be a nightmare to implement reliably, there is no way it would be autorevert like Tawkerbot2, it would have to compile lists and post them to a page somewhere. Its food for thought, I'll throw it out and we shall see -- Tawker 06:02, 6 April 2006 (UTC)

Western Reserve bot

Someone has put the following paragraph in a bunch of articles on Ohio townships:

"(Note: The U.S. Census Bureau counts township populations in the Connecticut Western Reserve as distinct from any municipalities located within the township. For populations of any municipalities within the township, please read the corresponding articles for those municipalities.)"

This is inaccurate. Some municipalities in Western Reserve townships, such as Cortland, Ohio, are independent from surrounding townships. Others, such as Newton Falls, Ohio, are part of the surrounding townships. This is no different than anywhere else in Ohio.

I'd like to see a bot that would delete the paragraph from all pages on which it is found. -- Mwalcoff 02:24, 4 April 2006 (UTC)

Copyvio bot

(There was User:Cobo of course, but that doesn't seem to have ever taken off.)

I find a lot of copyvios that have been dormant for months... and it seems like there are probably tons out there, if I can just check a few random articles and find one pretty fast. It seems like a bot with an organized approach would uncover thousands, and with an easy methodology... just select a few random 5-10 word phrases from the article, no punctuation, and search on Google, Altavista, etc for the exact phrase. The bot would make a list of any positive results. Of course it would have to ignore wikipedia mirrors. The odds of it listing a copyvio of something that is actually PD/GPL are low, in my experience, people are more interested in copying and pasting press releases, corporate bios, etc. than Project Gutenberg kind of stuff. But even still... that's where the human factor comes in.

The bot would just create a simple list of possible copyvios (with URLs), so it would be 100% non-invasive... humans (me, for example) would go through the list and handle as appropriate. The list could be stored in the bots userspace or wherever... I imagine it wouldn't be hard to drum up some people to go through it.

There are over a million articles now so it would be a lot of work and time... but afterwards it could perhaps monitor new articles (though that might be mre difficult to implement). Also since it's not live, I'm not even sure it would need to be flagged as a bot... all it would do is upload a list eventually, or in installments perhaps.

Anyway, I'm not a programmer... so I have no idea how hard this would be to implement. But given that it's not live, it could presumably be written in any language, up to Visual Basic. I've been thinking about this for a while though, and I think it would make a very positive impact on Wikipedia, and our goal of creating a truly free encyclopedia. Thoughts? --W.marsh 22:37, 5 April 2006 (UTC)

We were talking about this idea for Tawkerbot2 (as another feature) but we've run into one big snag. There are thousands of WP mirrors out there and every one of them would screw up automated detection -- Tawker 06:03, 6 April 2006 (UTC)
I've found that a simple "-wikipedia" (or equivalent depending on search engine) in the search cuts a lot of them out... to the point where you tend to just be left with copyvios, if there are any. Another option is whitelisting the domains listed at Wikipedia:Mirrors and forks. --W.marsh 06:10, 6 April 2006 (UTC)
If its a legit M/F it would have a GFDL complaint notice and see content from Wikipedia in it. That might be our saviour, though this bot would just list on a page, no way would I want it auto blanking -- Tawker 06:22, 6 April 2006 (UTC)
I agree. We wouldn't want automated blanking by now. Fetofs Hello! 13:33, 6 April 2006 (UTC)
Yeah, the whole point is doing a comprehensive job of pointing human copyvio hunters to all the probable needles in the haystack, so to speak. A bot shouldn't actually directly do anything with the articles. --W.marsh 14:10, 6 April 2006 (UTC)

Dictionary bot

Hello,

I am trying to find out if there is a bot created already that establishes a dictionary of terms found in a wiki site. Currently, there is no listing of definitions, and the terms end up being fairly convoluted at times.

I am able to generate a list of all the terms, and also to create definitions (manually), but I need to go back through the wiki site and create links from those terms to the dictionary. Further, some of the terms are too common to automatically replace using a bot, so they need to be removed from the "link creation" process, yet still remain in the dictionary. Is there something like this I can use as a bot base? Or is it simple to write?

Please note that this is for a mediawiki site. Is there a way that mediawiki can be set up to do this? (I have only glanced at the software docs, as I am not the admin).

Admins, you may email me with any responses. Thanks in advance!

Delfeld 04:37, 6 April 2006 (UTC)

You mean a list of every word on a page, I don't know how you would define terms over other words -- Tawker 06:04, 6 April 2006 (UTC)
-----
Tawker,
I don't mean every word on the page. All terms would be defined in a separate page - a dictionary page. Each term on this page would have an associated code - a simple "1" or "0", for example. This code could be listed anywhere in the definition, hidden or not. What this code would do is tell the bot, "Ok, this term is something to go through the wiki site and make into a link back here." or else would say, "Ok, don't make links of this term on the wiki site."
Does this make sense? I am not trying to glean all terms from the pages, but rather apply the dictionary terms to the pages.
Delfeld 21:52, 6 April 2006 (UTC)

Language-links Checker

I've noticed by browsing some of the Wikipedia Chinese articles that they will often link to an English page which has no corresponding link back to the Chinese version, and even one or two Chinese pages with no link to the relevant English article. I came across this actually several times in a short period of time, and would guess it to be not all that uncommon.

It would be helpful if someone could create a bot to scan pages and follow the links to different language versions, and make sure that all of the different translations are linked up. (i.e. that if a page exists in 15 different languages on any given topic, that each of those 15 versions has links to all 14 others).

Aside from just making it easier to find content in multiple languages, this may also encourage users to contribute in more than one language if they know the article exists in a second language they are familiar with.

Any thoughts?

--Hughitt1 19:35, 6 April 2006 (UTC)

There are many bots that do this task, such as User:YurikBot, though they do not run on all wikipedias, I suspect that not many run on the chinese wiki. Details of the bot they use can be found at m:Using the python wikipediabot. Martin 19:42, 6 April 2006 (UTC)

Renaming bot needed...

Per a (non) vote in Wikipedia_talk:WikiProject_Illinois_State_Routes, we have a list of routes in List of Illinois State Routes that all have articles of the form Illinois State Route X (active and red links). All of them need to become of the form Illinois Route X. It amounts to about 150 pages, and about 300 links to these pages, but I don't know if a bot can touch the "what links here" pages to change the names of those pages. There's also a template that I will update to reflect the new name when the pages are moved. Thanks for your help! —Rob (talk) 12:00, 8 April 2006 (UTC)

This was done by User:SPUI and User:Freakofnurture, please disregard. —Rob (talk) 03:52, 12 April 2006 (UTC)

Category Sorting in Category:WikiProjects

This category has several projects sorted under W when they should be changed to be under the name of the project, i.e. Wikipedia:WikiProject Hong Kong should be under H. Could a bot go through the projects sorted under W and re-sort them?--Max Talk (add) 20:40, 12 April 2006 (UTC)

Archiving Wikipedia:Articles for Creation

I am hoping to recruit a bot to help with the daily archiving at WP:AfC. The task used to be done by User:Uncle G's 'bot, and there were some plans for User:ShinmaBot take its place (along with some extra functions), but neither of their operators have been around recently. What's needed is three edits a day, shortly after 0000 UTC:

  1. move Wikipedia:Articles for creation/Today to [[Wikipedia:Articles for creation/YYYY-MM-DD]] (just the article, not the talk). The date should be that of the day that's beginning, not ending).
  2. edit Wikipedia:Articles for creation/Today, remove the redirect, and replace it with a generic header like this one.
  3. edit Wikipedia:Articles for creation/List and add the day's archive to the top of the page. Monthly archiving can be left to humans.

Can anyone help? ×Meegs 18:13, 15 April 2006 (UTC)

Replace an image on several pages

User:Germen was using Image:Nl small.gif in his signature, which was deleted as a redundant image of Image:Flag of the Netherlands.svg. Could someone run a bot to replace all instances of the text [[image:nl_small.gif|25px]] (articles) with [[Image:Flag of the Netherlands.svg|25px]]? Thanks! ~MDD4696 21:38, 20 April 2006 (UTC)

Song -> Album redirects

I'm not sure if this is worth the time and effort, but what about a bot that would parse the Track Listing sections of pages at List of albums and create redirects to the album article? For example, the bot would create redirects like this one for the tracks at Operation: Mindcrime#Track listing. It would also create redirects for lowercase variations of song titles. Thanks, TheJabberwock 22:26, 20 April 2006 (UTC)

Daybar inclusion

Recently a template Template:Tl was created to allow easy navigation through artciles for individual days given in the format below: June 10, 2004, June 11, 2004 etc. (see template page for specifics). On those pages it can be seen in use. This template could be used as the standard format for navigating through such articles, however it would be tedious to add them by hand. Hence I suggest a semi-automatic bot to add this template appropriately in the dates from January 1, 2003 till now. LukeSurl 16:29, 21 April 2006 (UTC)