Jump to content

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:RFBOT)

New to bots on Wikipedia? Read these primers!

To run a bot on the English Wikipedia, you must first get it approved. Follow the instructions below to add a request. If you are not familiar with programming, consider asking someone else to run a bot for you.

 Instructions for bot operators

Current requests for approval

Operator: Bunnypranav (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 04:34, Sunday, November 16, 2025 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): pywikibot

Source code available: Borrowing most of it from another bot

Function overview: Add "Women=yes" to {{WikiProject Latin music}} for related pages

Links to relevant discussions (where appropriate): Wikipedia talk:WikiProject Latin music#Women in Latin music taskforce proposal, User talk:Bunnypranav#BunnysBot: Tagging Women in Latin music category

Edit period(s): One time

Estimated number of pages affected: 667 (petscan:40576223)

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Tags pages for Wikipedia:WikiProject Latin music/Women in Latin music taskforce by adding |women=yes to {{WikiProject Latin music}}. I plan to borrow code from User:Tenshi Hinanawi's bot with a similar task from https://github.com/TenshiSWR/TenshiBot/blob/main/tasks/task7.py with sincere thanks, I hope they don't mind. :)

I would appreciate if I can get an extended approval for similar requests of only simple parameter addition, i.e. for Taskforce tagging for existing banners. Thank you.

~/Bunnypranav:<ping> 04:34, 16 November 2025 (UTC)[reply]

Discussion

Operator: Anomie (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:27, Friday, November 14, 2025 (UTC)

Function overview: Tag pages unambiguously eligible for WP:CSD#U6.

Automatic, Supervised, or Manual: Automatic

Programming language(s): Perl

Source code available: User:AnomieBOT/source/tasks/CsdU6Tagger.pm

Links to relevant discussions (where appropriate): Wikipedia:Village pump (policy)#Simplified solution, and Wikipedia:Village pump (policy)#CSD U6 implementation details in general.

Edit period(s): Daily-ish

Estimated number of pages affected: There are around 313000 in the backlog, and recently there have been 50–80 pages newly becoming eligible day.

Namespace(s): User

Exclusion compliant (Yes/No): No

Function details: The bot will tag pages that are unambiguously eligible for WP:CSD#U6 with {{db-u6|bot=CSD U6 Bot|bot_timestamp=YYYY-MM-DD}}. The bot will tag all newly eligible pages (as of the approval of this BRFA), and will additionally tag 150 older pages each day.

A page is "unambiguously eligible" if all of the following conditions apply:

The bot will additionally create the daily Category:Candidates for U6 speedy deletion as of DATE category, using {{Db-u6/daily bot category}}.

The definitions of "unambiguously eligible" and the 150-per-day rate limit may be loosened by consensus at WT:CSD, WP:VPR, or WP:VPP.

Discussion

Some additional notes:

  • Currently the bot will process the old pages in order by last non-bot edit. If someone thinks it should do a different order, I'm open to suggestions.
  • {{db-u6|bot_timestamp=YYYY-MM-DD}} currently works similarly to WP:PROD, having a 7-day waiting period before the deletion. Whether people decide to change that at some point is outside the scope of the bot approval.
  • I decided to create a task-specific account for this task instead of running it on User:AnomieBOT because it seems reasonably likely that newbies may show up at the bot's talk page questioning why their page was tagged for deletion. It seems like it would be clearer for them if the page they reach is specific to that task instead of being mixed in with all the other stuff AnomieBOT does. I've asked Tamzin and Chaotic Enby to help in setting up the bot's userpage and talk page to be more newbie-friendly, and to watch the talk page to help any newbies that may show up there. I hope others will watch it too.
  • When doing a trial, I'll set the "newly eligible pages" date to the date the trial is approved for the duration of the trial.

Anomie 23:27, 14 November 2025 (UTC)[reply]

One more thought for exclusion criteria: You could avoid a lot of alts by checking for "create2"-type account creations [1], and then running the 0-mainspace-edit logic against the creating account itself. Procedural creations by ACC or event coördinators should generally be "byemail" rather than "create2", so shouldn't confound it much. Could also check userpage membership in Category:Wikipedia alternative accounts or its subcats. Technically that'd mean anyone could make their account exempt from this bot, but I don't think we're too worried about gaming here. -- Tamzin[cetacean needed] (they|xe|🤷) 01:21, 15 November 2025 (UTC)[reply]
Looks like there are about 1200 accounts created with "create2" that are in the backlog. Skimming through the reasons given, seems like a good number of the ones with a comment are ACC, education program, and editathon creations, and a bunch more have comments that seem like they lost the password or got blocked or UAA-ed. As for Category:Wikipedia alternative accounts or Category:Wikipedia doppelganger accounts, only 11 such accounts are in the backlog. Overall, it doesn't seem very worth the added complexity. Anomie 02:15, 15 November 2025 (UTC)[reply]

Operator: Phuzion (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 22:14, Tuesday, November 11, 2025 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): AWB

Source code available: AWB

Function overview: Will replace {{infobox mapframe}} instances in {{Infobox station}} using a substitution template

Links to relevant discussions (where appropriate): Bot request

Edit period(s): One time run

Estimated number of pages affected: ~3,900

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: This task will, in two cases, insert a substitution template into instances of {{Infobox station}}: The first case is when |mapframe= is equal to yes and when |mapframe-custom= includes the text {{Infobox mapframe. The second case is when |embedded= begins with {{Infobox mapframe.

This substitution template will replace the embedded mapframe template with native mapframe parameters of {{Infobox station}}, simplifying the layout of the infobox and taking advantage of the native parameters.

I have performed a couple of test edits on my main account to demonstrate what will be done: embedded, mapframe

From a technical perspective, this is a simple AWB find/replace with a fairly simple regex. The list of articles was pulled from Petscan.

Happy to answer any questions!

Discussion

Just wanted to voice my support for this run! Happy to help check the diffs for any issues. -Zackmann (Talk to me/What I been doing) 22:18, 11 November 2025 (UTC)[reply]

@Phuzion: one issue to watch out for... Nested infoboxes also calling the mapframe. See Allenhurst station for example... These should probably be skipped by your bot and manually fixed... -Zackmann (Talk to me/What I been doing) 05:57, 15 November 2025 (UTC)[reply]

Operator: Trappist the monk (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 01:39, Monday, November 10, 2025 (UTC)

Function overview: replace wikitable-based ship infoboxen with Module:Infobox-based {{Infobox ship}}

Automatic, Supervised, or Manual: Automatic

Programming language(s): c#

Source code available: see User:Monkbot/task 22: Replace table-based ship infoboxen

Links to relevant discussions (where appropriate): Wikipedia:Templates for discussion/Log/2022 April 30 § Template:Infobox ship begin; subsequent sporadic discussions at:

Edit period(s): one time

Estimated number of pages affected: ~41000

Namespace(s): mainspace

Exclusion compliant (Yes/No): yes

Function details: see User:Monkbot/task 22: Replace table-based ship infoboxen

Discussion

Operator: DreamRimmer (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 10:06, Tuesday, October 28, 2025 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: TBD

Function overview: Mark as reviewed all the redirects in the NPP queue that are bolded in the lead of the target article.

Links to relevant discussions (where appropriate): Wikipedia talk:New pages patrol/Reviewers#Redirect backlog

Edit period(s): Daily

Estimated number of pages affected: Process all redirects in the NPP redirect queue, which currently contains over 37,000 redirects, and mark those that meet the criteria. The first run will cover around 10% of the total redirects.

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: Mark as reviewed all the redirects in the NPP queue that are bolded in the lead of the target article.

Discussion

Operator: Sisyph (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 22:00, Thursday, October 23, 2025 (UTC)

Function overview: Update tennis rankings and career prize money for women tennis players

Automatic, Supervised, or Manual: Automatic

Programming language(s): pywikibot

Source code available: fr:Utilisateur:DSisyphBot/Script/màj tennis.py

Links to relevant discussions (where appropriate):

Edit period(s): weekly (following WTA updated ranking)

Estimated number of pages affected: ~100 pages per week

Namespace(s): main

Exclusion compliant (Yes/No): Yes

Function details: 1/ Get WTA profile from wikidata page. 2/ Get data (best ranks + career prize money) from WTA profile. 3/ Update best ranks if needed + update prize money if > $US10000 to not spam edit for "small" earns.

The bot has already edits pages on one loop. For inactive players, it is a one shot. For active players, there will be a weekly check. Next step will be to do it for men tennis players with atptour.com site to get ranks data. --Sisyph (talk) 22:00, 23 October 2025 (UTC)[reply]

Discussion

I've reviewed a lot of the edits and they seem fine to me. This saves hours of manual labour on updating rankings and prize money. Do you feel it would be within scope for a bot to be able to update the win and loss totals as well? The only downside is that we'll have to follow the bot and manually update the 'last updated' timestamp at the bottom of the infobox, unless the bot is smart enough to do that too. Spiderone(Talk to Spider) 13:53, 26 October 2025 (UTC)[reply]

I note when something similar came up recently, a suggestion was that this sort of thing should be done in Wikidata, or failing that a centralized data page (i.e. a template, a module, or a .json page that's read by a module), instead of making repeated bot edits to individual articles. Anomie 14:10, 26 October 2025 (UTC)[reply]

Hello, for win / loss total, it is posssible [2]. For the current rank, my issue is to get the date of this rank and mentionnent in the infobox, it must be : <!--ONLY UPDATE WITH LAST DATE THIS RANKING WAS HELD, NEVER UPDATE UNTIL THE WTA WEBSITE IS UPDATED (date should be a Monday), THE REFERENCE DOES NOT NEED TO BE UPDATED -->, so not possible so far, it doesn't seem to have sources to find it. For update field, it can be updated if already existing [3]. For wikidata centralization, for sure it will be the best option, like ELO rank for chess players. But I am not skilled to initiate it for tennis players. I will be able to update wikidata if one day it is implemented. Sorry to have edit 2 more pages it was for the 2 examples --Sisyph (talk) 22:25, 26 October 2025 (UTC)[reply]
@Sisyph: Do not allow the bot to edit the English Wikipedia again until it is approved for trial by a member of the Bot Approvals Group. This will include use of the {{Bot trial}} template. If the bot does edit again, the bot account may be blocked until a trial is approved. Anomie 23:17, 26 October 2025 (UTC)[reply]
Just to point out it is pointless this bot being used to update career prize money in tennis player infoboxes when the rest of the statistics remain unchanged. Either get this thing to update everything (win/loss records, rankings, prize money) or stop doing it. The current practice is misleading and inaccurate. I have posted this message on the bot's talk page too. Shrug02 (talk) 09:02, 29 October 2025 (UTC)[reply]
Hello Shrug02, I can understand your view. I don't agree with inaccurate, but I can confess that only update the prize money could be pointless for some players. My fear to not update this field in the infobox independently (because, yes the bot could be only update when the rankings or win/loss records change), is to be in front a never update (by bot), for players who reach their highest rank. Currently the bot update the prize money when it changes over $10,000, to not update for unsignificant earnings. But I can change to minimum $10,000 AND 5% of wikipedia data current prize money. It means for valuable player already reach $1 million, it represents $50,000 difference before edit. --Sisyph (talk) 23:35, 29 October 2025 (UTC)[reply]

Operator: Aydoh8 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 04:28, Sunday, October 19, 2025 (UTC)

Function overview: Replacing DMY formatted dates on articles with {{use MDY dates}} tags, and vice versa.

Automatic, Supervised, or Manual: Supervised

Programming language(s): Python

Source code available: [4]

Links to relevant discussions (where appropriate):

Edit period(s): Daily

Estimated number of pages affected: Will check one page approximately every 3-5 seconds. If that page does include dates needing to be changed, it will take approximately 7 seconds (in testing) to complete before checking the next page.

Namespace(s): Mainspace

Exclusion compliant (Yes/No): Yes

Function details: This bot will run through pages in mainspace. It will check the page for any {{use dmy dates}} or {{use mdy dates}} templates, and if exactly one of those is found, it will check for any dates (both in plaintext and in certain date templates) to ensure they are correctly formatted, otherwise it will correct them. I have conducted testing in the bot's userspace (see the bot's contribs) and have fixed any of the bugs discovered in testing. As a side note, I have added an exception for references to the January 6 United States Capitol attack by blocking the change of January 6 on any article to 6 January (may lead to false negatives but I would rather false negatives over false positives).

Discussion

This seems very liable to run into WP:CONTEXTBOT issues. How will your bot avoid editing direct quotes, things besides "January 6", and so on? Glancing at your linked code, it looks like it would even break links and filenames if they happen to contain something that resembles a month and year. Anomie 00:57, 21 October 2025 (UTC)[reply]

Agreed. If I were this editor, I would plan to make at least 1,000 supervised edits at a reasonable pace using the intended script, checking each of the script's proposed changes before and after publishing. I think I would find that the script has some shortcomings. If you can address them, this bot process may be worth pursuing. Note that editing in this manner does not violate the bot policy, although you may find it tedious. – Jonesey95 (talk) 14:51, 21 October 2025 (UTC)[reply]
Note that editing in this manner does not violate the bot policy Agreed. It would fall under Wikipedia:Bot policy#Assisted editing guidelines, which has some useful information. Anomie 15:05, 21 October 2025 (UTC)[reply]

Bots in a trial period

Operator: Scaledish (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 12:58, Tuesday, September 16, 2025 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: GitHub

Function overview: Update US settlement census data

Links to relevant discussions (where appropriate): Request 1 · Request 2

Edit period(s): Yearly; new estimates released yearly

Estimated number of pages affected: Unknown, likely low 10 thousands

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details:

  • Doesn't add to a template if it sees there are multiple of it on the same page
  • Doesn't overwrite info if it is same age or newer

Discussion

Supervised Test 1 & Supervised Test 2 Scaledish! Talkish? Statish. 13:06, 16 September 2025 (UTC)[reply]

Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Since this is your first bot task, I am treating this as a one-off task. For future years, a new BRFA will be needed, and then we can see if it can be approved to run annually. – DreamRimmer 13:58, 24 September 2025 (UTC)[reply]

{{Operator assistance needed}} Anything on the trial? Tenshi! (Talk page) 11:52, 7 October 2025 (UTC)[reply]

Hi, the trial is not yet concluded.
As part of the trial, the bot was ran twice, both times being stopped due to eventually forming a false association between the database and the article. This lead to the conclusion that the match script needs to be improved significantly, which I will do but haven't yet had the time. I still believe a reasonable fix is possible. Likely, as part of this, a semi-supervised confidence approach will be adopted where, if confidence isn't overwhelmingly high, the association is sent for manual review.
Also as part of the trial, an additional issue was identified. If the infobox population is from <2010, is cited using a named reference, and elsewhere in the body that reference is referenced, a cite error is caused because those references are now dangling. This may be a simple fix, but needs to be implemented.
When both of these fixes are implemented, I plan to resume the bot for the remaining ~25 trial edits. Afterwards, I will request an additional 50 trial edits. Scaledish! Talkish? Statish. 17:16, 7 October 2025 (UTC)[reply]

{{Operator assistance needed}} Any progress on the fixes? Tenshi! (Talk page) 12:32, 7 November 2025 (UTC)[reply]

I apologize for the delay, my real life workload is roughly cyclical—you can see that reflected in my xtools stats. I expect to be able to work on it again within a week or two. Scaledish! Talkish? Statish. 19:55, 7 November 2025 (UTC)[reply]

Operator: GalStar (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 21:00, Wednesday, July 2, 2025 (UTC)

Function overview:

Automatic, Supervised, or Manual: Automatic

Programming language(s): Rust/Python

Source code available: Uses mwbot

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Redirects_related_to_those_nominated_at_RfD and Wikipedia talk:Redirects for discussion#Avoided double redirects of nominated redirects

Edit period(s): Continous

Page: Wikipedia:Redirects_for_discussion

Exclusion compliant (Yes/No): Yes (but N/A)

Adminbot (Yes/No): No

Function details:

  • Look at each RFD on each RFD Page
  • Determines whether there are any other redirects, in any namespace, that meet one or more of the following criteria:
    • Are marked as an avoided-double redirect of a nominated redirect
    • Are redirects to the nominated redirect
    • Redirect to the same target as the nominated redirect and differ only in the presence or absence of non-alphanumeric characters, and/or differ only in case
  • If the bot finds any redirects that match and which are not currently nominated at RfD, then it should post a message in the discussion (final details about the message are TBD, but the bot request outline the general point). The bot limits the length of it's message, ensuring that the RfD is not over-cluttered.

Discussion

Thanks for working on this GalStar, but it's not clear whether it is checking for redirects that differ only in the presence/absence of diacritics? Thryduulf (talk) 23:41, 2 July 2025 (UTC)[reply]

Diacritics fall under non-alphanumeric characters. GalStar (talk) (contribs) 16:48, 3 July 2025 (UTC)[reply]

Approved for trial (30 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete.DreamRimmer 06:35, 8 July 2025 (UTC)[reply]

{{Operator assistance needed}} Anything on the trial? Tenshi! (Talk page) 18:54, 4 August 2025 (UTC)[reply]

Am on vacation, expect updates in a few days. GalStar (talk) (contribs) 15:48, 11 August 2025 (UTC)[reply]
 On hold until RfD accepts my proposal to use a new templating system, one that is more friendly to bots. — Preceding unsigned comment added by GalStar (talkcontribs) 05:57, 25 August 2025 (UTC)[reply]
 On hold. For AnomieBot. Tenshi! (Talk page) 15:12, 25 August 2025 (UTC)[reply]
@GalStar: Is there a reason why you can't just use regex to find each nomination and use the information from that? For example, TenshiBot's unlisted copyright problems report looks for copyright problems in the subpages which use substed {{article-cv}} (regex: [5], although the script knows the names of the pages already, I imagine it wouldn't be too hard to get that from RfD subpages). Tenshi! (Talk page) 20:57, 6 September 2025 (UTC)[reply]
Thanks for pointing this out. I was trying to do this the "right" way with wikicode parsing, but I'll take a look at regex. GalStar (talk) (contribs) 04:13, 8 September 2025 (UTC)[reply]
{{Operator assistance needed}} Any update? – DreamRimmer 09:17, 29 September 2025 (UTC)[reply]
I'll take a look this week and see if I can finish implementation. GalStar (talk) (contribs) 05:20, 5 October 2025 (UTC)[reply]
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Anything on the proposal or implementation? Tenshi! (Talk page) 15:13, 5 November 2025 (UTC)[reply]

Bots that have completed the trial period

Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.