Jump to content

User talk:DErenrich-WMF/Add A Fact Experiment

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Please feel free to leave feedback here!

[edit]

Thanks DErenrich-WMF (talk) 15:50, 9 August 2024 (UTC)[reply]

'Spam' concerns?

[edit]

My initial gut reaction to this is that it seems great, but I also see how it could, especially on breaking stories, pollute talk pages with impetuously created topics from users acting in good faith but without much consideration. Mittzy (talk) 20:54, 26 September 2024 (UTC)[reply]

Oh, and I do hope that there are not plans to create a similar app that modifies articles directly. While automated edits are useful for housekeeping, I think that major (non-minor per WP:MINOR) changes to a page should be authored with care. Mittzy (talk) 20:57, 26 September 2024 (UTC)[reply]
Agreed on the above, @Mittzy! Tod Robbins (talk) 21:07, 26 September 2024 (UTC)[reply]
AFAIU, currently this requires an autoconfirmed wikipedia user, so the spam potential is reduced.
It could perhaps also be nice if the extension could check if talk page has a mention of this, then offering to add it as a reply to it (if the source is not the same). Aveaoz (talk) 21:47, 26 September 2024 (UTC)[reply]
I think we've seen from newcomer tasks that any structure that encourages specific editing behavior will produce a significant volume of non-constructive edits. Volume alone doesn't tell the whole story here, as some edits are less constructive/more destructive than others. With that in mind, I think both that and this have the potential to be a net positive for the site, while not unduly burdensome for those trying to maintain the existing quality of articles. Remsense ‥  05:02, 27 September 2024 (UTC)[reply]

Thanks everyone for the feedback! This is all really helpful. Yeah polluting/spamming talk pages is definitely something we're thinking about and we're thinking that the talk page might not be the right long term place for this kind of information. If we decide to move forward with this concept we'd definitely want to check that duplicative facts are not being added (either just not letting you add the source or adding it as a reply). Currently the edit rate is low enough that we skipped that for the purposes of getting this in people's hands quickly. DErenrich-WMF (talk) 05:29, 27 September 2024 (UTC)[reply]

One other concern I hold is that Wikipedia can be a highly technical / in-depth resource... The potential for largely automated inclusion of short 'fact' or 'fun-fact' type segments in the middle of an article does not, I feel, fit too well with Wikipedia's goal. As I said last night, I think that such automated / 'AI powered' tools poses quite a risk to the integrity of Wikipedia if it is ever allowed to be used to directly edit articles. Mittzy (talk) 15:12, 27 September 2024 (UTC)[reply]
Thanks for the sharing your thoughts @Mittzy! Did you get a chance to try out the extension and play around with fact/claim selection? It's interesting that you mention "short 'fact' or 'fun-fact' type segments" because our intention was to facilitate adding important new information on topics that may not have yet been updated, not fun facts or trivia. There are many articles (especially on less-popular topics) that haven't been updated to reflect new information/sources because no Wikipedian is closely watching/working on them, but people might be encountering new information on these topics outside of Wikipedia.
But I can see how adding random fun facts could be something a user of this could do. Tangentially related to this experimental browser extension (which we're just demoing/getting feedback on, not planning to make available to non-editors), but I am curious if you think there is any place anywhere on Wikipedia (maybe not in the article directly) for non-Wikipedians being able to submit well-sourced "fun facts/trivia"? I.e. something like a reader version of DYK? Maryana Pinchuk (WMF) (talk) 20:49, 27 September 2024 (UTC)[reply]
Hey! I have downloaded and looked at the extension, but have not used it to add anything to the site as of yet. I agree with what you say and understand the intended use of the extension, my concern was more related to what would happen if the extension became popular among 'reader' users, and people were to submit lots of insubstantial tidbits (This is what I meant by "fun-fact", though I do admit it was a poor choice of words by me) and pieces of information that are technically true but not substantial enough to warrant inclusion.
Though, as you say, a dedicated place for non-wikipedians to submit well-sourced fun facts (now using the term fun fact to mean trivia) would be a wonderful thing. I'm not aware of such a place, but the idea sounds like it has a lot of potential. :D Mittzy (talk) 22:03, 27 September 2024 (UTC)[reply]

Does not function on Opera GX

[edit]

I immediately downloaded this extension onto my Opera GX browser after being intrigued by the demo video and tested it out. I searched for an external website I'd been looking at for a while, highlighted a sentence and clicked the extension icon, but nothing happened. I tried toggling all sorts of settings to no avail. Am I doing something wrong or is the extension just not compatible with Opera GX? SleepDeprivedGinger (talk) 21:17, 26 September 2024 (UTC)[reply]

Hey, thanks for the feedback. We did not test this on Opera GX and I do not have a lot of experience with that browser so it's not surprising that it doesn't work. That said I don't think we did anything that would make it explicitly not work. DErenrich-WMF (talk) 05:32, 27 September 2024 (UTC)[reply]
That's alright, do you at least plan on adding support for this browser in future? SleepDeprivedGinger (talk) 11:59, 27 September 2024 (UTC)[reply]
there are no concrete plans at the moment. if we decide to turn this experiment into a permanent project then these are the kinds of things that we'd work on. DErenrich-WMF (talk) 21:38, 27 September 2024 (UTC)[reply]

Safari support

[edit]

I use Safari and would try this extension out if it had a Safari version. I understand that this would require quite a lot of extra work, so it might make sense to hold off on Safari support until and unless Add A Fact exits the experiment stage. Once that happens, reaching as many editors as possible should be a priority, and I hope that the existing Wikipedia app and Apple developer account can streamline the process of getting the extension onto Safari. Ilovemesomenachos (talk) 02:44, 27 September 2024 (UTC)[reply]

Yeah we targeted Firefox and Chrome first because of their wide use in the community. Wider support would be on the table if this exits the experimental stage. DErenrich-WMF (talk) 05:34, 27 September 2024 (UTC)[reply]

Oppose on principle

[edit]

Just logging that I opposed any use of "AI" on Wikipedia on principle, for reasons that should be obvious. MRSC (talk) 04:03, 27 September 2024 (UTC)[reply]

I remain similarly opposed to AI LLMs directly adding content to Wikipedia, especially because of model collapse as AI bots could begin training themselves on potentially incorrect portions of Wikipedia that other AI bots wrote. However, I think you should give this proposal a second look, since this AI implementation only checks whether a statement is already represented on the articles that seem most closely related to the claim. If the statement does not seem to be represented, then human users receive the option to use a non-AI script for proposing the fact to talk pages. BluePenguin18 🐧 ( 💬 ) 07:16, 27 September 2024 (UTC)[reply]
I'm also inclined in principle against AI in Wikipedia. But I'll try to keep an open mind, and will at least give the extension a try. Mike Marchmont (talk) 12:08, 27 September 2024 (UTC)[reply]
Hey, thanks for the feedback even when it's negative. I would actually appreciate more detail on your reasons for opposing applications of AI in Wikipedia. AI is actually already in use in various ways e.g. vandalism detection and is planned for more applications (see [1] or [2]). It'd be useful to know why the community opposes it (e.g. is it energy use, copyright. reliability, etc). DErenrich-WMF (talk) 15:46, 27 September 2024 (UTC)[reply]
The worst problem is a positive feedback loop on "info that's popular" as opposed to "info that's notable and verified" - LLMs introduce a severe information pollution problem. Conformism at its worst. "Four legs good, two legs bad" - it's true because it's true, because that's what everyone says.
Could you please point to the RfC on en.Wikipedia where this proposal was approved prior to being inserted in the rendered form of en.Wikipedia pages to logged-in users? Or was this a WMF idea imposed on the community without prior approval? Boud (talk) 23:02, 27 September 2024 (UTC) (minor edit to clarify Boud (talk) 13:46, 29 September 2024 (UTC))[reply]
Regarding copyright violation: this extension uses the LLM exclusively to search Wikipedia for articles related to the selected text and to check if the claim is already present in the article. It doesn't output any text. it's also Llama3, not that it changes anything Bertaz (talk) 21:47, 28 September 2024 (UTC)[reply]
There is a WP: space page somewhere that says that we're not supposed to link to any websites known to violate copyright. But here you are proposing that directly benefiting from massive copyright violation should be encouraged. If the LLM exclusively searches Wikipedia, then why not use an LLM trained just on Wikipedia instead of LLaMa3, which is severely polluted by general web content (CrowdCrawl)? That would look a lot less hypocritical in terms of WP:COPYVIO. Llama (language model) says that LLaMa3 is based on text generated from LLaMa2, but doesn't state the dependence of LLaMa2 on LLaMa1, which is why I mentioned the latter as a caveat. Boud (talk) 13:46, 29 September 2024 (UTC)[reply]

What LLM is being used?

[edit]

Curious about potential bias. (Haven't been able to try it yet.) RememberOrwell (talk) 06:30, 27 September 2024 (UTC)[reply]

The extension description states that it uses Llama3-70b Bertaz (talk) 07:44, 27 September 2024 (UTC)[reply]

Keep suggestions on the talk page

[edit]

Articles like Alcaligenes faecalis are not written as a loose collection of every time their subject is mentioned, instead requiring humans to weave reliable sources to comprehensively describe their topic. Using LLMs to assess whether certain statements are represented on the associated articles is an innovative approach, but I caution against posting these suggestions anywhere besides the talk page. Whereas the talk page can be easily archived to resolve spam during breaking news (though I like Aveaoz's idea to have the LLM check against existing talk page proposals), it would be difficult to cleanup if we allow these proposed facts to be posted into the article itself, such as through invisible comments. I think cautioning against unreliable and deprecated sources is sufficient because there are productive ways to use every source. BluePenguin18 🐧 ( 💬 ) 07:08, 27 September 2024 (UTC)[reply]

Thanks, @BluePenguin18! One piece of feedback we've heard is that talk pages aren't the ideal place for article suggestions, because many of them are dormant/unwatched. I agree that letting people add these suggestions directly to the article is risky, but I'm wondering if there could be a more actionable "holding space" than a talk page that doesn't become yet another moderation queue/backlog. What do you think about sending these to the relevant WikiProject's talk page? (Though the same concern as article talk holds, in the sense that many WikiProjects are also pretty dormant, I wonder if this might make it more likely for someone to look at/do something with a suggestion?) What do you think? Do you have other ideas like that, to put these in front of Wikipedians who'd be inclined and excited to do something with good suggestions? Maryana Pinchuk (WMF) (talk) 20:55, 27 September 2024 (UTC)[reply]
First, many articles do not have talk pages sorting them to their WikiProject. Second, many articles have multiple WikiProjects, and if the script posted to all of them, we would to have to mark all the posts as completed after finishing the single task of accepting/declining the proposal. Third, as you note, many WikiProject talk pages are inactive, even in cases where the articles under their purview are regularly edited. Sticking with the A. faecalis example, WikiProject Microbiology's talk page has gone untouched for three months, yet hundreds of articles on microbes are regularly edited each month. I recognize that many editors ignore article talk pages, but personally, I always check them for suggestions before beginning extensive editing. BluePenguin18 🐧 ( 💬 ) 21:30, 27 September 2024 (UTC)[reply]

Please enable this for draft space

[edit]

I tested this against three articles in draft space - Draft:Marshall L. Stephenson, Draft:Murder in New Jersey law, and Draft:William J. Harbison, each using sources and text selections that were clearly on point, and in each case, the tool 1) failed to detect that a relevant draft exists (in the first and third instance, it found nothing; in the second, it found various articles secondarily related to murder in New Jersey); and 2) when prompted with the title of a draft article, found the draft but then spun its wheels endlessly without getting to the step of offering to add a note to the draft talk page. This is frankly a perfect tool for the way I write drafts, it just needs to be able to find them. BD2412 T 12:17, 27 September 2024 (UTC)[reply]

this is really useful feedback. thank you. currently the way it is coded it isn't easy to let it auto-find drafts but adding support for manually tagging drafts it something we could consider doing in the short term. but improving the draft situation overall is a good idea. DErenrich-WMF (talk) 15:42, 27 September 2024 (UTC)[reply]
Actually, it would be nice to be able to start a new draftspace draft from such a tool. BD2412 T 12:25, 30 September 2024 (UTC)[reply]

AI, really?

[edit]

This seems like a terrible idea, AI spits out nothing but garbled false statements. Kirby54 — Preceding undated comment added 13:35, 27 September 2024 (UTC)[reply]

Hey Kirby54, see my reply to MRSC above noting that this AI implementation is just checking whether the claim is documented on the relevant Wikipedia pages. It is not attempting to propose a specific insertion of text into those articles. Thus, an error-prone LLM would not be capable of adding falsehoods to Wikipedia pages in this experiment BluePenguin18 🐧 ( 💬 ) 17:23, 27 September 2024 (UTC)[reply]
Thanks for chiming in here & elsewhere @BluePenguin18 on the question of how AI is being used here. You are correct that we are using AI in this tool only to search Wikipedia and retrieve back some contextual information to the user of the extension (i.e., whether or not there is a relevant article on this topic on Wikipedia – though the user can also just search manually; and whether or not the claim that the user has found/selected is already contained in the article). AI is not involved in writing any text that is posted to the talk page. I'll elaborate on this in the FAQ so hopefully it's less confusing to others! Maryana Pinchuk (WMF) (talk) 21:03, 27 September 2024 (UTC)[reply]

Well... where to begin?

[edit]
  1. I agree completely with BluePenguin18 above about the genesis and development of articles like Alcaligenes faecalis: it's a really good and concise summary of how WP should and can work. But thousands of less experienced editors really don't get it. As a professor of Mediaeval Literature pointed out to me last year, Wikipedia can be seen as the interface between academia (often dealing with highly technical subjects) and the general public: and such snippets of 'breaking news' detract from the long-term validity of similar articles.
    1. BTW, Alcaligenes f. has already been edited on this very subject (apparently deliberately ignoring the example in the video) by someone whose edit history and talk page do not inspire confidence, possibly just to get there first. I'm not qualified to judge whether it's a useful edit.
  2. This "Add a fact" tool works in exactly the opposite manner to the way I (and I suspect many others) actually edit on WP. If I do happen to come across something useful relating to an article I am familiar with, I will open it in the editor, add the relevant info in the article text and make a full cite there and then. Many of the articles I edit depend on scholarly articles, usually via pdfs, which the tool is not capable of addressing.
  3. The very idea of "Facts" is barely consistent with the fundamental concept of Wikipedia's approach to articles. It could easily be said that there are no facts, only opinions, and WP attempts to balance them using the best of reliable sources.
  4. Who is actually going to be doing the work once this factoid (cue breathless "Did you know...") has been posted on the talk page? The result, as shown in the video, is incapable of being transferred into the article without further effort from an experienced editor: if it is not simply ignored, someone will have to use their time and effort to give reasons why it's not suitable for inclusion. Only few of these posts could easily lead to exasperation.
  5. Use of this tool could easily encourage lazy spamming of talk pages by people eager to increase their edit count.
  6. I find that the use of |quote= makes it relatively hard to distinguish the quote from the cite.
  7. I suggest you re-record the video, speaking at approximately half the speed. I know you are very familiar with the tool, having developed it yourself, but as an intelligible introduction to a noob I'm afraid it fails. Your speech comes across as almost garbled. Please take your time.
  8. The foisting of AI on Wikipedia and Wikimedia has not yet found wide approval. Although there may be positive reasons for using the hallucinatory, imitative mumblings of a village idiot, I for one remain utterly unconvinced. MinorProphet (talk) 14:01, 27 September 2024 (UTC)[reply]
    Agree with all statements regarding Wikipedia. Mittzy (talk) 15:08, 27 September 2024 (UTC)[reply]
    The "Add a Fact" workflow of proposing statements is obviously very different from how experienced editors approach articles, but I think DErenrich-WMF is reasonably claiming that the act of proposing these facts would serve as a gateway to mainspace editing for new editors. I do not think that this tool would be abused for edit count purposes, simply because there are much easier and more fulfilling ways to boost one's count as a gnome. Regarding AI hallucinations, note that this implementation only uses the LLM to check whether the claim is represented on associated articles. It does not propose a particular text insertion to the talk page. BluePenguin18 🐧 ( 💬 ) 18:22, 27 September 2024 (UTC)[reply]
Thanks for clearing these points up. MinorProphet (talk) 22:49, 27 September 2024 (UTC)[reply]

Does it, or does it not, work with Firefox?

[edit]

The description of the tool states, "Add A Fact is available for the Google Chrome and Mozilla Firefox browsers." And there is a comment (above) which says, "Yeah we targeted Firefox and Chrome first because of their wide use in the community".

So in installed it in Firefox. It seemed to install OK, but I couldn't get it to work. So I delved into the docs, and I found this statement in the FAQ: "For technical reasons the extension will not just work on Firefox due to some manifest 3 API's that Firefox doesn't support".

If it is not supported in Firefox, I would understand that. I am aware of the issues of browser compatibility. And I congratulate you on what you have achieved so far. But can you perhaps amend the information to clarify this point. (And if it is not supported by Firefox, perhaps it should be removed from the FF add-ons page? Mike Marchmont (talk) 16:35, 27 September 2024 (UTC)[reply]

I'm running Firefox 130.0.0 and it's working for me. Which version do you have installed? Tod Robbins (talk) 16:45, 27 September 2024 (UTC)[reply]
I just tried it in Firefox and it worked for me (see screenshot). Could you tell us more about your setup so we can investigate why it's broken for you? DErenrich-WMF (talk) 21:33, 27 September 2024 (UTC)[reply]
OK, I'm running FF 127.0. Not the latest version, but close. This is what I did:
* Installed the extension in FFF.
* Opened it in the sidebar.
* Higlighted a passage in a web page.
This is how it looks:
I'm now at the point where it says "run the extension". But how to do that? I can't see a button or a menu option or anything else that will allow me to run it.
(I hope the screen shot is clear.)
(By the way, I have enabled all permissions for the extension.)
Mike Marchmont (talk) 15:09, 28 September 2024 (UTC)[reply]
It should autorun when you select text with the sidebar open. If that doesn't work, try closing the sidebar, selecting some text and clicking on the extension icon. If the sidebar reopens with the instruction (like in your screenshot) select some other text.
For me the extension only works after I select some text while the sidebar is open. If I select the text before opening it, the extension is unable to grab the selected text. Bertaz (talk) 21:34, 28 September 2024 (UTC)[reply]
@Bertaz:, thanks for your suggestions, which I have now tried, but still without success. But never mind. I won't take this any further for now. I was really just experimenting with the extension, more out of curiosity. I might come back to it another time.
That said, I do feel that the reference to FireFox in the FAQ should be updated, where it says, "For technical reasons the extension will not just work on Firefox due to some manifest 3 API's that Firefox doesn't support". Mike Marchmont (talk) 14:07, 30 September 2024 (UTC)[reply]

Please explain the required permissions

[edit]

So I said to myself, why not give it a try, and followed the link to the Firefox Add-Ons store. Lo and behold, the extension wants to be able to read all the data on all websites. Which, of course, includes anything entered on a form on any website, including usernames, passwords, bank card numbers, CVV2 numbers, and so on. Of course, I politely declined the installation of the extension. The required permissions and why they are needed would be most welcome, as would be an official statement from the Wikimedia Foundation regarding the collection and use of data. Imerologul Valah (talk) 21:17, 27 September 2024 (UTC)[reply]

@Imerologul: This is a really good question and I knew this would be problematic for some. We requested that permissions because we need to be able to tell what text you have selected. To do that we need to be able to run javascript in the context of the page (which can be any page). If I'm mistaken and there's a way to get around this we'd definitely want to do it that way. If it's any consolation the code is open source and it should be easy to check it isn't malicious and build it yourself. DErenrich-WMF (talk) 21:30, 27 September 2024 (UTC)[reply]
@DErenrich-WMF: What I was trying to say is that the need for powerful permissions should be highlighted on the extension's home page in the Add-Ons store, with an explanation of why they are needed. And an official statement from Wikimedia Foundation saying that no personal data will ever be collected and the extension will never be sold to shady actors would be most welcome, because developers change, they move to other projects, and generally stuff happens during the lifetime of a piece of software. Imerologul Valah (talk) 20:04, 28 September 2024 (UTC)[reply]
I agree that an explanation of the permissions would be good. But a statement won't prevent personal data collection. If for whatever reason the maintainers of the extension decide to start collecting data, they can do so even leaving the statement up. The best way to avoid any of these scenarios is by keeping the extension open source. So that anyone can check what the extension actually does.
Also the permission "Access your data for all websites" is needed to use the selected text from any website the user browses. Even if a user preferred manually selecting in advance each website, this is not possible on Firefox. See [3][4]
@DErenrich-WMF: Wouldn't dropping "https://*/*" in "host_permissions" lose no functionality, as it's declaring "activeTab" permission? Maybe there's some technical limitation I'm not considering. Because just "activeTab" won't trigger the "[…]all websites" permission. Bertaz (talk) 23:07, 28 September 2024 (UTC)[reply]
Thanks for the suggestion. I'll take a look at this this week. I do know we had internal conversations when we needed to increase the permissions and thought there was no workaround. This was a while ago so I may have explained it incorrectly and we may have been wrong.
I'm not a lawyer but it is my understanding that the privacy policy covering data collected by this extension follows the same rules as that used on Wikipedia. DErenrich-WMF (talk) 05:08, 29 September 2024 (UTC)[reply]

Why are we trying to (1) Use AI; (2) Encourage editing off-site; (3) Risk flooding of mainspace with drive-by gunk?

[edit]

Inquiring minds want to know. Looks on the face of it like another terrible idea from the bureaucracy. Carrite (talk) 22:48, 27 September 2024 (UTC)[reply]

First, AI is only being used to analyze whether the highlighted statement is represented on associated Wikipedia pages. Second, I think that using this off-site extension could ultimately be a gateway to on-site editing by having novices see how editors act on their talk page proposals. Third, regarding drive-by posting, I am in agreement that the tool should scan whether the talk page already has a pitch to add the proposed fact BluePenguin18 🐧 ( 💬 ) 00:43, 28 September 2024 (UTC)[reply]
As I pointed out earlier, the term "fact", proposed or otherwise, has no place on WP, which is not Speakers' Corner. ;) MinorProphet (talk) 15:34, 28 September 2024 (UTC)[reply]

Doubtful about the fundamental premise of a tool like this

[edit]

Thanks for experimenting with this tool, I appreciate the time and effort invested. However, I believe tools like this are fundamentally flawed from the outset. My reasons are as follows:

  1. Everyone agrees that an LLM writing any articles is a terrible idea, so the best it can do is leave little notes in the suggestions box, which is what this extension does essentially.
  2. However, the difficult part of writing Wikipedia is not finding random titbits of information; it is integrating them into the text of all the relevant articles (of which there might be many), checking the sources, establishing their credibility, ensuring articles aren't obviously contradicting each other, and that they don't have repeated and/or contradictory claims internally resulting from many independent edits. Those are all tasks for a subject matter expert, or at least a dedicated layman who can invest the time in gaining enough insight in a given field to be able to contribute. This tool helps with none of that.
  3. Thus, we now have a tool that does not help with any of the hard parts, and automates the easy part. The best this can result in is a huge backlog of suggestions that will be ignored by human editors as essentially equivalent to spam; at worst this will drown out actual meaningful discussions on talk pages and potentially create a dangerous precedent amongst less experienced users who might believe that those are things that should be added to articles and aren't merely suggestions to be carefully vetted.
  4. Lastly, I strongly object to the use of "facts" in the description of this extension. Those are, at best, claims, and random places on the Internet do not necessarily contain any facts, so calling them that is actively harmful to the critical thinking we're trying to cultivate, IMHO.

In other words, as with many other "AI" applications, this is a tool that doesn't what would be actually useful, but rather it does what LLMs can achieve easily. I don't think shoehorning this into the editorial process, or even suggesting to users that it is potentially a good idea is wise.

If you want to make a tool where an LLM could genuinely make the process easier, how about a tool that tries to detect repetitions and/or contradictions in articles? A little summary of "these paragraphs seem redundant", where I can click and jump to the relevant place in the article and see it in context easily would be very useful. Bonus points if it could figure out which articles are related and either repeat the same information instead of referring to a main article, or contradict it. This is an extremely widespread problem affecting many articles and one that is very tedious to deal with manually. mathrick (talk) 02:06, 28 September 2024 (UTC)[reply]

I just want to drop in and say that this tool is perfect for an editor like me, because I do enjoy the integrative part, and am able to do that better and faster with a tool that drops in the quote and the cite. I may be an outlier in that regard, but there are some of us who do this. In practice, however, I am not going to use it to find the right article for the information so much as to put the information where I already know it should go. BD2412 T 02:30, 28 September 2024 (UTC)[reply]
Maybe it should be in the settings for those who need to use this, to activate it or disbale it from the settings. It is a great tool but from what it is now, is going to be over used for its true use, the talk page won't be relevent as it will be filled with information other people will suggest, some just want to have high number of edits, this is the opportunity to quickly focus on this type of generated facts. Does Wikipedia has to have any fact that is available from any part of the of the internet? With this in place, I expect to see article or talk page expand in just few weeks if not days. Dwaynemoony (talk) 20:03, 4 October 2024 (UTC)[reply]

Comment by Imerologul

[edit]

Please note that, at least on Firefox, this extension requires the permission to read all the data you enter on any web page, including usernames and passwords. Imerologul Valah (talk) 21:07, 27 September 2024 (UTC)[reply]

This comment was originally posted on User:DErenrich-WMF/Add_A_Fact_Experiment and I moved it to here. --SCP-2000 04:34, 28 September 2024 (UTC)[reply]

How about a tool to check sources for problematic statements in Wikipedia?

[edit]

"Add A Fact" is a great tool, but how about making a tool the other way around? I mean, highlight a "citation needed" statement in Wikipedia and help me find sources for them. I would prioritize fixing existing problematic statements rather than adding more factual statements. The article will be more reliable and verifiable after fixing "citation needed" statements. --Jojit (talk) 06:24, 28 September 2024 (UTC)[reply]

How about entire problematical _articles_ in Wikipedia? I give you Talk:Ali James, which I came across yesterday and which I feel represents the underlying cracks in the very foundations of WP, much as there are cracks in the foundations of mathematics: but no-one wants to talk about them, even less fix them. Am I becoming a Deletionist? Plenty of my earlier articles do not bear scrutiny. How about a moratorium on all new articles, and edits to all existing ones, except to fix these egregious errors? MinorProphet (talk) 19:46, 28 September 2024 (UTC)[reply]
We've considered experimenting with tools that do exactly. It may be something we investigate in the future. DErenrich-WMF (talk) 04:57, 29 September 2024 (UTC)[reply]

What the tool appears to be trying to achieve

[edit]

As far as I can ascertain from the main page ("we want more excited editors!" - a slight paraphrase), this tool is hoping to encourage non-wikipedians and other beginners to contribute to the encyclopedia. I just don't think it's a good way. 'Excited editors' is exactly what we don't want: we would like level-headed, intellectually curious, emotionally robust people capable of identifying dross from worthwhile information, circumscribed by a vast nebulous bureaucracy of rules, advice, manuals, essays, admins and the dreaded Arbcom. Should we not be spending our time wondering how to attract more editors capable of coping with this? Once you get drawn in to contributing to WP, the learning curve turns out to be exceptionally steep, and then there's the inter-personal element which is possibly the hardest barrier to overcome. Just a thought. MinorProphet (talk) 15:56, 28 September 2024 (UTC)[reply]

Thanks for your comments. This isn't my area of expertise but it's my understanding that a lot of the efforts around trying to attract new users are about finding small well-defined low-risk tasks that they can do (e.g. suggested edits). Flagging information that's maybe missing from the Wikipedia could maybe be such a task. DErenrich-WMF (talk) 05:03, 29 September 2024 (UTC)[reply]

Why not have an option to add the information directly as a new edit rather than posted on the talk page?

[edit]

Yes I'm asking the question on the title. By the way, I can see this helping editors of under-updated articles such as those about Kuwait, where I live. FSlolhehe (talk) 19:21, 28 September 2024 (UTC)[reply]

Yup that's something we have considered and maybe future work would do something like that. The team had vigorous conversations about how or whether we should drop the user into an editor to make the edit immediately. The main reason we didn't go that route for now is that making the edit can be complicated and we want this tool to be easy to use for relatively new users. DErenrich-WMF (talk) 04:56, 29 September 2024 (UTC)[reply]
You are then offloading the work to people who have that page on their watch list. If you are already autogenerating the citation, then let users copy that and manually edit the article. Please don't enable this tool in this current configuration and don't advertise it! Matthias M. (talk) 17:32, 29 September 2024 (UTC)[reply]
I would like to have the option to make a real edit. I don't like to offload work on others through a talk suggestion. Talk is a good default, though. Editing should be a secondary option. But please allow users of the extension to choose. Peter Buijs (talk) 22:30, 30 September 2024 (UTC)[reply]
Once again, WMF, seeking to make itself useful to justify the tens of million dollars it burns annually, is clearly disguising a "plan" as an "experiment"... Carrite (talk) 01:34, 4 October 2024 (UTC)[reply]
If they did this, there'd be a whole lot more criticism of using AI to actively generate WP content, with a real risk of introducing hallucinations. Aaron Liu (talk) 02:21, 4 October 2024 (UTC)[reply]

Probably does not work for foreign languages

[edit]

Hi, I tried to use a fact from https://www.novinky.cz/clanek/domaci-pirati-opusti-vladu-v-utery-40490908 about the current Czech political crisis - the Czech Pirate Party will probably leave the govt. The tool found the Wikipedia article correctly (Czech Pirate Party) bud did not translate the text into English and gave me a bad citation source (a page I visited previously, not the current page). Firefox on Linux Mint.

This is the output of the tool:

Add the following to the talk page of Czech Pirate Party I found a fact that might belong in this article. See the quote below. Současná roztržka ve vládní koalici vyústí již v příštím týdnu odchodem Pirátů z vlády. V diskuzním pořadu Otázky Václava Moravce v České televizi to v neděli oznámil odcházející předseda Pirátů Ivan Bartoš. Následovat bude demise obou pirátských ministrů, demisi podají ministři zahraničí Jan Lipavský a pro legislativu Michal Šalomoun. The fact comes from the following source: https://www.novinky.cz/clanek/zahranicni-blizky-a-stredni-vychod-krtek-zradil-sefa-teroristu-hizballahu-40490931?consent=CP91N4AP91N4AD3ACLCSBHFsAP_gAEPgAATIJVwQgAAwAKAAsACAAFQALgAZAA6ACAAFAAKgAWgAyABoADmAIgAigBHACSAEwAJwAVQAtgBfgDCAMUAgACEgEQARQAjoBOAE6AL4AaQA4gB3ADxAH6AQgAkwBOACegFIAKyAWYAuoBgQDTgG0APkAjUBHQCaQE2gJ0AVIAtQBbgC8wGMgMkAZcA0oBqYDugHfgQHAhcBGYCTQEqwQugRAALAAqABcAEAAMgAaABEACOAEwAKoAYgA_ACEgEQARIAjgBOADLAGaAO4AfoBCACLAF1ANoAm0BUgC1AFuALzAYIAyQBlwDUwIXAAAAA.YAAAAAAAAWAA&sznclid=qsPOl5udk5KZk5KSk5KYn5mYn5uSmZiT1t6Xm5yfmZiZnpmbn4SSk5jW3s-Xm52YnZ-fnJ-bnoSTkpnWyZfvnZ3vm-yb65vu6OmemZPvn5ifmZ3s75uY6JiTmZ2amw#dop_ab_variant=1368001&dop_source_zone_name=hpfeed.sznhp.box&dop_vert_ab=1368001&dop_vert_id=int1&dop_req_id=WAl0b0F67so-202409291914&dop_id=33170787&utm_source=www.seznam.cz&utm_medium=sekce-z-internetu Jan Spousta (talk) 19:28, 29 September 2024 (UTC)[reply]

No.

[edit]

I am absolutely not interested in using generative AI or LLMs in any way. I understand that this isn't used for writing new text on the wiki, just "searching" for the presence of existing information. As others have said, I'm opposed to this on principle, do not wish to ever interface with these tools, and suggest the significant climate impacts of this be considered: https://www.scientificamerican.com/article/ais-climate-impact-goes-beyond-its-emissions/Fpmfpm (talk) 10:37, 30 September 2024 (UTC)[reply]

Idea: Try to suggest locations within the articles to edit that fact into, but do not generate text

[edit]

For example when adding a newly released album to an existing discography article, an LLM might find an existing list of releases and suggest adding it there. Maybe place the cursor in the editor there. But do not generate any text. Let the user do the editing.

Another example might be an article with exclusively prose. Then the suggestion should probably be writing prose too, based on the context. Maybe even point to a likely paragraph to extend if the article is large. But again, do not generate any prose.

More advanced suggestions could be to not add the fact at all if it does not seem to add value. Or to make slightly more extensive edits like extracting a list from prose or creating a new paragraph. Peter Buijs (talk) 23:03, 30 September 2024 (UTC)[reply]

Maybe I'm misunderstanding your comment, but based on the description, the demo video and my experience, this extension does not, and is not intended to, generate any article text. The only thing that is supposed to be generated (but wasn't for me, see below) is a citation, and there is no need to use AI to do that, it is well within the capabilities of good old-fashioned programming (which I assume is what is used). The only thing I observed AI being used for is the selection of suggested articles. -- chris_j_wood (talk) 11:11, 1 October 2024 (UTC)[reply]

Bit disappointed in first use

[edit]

I tried using this tool on a fairly arbitrarily chosen web page in an area I'm currently editing in. Not sure if the text I selected is really encyclopedic, tbh, but I though it would make a good test. You can see the resulting added comment in Talk:Benton Metro station#Add A Fact: "Original station shelter at Benton Metro". Despite telling me in the extension that it would create a reference, as indeed it did in the demo video, it has simply left me with a bare link to the source. The pre-written reference was the thing that made me think this extension might be useful, without that I'm at a loss as to how this is easier than simply adding the fact to the article directly. Admittedly this is partly because I already knew the article I wanted to add the fact to (and the AI did correctly work this out); I suppose if I didn't know that, it might still have been useful.

I'd be interested to know why it didn't give me a reference. Has the feature been removed/suspended, or is it something to do with the source web page?. -- chris_j_wood (talk) 11:02, 1 October 2024 (UTC)[reply]

Thanks for trying it out and sharing your feedback, @Chris j wood!
Hmm, I'm not sure why your citation showed up as a bare link rather than in Wikipedia reference format as intended – FYI we're using the Citoid extension to generate citations formatting, and we do know there are some sources/website that don't play well with Citoid (e.g., sites that block web crawlers, which are required for Citoid to grab the relevant metadata for the citation, will show up as bare links). Have you tried it with other links/sources? (P.S., I'm happy to see that your test edit generated a discussion that looks like it's going to lead to an update to the article! That's very cool ) Maryana Pinchuk (WMF) (talk) 15:21, 1 October 2024 (UTC)[reply]

Oppose

[edit]

I oppose the use of AI programs on Wikipedia for a myriad of moral and logistical reasons that many users have listed above, including the climate crisis. Further, the concerns about spam and users adding elements to articles without much forethought are also well founded, if it is *too* easy to contribute to wikipedia, many unverified assertions may enter into articles and then later have to be edited out. I also have security concerns regarding the current permissions structure. My most serious concern is and will continue to be my vehement moral opposition to the use of AI programs of any kind. Gbrann100 (talk) 23:25, 1 October 2024 (UTC)[reply]

Summary of feedback, clarifications, and data so far (as of October 1)

[edit]

Thank you to everyone who has taken the time to test out this extension and shared their feedback! I wanted to provide some clarification on a few points, summarize some of the feedback so far, and invite more feedback for those who have tried out the extension and not yet voiced their thoughts here.

Clarifications on what this extension is not:

  • This extension is not planned to be deployed more widely or maintained in its current state. It is a small experiment by the WMF Future Audiences team. The goal is to understand if or how a tool like this could help current or new Wikipedia editors improve the encyclopedia, with an eye towards attracting more editors to the projects.
  • This extension does not use AI to generate any content that is published to Wikipedia, and it does not publish anything automatically to Wikipedia without a logged in, autoconfirmed Wikipedia users’ approval. Once a user has installed the extension, logged in, and selected a claim on a website that they think might contain new information that should be added to Wikipedia, AI is used to: a) search Wikipedia articles that may be associated with that claim, b) identify whether the claim is already contained in those articles, and c) notify the user whether the claim is or is not already present in those articles. The user can then choose what to do from there – whether that’s to open the article up in a new tab and make an edit directly to the article, publish a suggested claim to the talk page via the form in the extension, or reject the suggestion/do nothing.
  • This extension does not read or store any text on webpages that the user hasn’t explicitly chosen for it to read and search for in Wikipedia. Once you have installed the extension, you as the user must highlight a statement and choose to search for it in Wikipedia. Information on websites you browse that you do not explicitly select to search for in Wikipedia via the extension is not read or stored anywhere by the extension. 

Summary of feedback and data so far:

What I’m hearing from those of you who have tested this out (thanks again for taking the time to do this!) is that to most of you, an extension like this is not going to help you in your Wikipedia editing in its current state, but there have been some ideas proposed for how something like this could be changed in order to be improved, for example:

  • Make it more useful for existing editors by enabling on the Draft namespace and being able to start a new article draft from the extension.
  • Ensure that it benefits the encyclopedia and doesn’t create more burden for existing editors by not allowing users to send through the quote of the fact they are proposing to the talk page, to avoid it being copied into the article/discourage non-paraphrased additions from external sources to the encyclopedia.

I’m also hearing that some Wikipedians are opposed to any use of AI on Wikipedia and are not interested in testing/using any tools that use AI. It’s important to hear this feedback and I want to thank those of you who shared it. As stated above, there are no plans to scale or maintain this extension in its current state, but I do want to note that “AI” means a lot of different things, and there are some uses of AI on Wikipedia that have long been widely accepted by the Wikipedia community (e.g., Cluebot NG’s vandalism detection algorithms – the way the bot identifies and acts on harmful edits – make use of an artificial neural network, which is a type of machine learning/AI model). So I think it’s important for all of us (myself and all WMFers included!) to be clear about what we mean when we’re discussing AI, its uses, benefits, and risks in the context of Wikipedia.

Next steps:

According to the hashtags tool on Toolforge, as of this moment 61 Wikipedians have posted 89 new fact suggestions to talk pages using this extension so far, which exceeded our expectations for how much usage we'd get (thank you!). We'll be turning off the notice to invite testers in the next couple of days. For those of you who have tried it but haven’t yet left your feedback (or those who have used it a few times since leaving your initial thoughts), I’m curious to hear what your experience has been like! As stated in the clarifications above, this isn’t something we plan to scale more widely or keep going forever, so please make your voice heard, especially if you have thoughts/ideas that haven’t been raised yet. And if by any chance any of you will be at WikiConference North America in Indianapolis this weekend, come talk to me about your experience in-person! :) Maryana Pinchuk (WMF) (talk) 00:02, 2 October 2024 (UTC)[reply]

This summary does not address the concerns raised by @Boud, which are the same concerns I have: why was this implemented across Wikipedia for editors without a RFC or some other sort of consent process? The FAQ for Futures Audience even states that "as yet there is no global consensus on whether and how AI can or should be used on our projects". Wikipedia operates on the principles of consensus, and it is baffling to me how a group of about 100 Wikimedians with conflicting views were able to implement this onto Wikipedia without care or concern regarding the Wikipedia community's consensus.
Furthermore I don't believe it is fair to compare an LLM browser add-on requiring access to data outside of Wikipedia, to the bots and extensions that run entirely within Wikipedia and may happen to also be an AI. The privacy, environmental, and factual concerns raised by other users about this extension do not have equivalents in the vast majority of the bots that run on Wikipedia.
If the extension is not planned to be deployed more widely or maintained in its current state, what's the point of all of this? To myself and some other users on this page, it appears this is a test run on a tool with the goal to eventually implement this to a wider audience or refine it for future usage. Mintopop(talk) 18:04, 2 October 2024 (UTC)[reply]
No alteration to Wikipedia was required. The LLM is only accessed if the extension's user chooses to run it. Aaron Liu (talk) 21:31, 2 October 2024 (UTC)[reply]
I'm talking about the decision to launch this on Wikipedia editors prior to any discussion with the Wikipedia community. Why was this team allowed to run the little "want to try out our new tool?" pop-up box to me when I logged into Wikipedia one day, when it should be plainly obvious that this would be a controversial tool and should require a level of community consensus. As a newer editor but a long-time Wiki user, I have never had something like that pop to me while looking at Wikipedia. Editors don't expect to go to Wikipedia and have LLM browser extension "experiments" advertised to them. It is concerning that the decision to move ahead with this was made, when even in a small team of 100 people there was not a clear consensus.
This has extremely lowered my opinion of the Wikimedia Future Audience program, and I also have little faith in the program's capabilities to address the many other complicated and valid issues that have been raised by other users here. Mintopop(talk) 23:40, 3 October 2024 (UTC)[reply]
Hi @Mintopop,
RE: your (and @Boud's) question of: Where was the discussion to launch this new tool? There are thousands of experimental tools developed by community developers and Wikimedia Foundation developers alike over the years (and not just on Toolhub, but also hosted externally). Many of these tools integrate into Wikipedia or other Wikimedia projects in some way (whether that's adding/overlaying information or metadata to existing content, making assisted edits, remixing content in new ways, etc.). There’s never been a habit to create RfCs for them – tool development would be very slow and expensive and the communities would be stuck in constant RfCs on tools with very minor impact. The idea for this experiment originated at WikiConference North America this year, in a discussion between WMF staff and community members, and we discussed its development monthly with interested community members during our regular open calls (notes and recordings available here). We also announced it on the English Wikipedia Village Pump in August, and I discussed it at Wikimania.
Until now, we hadn't heard any objection from any Wikipedians to moving forward with this as a limited experiment, especially because it is restricted to logged-in, autoconfirmed Wikipedians; is not adding any information to the article itself; does not post anything automatically; and is limited to 10 posts a day. I understand your and others' concern about how a different implementation of an idea like this (e.g., a tool to add third-party content directly to Wikipedia, automatically and unrestricted in any way) could be harmful, but that is why we have been very careful to implement it in the way we have.
RE: what is the point of this experiment? In this discussion (and while talking about this with Wikipedians at this year's WikiConference North America, happening right now!) we're learning a lot of things about:
  • How current editors typically approach adding new information to Wikipedia – i.e., it seems like encountering a new source/claim is typically not the starting point for many Wikipedians to improve an article. Rather, they like to start from an article they've already identified as needing an update and go out to look for new sources/information. So maybe something like Add A Fact that lives within Wikipedia and can trigger searches for sources/claims from outside Wikipedia when someone has identified that information in an article is missing or outdated could be an interesting future experiment (though a version of that exists now on some citation-needed cleanup templates, I believe, and I'm not sure how effective that is).
  • Pain-points that some editors are experiencing in their workflows – i.e., when they do encounter new information online, it may be too much work to go and find the part of Wikipedia that needs updating, so they'll just file it away for later. And when they finally have time to sit and work on Wikipedia, they have a long list of additions across multiple sources that they'd like to make. So maybe a different way to think about this in the future could be a tool that creates a running list of sources/claims as an internal-to-the-user worklist to save and revisit later. (But that wouldn't be helpful for newer/more casual/non-editors).
  • How this could be useful if incorporated in a different way into the more stubby/drafty parts of Wikipedia (e.g., drafts space, or stub creation). So maybe if there is a worklist or something that this tool generates, it could hook into existing newcomer structured tasks and provide newer editors with a more guided contribution.
All of these ideas would need a lot more fleshing out and discussion before we would recommend any of them to become a new feature/hand it over to a fully-staffed Product team at WMF to build, scale and maintain it – and they would need more evidence that it's a good enough idea to actually invest those kinds of resources in. That is the goal of Future Audiences: finding the tools/strategies that are compelling enough to warrant larger investment (and if not, learning why not, so we can quickly move on from things that aren't working and spend more time on more promising areas/experiments). Maryana Pinchuk (WMF) (talk) 21:18, 4 October 2024 (UTC)[reply]
For the record, I think this is a very nice tool with great potential (and, as with all tools, some potential for mischief). Frankly, I think that if this had not been described as a tool that uses AI, and had been pitched on its functionality alone, there would not be dissent about its use. As a longtime intellectual property attorney, I think the copyright concerns are vastly overstated (though these could also be ameliorated by having the tool search for other instances of its own use to add a talk page quote, and avoid multiple uses for that purpose). BD2412 T 21:47, 4 October 2024 (UTC)[reply]
Thanks for your feedback and for helping to clarify things for others here, @BD2412! I really appreciate it. And this has indeed been a valuable meta-learning experience on the range of AI sentiment within English Wikipedia (which does seem to vary quite a bit depending on who you ask, and where and how you ask them). Maryana Pinchuk (WMF) (talk) 21:07, 5 October 2024 (UTC)[reply]

Bit confused on Firefox

[edit]

Do I need to finish the "Log in" button if I'm already logged-in? Is there a way to hide the annoying sidebar without manually bringing up and hiding a default Firefox sidebar? Why sidebar instead of an extension pop-up window? Aaron Liu (talk) 01:15, 2 October 2024 (UTC)[reply]

@DErenrich-WMF I highlighted some text and pressed "Wikipedia Add a Fact" in the right-click menu. Nothing happened, except the sidebar popped out. Aaron Liu (talk) 01:21, 2 October 2024 (UTC)[reply]
thanks for the feedback. it seems like there is an issue with the Firefox version that is affecting some users. You shouldn't have to "log in" if you are already logged in. early versions of the extension did use a different way to show the information feedback we got suggested that was annoying so we moved to the sidebar. DErenrich-WMF (talk) 16:20, 3 October 2024 (UTC)[reply]
There should be an easy way to close the sidebar. Aaron Liu (talk) 17:09, 3 October 2024 (UTC)[reply]
There should be an x icon that you can click to close it. Maybe I don't understand what you're saying. DErenrich-WMF (talk) 17:21, 3 October 2024 (UTC)[reply]
Oh yeah, forget about it, my browser has a modification that hides the sidebar headers. It would be better for UX to make clicking on the extension's icon again close the sidebar, though. Aaron Liu (talk) 17:24, 3 October 2024 (UTC)[reply]

Meta: Mention how AI is used more prominently on the page

[edit]

My first reaction when seeing the article and clicking the video was "wtf!? AI? that sounds like a terrible idea!" and only after watching through the entire video did I see that it wasn't actually used to generate any text for Wikipedia. And judging from this talk page, I wasn't alone in my reaction.

It would probably be worth mentioning in bold or in a highlighted box or something at the top of the page that the LLM is not used to generate any text that could end up on Wikipedia (and is only used to decide if information is already present or not in an article).

Also, this page should also mention which LLM is being used. You shouldn't need to go to the extension page itself to learn that. Anka.213 (talk) 18:05, 2 October 2024 (UTC)[reply]

another poorly thought out project from the wmf!

[edit]

we don't have enough of those! ltbdl☃ (talk) 04:17, 3 October 2024 (UTC)[reply]

Privacy Concerns

[edit]

The extension in Firefox requires access to data on all webpages as I browse. I have no fundamental issues with the idea of such a tool, but it seems like a privacy concern to me. Is the source code available? ~tayanaru (talk) 19:29, 3 October 2024 (UTC)[reply]

[edit]

A thread about this tool on Wikipediocracy emphasizes it would be an ideal tool for tag team vandalism, in which one player "suggests" and the other player "inserts"... See the thread there for more fulsome discussion. There are also copyright concerns expressed: "a tool for bulk copyright violation" is one memorable phrase. Carrite (talk) 01:31, 4 October 2024 (UTC)[reply]

Thanks, @Carrite – RE: "ideal tool for tag team vandalism, in which one player 'suggests' and the other player 'inserts'", this extension in its current state wouldn't make that any easier than adding a suggestion to the talk page with one account and then inserting it into the article with another. A user must be logged into their account to use the extension, and the talk page suggestion edit is saved under their username, so they would need to log into the browser extension, find/add the suggestion, log out, log back into Wikipedia on another account... I'm afraid plain old SPA vandalism is so much simpler Same with "bulk copyright violation" – only existing editors can use this currently, and they're limited to 10 posts a day. But these are great points to consider when thinking about how a tool like this would need to be modified in order to be usable for logged-out users, and getting some community thinking/talking about potential uses and abuses like this is precisely why we've tried to put it in the hands of existing editors. (As I've clarified above, though, we don't have any plans to scale this more widely.) Maryana Pinchuk (WMF) (talk) 21:46, 4 October 2024 (UTC)[reply]

Make local notes in browser

[edit]

I could imagine using this if it simply made a note for my personal use rather than adding to a talk page. I am at best an occasional editor, but I do read mathematical and scientific papers from time to time. It would disturb my flow to jump to WP whenever I come across information that I feel should be represented there, but I can certainly imagine making more substantive contributions in the future given a way to essentially make an (automatically high-quality) note that I can then review and act on afterwards. I feel that such notes could be held entirely within the browser, which should remove concerns about spam on talk pages. Hv (talk) 00:44, 6 October 2024 (UTC)[reply]

Bug Reported on VPT

[edit]

Wikipedia:Village pump (technical)#"Add A Fact" LLM browser extension

In this discussion at VPT a couple of diffs have been brought up where the wikitext snippet citation has been filled out with "User:DErenrich-WMF/Add A Fact Experiment", see [5] [6] 86.23.109.101 (talk) 09:58, 6 October 2024 (UTC)[reply]

Further Reading

[edit]

No AI Content should be a red line policy of English WP. It's super offensive that WMF is making an ill-considered end run around the community on this. Carrite (talk) 06:52, 11 October 2024 (UTC)[reply]

You can see the previous response from WMF "This extension does not use AI to generate any content that is published to Wikipedia" fyi. SCP-2000 07:17, 11 October 2024 (UTC)[reply]