Jump to content

Wikipedia:Request a query

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:SQL requests)

This is a page for requesting one-off database queries for certain criteria. Users who are interested and able to perform SQL queries on the projects can provide results from the Quarry website.

You may also be interested in the following:

  • If you are interested in writing SQL queries or helping out here, visit our tips page.
  • If you need to obtain a list of article titles that meet certain criteria, consider using PetScan (user manual) or the default search. Petscan can generate list of articles in subcategories, articles which transclude some template, etc.
  • If you need to make changes to a number of articles based on a particular query, you can post to the bot requests page, depending on how many changes are needed.
  • For long-term review and checking, database reports are available.

Quarry does not have access to page content, so queries which require checking wikitext cannot be answered with Quarry. However, someone may be able to assist by using Quarry in another way (e.g. checking the table of category links rather than the "Category:" text) or suggest an alternative tool.

Syntax error due to using a reserved word as a table or column name in MySQL

[edit]

https://quarry.wmcloud.org/query/87911

https://stackoverflow.com/questions/23446377/syntax-error-due-to-using-a-reserved-word-as-a-table-or-column-name-in-mysql

It isn't handling the `user` table right as "user" is an SQL reserved word, I think.

The syntax highlighter was showing "user" in red, so I surrounded it with backticks `user`, then it was showing in light blue.

I think it needs to be highlighted in white to work correctly. But how? wbm1058 (talk) 18:47, 14 November 2024 (UTC)[reply]

Unrelated to the reserved word. `WHERE IS NULL(u.user_name)` should be `WHERE u.user_name IS NULL`. But see prior noise at User talk:Primefac/Archive 32#U2 deletions if you want to continue this. * Pppery * it has begun... 20:12, 14 November 2024 (UTC)[reply]
https://www.w3schools.com/sql/sql_isnull.asp indicates that my syntax should be valid. Two alternative ways to do the same thing? Regarding the "prior noise", I'm a more competent administrator who's checking page histories, and leaving redirects within user space alone. My current focus is on cross-namespace redirects from user pages of nonexistent users to outside of userspace. My recent deletion log will give you an idea; I'm trying to make a more specific query to reduce the noise level in the query results I've been working from. – wbm1058 (talk) 20:53, 14 November 2024 (UTC)[reply]
Wikimedia uses MySQL (actually MariaDB which uses MYSQL-ish syntax), not SQL server where your link says ISNULL (not IS NULL which the query uses) is valid. * Pppery * it has begun... 21:06, 14 November 2024 (UTC)[reply]
MariaDB supports ISNULL(), and it works the way Wbm1058 was trying to use it (modulo the misplaced space). SQL Server's ISNULL() is a synonym of COALESCE() instead. x IS NULL is generally safer precisely because of that incompatibility. —Cryptic 21:29, 14 November 2024 (UTC)[reply]
I tried just changing the syntax of the "IS NULL" statement as suggested. It was cooking on that for a while, and then:
"Error
This web service cannot be reached. Please contact a maintainer of this project.
Maintainers can find troubleshooting instructions from our documentation on Wikitech."
Hopefully my query didn't just crash the server. – wbm1058 (talk) 21:55, 14 November 2024 (UTC)[reply]
It just ran to completion, so simply changing the "IS NULL" statement fixed the syntax error. Now on to figure out the results, and tweak the query to do what I really want it to do. Thanks for your help. wbm1058 (talk) 22:09, 14 November 2024 (UTC)[reply]

FYI, I'm now feeling the joy. User:Wbm1058/Userpages of nonexistent users is my report of 400 pages which I think may all be safely speedy-deleted under U2: Userpage or subpage of a nonexistent user. This report was culled from a report of 1960 pages, by INTERSECT with the user table SELECT. This is indicative of the poor page-move interface design, which leads editors who think they're publishing user drafts to keep pages in userspace when they really wanted to move to mainspace, because they neglected the namespace dropdown in the move-page user interface. – wbm1058 (talk) 14:11, 15 November 2024 (UTC)[reply]

Dusty articles within all subcategories of a parent category

[edit]

Is this possible? I'd like to get a list like Special:AncientPages but for anything within any subcategory of Category:Food and drink. It would make quite a nice little To Do list for times I feel like doing some research and writing but don't have a particular bee in my bonnet that very minute. Thanks for any help! Valereee (talk) 15:16, 17 November 2024 (UTC)[reply]

What is "dusty"? Neither Special:AncientPages nor Wikipedia talk:Special:AncientPages say what it does. Is it a sort by timestamp of last edit?
In direct subcategories only, include the handful of pages directly in the category, or the whole tree? If the last, to what depth? Examples: Category:Food and drinkCategory:DairyCategory:Dairy industryMark Ezell is depth 2, and Food and drinkCategory:DairyCategory:Dairy industryCategory:Dairy farmingGoatherd is depth 3; neither the page itself nor the root category count. —Cryptic 16:51, 17 November 2024 (UTC)[reply]
Yes, it's a list of articles by date of most recent edit.
Hm, on the second question. Ideally I'd end up with is a list of, say, food items that hadn't been edited in ten years. Or chefs, or restaurants, or food animals or whatever. Maybe I need to choose a more specific subcategory? Valereee (talk) 17:24, 17 November 2024 (UTC)[reply]
Well, ok,
The reason I need a maximum depth is because - like almost all reasonably broad categories - Category:Food and drink eventually includes a significant portion of all categories. Depth 10, for example, has 122639 different categories in the tree, out of 2.4 million total categories (including maintenance categories, category redirects, and so on), and you really quickly start getting unrelated pages like Category:Food and drinkCategory:Food and drink by countryCategory:Agriculture by countryCategory:Agriculture in Europe by countryCategory:Agriculture in RomaniaCategory:Forestry in RomaniaCategory:Romanian woodworkersConstantin Brâncuși.
Or, if you like, you can give me a list of categories to pull from. Even if it's a large list, or something like "Anything in any direct subcategory of Category:Food and drink, Category:Cuisine, Category:Chefs, Category:Poultry, [20 or 30 other cats]". —Cryptic 18:33, 17 November 2024 (UTC)[reply]
Oh, and do you want non-mainspace pages in the list or not? What about redirects? —Cryptic 18:38, 17 November 2024 (UTC)[reply]
lol...clearly in over my head here. :D Thank you for your patience.
So, no to feed a cold, starve a fever. Yes to recipe, dulce de leche and ice milk.
I think maybe start with something that's likely to contain fewer extraneous things. Category:Chefs in a way that will allow me to see, for instance, the articles that are in Category:Chefs by nationality > Category:Women chefs by nationality > Category:British women chefs > Category:Women chefs from Northern Ireland that haven't been edited in the last ten years. Valereee (talk) 18:50, 17 November 2024 (UTC)[reply]
Oh, no non-mainspace pages, no redirects. Valereee (talk) 19:13, 17 November 2024 (UTC)[reply]
None quite that old in either tree. quarry:query/87975 for Category:Food and drink depth 3 (oldest is Land reform in the Austrian Empire, 2015-11-16 18:36:35 - see what I meant about unrelated pages?), quarry:query/87976 for Category:Chefs depth 4 (oldest is Richard Ekkebus, 2019-12-16T04:47:03). —Cryptic 19:19, 17 November 2024 (UTC)[reply]
Well, thank you for your work, and sorry to waste your time! Valereee (talk) 19:35, 17 November 2024 (UTC)[reply]
Not wasted at all. Not your fault the category system is terrible for datamining.
There might be some value in finding the latest revision that wasn't marked minor, and maybe excluding ones made by bots too, but that's going to be harder and a lot slower. Would definitely need to cut the set of articles to look at to something on the order of a couple thousand before looking at the edits, rather than the tens of thousands in that first tree. —Cryptic 20:14, 17 November 2024 (UTC)[reply]
Thanks. And I've actually already found an article that needs attention from your 87976 query, so win!
The point for me here is looking for categories that have many articles that haven't been updated since before sourcing started modernizing. It's a bit tricky because the articles that were created first -- probably in any category -- are also likely the articles that get update often, have multiple watchers, etc. So it's possible there just aren't huge numbers of food articles that need this kind of attention. Valereee (talk) 21:18, 17 November 2024 (UTC)[reply]

Number of articles that are actually articles

[edit]

There are 6,917,694, but AIUI that includes disambig pages, stand-alone lists, and outlines, and maybe even portals (i.e., all content namespaces, not just the mainspace) but excludes redirects. Is there a way to get a count of the number of just plain old ordinary articles, excluding the other types? (A percentage from a sample set is good enough; I'd like to be able to write a sentence like "Of the 6.9 million articles, 6.2 million are regular articles, 0.45 million are lists, and 0.2 million are disambig pages.") WhatamIdoing (talk) 22:46, 17 November 2024 (UTC)[reply]

@WhatamIdoing: according to Category:All disambiguation pages, there are 362,957 of those. BD2412 T 23:29, 17 November 2024 (UTC)[reply]
Category:WikiProject Outlines articles suggests that there are about a thousand of those, which will not have a material effect on the numbers.
Wikipedia:WikiProject Lists/Assessment#Statistics says they've tagged 131K pages. There are about 123,700 pages with "List of" or "Lists of" at the start of the title. WhatamIdoing (talk) 00:37, 18 November 2024 (UTC)[reply]
There is no clear definition of what a "regular article" is. Also many pages are not correctly marked and categorized. Don't for WP:INDEXES which look like ordinary articles, or might be, depending. -- GreenC 00:43, 18 November 2024 (UTC)[reply]
{{NUMBEROFARTICLES}} seems to be mainspace non-redirect pages. I'd thought it used other heuristics, too; I remember needing at least one link, and less confidently requiring a period? But plainly doesn't anymore; I'm getting 6912240 for ns0 !page_is_redirect on the replicas now.
There's only 362201 non-redirects in Category:All disambiguation pages and mainspace. Most of the difference are in other namespaces, probably legitimately, though I'm surprised to see 208 in user:, 44 total in various talk namespaces, 9 mainspace redirects, and a single redirect in draftspace.
114253 mainspace non-redirects in Category:All set index articles, though 64 of those are in the disambig cat as well.
Lists are less certain; there's no Category:All list pages. I could try to count pages that are in any category starting with "Lists " or ending with " lists", but - not having done that before - don't have any idea how many it would miss, and how many it would miscount. Ditto with pages starting with "List of " or "Lists of " (which is easy - 120653, not counting any redirs or pages in the dabs or set index cats). —Cryptic 01:00, 18 November 2024 (UTC)[reply]
Oh, and 11193167 redirects (so 18105407 total mainspace pages), if you care. —Cryptic 01:03, 18 November 2024 (UTC)[reply]
So 6,912,230 non-redirect pages, of which 362,201 are dab pages and 120,653 are Lists (per title), and the rest (e.g., Outlines, Index pages) is immaterial. A good SIA looks like an article and an underdeveloped one looks like a dab page, which takes us back to GreenC's point about it not being a clear-cut question.
All of this suggests that if you count SIAs as 'articles', then there are 6.429 million articles (93%) and if you count them as lists/dabs, then there are 6.315 million articles (91%).
Thanks, all. WhatamIdoing (talk) 01:15, 18 November 2024 (UTC)[reply]

Median account age for EXTCON

[edit]

Hello again, generous satisfiers of curiosity:

Today's question is how old the typical currently active WP:EXTCONFIRMED account is. The requirements are:

  • is currently active (perhaps made at least one edit during the last 30 days? Any plausible definition of active works for me, so pick whatever is easiest and cheapest to run)
  • meets EXTCON (all of which will have the EXTCON permission)

I don't care whether it's date of first edit vs registration date. I also don't care whether it's all ~73K of them or if it's a sample of 5K–10K. I am looking for an answer to the nearest year ("Most active EXTCON editors started editing before 2014" or "If you see someone editing an article under EXTCON, they probably started editing more than 10 years ago").

Thank you, WhatamIdoing (talk) 17:14, 19 November 2024 (UTC)[reply]

Hmm. user_touched has been scrubbed because it is private data. So I guess LEFT JOIN recentchanges to see who is active? This should only get us users who have made an edit in the last 30 days. Then run MEDIAN() on it. Let's see if quarry:query/88037 finishes. If the count is 72,000ish, then I also need to add a WHERE to filter out the editors who aren't in recentchanges. –Novem Linguae (talk) 18:33, 19 November 2024 (UTC)[reply]
That's going to get you not just every user with the user right - the whole point of a left join is that you get a result whether there's a row in the joined table or not - but a row in your resultset for every row they have in recentchanges. And you're leaving out admins, who have extended confirmed implicitly. Plus, even if it worked, it would be a dump of ~25k values.
Mariadb has a MEDIAN() window function, but I can't get it to work on user_registration no matter how I try to preprocess it first - it gives me "Numeric datatype is required for percentile_cont function" when I call it directly on the column, which is reasonable, but always 100000000 if I cast it to any kind of numeric value, which isn't. (Anyone know what I'm doing wrong? Sample query. And I've never really grokked window funcs or how to get them to behave like normal, sane, grouped-by aggregate funcs anyway.) But doing it longhand works just fine. quarry:query/88039: 28 May 2013. —Cryptic 19:37, 19 November 2024 (UTC)[reply]
user_registration is ASCII-encoded binary rather than a binary number which is why you're getting nonsense when casting it and trying to do operations on it. Uhai (talk) 21:28, 19 November 2024 (UTC)[reply]
Casting it seems to get me a numeric, and doing normal arithmetic on it (user_registration + 1, division, and so on) coerces it to a numeric; it doesn't get me nonsense until I try to pass it through MEDIAN(). And UNIX_TIMESTAMP() in particular is documented to return an unsigned int. Current theory is that MEDIAN() can't deal with large numbers (see resultset 4; dividing by numbers smaller than ten gets me 100 million again), which is boggling. No, a cast or operation on the result of MEDIAN() is what fixes it. Still boggling. Cleaner query. Thanks for the prod. —Cryptic 21:55, 19 November 2024 (UTC)[reply]
Also, comparing the results reminded me that user_registration is NULL for some users who registered before mid-2005ish, which I hadn't corrected for. 2013-06-15 19:42:14, though I doubt the two weeks' inaccuracy is going to matter much to WAID. —Cryptic 22:30, 19 November 2024 (UTC)[reply]
TIL CAST seems to convert to the proper numeric representation if the binary string contains only numeric ASCII characters. Glad you were able to get it working though. Uhai (talk) 22:34, 19 November 2024 (UTC)[reply]