Please do not include wiki markup or links in section titles.
Sign your posts with   ~~~~

Do you have questions, comments or suggestions about Wikiversity? That is what this page is for! Before asking a question, you can find some general information at:


var wgArticlePath = "/wiki/$1"; var wgServer = ""; var wgPageName = "Wikiversity:Colloquium"; var wgTitle = "Wikiversity Colloquium"; var wgContentLanguage = "en"; var x-feed-reverse = "true"; var x-blog-description = "You have questions, comments or suggestions about Wikiversity? That's what this page is for!";

"Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution." — Albert Einstein (discuss)

Draft Space CleanupEdit

12-month moving average pageviews before and after introduction of draftspace (dotted line)

I have tagged all Draft: namespace resources that were unedited in the last 150 days with a 30-day proposed deletion. See Category:Proposed deletions to review. -- Dave Braunschweig (discusscontribs) 18:42, 1 December 2019 (UTC)

I use draftspace a lot and I agree with this policy. People with items up for deletion have a number of options:
  1. Improve the article in draft space or argue that it is complete and does not need improving.
  2. Move the draft into mainspace and see what happens
  3. Move the draft into your own userspace
  4. Save a permalink in your userspace. I like the latter because I can easily index them in case I ever want to complete the project. I keep a dated index of them at the bottom of my sandbox.I think it's OK to list them as external links by selecting "Permament link" under the "Tools" sidebar and saving the external link. The links look like this:
Nobody can be certain about this, but there is good reason to hope that removing the clutter from Wikiversity will make it a more popular resource.--Guy vandegrift (discusscontribs) 22:24, 1 December 2019 (UTC) -I added a new option and placed it first Guy vandegrift (discusscontribs) 13:37, 2 December 2019 (UTC)
Just FYI but the increase in popularity of Wikiversity coincides with the beginning and continuation of the WikiJournals. It also doesn't reflect that while the popularity of Wikiversity is increasing the number of new editors is decreasing. Wikiversity is the only WMF project with increasing popularity! --Marshallsumter (discusscontribs) 12:19, 2 December 2019 (UTC)
Also, as mentioned before, Google searches our Draft space. Resources that had high popularity before being put into Draft space had their popularity begin to recover once readers could find it in Draft space. Draft space does not improve resources, editors do! We need to address why we keep losing new editors! --Marshallsumter (discusscontribs) 12:29, 2 December 2019 (UTC)
I just did the math: according to Wikiversity:Statistics/2019/10 WikiJournals, and subpages, for 0.65% of the total request during that month. Rainwater harvesting is more than twice (1.77%) as popular. According to pageviews the rainwater top page is an order of magnitude more popular than any Journal landing page.[1] Correlation does not imply causation. The data supports that your observation is merely a coincidence. --mikeu talk 17:56, 3 December 2019 (UTC)
Actually, Rainwater harvesting has had no substantial added content editing since June 2016 when it was abandoned by its principal editor. Just reversals of vandalism have been occurring since primarily. Its popularity may have more to do with topic preference of our readers than any "cleanup" using Draft space. Coincidence results from causation per coincidence counting that increases with time. --Marshallsumter (discusscontribs) 02:25, 9 December 2019 (UTC)
Note: MediaWiki:Robots.txt discourages search engine crawlers from retrieving urls in Draft space. If search engines are indexing drafts despite this we should look into other methods of blocking such as including a template with __NOINDEX__ in it. If a draft isn't ready to become a mainspace resource, it's not ready for indexing. --mikeu talk 16:34, 2 December 2019 (UTC)
I've created {{NOINDEX/DRAFT}} to address this. It adds __NOINDEX__ to pages only while they are in the Draft namespace. I will work on adding this to each Draft: page. I will also check What links here to verify that main pages aren't linking to draft resources. -- Dave Braunschweig (discusscontribs) 23:55, 2 December 2019 (UTC)
All main space links to draft resources have been removed, except for a few that are about or referencing Draft: pages rather than directing users to draft pages. -- Dave Braunschweig (discusscontribs) 03:05, 4 December 2019 (UTC)

This brings up a question: what is the default for indexing draft space? I wasn't sure, so I modified robots.txt but don't know if that is actually doing anything. IMO, NOINDEX should be the default for all pages in draft and user space. If that is not the case we may want to open a phab request. --mikeu talk 02:01, 3 December 2019 (UTC)

@Mu301: I usually put my idea in my user page, but those aren't indexed by search engine. So, which place in wikiversity can I put my idea to be indexed by search engine? --Ans (discusscontribs) 13:13, 17 January 2020 (UTC)

Forming a User GroupEdit

It's only just sunk in that Wikiversity has no dedicated user group. There's the WikiJournal User group, but that has a rather specific remit. There's the Wikipedia & Education User Group, but that's broader and more focused on Wikipedia. There's the WikiToLearn User Group, but that's focused on the website.

Compare that to e.g. :

A thematic, multilingual User Group for Wikiversity would be valuable to represent and promote the interests of Wikiversity users as a whole at Strategy Summits and similar could be valuable (If there's a logical place to also post on then please repost/redirect from there). They are relatively simple to set up (Eligibility and Process). T.Shafee(Evo﹠Evo)talk 05:09, 3 December 2019 (UTC)

Can you give me an idea of how you think the above user groups have helped those respective sister projects? —Justin (koavf)TCM 07:40, 4 December 2019 (UTC)
I think the main aspect is that they have been pretty key voices in reducing Wikipedia-centrism in the strategy process (each UG nominates a strategy liaison to participate in ongoing discussions and aid the working groups), and at the Berlin Strategy summit. They've also been pretty useful as multi-lingual points of contact. T.Shafee(Evo﹠Evo)talk 00:27, 10 December 2019 (UTC)

Thoughts about the latest community wishlistEdit

Hi everyone, maybe you could be interested participating to this discussion about the community wishlist and how to improve the process. Do not hesitate to give your opinion; the more we will know about the small communities, the more we can build something representative. Pamputt (discusscontribs) 17:36, 6 December 2019 (UTC)

Export to Pdf, LaTeX, Epub, OdtEdit


I propose to enable file export for Wikiversity to the formats Pdf, LaTeX, Epub, Odt. Technically I propose to use my open source project hosted on wmflabs You can just copy and paste an Url from Wikiversity and click start.

You can also add a link to it in the left sidebar, by copying the commons.js from my user namespace to your user namespace.

Of course my ultimate goal is to modify MediaWiki:Common.js to make that link in the sidebar appear for all users. I am happy to hear your opinion about that. Furthermore I will also have to see if the servers can stand the load if that actually happens.

--Dirk Hünniger (discusscontribs) 07:35, 7 December 2019 (UTC)

Wow! I'll try it. Boris Tsirelson (discusscontribs) 08:42, 7 December 2019 (UTC)

@DannyS712: Can you review this and let us know if it is something we should add for everyone? -- Dave Braunschweig (discusscontribs) 14:16, 7 December 2019 (UTC)

I tried the existing "Download as PDF" or "Create a book" and the results were quite disappointing. I haven't tried this yet, but if it works better than the existing tools I support it. --mikeu talk 16:52, 7 December 2019 (UTC)
I installed mediawiki2latex on my Debian 9 (without any problems) and did
I/O detail
mediawiki2latex -c M2L-spaces -p A4 -o M2L-spaces/mytry2.pdf -u
mediawiki2latex (1575744739.193710114s): processing started
mediawiki2latex (1575744742.208098378s): downloading article and contributor information
warning: ""<!DOCTYPE html>\n<html lang=\"en\" dir=\"ltr\">\n<m..."" (line 1, column 1) HTML DOCTYPE declaration ignored
warning: ""<mediawiki xmlns=\""" (line 5, column 70) no opening tag found for </base>
warning: ""<!DOCTYPE html>\n<html class=\"client-nojs\" lan..."" (line 1, column 1) HTML DOCTYPE declaration ignored
warning: ""<mediawiki xmlns=\""" (line 5, column 61) no opening tag found for </base>
mediawiki2latex (1575744747.931374163s): parsing article text
mediawiki2latex (1575744747.931420509s): number of bytes to be parsed: 186906
mediawiki2latex (1575744747.950343111s): forking threads to download of images and contributor information on them
mediawiki2latex (1575744747.950386003s): number of images going to be downloaded: 11
mediawiki2latex (1575744750.763529471s): precompiling table columns
mediawiki2latex (1575744750.81762087s): number of columns to be compiled: 1
mediawiki2latex (1575744750.817703065s): precompiling column number 1
mediawiki2latex (1575744753.41844276s): generating LaTeX document
mediawiki2latex (1575744753.419271494s): number of bytes to be parsed: 186906
mediawiki2latex (1575744753.420796959s): joining threads to download the images and contributor information on them
mediawiki2latex (1575744753.420833862s): number of images to be processed: 11
mediawiki2latex (1575744775.865091424s): preparing for PDF generation
mediawiki2latex (1575744790.821428251s): preparing images for LaTeX document
mediawiki2latex (1575744794.441326673s): generating PDF file. LaTeX run 1 of 4
mediawiki2latex (1575744798.621484458s): generating PDF file. LaTeX run 2 of 4
mediawiki2latex (1575744803.345461606s): generating PDF file. LaTeX run 3 of 4
mediawiki2latex (1575744807.908469763s): generating PDF file. LaTeX run 4 of 4
mediawiki2latex (1575744812.486176514s): finished
Now I see a PDF file, containing 14 pages, while it should contain much more.
Images are included (within this 14-pages part), but the table is utterly distorted.
And I see some new tex files, but no one of them contains the text of the article. Boris Tsirelson (discusscontribs) 19:29, 7 December 2019 (UTC)
Hi I just tried the same file with the server and got 45 pages. I did some work since the last release of debian. The resulting file is here. You may reproduce the result locally by using the package from debian testing or if you don't want to change you current system followi the installation instructions at, should be done in 5 minutes. Dirk Hünniger (discusscontribs) 20:07, 7 December 2019 (UTC)
I see, thank you. Anyway, I admire your project. Тhe task is extremely difficult, and I do not expect to get a really fine result without tweaking latex files, the more so that my article contains a lot of rather atypical cases. Boris Tsirelson (discusscontribs) 20:21, 7 December 2019 (UTC)
I have multiple initial thoughts, but want to take a look before I elaborate. First though, what exactly would be included in the project js? What specific javascript gadget is proposed? --DannyS712 (discusscontribs) 19:38, 7 December 2019 (UTC)
Hi, the js is just a link to the web form on the web server with the field url filled in. Its just one line of js. Dirk Hünniger (discusscontribs) 19:49, 7 December 2019 (UTC)
I'll investigate (I have some potential concerns) the actual software, but can you post what the code would be? --DannyS712 (discusscontribs) 23:43, 7 December 2019 (UTC)
here is the code mw.util.addPortletLink ('p-tb', ''+'fill/'+encodeURIComponent(''+wgArticlePath.replace('$1', encodeURIComponent(wgPageName))), 'Multi Format Export'); --Dirk Hünniger (discusscontribs) 08:00, 8 December 2019 (UTC)
C.f. mw:Wikimedia Labs/Agreement to disclosure of personally identifiable information - I'm not sure just how much info tool maintainers get, but it expressly is not bound by the normal WMF privacy policy. Strongly oppose common.js / gadget on by default, not sure yet otherwise. --DannyS712 (discusscontribs) 23:51, 7 December 2019 (UTC)
The web page does not know which users sends the requests, it does not use any cookies, furthermore all data is delete after 6 hours by a cron job, so I think I did my best to protect personal data. If you are concerned about java script, we could still do a pull request on the mediawiki software itself to reach the same effect, it will just require some more time --Dirk Hünniger (discusscontribs) 08:00, 8 December 2019 (UTC)

I just tried this out on SkyCam. Most of the generated doc looks ok but the images are sideways with captions that are not oriented horizontally to the main text and there is excessive whitespace. I also noted problems with ordered (numbered) lists at Scientific computing. I really like the idea of this project and I also appreciate the difficulties in implementing it, but it doesn't look ready as a replacement for the (rather poor quality) existing generate features. Please ping me if you'd like me to beta test an update. I highly encourage development of a viable alternative to what we have now. --mikeu talk 04:19, 8 December 2019 (UTC)

I will not be able to solve this special case before Christmas. The problem are galleries, which a not done by mediawiki standard "gallery" tag but with custom gallery templates. Those templates put two tables in one cell of an outer table. This requires to scale the inner tables to fit into the single cell of the outer table. This is not yet implemented. Things will work if each inner table is in a separate cell in the outer table with the current implementation. --Dirk Hünniger (discusscontribs) 08:36, 8 December 2019 (UTC)
@Mu301: I solved the nested lists issue with Scientific computing I also improved the situation with SkyCam although there is still work to be done in this issue. I uploaded to git and deployed on the servers, so you can try again now. Thanks a lot for pointing out the issues so far, that greatly helps to improve the program. Looking forward to your next bug report. --Dirk Hünniger (discusscontribs) 10:56, 8 December 2019 (UTC)
@Mu301: I also deployed a fix for the remaining issue in SkyCam on the server. The simple thing, I did was to scale table which reside in the same cell of and outer table by a factor of 0.5. --Dirk Hünniger (discusscontribs) 13:53, 8 December 2019 (UTC)
@Dirk Hünnigerwishlist: Nice work! This is obviously quite relevant to a couple of the items that were on the Community wishlist (1, Export_published_WikiJournal_articles_to_DOCX_or_PDF, 3). Would it be possible to adapt it to create more complex custom formatting, and add/omit certain page headings and footers? Currently, published WikiJournal articles are formatted up manually using a word document template (example output), but an automated tool would be far preferred! T.Shafee(Evo﹠Evo)talk 21:01, 10 December 2019 (UTC)
@Evolution and evolvability: yes, the system can be customized. If you got Debian and know LaTeX you can play with that yourself from the command line. I can add your template to the webserver after that. --Dirk Hünniger (discusscontribs) 07:07, 11 December 2019 (UTC)
@Dirk Hünniger: Thank you, Dirk, for proposing this, exporting Wikiversity pages to other formats like LaTeX, MarkDown, ODT, ... that can be used in a tailored learning environements and could be quite useful. E.g. export the learning task in a Wikiversity page to a LibreOffice document (ODT) that can be adapted to local requirements and constraints of the students, e.g. include regional data in the learning task, tailored difficulties for different students. All that cannot be addressed in the public available learning resources in Wikiversity. Adapting examples e.g. by adding local and regional data for the students will not be performed on the wikiversity resources itself, but e.g. in the exported ODT file. This is necessary especially when data cannot be published. Replace of regional local images that are more comprehensive to the learner are an equivalent example to replacement sample data with real date in a learning resource. Beside privacy concerns about data also the general approach to remix Open Educational Resources can be addressed with provision of export formats. The WikBook-Creator allowed ODT export in the past. This approach allowed also to create tailored learning resources according to the prerequisits of the learners. Anyway thank you for the effort. Wiki2Reveal is just another example of an export format, that could be helpful for teachers, but it depends on the community if it is supported or not. Anyway thank you for your contribution. --Bert Niehaus (discusscontribs) 16:28, 17 December 2019 (UTC)

Just for your information. The deployment I proposed here has now been installed on the German Wikibooks. See German Wikibooks link "Multi Format Export" in the sidebar on the left. Yours --Dirk Hünniger (discusscontribs) 06:08, 21 December 2019 (UTC)

I just tried it, and it looks like its just a plain link to - I thought that it would have the url filled in? --DannyS712 (discusscontribs) 06:30, 21 December 2019 (UTC)
Yeah currently its still plain, the automated url fill in will hopefully happen by the beginning of next year. --Dirk Hünniger (discusscontribs) 08:10, 21 December 2019 (UTC)
@User:DannyS712 the automated URL fill in works now on German Wikibooks. See here for implementation details b:de:MediaWiki:Gadget-addMediawiki2LatexLink.js --Dirk Hünniger (discusscontribs) 13:37, 30 December 2019 (UTC)
Why not use mw.util.addPortletLink as previously? --DannyS712 (discusscontribs) 18:55, 30 December 2019 (UTC)
I don't know, but possibly the author knows b:de:Benutzer:Stephan Kulla --Dirk Hünniger (discusscontribs) 10:18, 31 December 2019 (UTC)

Note: This has also been deployed on the German Wikiversity. --Dirk Hünniger (discusscontribs) 17:29, 2 January 2020 (UTC)

Export Format RevealJSEdit

As recommended by @Dave Braunschweig: I want to add to topic about export formats that are already addressed by the section above. The export format is RevealJS, which is a webbased presentation format, that allows audio comments and stylus annotation to the slides. The prototype tool as a proof of concept is available on GitHub (see ). It fetches the Wiki source and converts it to HTML in the RevealJS syntax (see Wiki2Reveal-Demo). So it performs a slidely different task than Parsoid with a RevealJS syntax of HTML as output/export format.


Wiki2Reveal FooterEdit

The comments of @Dave Braunschweig: made me think about transparency of any output format conversion of the Wikiversity source of which the Wiki2Reveal approach is just one example (PDF-export of the Wikiversity page is an other one). I think it is very important that a Wikiversity learning resource that is used by Wiki2Reveal e.g. as lecture slides or as online presentation for a WikiJournal articles needs the following adaptation to a kind of Wiki2Reveal-policy in the following way:

  • (References to Wiki2Reveal) Wikiversity article that are used with Wiki2Reveal must include a reference at the end of the article, that indicated that this article was designed for Wiki2Reveal,
  • (Compare Wikiversity Source and Wiki2Reveal Presentation) the footer link allows the learner/teacher to start the presentation from the Wikiversity source. With the footer link the community is able to check, if the generated Wiki2Reveal presentation shows the same content of the source article in Wikiversity, it was generated from. This is important that users of the Wiki2Reveal presentation can validate that the generate content (i.e. content was not manipulated by Wiki2Reveal).
  • (Wikiversity back-reference from Wiki2Reveal presentation) The a generated output format like Wiki2Reveal must contain a reference back to the Wikiversity source, so that someone who just displays the Wiki2Reveal presentation is also able to validate that source content in Wikiversity and Wiki2Reveal presentation are equal - e.g. for Wiki2Reveal the title page contains a link back to Wikiversity Source by default.
  • (Wiki2Reveal Footer) a Wiki2Reveal footer is currently not added to the Wikiversity source by default. I started to add the footer for all lectures slides in Mathematics (german Wikiversity).
    • (Small Section that fit on page) With the Wiki2Reveal footer on the page users can understand that the sections of the page in a Wikiversity a designed in way that they fit on slide. Without this information users might add more information to a section (good faith edit) but the content will not be visible on the RevealJS slides.
    • (Open Source Code for Wiki2Reveal) A converter like Wiki2Reveal for the Wikiversity learning source must be Open Source and the community must be able to check the source code to validate that does what it pretends to do. Futhermore the comminity should be able to use and modify the tool according to the requirements and constraints the learning resouce in Wikiversity is used in (e.g. Wiki2Reveal for Mathematics lectures or an added Wiki2Reveal presentation for a WikiJournal article).
  • (Teachers & Lectures)
    • WikiReveal presentation should be available to teachers/lectures without administations rights.
    • Lectures and teacher should be aware of the fact that the annotation of slides do not have an impact on the Wikiversity source or were recorded by Wikiversity servers. The annotation are temporarily available during the browser session on the client. Other calls of the Wiki2Reveal presentation will not see the annotations on the slides and with the next call of the Wiki2Reveal presentation all annotations are gone.

--Bert Niehaus (discusscontribs) 09:31, 15 December 2019 (UTC)

Page InformationEdit

The Wiki2Reveal slides were created for the Kurs:Funktionalanalysis' and the Link for the Wiki2Reveal Slides was created with the link generator.

Community View about Wiki2RevealEdit

As far as I understood cross-compilation of Wikiversity content in the sense of the PanDoc concept should be discussed by the community. So I am looking forward to your comments according to Wiki2Reveal as a tool to use Wiki learning resource with a slide conversion. Do you regards that as useful and what must be altered to be inline the community strategy according to Wikiversity as an OER-Sink from which cross-compilation in other formats can be performed.--Bert Niehaus (discusscontribs) 16:13, 10 December 2019 (UTC)


@Dave Braunschweig: raised the awareness according to the prototype tool Wikipedia2Wikiversity needs community discussion how and if it should be used. I used Special:Import to create a learning resource from a Wikipedia entry by adding learning tasks and using the some of the images for illustration that are used in Wikipedia as well. I identified the challenge that most of links within Wikipedia do not work after import because the articles in Wikipedia do not have a corresponding learning resource in Wikiversity.

Purpose of Wikipedia2WikiversityEdit

First of all Wikipedia2Wikiversity corrects the Wikipedia links, because encyclopedic links (explaination of used terminology e.g. Solar Water Still does not have a correponding learning resource solar still in Wikiversity). Working links within Wikipedia do not work after the Special:Import, so the main job of Wikipedia2Wikiversity is to convert the link e.g. solor still into inter-wiki links w:en:solar still with the prefix w:en: so reference to encyclopedic explainations to terms are still working. So in a Wikiversity learning resource the learner use a learning resource for the terminology or learner could look up the specific meaning of terminology in Wikipedia with an inter-wiki link. But even if the learning resource exist in Wikiversity (e.g. for the term Water) then a inter-wiki link might be used for an encyclopedic reference to Water in Wikipedia. This is dependent on the requirements of the learning resource and might not be automated by a bot. A rule of thumb for application could be, if more links should inter-wiki links in the learning resource then the tool Wikipedia2Wikiversity might be helpful for a specific learning resource to reduce manual work. A pure automated conversion with a bot does make sense, because it is a semantic question if a reference to Wikipedia as inter-wiki link or a link to learning resource within Wikiversity is required. If currently there is no learning resource in Wikiversity for specific terminology, then it makes sense to use temporarily a interwiki link is replaced later by a link within Wikiversity when a learning resource is available.

With the support of Dave I would recommend the following workflow if regarded as useful:

  • Use Special:Import to import the raw version of the page history from Wikiversity,
  • If most of the links do not work and it would be sufficient to reference most of links with Wikiversity articles local links can be converted with inter-wiki links and with manual replacement of some links that will be created as learning resources in Wikiversity
  • add a new version with interwiki links in Wikiversity

If it is not allowed to use this tool authors have to do the manual replacement of links for all links that should be used as Inter-Wiki links to Wikipedia in the Wikiversity learning resource.

Underlying TechnologyEdit

Both Wiki2Reveal and Wikipedia2Wikiversity use the same underlying technology based on the cross-fetch library, that was also used by wtf_wikipedia which are mentioned as alternative parsers for Wiki markdown (see ). Similar to Wiki2Reveal one component of wtf_wikipedia is wtf_fetch that is used to perform a client-side coversion into slides of Wikiversity content.Wikipedia2Wikiversity is just like Wiki2Reveal an AppLSAC that runs in browser and does not create a server load for conversion on the server. The tool is Open Source and what the tool does can be checked by anyone on the Github-Prepository for the link converter Wikipedia2Wikiversity I can wait for the results of the community discussion and put the content back on the german Wikiversity if and only if I get the community permission to use Wiki2Reveal for lectures? For comparsion of Wiki2Reveal and Wikipedia2Wikiversity

  • both tools are fetching the wiki source from the MediaWiki API
  • and either perform a link conversion if a button is pressed (Wikipedia2Wikiversity) or
  • HTML-presentation of the Wikiversity article as HTML5-slides

I am looking forward to your comments if you think that both tool are useful or not or should be modified. --Bert Niehaus (discusscontribs) 17:52, 10 December 2019 (UTC)

@Bert Niehaus: Very interesting tool. We've commonly had this come up for WikiJournal articles (especially those submitted from Wikipedia via WP:JAN). We've so far wrapped this template around the text {{subst:convert_links|...}} (relevant process bulletpoints). However, it doesn't deal well with tables. What are the current limitations of Wikipedia2Wikiversity? T.Shafee(Evo﹠Evo)talk 20:56, 10 December 2019 (UTC)
@Evolution and evolvability: The limitation is that you import the Wikipedia article first with the Special:import tool, then you fetch the imported Wiki markdown in Wikiversity with Wikipedia2Wikiversity again and replace the local links by interwiki links. I would recommend to use the Wikiversity2Wikiversity tool in addition to Special:import and compare both results. Do not worry to use the tool, it just fetches the Wiki markdown source and does not alter anything on Wikipedia or Wikiversity. Then you copy the generated markdown of Wikipedia2Wikiversity as new version in the imported article that already exists in Wikiversity due your use of Special:import. Then you explore the page history with a diff between the last version of the Wikiversity document and the updated version of Wikipedia2Wikiversity. The diff will help you and me to identify which requirements of WikiJournal are covered and what replacement are missing. If needed I could add an additional regular expresseion or modify the existing ones for Wikipedia2Wikiversity, so that the improved tool covers your needs. You can download the tool and start it locally in your browser as index.html. Wikipedia2Wikiversity does not need a the Github-server for processing. Wikipedia2Wikiversity performs the Link replacement in the Wiki markdown just in your browser.--Bert Niehaus (discusscontribs) 17:30, 11 December 2019 (UTC)
@Evolution and evolvability: Do you want me perform a Special:Import and an application of Wikipedia2Wikiversity? Please let me know! Just select a Wikipedia article that should be imported and processed as a prove of concept for you! --Bert Niehaus (discusscontribs) 11:14, 12 December 2019 (UTC)

Wikidata integration and en.wp linkingEdit

I've been thinking about posting this for a few weeks and went back and forth until I saw today that YouTube is considered a high-quality learning resource. I went to check and sure enough, the link at the bottom of w:en:YouTube#External_links was added by me and the relevant Wikidata item did not have a link until I added it just now. It's valuable that we are deliberate about adding links to our sister projects for visibility and I think it's especially important to do so with our higher-quality resources. Am I on the right track here? Does anyone else have an interest in working withe me on this? —Justin (koavf)TCM 01:50, 13 January 2020 (UTC)

I'd recommend focusing on a combination of high-quality, high-interest resources. Perhaps start at the top of Wikiversity:Statistics/2019 and review for quality. Where appropriate, add Wikidata and Wikipedia links. I have interest, but not much time right now. -- Dave Braunschweig (discusscontribs) 03:21, 13 January 2020 (UTC)
I've added WikiData links to all of my resources for years. For resources in general I use {{Sisterprojectsearch}} or {{Sisterlinks}}. I agree that it is also important to add these to additional resources here and noticed that User:Koavf added the WikiData link to YouTube, thanks! Generally, I've noticed that few of our resources have sister links especially to WikiData. Perhaps these can be added by a bot. --Marshallsumter (discusscontribs) 06:09, 13 January 2020 (UTC)
It's fairly trivial to do a search on Wikidata for articles that have the same title as all of the titles here on this project. Since there are so few resources compared to the 6 million articles on en.wp, that wouldn't be terribly time-consuming. Of course, many resources are not going to have some one-to-one correspondence to en.wp but anything with the most basic name (e.g. Spanish) will. —Justin (koavf)TCM 06:18, 13 January 2020 (UTC)
I strongly oppose a bot making these decisions and in any case the bot would need to be authorized on WP. I doubt you will find consensus there to such a task. (I would !vote to oppose it) As I've explained below we should focus our community efforts on quality over quantity. The latter has been severely neglected in past efforts. --mikeu talk 15:44, 13 January 2020 (UTC)
Did anyone propose a bot? I appreciate you removing extraneous and essentially misleading links but we should retain proper links even if our resource here an en.wv is pretty poor for data integrity purposes. —Justin (koavf)TCM 22:45, 13 January 2020 (UTC)
Yes, a bot was proposed. And no, I agree with Mu301 that we should not have links to poor resources. Perhaps we should even Draft: the poor resources, but we certainly don't want to encourage Wikiversity being known for poor quality. If you believe that linking is necessary, that would accelerate my interest in pushing poor quality resources into Draft space. We do have a standing policy not to link to drafts. -- Dave Braunschweig (discusscontribs) 23:32, 13 January 2020 (UTC)
We should not choose to include or exclude Wikidata links because of quality. If we feel that a certain threshold is not met to be in the main namespace, then we should draftify or userfy. —Justin (koavf)TCM 23:44, 13 January 2020 (UTC)
Please have a look at my notice at Wikiversity:Notices_for_custodians/Archive/5#cross_wiki_disruption and the contribs.
Nobody60 (talk • email • contribs • stats • logs • global account) This participant's egregious self promotion of personal essays is the consequence of our negligence in curating the quality of wp links to resources here. Yes, we should be reviewing quality and culling those attempts. If draft or user moves are required I would accept that as an alternative. These are not AGF attempts at providing useful information to learners. This is spam and vandalism. --mikeu talk 00:07, 15 January 2020 (UTC)
I've made some effort to curate the existing links there, see my blog post for a summary. I discovered quite a few article that had broken redlinks or pages that had been moved to draft space.[2] Ensuring quality works both ways: choose to link to our best featured projects, and also be vigilant about removing overzealous linking back here.
Currently, I see links to Arabic and Ada which are poorly developed local stubs. Historically there was an effort by participants here to "drive traffic" to new resources. IMO, this is misguided and counter productive. Despite a lengthy period of time with those links existing it resulted in no development of the local resources. I just removed a link from w:Drum to Category:Drums for example. I see little sense in linking w:Idempotence to Portal:Computer Science which doesn't even mention the word. Even more problematic are w:World peace linking to a personal essay Happiness/A World of Peace, Love and Happiness. (<- there were multiple examples of spamming to related essays) Then there's w:Refrigeration linking to an abandoned survivalist Underground refrigerated storage room resource. I just removed a half dozen such links. --mikeu talk 15:40, 13 January 2020 (UTC)

Wiki Loves FolkloreEdit

Hello Folks,

Wiki Loves Love is back again in 2020 iteration as Wiki Loves Folklore from 1 February, 2020 - 29 February, 2020. Join us to celebrate the local cultural heritage of your region with the theme of folklore in the international photography contest at Wikimedia Commons. Images, videos and audios representing different forms of folk cultures and new forms of heritage that haven’t otherwise been documented so far are welcome submissions in Wiki Loves Folklore. Learn more about the contest at Meta-Wiki and Commons.

Kind regards,
Wiki Loves Folklore International Team
— Tulsi Bhagat (contribs | talk)
sent using MediaWiki message delivery (discusscontribs) 06:14, 18 January 2020 (UTC)


This refers to the discussion mentions here:

I think that a Wikiversity page should be added on LinkedIn. Please vote for support or lack thereof.


From [3]:

Wikimedia has its LinkedIn page; Wikipedia, too. But not Wikiversity. I tried to show my Swedish studies but could not choose Wikiversity as the Institution. Why not? Even when it is not a "granting degree" institution, is is still an Institution, right? When I contacted LinkedIn about this, they sent me the link so that I can create myself the Wikiversity page. But then there is box I must tick: " I confirm I am an approved authority of this Institution to create this page", which is not the case. But I think there are many Wikiversity experts on here that woud qualify as Wikiversity Linkedin page creators. I can create the page if someone here approves, but I would need some info: # of employees, etc.

The number of employees (volunteers is not an option but we are unpaid) for our Wikiversity I guess could be the number of active users 201-500. The current logo is File:Wikiversity logo 2017.svg. The website can be

Wikiversity is a community. None of us gets to insist that anything happen on behalf of the community unless there is consensus to do so. Because this request involves an outside organization, it may also require support from the WMF.

--Leonardo T. Cardillo (discusscontribs) 16:23, 19 January 2020 (UTC)

  •   Oppose Looking at your request at Wikiversity:Help desk it seems that what you are trying to do is list Wikiversity as an educational institution that you have attended which is not quite the same thing as a "company" profile page on that site. Because Wikiversity is not a degree granting institution and has no formal enrollment requirements it wouldn't qualify as a school as defined by LinkedIn. Creating a company page is problematic as we would need to have volunteers to maintain the page and keep it up-to-date. We tried creating @Wikiversity on Twitter and it was not much used so it was abandoned many years ago. I see that there's a Wikipedia Users Group that barely gets one or two posts per year despite Wikipedia having a much larger base of participants. There is also a Wikipedia company page that appears to be managed by staff who are paid to promote the site. I can't see a compelling reason to create a Wikiversity presence on LinkedIn. I would also note that you can add a "project" to your profile that links to Wikiversity. That would seem to be the best solution to what you are looking for. I don't support creation of a LinkedIn page at this time. --mikeu talk 17:06, 19 January 2020 (UTC)

Movement Learning and Leadership Development ProjectEdit


The Wikimedia Foundation’s Community Development team is seeking to learn more about the way volunteers learn and develop into the many different roles that exist in the movement. Our goal is to build a movement informed framework that provides shared clarity and outlines accessible pathways on how to grow and develop skills within the movement. To this end, we are looking to speak with you, our community to learn about your journey as a Wikimedia volunteer. Whether you joined yesterday or have been here from the very start, we want to hear about the many ways volunteers join and contribute to our movement.

To learn more about the project, please visit the Meta page. If you are interested in participating in the project, please complete this simple Google form. Although we may not be able to speak to everyone who expresses interest, we encourage you to complete this short form if you are interested in participating!

-- LMiranda (WMF) (talk) 19:01, 22 January 2020 (UTC)

I just filled out the survey. I think that it would be great if Wikiversity had a presence in this project. --mikeu talk 23:46, 22 January 2020 (UTC)

Open call for Project GrantsEdit

Greetings! The Project Grants program is accepting proposals until Feburary 20 to fund both experimental and proven projects such as research, offline outreach (including editathon series, workshops, etc), online organizing (including contests), or providing other support for community building for Wikimedia projects.

We offer the following resources to help you plan your project and complete a grant proposal:

With thanks, I JethroBT (WMF) (talk) 18:38, 24 January 2020 (UTC)