Wikiversity:Colloquium/archives/December 2019

Draft Space Cleanup edit

 
12-month moving average pageviews before and after introduction of draftspace (dotted line)

I have tagged all Draft: namespace resources that were unedited in the last 150 days with a 30-day proposed deletion. See Category:Proposed deletions to review. -- Dave Braunschweig (discusscontribs) 18:42, 1 December 2019 (UTC)[reply]

I use draftspace a lot and I agree with this policy. People with items up for deletion have a number of options:
  1. Improve the article in draft space or argue that it is complete and does not need improving.
  2. Move the draft into mainspace and see what happens
  3. Move the draft into your own userspace
  4. Save a permalink in your userspace. I like the latter because I can easily index them in case I ever want to complete the project. I keep a dated index of them at the bottom of my sandbox.I think it's OK to list them as external links by selecting "Permament link" under the "Tools" sidebar and saving the external link. The links look like this:
- https://en.wikiversity.org/w/index.php?title=Draft:How_to_create_a_Wikiversity_article&oldid=2087561
- https://en.wikiversity.org/w/index.php?title=Draft:How_to_create_a_Wikiversity_article/Sample_subpage&oldid=2099847
Nobody can be certain about this, but there is good reason to hope that removing the clutter from Wikiversity will make it a more popular resource.--Guy vandegrift (discusscontribs) 22:24, 1 December 2019 (UTC) -I added a new option and placed it first Guy vandegrift (discusscontribs) 13:37, 2 December 2019 (UTC)[reply]
Just FYI but the increase in popularity of Wikiversity coincides with the beginning and continuation of the WikiJournals. It also doesn't reflect that while the popularity of Wikiversity is increasing the number of new editors is decreasing. Wikiversity is the only WMF project with increasing popularity! --Marshallsumter (discusscontribs) 12:19, 2 December 2019 (UTC)[reply]
Also, as mentioned before, Google searches our Draft space. Resources that had high popularity before being put into Draft space had their popularity begin to recover once readers could find it in Draft space. Draft space does not improve resources, editors do! We need to address why we keep losing new editors! --Marshallsumter (discusscontribs) 12:29, 2 December 2019 (UTC)[reply]
I just did the math: according to Wikiversity:Statistics/2019/10 WikiJournals, and subpages, for 0.65% of the total request during that month. Rainwater harvesting is more than twice (1.77%) as popular. According to pageviews the rainwater top page is an order of magnitude more popular than any Journal landing page.[1] Correlation does not imply causation. The data supports that your observation is merely a coincidence. --mikeu talk 17:56, 3 December 2019 (UTC)[reply]
Actually, Rainwater harvesting has had no substantial added content editing since June 2016 when it was abandoned by its principal editor. Just reversals of vandalism have been occurring since primarily. Its popularity may have more to do with topic preference of our readers than any "cleanup" using Draft space. Coincidence results from causation per coincidence counting that increases with time. --Marshallsumter (discusscontribs) 02:25, 9 December 2019 (UTC)[reply]
Note: MediaWiki:Robots.txt discourages search engine crawlers from retrieving urls in Draft space. If search engines are indexing drafts despite this we should look into other methods of blocking such as including a template with __NOINDEX__ in it. If a draft isn't ready to become a mainspace resource, it's not ready for indexing. --mikeu talk 16:34, 2 December 2019 (UTC)[reply]
I've created {{NOINDEX/DRAFT}} to address this. It adds __NOINDEX__ to pages only while they are in the Draft namespace. I will work on adding this to each Draft: page. I will also check What links here to verify that main pages aren't linking to draft resources. -- Dave Braunschweig (discusscontribs) 23:55, 2 December 2019 (UTC)[reply]
All main space links to draft resources have been removed, except for a few that are about or referencing Draft: pages rather than directing users to draft pages. -- Dave Braunschweig (discusscontribs) 03:05, 4 December 2019 (UTC)[reply]

This brings up a question: what is the default for indexing draft space? I wasn't sure, so I modified robots.txt but don't know if that is actually doing anything. IMO, NOINDEX should be the default for all pages in draft and user space. If that is not the case we may want to open a phab request. --mikeu talk 02:01, 3 December 2019 (UTC)[reply]

@Mu301: I usually put my idea in my user page, but those aren't indexed by search engine. So, which place in wikiversity can I put my idea to be indexed by search engine? --Ans (discusscontribs) 13:13, 17 January 2020 (UTC)[reply]

Forming a User Group edit

It's only just sunk in that Wikiversity has no dedicated user group. There's the WikiJournal User group, but that has a rather specific remit. There's the Wikipedia & Education User Group, but that's broader and more focused on Wikipedia. There's the WikiToLearn User Group, but that's focused on the wikitolearn.org website.

Compare that to e.g. :

A thematic, multilingual User Group for Wikiversity would be valuable to represent and promote the interests of Wikiversity users as a whole at Strategy Summits and similar could be valuable (If there's a logical place to also post on beta.wikiversity.org then please repost/redirect from there). They are relatively simple to set up (Eligibility and Process). T.Shafee(Evo﹠Evo)talk 05:09, 3 December 2019 (UTC)[reply]

Can you give me an idea of how you think the above user groups have helped those respective sister projects? —Justin (koavf)TCM 07:40, 4 December 2019 (UTC)[reply]
I think the main aspect is that they have been pretty key voices in reducing Wikipedia-centrism in the strategy process (each UG nominates a strategy liaison to participate in ongoing discussions and aid the working groups), and at the Berlin Strategy summit. They've also been pretty useful as multi-lingual points of contact. T.Shafee(Evo﹠Evo)talk 00:27, 10 December 2019 (UTC)[reply]

Thoughts about the latest community wishlist edit

Hi everyone, maybe you could be interested participating to this discussion about the community wishlist and how to improve the process. Do not hesitate to give your opinion; the more we will know about the small communities, the more we can build something representative. Pamputt (discusscontribs) 17:36, 6 December 2019 (UTC)[reply]

Export to Pdf, LaTeX, Epub, Odt edit

Hi,

I propose to enable file export for Wikiversity to the formats Pdf, LaTeX, Epub, Odt. Technically I propose to use my open source project hosted on wmflabs https://mediawiki2latex.wmflabs.org/. You can just copy and paste an Url from Wikiversity and click start.

You can also add a link to it in the left sidebar, by copying the commons.js from my user namespace to your user namespace.

Of course my ultimate goal is to modify MediaWiki:Common.js to make that link in the sidebar appear for all users. I am happy to hear your opinion about that. Furthermore I will also have to see if the servers can stand the load if that actually happens.

--Dirk Hünniger (discusscontribs) 07:35, 7 December 2019 (UTC)[reply]

Wow! I'll try it. Boris Tsirelson (discusscontribs) 08:42, 7 December 2019 (UTC)[reply]

@DannyS712: Can you review this and let us know if it is something we should add for everyone? -- Dave Braunschweig (discusscontribs) 14:16, 7 December 2019 (UTC)[reply]

I tried the existing "Download as PDF" or "Create a book" and the results were quite disappointing. I haven't tried this yet, but if it works better than the existing tools I support it. --mikeu talk 16:52, 7 December 2019 (UTC)[reply]
I installed mediawiki2latex on my Debian 9 (without any problems) and did
I/O detail
mediawiki2latex -c M2L-spaces -p A4 -o M2L-spaces/mytry2.pdf -u https://en.wikiversity.org/wiki/WikiJournal_of_Science/Spaces_in_mathematics
getting
mediawiki2latex (1575744739.193710114s): processing started
mediawiki2latex (1575744742.208098378s): downloading article and contributor information
warning: ""<!DOCTYPE html>\n<html lang=\"en\" dir=\"ltr\">\n<m..."" (line 1, column 1) HTML DOCTYPE declaration ignored
warning: ""<mediawiki xmlns=\"http://www.mediawiki.org/xm..."" (line 5, column 70) no opening tag found for </base>
warning: ""<!DOCTYPE html>\n<html class=\"client-nojs\" lan..."" (line 1, column 1) HTML DOCTYPE declaration ignored
warning: ""<mediawiki xmlns=\"http://www.mediawiki.org/xm..."" (line 5, column 61) no opening tag found for </base>
mediawiki2latex (1575744747.931374163s): parsing article text
mediawiki2latex (1575744747.931420509s): number of bytes to be parsed: 186906
mediawiki2latex (1575744747.950343111s): forking threads to download of images and contributor information on them
mediawiki2latex (1575744747.950386003s): number of images going to be downloaded: 11
mediawiki2latex (1575744750.763529471s): precompiling table columns
mediawiki2latex (1575744750.81762087s): number of columns to be compiled: 1
mediawiki2latex (1575744750.817703065s): precompiling column number 1
mediawiki2latex (1575744753.41844276s): generating LaTeX document
mediawiki2latex (1575744753.419271494s): number of bytes to be parsed: 186906
mediawiki2latex (1575744753.420796959s): joining threads to download the images and contributor information on them
mediawiki2latex (1575744753.420833862s): number of images to be processed: 11
mediawiki2latex (1575744775.865091424s): preparing for PDF generation
mediawiki2latex (1575744790.821428251s): preparing images for LaTeX document
mediawiki2latex (1575744794.441326673s): generating PDF file. LaTeX run 1 of 4
mediawiki2latex (1575744798.621484458s): generating PDF file. LaTeX run 2 of 4
mediawiki2latex (1575744803.345461606s): generating PDF file. LaTeX run 3 of 4
mediawiki2latex (1575744807.908469763s): generating PDF file. LaTeX run 4 of 4
mediawiki2latex (1575744812.486176514s): finished
Now I see a PDF file, containing 14 pages, while it should contain much more.
Images are included (within this 14-pages part), but the table is utterly distorted.
And I see some new tex files, but no one of them contains the text of the article. Boris Tsirelson (discusscontribs) 19:29, 7 December 2019 (UTC)[reply]
Hi I just tried the same file with the server and got 45 pages. I did some work since the last release of debian. The resulting file is here. You may reproduce the result locally by using the package from debian testing or if you don't want to change you current system followi the installation instructions at, https://de.wikibooks.org/wiki/Benutzer:Dirk_H%C3%BCnniger/wb2pdf/install should be done in 5 minutes. Dirk Hünniger (discusscontribs) 20:07, 7 December 2019 (UTC)[reply]
I see, thank you. Anyway, I admire your project. Тhe task is extremely difficult, and I do not expect to get a really fine result without tweaking latex files, the more so that my article contains a lot of rather atypical cases. Boris Tsirelson (discusscontribs) 20:21, 7 December 2019 (UTC)[reply]
I have multiple initial thoughts, but want to take a look before I elaborate. First though, what exactly would be included in the project js? What specific javascript gadget is proposed? --DannyS712 (discusscontribs) 19:38, 7 December 2019 (UTC)[reply]
Hi, the js is just a link to the web form on the web server with the field url filled in. Its just one line of js. Dirk Hünniger (discusscontribs) 19:49, 7 December 2019 (UTC)[reply]
I'll investigate (I have some potential concerns) the actual software, but can you post what the code would be? --DannyS712 (discusscontribs) 23:43, 7 December 2019 (UTC)[reply]
here is the code mw.util.addPortletLink ('p-tb', 'https://mediawiki2latex.wmflabs.org/'+'fill/'+encodeURIComponent('https://en.wikiversity.org'+wgArticlePath.replace('$1', encodeURIComponent(wgPageName))), 'Multi Format Export'); --Dirk Hünniger (discusscontribs) 08:00, 8 December 2019 (UTC)[reply]
C.f. mw:Wikimedia Labs/Agreement to disclosure of personally identifiable information - I'm not sure just how much info tool maintainers get, but it expressly is not bound by the normal WMF privacy policy. Strongly oppose common.js / gadget on by default, not sure yet otherwise. --DannyS712 (discusscontribs) 23:51, 7 December 2019 (UTC)[reply]
The web page does not know which users sends the requests, it does not use any cookies, furthermore all data is delete after 6 hours by a cron job, so I think I did my best to protect personal data. If you are concerned about java script, we could still do a pull request on the mediawiki software itself to reach the same effect, it will just require some more time --Dirk Hünniger (discusscontribs) 08:00, 8 December 2019 (UTC)[reply]

I just tried this out on SkyCam. Most of the generated doc looks ok but the images are sideways with captions that are not oriented horizontally to the main text and there is excessive whitespace. I also noted problems with ordered (numbered) lists at Scientific computing. I really like the idea of this project and I also appreciate the difficulties in implementing it, but it doesn't look ready as a replacement for the (rather poor quality) existing generate features. Please ping me if you'd like me to beta test an update. I highly encourage development of a viable alternative to what we have now. --mikeu talk 04:19, 8 December 2019 (UTC)[reply]

I will not be able to solve this special case before Christmas. The problem are galleries, which a not done by mediawiki standard "gallery" tag but with custom gallery templates. Those templates put two tables in one cell of an outer table. This requires to scale the inner tables to fit into the single cell of the outer table. This is not yet implemented. Things will work if each inner table is in a separate cell in the outer table with the current implementation. --Dirk Hünniger (discusscontribs) 08:36, 8 December 2019 (UTC)[reply]
@Mu301: I solved the nested lists issue with Scientific computing I also improved the situation with SkyCam although there is still work to be done in this issue. I uploaded to git and deployed on the servers, so you can try again now. Thanks a lot for pointing out the issues so far, that greatly helps to improve the program. Looking forward to your next bug report. --Dirk Hünniger (discusscontribs) 10:56, 8 December 2019 (UTC)[reply]
@Mu301: I also deployed a fix for the remaining issue in SkyCam on the server. The simple thing, I did was to scale table which reside in the same cell of and outer table by a factor of 0.5. --Dirk Hünniger (discusscontribs) 13:53, 8 December 2019 (UTC)[reply]
@Dirk Hünnigerwishlist: Nice work! This is obviously quite relevant to a couple of the items that were on the Community wishlist (1, Export_published_WikiJournal_articles_to_DOCX_or_PDF, 3). Would it be possible to adapt it to create more complex custom formatting, and add/omit certain page headings and footers? Currently, published WikiJournal articles are formatted up manually using a word document template (example output), but an automated tool would be far preferred! T.Shafee(Evo﹠Evo)talk 21:01, 10 December 2019 (UTC)[reply]
@Evolution and evolvability: yes, the system can be customized. If you got Debian and know LaTeX you can play with that yourself from the command line. I can add your template to the webserver after that. --Dirk Hünniger (discusscontribs) 07:07, 11 December 2019 (UTC)[reply]
@Dirk Hünniger: Thank you, Dirk, for proposing this, exporting Wikiversity pages to other formats like LaTeX, MarkDown, ODT, ... that can be used in a tailored learning environements and could be quite useful. E.g. export the learning task in a Wikiversity page to a LibreOffice document (ODT) that can be adapted to local requirements and constraints of the students, e.g. include regional data in the learning task, tailored difficulties for different students. All that cannot be addressed in the public available learning resources in Wikiversity. Adapting examples e.g. by adding local and regional data for the students will not be performed on the wikiversity resources itself, but e.g. in the exported ODT file. This is necessary especially when data cannot be published. Replace of regional local images that are more comprehensive to the learner are an equivalent example to replacement sample data with real date in a learning resource. Beside privacy concerns about data also the general approach to remix Open Educational Resources can be addressed with provision of export formats. The WikBook-Creator allowed ODT export in the past. This approach allowed also to create tailored learning resources according to the prerequisits of the learners. Anyway thank you for the effort. Wiki2Reveal is just another example of an export format, that could be helpful for teachers, but it depends on the community if it is supported or not. Anyway thank you for your contribution. --Bert Niehaus (discusscontribs) 16:28, 17 December 2019 (UTC)[reply]

Just for your information. The deployment I proposed here has now been installed on the German Wikibooks. See German Wikibooks link "Multi Format Export" in the sidebar on the left. Yours --Dirk Hünniger (discusscontribs) 06:08, 21 December 2019 (UTC)[reply]

I just tried it, and it looks like its just a plain link to https://mediawiki2latex.wmflabs.org/ - I thought that it would have the url filled in? --DannyS712 (discusscontribs) 06:30, 21 December 2019 (UTC)[reply]
Yeah currently its still plain, the automated url fill in will hopefully happen by the beginning of next year. --Dirk Hünniger (discusscontribs) 08:10, 21 December 2019 (UTC)[reply]
@User:DannyS712 the automated URL fill in works now on German Wikibooks. See here for implementation details b:de:MediaWiki:Gadget-addMediawiki2LatexLink.js --Dirk Hünniger (discusscontribs) 13:37, 30 December 2019 (UTC)[reply]
Why not use mw.util.addPortletLink as previously? --DannyS712 (discusscontribs) 18:55, 30 December 2019 (UTC)[reply]
I don't know, but possibly the author knows b:de:Benutzer:Stephan Kulla --Dirk Hünniger (discusscontribs) 10:18, 31 December 2019 (UTC)[reply]

Note: This has also been deployed on the German Wikiversity. --Dirk Hünniger (discusscontribs) 17:29, 2 January 2020 (UTC)[reply]

Export Format RevealJS edit

As recommended by @Dave Braunschweig: I want to add to topic about export formats that are already addressed by the section above. The export format is RevealJS, which is a webbased presentation format, that allows audio comments and stylus annotation to the slides. The prototype tool as a proof of concept is available on GitHub (see https://www.github.com/niebert/Wiki2Reveal ). It fetches the Wiki source and converts it to HTML in the RevealJS syntax (see Wiki2Reveal-Demo). So it performs a slidely different task than Parsoid with a RevealJS syntax of HTML as output/export format.

Application edit

Wiki2Reveal Footer edit

The comments of @Dave Braunschweig: made me think about transparency of any output format conversion of the Wikiversity source of which the Wiki2Reveal approach is just one example (PDF-export of the Wikiversity page is an other one). I think it is very important that a Wikiversity learning resource that is used by Wiki2Reveal e.g. as lecture slides or as online presentation for a WikiJournal articles needs the following adaptation to a kind of Wiki2Reveal-policy in the following way:

  • (References to Wiki2Reveal) Wikiversity article that are used with Wiki2Reveal must include a reference at the end of the article, that indicated that this article was designed for Wiki2Reveal,
  • (Compare Wikiversity Source and Wiki2Reveal Presentation) the footer link allows the learner/teacher to start the presentation from the Wikiversity source. With the footer link the community is able to check, if the generated Wiki2Reveal presentation shows the same content of the source article in Wikiversity, it was generated from. This is important that users of the Wiki2Reveal presentation can validate that the generate content (i.e. content was not manipulated by Wiki2Reveal).
  • (Wikiversity back-reference from Wiki2Reveal presentation) The a generated output format like Wiki2Reveal must contain a reference back to the Wikiversity source, so that someone who just displays the Wiki2Reveal presentation is also able to validate that source content in Wikiversity and Wiki2Reveal presentation are equal - e.g. for Wiki2Reveal the title page contains a link back to Wikiversity Source by default.
  • (Wiki2Reveal Footer) a Wiki2Reveal footer is currently not added to the Wikiversity source by default. I started to add the footer for all lectures slides in Mathematics (german Wikiversity).
    • (Small Section that fit on page) With the Wiki2Reveal footer on the page users can understand that the sections of the page in a Wikiversity a designed in way that they fit on slide. Without this information users might add more information to a section (good faith edit) but the content will not be visible on the RevealJS slides.
    • (Open Source Code for Wiki2Reveal) A converter like Wiki2Reveal for the Wikiversity learning source must be Open Source and the community must be able to check the source code to validate that does what it pretends to do. Futhermore the comminity should be able to use and modify the tool according to the requirements and constraints the learning resouce in Wikiversity is used in (e.g. Wiki2Reveal for Mathematics lectures or an added Wiki2Reveal presentation for a WikiJournal article).
  • (Teachers & Lectures)
    • WikiReveal presentation should be available to teachers/lectures without administations rights.
    • Lectures and teacher should be aware of the fact that the annotation of slides do not have an impact on the Wikiversity source or were recorded by Wikiversity servers. The annotation are temporarily available during the browser session on the client. Other calls of the Wiki2Reveal presentation will not see the annotations on the slides and with the next call of the Wiki2Reveal presentation all annotations are gone.

--Bert Niehaus (discusscontribs) 09:31, 15 December 2019 (UTC)[reply]

Page Information edit

The Wiki2Reveal slides were created for the Kurs:Funktionalanalysis' and the Link for the Wiki2Reveal Slides was created with the link generator.

Community View about Wiki2Reveal edit

As far as I understood cross-compilation of Wikiversity content in the sense of the PanDoc concept should be discussed by the community. So I am looking forward to your comments according to Wiki2Reveal as a tool to use Wiki learning resource with a slide conversion. Do you regards that as useful and what must be altered to be inline the community strategy according to Wikiversity as an OER-Sink from which cross-compilation in other formats can be performed.--Bert Niehaus (discusscontribs) 16:13, 10 December 2019 (UTC)[reply]


Wikipedia2Wikiversity edit

@Dave Braunschweig: raised the awareness according to the prototype tool Wikipedia2Wikiversity needs community discussion how and if it should be used. I used Special:Import to create a learning resource from a Wikipedia entry by adding learning tasks and using the some of the images for illustration that are used in Wikipedia as well. I identified the challenge that most of links within Wikipedia do not work after import because the articles in Wikipedia do not have a corresponding learning resource in Wikiversity.

Purpose of Wikipedia2Wikiversity edit

First of all Wikipedia2Wikiversity corrects the Wikipedia links, because encyclopedic links (explaination of used terminology e.g. Solar Water Still does not have a correponding learning resource solar still in Wikiversity). Working links within Wikipedia do not work after the Special:Import, so the main job of Wikipedia2Wikiversity is to convert the link e.g. solor still into inter-wiki links w:en:solar still with the prefix w:en: so reference to encyclopedic explainations to terms are still working. So in a Wikiversity learning resource the learner use a learning resource for the terminology or learner could look up the specific meaning of terminology in Wikipedia with an inter-wiki link. But even if the learning resource exist in Wikiversity (e.g. for the term Water) then a inter-wiki link might be used for an encyclopedic reference to Water in Wikipedia. This is dependent on the requirements of the learning resource and might not be automated by a bot. A rule of thumb for application could be, if more links should inter-wiki links in the learning resource then the tool Wikipedia2Wikiversity might be helpful for a specific learning resource to reduce manual work. A pure automated conversion with a bot does make sense, because it is a semantic question if a reference to Wikipedia as inter-wiki link or a link to learning resource within Wikiversity is required. If currently there is no learning resource in Wikiversity for specific terminology, then it makes sense to use temporarily a interwiki link is replaced later by a link within Wikiversity when a learning resource is available.

With the support of Dave I would recommend the following workflow if regarded as useful:

  • Use Special:Import to import the raw version of the page history from Wikiversity,
  • If most of the links do not work and it would be sufficient to reference most of links with Wikiversity articles local links can be converted with inter-wiki links and with manual replacement of some links that will be created as learning resources in Wikiversity
  • add a new version with interwiki links in Wikiversity

If it is not allowed to use this tool authors have to do the manual replacement of links for all links that should be used as Inter-Wiki links to Wikipedia in the Wikiversity learning resource.

Underlying Technology edit

Both Wiki2Reveal and Wikipedia2Wikiversity use the same underlying technology based on the cross-fetch library, that was also used by wtf_wikipedia which are mentioned as alternative parsers for Wiki markdown (see https://www.mediawiki.org/wiki/Alternative_parsers ). Similar to Wiki2Reveal one component of wtf_wikipedia is wtf_fetch that is used to perform a client-side coversion into slides of Wikiversity content.Wikipedia2Wikiversity is just like Wiki2Reveal an AppLSAC that runs in browser and does not create a server load for conversion on the server. The tool is Open Source and what the tool does can be checked by anyone on the Github-Prepository for the link converter Wikipedia2Wikiversity I can wait for the results of the community discussion and put the content back on the german Wikiversity if and only if I get the community permission to use Wiki2Reveal for lectures? For comparsion of Wiki2Reveal and Wikipedia2Wikiversity

  • both tools are fetching the wiki source from the MediaWiki API
  • and either perform a link conversion if a button is pressed (Wikipedia2Wikiversity) or
  • HTML-presentation of the Wikiversity article as HTML5-slides

I am looking forward to your comments if you think that both tool are useful or not or should be modified. --Bert Niehaus (discusscontribs) 17:52, 10 December 2019 (UTC)[reply]

@Bert Niehaus: Very interesting tool. We've commonly had this come up for WikiJournal articles (especially those submitted from Wikipedia via WP:JAN). We've so far wrapped this template around the text {{subst:convert_links|...}} (relevant process bulletpoints). However, it doesn't deal well with tables. What are the current limitations of Wikipedia2Wikiversity? T.Shafee(Evo﹠Evo)talk 20:56, 10 December 2019 (UTC)[reply]
@Evolution and evolvability: The limitation is that you import the Wikipedia article first with the Special:import tool, then you fetch the imported Wiki markdown in Wikiversity with Wikipedia2Wikiversity again and replace the local links by interwiki links. I would recommend to use the Wikiversity2Wikiversity tool in addition to Special:import and compare both results. Do not worry to use the tool, it just fetches the Wiki markdown source and does not alter anything on Wikipedia or Wikiversity. Then you copy the generated markdown of Wikipedia2Wikiversity as new version in the imported article that already exists in Wikiversity due your use of Special:import. Then you explore the page history with a diff between the last version of the Wikiversity document and the updated version of Wikipedia2Wikiversity. The diff will help you and me to identify which requirements of WikiJournal are covered and what replacement are missing. If needed I could add an additional regular expresseion or modify the existing ones for Wikipedia2Wikiversity, so that the improved tool covers your needs. You can download the tool and start it locally in your browser as index.html. Wikipedia2Wikiversity does not need a the Github-server for processing. Wikipedia2Wikiversity performs the Link replacement in the Wiki markdown just in your browser.--Bert Niehaus (discusscontribs) 17:30, 11 December 2019 (UTC)[reply]
@Evolution and evolvability: Do you want me perform a Special:Import and an application of Wikipedia2Wikiversity? Please let me know! Just select a Wikipedia article that should be imported and processed as a prove of concept for you! --Bert Niehaus (discusscontribs) 11:14, 12 December 2019 (UTC)[reply]