WikiFur
Give a brief description of your project, specifically mentioning why you chose to use Wikibase.
WikiFur is the international encyclopedia of furry fandom – run by fans of anthropomorphic animals in art, stories and popular culture. Like Wikipedia, we use MediaWiki, but act more as a primary or secondary source, trusting editors with knowledge of the events, works, terms and people covered. Wikibase matched our increasing need for centralized multilingual data storage with tight MediaWiki integration, plus a strong querying solution (WDQS) to generate lists, maps, graphs and other visualizations. Support for lexemes – as our community has many unique terms – and schemas for automated validation were also bonus factors.
There are a number of ways to import data into Wikibase. Describe the criteria that influenced your decision to use your chosen method.
WikiFur’s strength is its strong community, which it relies on for data collection and input from disparate sources – tweets, convention or personal websites, or even chat platforms such as Telegram or Discord. As we largely lack other databases to import from with automation, we focused on assisting users with manual data entry, with tools such as a Cradle entry form for convention instances based on an entity schema – these can also be used for validation of both new data and existing event records, which must be loaded from existing lists, timelines and maps, before their replacement with Wikibase-driven equivalents.
Did Wikibase meet your needs? Describe the challenges and successes you faced when implementing Wikibase.
We created a prototype to test types for data representation and queries. Standalone Wikibase implementation was delayed by availability of LTS releases. While we’d want to cite the source – rather than rely on an editor’s authority – Cradle can’t add references with statements; and lacks guidance for EDTF, or a way to provide it other than on a linked property page. The “instance of” search icon also searches Wikidata, rather than our wiki, despite the wdt: prefix set in the schema – and we couldn’t use wfdt: as a prefix. It’s hard to find tools for validation via schema usable by average editors; Wikidata’s quality constraints system doesn’t use them, limiting their utility. Creating backups of our Wikibase.cloud prototype was solved with the Archive.org Wikiteams python script – first scavenging the necessary Python 2 packages for FreeBSD, then helping to test the preliminary Python 3 version on Linux – now documented in MediaWiki’s manual.