Context
A StrategyCore forum post (magic9mushroom, April 2025) identifies that a UFOpaedia editor called EsTeR "unilaterally revamped everything over community objections" in the Apocalypse section — doing some good work but "ruining a large number of articles." The goal is to systematically review all of EsTeR's edits across the entire wiki, decide what to revert, and execute the reverts using the Deldonut1 account.
"One notable goal is to unfuck the Apocalypse section. An enthusiastic editor called EsTeR decided to unilaterally revamp everything over community objections, and while he did some good work he ruined a large number of articles. It's a mammoth task to figure out what to do with the mess."
— magic9mushroom, StrategyCore
Key Reference Links
Parent ticket: #3 — Organize and index UFOpaedia wiki dump data
Data Sources Available (from #3)
We already have a local wiki dump that partially covers this audit — but it may not have all of EsTeR's edits or full revision history. Both local data and browser-based gathering are needed to complement each other.
| Source |
File |
Relevance |
| 2020 XML dump |
ufopaediaorg-20200218-history.xml (33 MB) |
Pre-EsTeR baseline — 14,400 pages with wikitext as of Feb 2020. Gives us the "before" state for comparison. |
| EsTeR edits |
ester_edits.json |
50 EsTeR edits tracked via Wayback Machine — may not be complete. |
| Wayback snapshots |
wayback/*.json (14 files) |
14 key pages with 1-40 snapshots each (2020-2026). Partial coverage only. |
| Page titles |
ufopaediaorg-20200218-titles.txt |
Full list of 14,400 page titles for cross-referencing. |
| Indexed data |
indexed/ (once #3 is complete) |
Structured per-page JSON with categories, links, contributor info. |
All files are in scratch/wiki-dump/ufopaedia-dump/ (gitignored).
Phase 1: Gather Data
Use both local data and browser-based gathering to build a complete picture. The local dump gives us a head start; browser steps fill the gaps.
1A: Local Data (from wiki dump, #3)
1B: Browser-Based Gathering (to fill gaps)
The local data may be incomplete — ester_edits.json has only 50 edits, the XML dump has only 1 revision per page, and Wayback covers only 14 pages. Browser-based gathering ensures we capture everything.
Phase 2: Analyze with Python Script
Phase 3: Review & Act
Decision Framework
| Situation |
Action |
| EsTeR removed accurate, useful content |
Revert to pre-EsTeR version |
| EsTeR added incorrect/misleading info |
Revert or Edit to fix |
| EsTeR improved formatting but changed factual content |
Partial — keep formatting, restore facts |
| EsTeR's version is fine or improves on the original |
Keep |
| Another editor already reverted/fixed |
Already-fixed — no action needed |
| EsTeR's edit was the last edit and page is now worse |
High priority revert |
File Structure
scratch/ester-audit/
analyze_ester_edits.py # Analysis script
review-tracker.json # Generated tracking file
report.md # Human-readable summary report
contributions/ # Saved Special:Contributions HTML pages
pages-export.xml # Special:Export XML (from browser)
histories/ # Individual page history HTMLs (fallback)
scratch/wiki-dump/ufopaedia-dump/ # (from #3, already exists)
ufopaediaorg-20200218-history.xml # Pre-EsTeR baseline (Feb 2020)
ester_edits.json # 50 tracked EsTeR edits (partial)
wayback/ # Wayback snapshots for 14 key pages
indexed/ # Structured index (once #3 is complete)
Verification
Dependencies
Context
A StrategyCore forum post (magic9mushroom, April 2025) identifies that a UFOpaedia editor called EsTeR "unilaterally revamped everything over community objections" in the Apocalypse section — doing some good work but "ruining a large number of articles." The goal is to systematically review all of EsTeR's edits across the entire wiki, decide what to revert, and execute the reverts using the Deldonut1 account.
Key Reference Links
Data Sources Available (from #3)
We already have a local wiki dump that partially covers this audit — but it may not have all of EsTeR's edits or full revision history. Both local data and browser-based gathering are needed to complement each other.
ufopaediaorg-20200218-history.xml(33 MB)ester_edits.jsonwayback/*.json(14 files)ufopaediaorg-20200218-titles.txtindexed/(once #3 is complete)All files are in
scratch/wiki-dump/ufopaedia-dump/(gitignored).Phase 1: Gather Data
Use both local data and browser-based gathering to build a complete picture. The local dump gives us a head start; browser steps fill the gaps.
1A: Local Data (from wiki dump, #3)
ester_edits.json— extract the 50 already-tracked EsTeR edits (page titles, timestamps, content snapshots from Wayback)apocalypse-pages.jsonto identify all Apocalypse-related pages and check which ones EsTeR touched1B: Browser-Based Gathering (to fill gaps)
The local data may be incomplete —
ester_edits.jsonhas only 50 edits, the XML dump has only 1 revision per page, and Wayback covers only 14 pages. Browser-based gathering ensures we capture everything.Save EsTeR's full Contributions list
https://www.ufopaedia.org/index.php?title=Special:Contributions/EsTeR&limit=500scratch/ester-audit/contributions/ester_edits.jsonto find any edits not already trackedExport Affected Pages with Full History via Special:Export
https://www.ufopaedia.org/index.php/Special:Exportscratch/ester-audit/pages-export.xml(Fallback) Save Individual Page Histories — If Special:Export is too slow, visit
https://www.ufopaedia.org/index.php?title=PAGE_NAME&action=historyfor each affected page and save as HTMLPhase 2: Analyze with Python Script
Build analysis script (
scratch/ester-audit/analyze_ester_edits.py) that:ester_edits.json(50 tracked edits) as a starting point for the EsTeR edit listSpecial:ContributionsHTML pages to find any additional edits not inester_edits.jsonSpecial:ExportXML (from browser) to get full revision history with all contributorsreview-tracker.jsonwith fields per page:page,ester_edit_count,first_edit,last_edit,total_size_changeedit_summaries,pre_ester_revision_id,post_ester_revision_iddiff_url,subsequent_edits_by_others,review_status,decision,notesdata_sources(which sources contributed data: dump, wayback, export, contributions)report.mdwith statistics and the review queueLibraries needed:
xml.etree.ElementTree(stdlib) for XML parsing — reuse patterns fromindex_wiki_dump.py(Organize and index UFOpaedia wiki dump data #3)beautifulsoup4(pip install beautifulsoup4) for HTML parsing (contributions pages)difflib(stdlib) for computing text diffs between revisionsjson,pathlib,re(stdlib)Phase 3: Review & Act
wiki/original-game/)"decision": "revert"/"keep"/"partial"/"already-fixed"Decision Framework
https://www.ufopaedia.org/index.php?title=PAGE&action=edit&undoafter=PRE_ESTER_REVID&undo=LAST_ESTER_REVIDFile Structure
Verification
Special:Contributionsto confirm no edits were missedDependencies
apocalypse-pages.json, per-page JSONs)ester_edits.jsondirectly)