This user is a bot | |
---|---|
(talk · contribs) | |
Operator | Johand199 |
Approved? | No |
Flagged? | No |
Task(s) | Semi automated vandal reverter |
Edit rate | 1-5 EPM |
Edit period(s) | Always checking recent changes feed |
Automatic or manual? | Semi-Automatic |
Programming language(s) | Node.JS, MongoDB and Redis |
Exclusion compliant? | Yes |
Source code published? | No |
Emergency shutoff-compliant? | Yes |
This user account is a bot that uses Node.js, operated by Johand199 (talk). It is used to make repetitive automated or semi-automated edits that would be extremely tedious to do manually, in accordance with the bot policy. This bot does not yet have the approval of the community, or approval has been withdrawn or expired, and therefore shouldn't be making edits that appear to be unassisted except in the operator's or its own user and user talk space. The bot is undergoing testing. Administrators: if this bot is making edits that appear to be unassisted to pages not in the operator's or its own userspace, please block it. |
Emergency bot shutoff button
Administrators: Use this button if the bot is malfunctioning. (direct link)
Non-administrators can a malfunctioning bot to Wikipedia:Administrators' noticeboard/Incidents.
Summary
editJohandBot is used to complete semi-automated tasks to revert Vandalism on the English Wikipedia using machine learning and a probability algorithm trained on current vandal trends. The bot is currently not sentient, but it is in the background training a more advanced version of itself to catch more Vandalism.
NOTE: None of the machine learning models are running as the bot is currently training on production traffic.
Detection
editThe base of JohandBot is made to be fast, but there are multiple layers of ”models” taking independent decisions before completing an action. The models are only getting the following to base its decision on:
- Edit summary
- User information (edit count, warning count, user roles, IP editor, etc.)
- Article metadata (category, article owner, created date, edit frequency, etc.)
- Content added/removed
- Huggle whitelist
No third party data is used. After the initial data collection, a feature set is created and passed to the machine learning models.
Machine Learning
editAn initial feature set is created and ran against a training set to get feature importance. Unneeded/unnecessary features are discarded. A decision tree learning model is created and trained on a carefully selected, random, and accurate training set created by human reviewers. If a model shows potential, it will be set in a “dry run” mode. This is a verbose only setting where it will log it to the bot creator for review. If a model is successful, it will be deployed with a gradually increase in traffic.
Multiple models may be active at once. There could be a model looking for test edits, and one for vandalism. However, this has yet to be implemented and analysed if there is an ROI.
Bayesian Classifiers
editThe Bayesian classifier is a simple probabilistic classifier looking at the word combinations that is common in vandalism. It is not just looking at “bad” words, but also how non-vandals use words. The classifier is used to make final score adjustment from the first models.
Configuration, Threshold and Final decision
editA final score will be given by the model, from 0-1000 where 1000 is “safe” and 0 is “vandalism”. There is an adjustment layer and config where you can set when to revert or ignore. The adjustment layer will provide an output of 0-100%, where 100% is vandalism. Whenever an edit to the adjustment layer or config is made, it will be set to dry-mode to and ran against a training set to get a false positive and false negative %.
Source code and development
editThe Source Code is, for now, not public. This is due to the current code complexity, and is currently being worked on.
JohandBot is mainly written in Node.js, with a Redis as a cache layer and message broker. For long term storage, there is a MongoDB server running Ubuntu. For model training Python is used, on the TensorFlow library.
Outgoing requests are as follow:
- Every 5 minutes a request is sent to the Huggle servers requesting an updated list of whitelisted users
- An ongoing stream to stream.wikimedia.org getting the recent changes feed
- If stream.wikimedia.org goes down, it will have a fallback to the recent changes IRC
- Miscellaneous requests asking for revision, article and user metadata from Wikipedia API
The database only stores a unique decision ID, date, and model outcome. Edits that where reverted stays stored for 30 days, non-reverted edits gets stored for 24 hours.
Questions & Answers
editWIP
Report false positives
editWIP