"ORES functions like a pair of X-ray specs, the toy hyped in novelty shops and the back of comic books," explains a post on the Wikimedia blog. "But these specs actually work to highlight potentially damaging edits for editors. This allows editors to triage them from the torrent of new edits and review them with increased scrutiny
About half a million edits happen on Wikipedia every day, which is a lot for human editors to keep up with - the new ORES service is designed to take on some of this workload and make Wikipedia a more welcoming place. The aim is to make life easier for human editors, not replace them.
Spammers and trolls beware
ORES works by analysing quality assessments made by real people: by looking at which kinds of edits get approved and which don't, the roving software bots can apply some machine learning magic to spot patterns and make assessments of their own.
It's an open web service that everyone can play around with - anyone who's handy with a coding language can return results from ORES themselves. The source code and performance statistics are being made public too to keep the service as transparent as possible.
So if you suddenly find your creative interpretations of the events of the Second World War or the life and times of Wayne Rooney are getting cleaned up faster than ever, you'll know why - ORES is watching.