In 2006, bots were introduced to do various clean up tasks automatically. There are currently over 800 registered bots on Wikipedia. They are generally made to enforce the rules. But if you need 800 different bots controlled by 800 different people to enforce the rules, that means you have a lot of obscure rules.
For instance, there is a list of domain names that wikipedia sees often as spam. So if an innocent user goes to edit a page, and wants to add a new piece of information referencing a domain on that list, a bot will come along and mark it as spam (and revert it) even though its a good faith edit.
It can be frustrating and confusing for new users. The bot is often quite terse and rude, and so there is no incentive to return to wikipedia to make edits. 9/10 new edits are rejected, so no wonder they can't keep new editors.
For instance, there is a list of domain names that wikipedia sees often as spam. So if an innocent user goes to edit a page, and wants to add a new piece of information referencing a domain on that list, a bot will come along and mark it as spam (and revert it) even though its a good faith edit.
It can be frustrating and confusing for new users. The bot is often quite terse and rude, and so there is no incentive to return to wikipedia to make edits. 9/10 new edits are rejected, so no wonder they can't keep new editors.