Today marks the deprecation of the Application Verification Program (AVP) as announced during the big Platform Roadmap announcement back in October. Why is Facebook doing this, and what does this mean to developers and testers. Let’s have a look:
The AVP was announced a year ago as a way for developers to differentiate their apps. Verification was supposed to show that the apps met a set of criteria to reassure users that the apps provided a good user experience. The theory was that users would flock to these verified apps, Facebook would feel comfortable that these apps provided a consistent user experience, and developers would be rewarded for playing according to the rules with things like prominent placement in the app directory and an increase on notification allocations. So why was this program axed less than a year after announcement? The truth is, it didn’t work as well as planned. When the first wave of verified apps became available in May 2009, bugs prevented some of the promised developer benefits from actually working (e.g. the special green checkmark to denote a verified app did not always show up). Developers also weren’t thrilled with the $375 price that Facebook charged to verify their app, something that really hurt small developers. And finally, did any users really care whether the apps they were using were verified or not, and moreover, did they recognize when they were?
And so today, this program is being put to rest. But that doesn’t mean Facebook is giving up on consistent user experience. They’ve simply taken the criteria from the AVP, strengthened them, and reinvented them as the Developer Principles and Policies . This is not a voluntary program as the AVP was, everyone is supposed to live by these rules. Facebook is reserving the right to censure or cut off developers who don’t. We don’t know the circumstances under which Facebook will do an audit of an application, but one thing is for sure – it won’t be so easy for them.
Understanding why auditing for consistent user experience isn’t easy gets to the central issue that Facebook is trying to solve. Consistent user experience is important. Those of us old enough to remember that awkward time between command line DOS and the advent of Windows will remember the many applications that provided quasi-GUI interfaces on top of a DOS base. None of these applications looked or felt like any other, which made the process of learning to use one long and tedious. Say what you will about Microsoft, but they did users a great service by providing a Framework for consistency with Windows (okay, borrowed from Apple, borrowed from Xerox). Contrast this with my 8-year old son’s Super Mario games which he plays expertly on 3 different platforms with different sets of controls. Not only does he not read a manual, but mocks the idea that a manual even exists. This is the epitome of consistent user experience that Facebook would like all Platform apps to have.
The trouble is, consistency for Facebook apps is elusive. After all, they’re really just web apps, and developers can do pretty much anything they like (a la the pre-Windows GUI days). To compound problems, an application audited today may change how it works tomorrow. Even if the application code doesn’t explicitly change, any changes to its software stack (e.g. the libraries it uses, the application server, the operating system) may alter the way it works.
Developer Principles and Policies: How to stay on Facebook’s good side
Facebook breaks down proper application behavior into two Principles: Be trustworthy, and Create a great user experience. These principles are expanded into ten policies that Facebook says it will enforce. Upon reading the policies, you’ll find that most of them fall into the Be trustworthy category, or as I like to think about it, the “Don’t be a jerk category”. Most of these rules are simply an extension of civilized behavior for any application: Don’t misuse people’s information. Don’t SPAM the network. Don’t engage in illegal activity. Respect copyright. … and lots more. Facebook unfortunately has to list all of these things since when it shuts someone down who engages in one, they’ll need to point to the policy that was violated. Be trustworthy is a necessary evil that Facebook must delineate, and spend lots of time, effort, and money to police. After all, hate speech is really subjective, and cannot be determined through automated processes. It requires real people to make that determination.
On the other hand, Create a great user experience is a much more positive set of policies, mainly focused on Application Integration Points and Application Responses to User Actions. Whereas Windows codified consistent behavior like where the “File” menu was (and what the typical submenus should be) these policies outline the types of things a user should expect (or better yet, not expect) when using your application. Reading the list, one can remember various apps that used some of the techniques outlined before they were outlawed, and the user backlash that they caused. Some of the policies are:
- You must not prompt users to send invitations, requests, generate notifications, or use other Facebook communication channels immediately after a user allows access or returns to your application.
Invitations or notifications should really only be used once you use and like an application. This is a good policy since it spares your friends the application SPAM that was once common. This also signals to users who receive such invitations that their friends who sent it probably really do like the application and there may be some merit in trying it out themselves.
- You must only use one Facebook communication channel in response to a user’s single action.
When a user clicks on some choice that will generate some action, that user should not be surprised when additional actions are also generated.
- You must not prompt users to bookmark your application (e.g., by using a modal window or pop-up dialog). Instead, users must explicitly invoke any bookmark option you provide.
Nothing is worse than a modal dialog that forces you to take some action that you don’t necessarily agree with, but you must do because hey, it’s modal. Sure, you can go back and delete it, but this policy puts to rest this kind of nonsense that some applications used to pull.
- You must use discretion when publishing Stream stories and must not misuse the Stream by publishing an excessive amount of stories on a user’s behalf.
Forcing your friends to confront too much information about you and some application isn’t nice. Having an application use some discretion is good.
- Stream stories must be consistent with our design and be user-focused based on the user’s action that triggered the story. In no case should a Stream story serve primarily as a means to promote or advertise your application.
Time was when any sort of notification was simply an advertisement to your friends to come and check out this cool application. Well, maybe things stayed the same, but don’t be so blatant about it.
Going through the entire list, we begin to notice a common theme. Application consistency in the Facebook world is not about where the “File” menu is, it’s about the behavior of the application when dealing with the generation of privately and publically available information. But what it’s REALLY about is trying to ensure that you don’t look stupid in front of your friends (probably one of the greatest fears out there). It’s about allaying your fears to try out an application, lest that application begins to SPAM your friends who will get annoyed at you. Once we can all try out any application without worrying that bad things will happen, we all (especially Facebook) win.
One other interesting aspect of application consistency is that many of the rules of this principle can be tested in an automated way. Does Facebook have a way to do this automatically, it would sure save them a lot of legwork. How about your development organization? Would an automated tool to test compliance with Developer Principles be of use to you? Leave a comment and let us know!