How do governing entities judge the will of the governed? How do they gain their consent?
A culture of participation involves a community which is to some large extent, self-governing. In principle, everyone participates in the decision-making process, perhaps just by voicing their opinion. In principle, decisions have buy-in from everyone involved, even if they are merely tolerated.
In such environments, there's a continuous process of consensus-building, involving a large degree of mutual trust. It involves the "consent of the governed."
Consensus-building, when done well, involves most everyone and relies on a lot of buy-in. However, consensus-building skills are rare, and the process is frequently a lengthy one. It involves many one to one or one to a few conversations. As a result, decisions can take intolerably long, or might not be made at all. Group decisions can be more accurate than individual decisions, ala "wisdom of crowds", but crowds are also subject to panic, degenerating into mobs.
In contrast, central authority involving a "command and control" system can provide for fast decision-making. This works if the authority is trustworthy and competent, and has good intelligence from the community. Unfortunately, the skills required to ascend to a position of centralized power often don't include the skills needed to make good decisions. Those skills frequently carry few scruples regarding trustworthy behavior, particularly in hierarchical organizations. Intelligence data generated on the front lines rarely reaches authority in an effective manner. People tend to tell their boss what the boss wants to hear… and that boss tells his or her boss what they want to hear. The filtered version of intelligence reaching the center leads to bad decision-making.
In real life, the solution normally involves some hybrid solution. In a small company, the managers are to some extent engaged continuously with their workers and their customers. They listen to the concerns and suggestions of both groups. Hopefully, there's some balance between managerial authority and the participants. If they do an adequate job, they survive; otherwise, their customers leave and they fail. As checks and balances go, that's crude, but effective, in the long term.
Large companies strive to communicate well and to listen to their worker and customer communities. However, their hierarchies are bigger, and communication is proportionately more difficult. As a result, large companies tend to die, that process sometimes takes decades; observe what's happened to the steel and auto industries.
However, the Net potentially changes conceptions of governance, both for private or governmental institutions. Net culture tends to include everyone who wants a voice; it's about inclusion, the biggest tent possible. That means including people with great information and ideas. It also means including people who want attention, and are willing to disrupt conversation, normally referred to as "trolls."
That is, existing Net behavior is about a culture of participation, but with realistic governance as needed, all a work in progress. For example, early discussion boards included Usenet newsgroups. The unrestricted newsgroups tended to suffer from the "tragedy of the commons" in that spammers and trolls so overused them that they were rendered largely useless.
Some newsgroups were moderated and the concept of moderation plays a major role in online governance today. In more advanced systems, sites are self-moderated to a large extent, but there are always situations where administrators need to act.
Wikipedia is a great case study regarding the evolution of moderation in governance. It's subject to pranks or outright disinformation attacks, with periodic improvements in moderation. Jimmy Wales will rarely make a decision as a "benevolent dictator," but normally the users of Wikipedia control the material.
At craigslist, guidelines and policy are driven by the community. Also, removal of ads is accomplished via a kind of voting process— that is, "flagging for removal." Administrators will work with people to remove postings from spammers and trolls, but the site is largely self policing.
In all sites, participants need to know what the rules are. On the other hand, if the written rules are too complex, or just too long, people will disregard them. Successful sites rely on a set of written guidelines but also on common sense regarding civil discourse.
It might be observed that "culture of participation" is, more or less, the same as "democracy." The preceding discussion applies to governance, with one large difference: If the culture of a specific site is objectionable, people abandon that site to failure. However, a government has a monopoly on force within its boundaries. That requires a more specific written constitution with better defined checks and balances.
The Roman republic thrived for hundreds of years, but the empire fell into ruin when the unwritten rules and checks and balances failed. I feel we dodged that bullet, having survived an Administration which made explicit statements contrary to a culture of participation.
In the last few years, people have been developing tools that might confront the challenges of international governance. Millions of people might participate in online discussion, providing not only feedback, but also big problems regarding Too Much Information. The challenge will be in the development of software which allows citizens to collectively moderate their discussion, enabling people to filter up the good feedback and ideas, and to vote down the trolling and spam. A version of the craigslist and Wikipedia models, on a global scale.