Someone once said you gotta serve somebody. But who do social media apps serve? Users? Advertisers? Investors? Founders & Directors? Employees?

The reality is a complex interplay of confluent and conflicting interests, balanced but perpetually unstable, requiring extraordinary amounts of money from advertising and investment capital to resist the sometimes orthogonal and sometimes opposing forces from pulling them apart.

What keeps social media companies growing is ingenuity and money, and what keeps that money coming in is growth. Its a virtuous (or vicious) cycle.

But when growth slows, either because the app has reached saturation or been overtaken by another, often more addictive app, it becomes more difficult to accrue income from advertising or investment capital. If the social media company is not established enough at this point this is a death sentence, and the owners will usually attempt a fire sale to a larger competitor who may still be able to harvest the userbase or incorporate the app into their existing suite. If the company is big enough, with a dominant share of the ad market, there are more options on the table, such as inflating the value of their ads to bring in more revenue, or buying up a smaller but fast growing competitor, either to run the service alongside its own, or to strip the assets and shut it down.

If these and other tricks don’t smooth out the dip in growth until the next wave, and growth is anaemic for too long, the company will fail to achieve the revenue it needs to pay for the ingenuity that boot loads every growth cycle. Without paying exceptionally large salaries the company will lose the human capital that its success is really based upon. A large company can coast for years or even decades with mediocre employees, but without growth and the prospects of earning huge salaries relative to peers in other industries even the largest players in social media will grind to a slow burn and eventually get sold off for scrap. In social media there is no stable long term play. You grow or you die.

There is a long running debate over whether social media should be administered and regulated as a public utility, as is the case in China. This is another game entirely. If this were to happen in the US we would see regulatory capture, as we see in many other industries, where the incumbents cooperate with regulators in order to suppress competition and innovation. In this scenario growth is controlled by restricting supply of products to a narrow set of options.

In this latter scenario the social media company has another entity to serve: Government. The various bodies and institutions of the state, such as law enforcement, national security and public health, with all of the political baggage this implies, then have greater bargaining chips with which to use the social platform to achieve their own interests, whether those be the public interest or any interests being serviced by these institutions. The source of the real complexity here is that those institutions of the state are even more likely to have been captured by corporate interests across many other industries, and thus the platform is then not simply negotiating with an impartial and benign state, but with a range of large incumbents via the apparatus of the state, using (and sometimes abusing) its legal authority to ensure that the architectural decisions made by the social technology company serve the interests of other large corporations, whether they be in FMCG, healthcare or the media.

And that is not to mention the ideological forces constantly flowing through government and the wider society.

The number of interests that social applications are required to service in some manner is so complex and diffuse that it brings to mind an ominous grey cloud of acrimony and conspiracy. Is it any wonder that these companies become duplicitous? Founders and directors routinely contradict their own lawyers, their marketing pitches being in complete contradiction with their business models and architectures.

What is missing from this picture? First of all, you might say a meaningful central theme or purpose, a set of foundational principles that are public, coherent and morally justifiable, to use the Silicon Valley cliché: core values. But more glaring than that is civil society and the needs of the individual. Is it any wonder that the individual cannot be heard by these platforms? When was the last time you were satisfied by anything like “customer service”? No, your role as an individual in this maelstrom is to have your attention strip-mined in the interests of advertisers on behalf of… something so far outside your awareness that it begins to resemble the cosmic. Cthulhu?

Is Mark Zuckerberg a servant of Cthulhu?

Maybe this picture is too pessimistic. If you have read this far I am going to assume that this story resembles reality, at least for you. Maybe we’re wrong. Maybe we’re overthinking it. But that is getting harder to believe.

Lets Start Again

At Tuvens we’ve taken a green field approach to social media. We’re not playing the growth game for growth’s sake. We’re not dancing to the fiddles of venture capitalists and angel investors. We’re thinking about what social media and social networks should be from first principles, to create the impossible startup.

Everything is on the table, and anything is possible. We haven’t even decided upon our formal structure, whether we will form as a standard for-profit company or a not-for-profit foundation or something in between. Maybe we’ll be completely radical and create a DAO, governed by an algocracy.

This is our starting position: Social media serves relationships on behalf of communities. That big dark grey cloud that resembles the coming of Cthulhu is an outcome of the lack of recognition that communities are the entities that social technologies service, and by serving them badly they have become angry. The essential definition of social technology, as opposed to a tool like a hammer or a calculator app, is that it serves the relationships between individuals and organisations on behalf of the communities that they share as their primary function.

And what is the best way to serve a community? By creating a game. A good game serves the relationships between players, building bonds and creating stable communities out of independent individuals and groups. Games channel the purposes of groups and individuals into a set of acceptable constraints (the rules or the environment) that help us to resolve conflicts and come together, without taking away the autonomy of individuals.

Social media apps are social games, the algorithms are the rules, and reach is, in the abstract, the goal. Users and businesses and other organisations play in order to come into confluence or alignment with each other by resolving conflicts and contradictions.

In a separate post we have created a glossary of terms that we hope will clarify how we understand the process of social game design and our role as stewards of Tuvens. We are not claiming our definitions are objective, just that this is what we mean when we use words like ‘governance‘ and ‘game‘ and ‘purpose‘.

Our role as stewards of Tuvens is to foster community through good governance. We do this by creating and maintaining the interface, architecture and algorithms within the logical constraints of the meta rules – Difficulty, verifiability, universality and independence. You can consider these to be our core values. This means that our purpose is facilitate the creation digital social spaces that amplify the people who contribute the most and exemplify the purposes of the space, in other words the values, without introducing our own biases and interests into the game.

Communities are nested inside each other, and often come into conflict. By various techniques, such as incentivising real world meetings between members and amplifying bridge builders – individuals who have reputation in orthogonal spaces (meaning spaces who have very few members in common) within the broader spaces that those communities share, such as geographic spaces or broader “realms” (e.g. dance or music) – we hope to maximise the chances of resolving conflicts through the generation of shared norms and values.

If this works…

Without exaggeration, we see this as a paradigm shift in social media. If successful, we hope to see competition from other social tech startups, and even other kinds of applications, like marketplaces, dating apps and sharing economy startups, moving away from the lip-service given to community-focus and an over-emphasis of behavioural psychology and opaque machine learning algorithms, towards robust social game design.

A goal is an intent to generate an imagined reality.

To be an agent is to have purposes.

A pursuit is a recurring goal or set of goals.

A purpose is a series of pursuits and goals.

A space is a environment dedicated to a particular pursuit, such as a football pitch, or a web forum.

An environment is a set of contraints and enablers that a pursuit interacts with.

A good pursuit is one whose goal, or goals, correlate with the agent’s purpose.

  • When goals and pursuits correlate with their purpose it means the subsequent goals become easier to achieve, or are directly achieved.
  • Goals can have multiple purposes.

The meaning of a pursuit is the description of the relationship between the goals and the purpose, in other words why they correlate. So the meaning of a pursuit is the reason an agent engages in the pursuit.

A pursuit that becomes trivial is called a fixed action pattern, and no longer requires an agent.

  • Fixed Action Pattern is a term used in ethology to describe unconscious actions and reactions to stimuli in animals and plants.
  • Optimisation for a pursuit is the process of converting focused intent into fixed action patterns, achieving the same goals more often, and more efficiently. For example, a tennis player may hone a particular type of swing until it is almost automatic, converting conscious intent into an automatic response, both through practice and visualisation.
  • Optimisation increases agency for the purpose by reducing agency for the parts.

A competition is a pursuit in which the success of agents is determined relative to each other.

A goal of one agent may contradict the goal of another. Within a pursuit, if the goal of one agent contradicts that of another, then the total sum of the energy input by each player can equal or exceed the total sum of the value output. We call this zero-sum or negative-sum. If one agent concedes before this point then the pursuit remains positive sum, although this also support the other agent at their own expense.

A goal of one agent may serve the purposes of another. Within a pursuit, if the goal of one agent serves the purposes of another, this will always be positive-sum.

The goals of multiple agents may be independent, in conflict, in alignment, or in confluence. The purposes of multiple agents can be, at different times, independent, in conflict, in alignment or in confluence.

  • Independence is where there is no intersection of goals or purposes and therefore the outputs cannot be summed
  • A conflict is always negative sum, where agents inhibit each other’s goals in order to advance their own purposes
  • Alignment is the state of goals intersecting without either conflict or shared interest. A pursuit may produce an output of value that exceeds their own interests, a surplus, which can be used in the pursuits of another agent (a gift), or which can be repurposed for another goal (usually traded)
  • Confluence is the state of multiple agents fulfilling their purposes while also serving each other, and is therefore always positive sum. Confluence does not preclude competition, but suppresses conflict.

A game is a pursuit engaged by agents without regard to the correlation between the goal and its purpose, as in a goal pursued for its own sake. In other words, to ‘game the system’ is to pursue the goal at the expense of the sustainability or robustness of the system.

An agent in a game is a player.

A good game is one which remains robust regardless of the strategies deployed by players. (Nash Equilibria).

  • Since an agent can have multiple purposes, but can only attend to one goal at a time, a player may go in and out of a game repeatedly, by redirecting their attention.

If a goal serves the purposes of an agent outside the game we say that the agent is interested in the game.

  • Games can be used by agents to select between players when success within the game correlates with success in that player’s purposes.

Selection by interested agents may also serve the purposes of players.

A player that aims to be selected by an external agent is in service of that agent. Thus we say that a player’s purposes can be in service of a set of selecting agents.

There may be multiple selecting agents, in which case a game emerges between selectors, to win the service of players. This two-way selection process changes both categories of agents over time as they optimise for the objective function of the other, and is called dimorphic selection.

  • Dimorphic selection is the term given to the morphological differences between males and females due to different selection pressures acting on each category. Humans are less dimorphic than other species, like gorillas or black widow spiders. But dimorphic selection applies to any two-way selection process that changes each corresponding category, such as consumers and products, employers and employees, or predators and prey. An arms race is a category of dimorphic selection, where the relationship between agents is punctuated by conflict rather than confluence.

The games used on both sides to be selected are called proxy games, because they correlate with and represent the agents’ capacity to engage in further games that serve their purposes.

For a proxy game to be interesting to selecting agents it must be difficult* and verifiable*.

  • If the goal of a game is not difficult it is trivial, increasing the cost to distinguish between the success of players for the selecting agents, which decreases its efficiency.
    • A trivial game may be a necessary proxy game for a less trivial and more meaningful game, and thus becomes a fixed action pattern. Agency is not required to implement a fixed action pattern, and thus the game loses its agent.
  • If the goal of a game is not verifiable the meaning is unknown, and thus the game loses its purpose.

Furthermore, for a game to be a useful selection proxy it must be universal* and independent*.

  • If the game is not universal it is excluding potential players for reasons unrelated to the goals of the game and the purposes of the selectors, and thus players and selectors will choose a different, more universal game if it is available.
  • If the game is not independent it means success is partially or wholly determined or influenced by a third-party agent, who can have their own implicit purposes which may or may not be in confluence or alignment with the purposes of players and selectors.
    • Another way of describing independence is the orthogonality of the verification mechanism. In Bitcoin, the nodes that verify the result of the proof of work algorithm are a set of independent third parties which may have their own interests, but it is their orthogonality which makes the system work, because they cannot coordinate.

The extent that games fulfill these constraints – difficulty, verifiability, universality and independence – is the extent that they are stable, retain their meaning, meaning robust against entropy.

A crisis of meaning happens when the games used by agents to fulfill their purposes lose their meaning.

We call these constraints meta rules because both players and game designers can contribute to preventing a crisis of meaning, to reducing disorder and chaos, to the extent that they act within these constraints, as a generalisation of the explicit constraints of the game itself.

When a player uses a game for the purposes of being selected we call this a selection signal.

A player may imitate success, what we call cheating, in order to be selected. These false signals, we call noise.

The robustness of a game system is measured by the signal to noise ratio output by its composite proxy games.

Signal is a form of information.

Entropy is a measure of the rate of change in the amount of energy required to describe a system at any time ‘t’.

Since entropy is a measure of information in a system, the amount of noise directly correlates with the rate of entropy, or the tendency towards disorder. In other words, as noise increases it increases the cost of describing the state of the game.

Therefore an agent does not contribute to the entropy of a game system when they work hard and sacrifice (difficulty), don’t lie or cheat (verifiability), do not exclude anyone for reasons unrelated to the goals of the game (universality), and take responsibility (independence).

A game designer generates negentropy when they generate the rules and constraints of the game system that produce more signal and less noise, in other words that the goals are difficult, verifiable, universal and independent.

Rules are descriptions of constraints that prevent entropic strategies being employed in a game, which authorise some agents to prevent the use of these strategies with force, called a sanction.

Good rules are a set of explicit or understood regulations or principles constraining the conduct or procedure within a game which are in accordance with the meta rules.

Bad rules are a set of explicit or understood regulations or principles constraining the conduct or procedure within a game which are in discordance with the meta rules.

Governance is the process of setting the rules by which a set of games are played.

Good governance is the process of setting the rules by which a set of games are played such that the game retains its order, meaning it is meta stable.

Bad governance is the process of setting the rules by which a set of games are played such that the game tends towards disorder, or meta chaos.

Games mediate the relationships between agents, allowing them to achieve goals more efficiently, without having to attend to the goals and broader purposes simultaneously.

Where goals are in conflict, but purposes are in alignment, a mechanism emerges to resolve the conflict such that both parties continue in their purpose as efficiently as possible. This is called a rivalrous game, and it determines whose goal takes priority. For example, a queue in a supermarket, a sport, exams, or democracy.

Like any other game, a rivalrous game can be good or bad. Rivalry itself is not the cause of self-termination. Rivalry is a mechanism for continuing a game even when players are in conflict, and the value of that mechanism is determined by its correspondence to the meta rules.

Society is the aggregate of all of the selection games used to bring players into confluence.

A good society is meta stable, as in has a low rate of entropy.

A bad society is meta chaotic, as in has a high rate of entropy.

Community is the state of players being in confluence.

Good governance is a prerequisite of a good society, and a good society is a prerequisite for growing communities, therefore good governance is a prerequisite for growing communities.