A simple hypothesis about society, and why there’s so many problems nowadays

I’ll just come right out with my hypothesis: As the number of people involved in a system increases, the odds of that system failing increase.

Why, though?

Here’s what I’ve seen that leads up to that hypothesis.

The first thing to look at is bureaucracy. Bureaucracy is when small, competing systems form within a larger system, and those competing systems get in the way of each other, when they should be working together towards a common goal. Bureaucratic systems evolve due to specialization – which is usually a good thing – one person can’t control everything in a large system, so other people have to specialize in it. The problem is, get enough people, and then you’re split into different teams that don’t work closely together. At that point, they’re actively competing against each other, and they’re having to cover their asses against one another – the spirit of working to further a cause goes away, and instead self-preservation is the name of the game.

This happens in companies, non-profit organizations, and governments.

Now, let’s use government as an example.

Governments work best when their citizens take an active role in the operation of their government. However, with a large government, several things happen. First, bureaucracy insulates those in power from the people they serve – don’t want to do something? Hide behind the bureaucracy. Your citizens want something? You might not even hear about it because of the bureaucracy. Second, those that are in power are often of a drastically different status than their constituents, in a large government (even at the state level, and sometimes at the county or even local level.) Finally, the value of the input of an individual citizen falls as population increases, and therefore people feel less personally invested in their government.

The governments that truly serve their citizens tend to be the smallest ones. (Keep in mind that this isn’t to say that all small governments are good – many are quite corrupt. And the same goes for companies, for that matter. But, my hypothesis is that a larger system is more likely to fail, where failure is in failing to meet the system’s goals, or relative to another smaller system.) The companies that focus on quality products and customer tend to be smaller.

Now, here’s a huge problem. Let’s say you break governments up into a bunch of self-sufficient micronations. That solves the citizen participation issue and much of the bureaucracy… but you get two big problems.

First, specialization isn’t necessarily a bad thing, and as a system increases in head count, specialization also increases. Technology goes backwards if everyone has to farm to keep themselves fed.

That leads into another problem, indirectly. So far, this post has all been ignoring a HUGE system, that expands past governments and businesses altogether – humanity itself. And, the internet means that all of humanity is even more closely linked into one system, which naturally creates bureaucracy among its participants.

And, of course, you’ve got enough microgovernments at that point that their interactions themselves will cause bureaucracies and corruption to form… meaning that microgovernments don’t even solve the problem.

What’s the solution?

There is no “good” solution, but I suspect that the biggest cure is to reduce the number of people in the system. There are plenty of other reasons to do that – the environmental impact of humans is rather high, and our resource consumption is extremely high, and attempts to curb our resource usage have failed. Reducing the population gets around that. Of course, this gets into the whole “no ‘good’ solution” thing – people have a natural desire to breed, and taking that away gets into some nasty human rights issues. Directly taking people’s lives also gets into some equally nasty issues, needless to say.

Ultimately, I think the system will self-regulate… but self-regulating systems like this don’t regulate until there’s no alternative to fixing the problems, rather than fixing the problems before there’s an issue – see the oil industry for a perfect example of that. Self-regulation, in this case, will be an extremely messy process, that I suspect will take the form of World War III – and it will be a sign that the system has failed, but that it is correcting itself to more appropriate levels.

I guess I’m not even sure where I’m going with this, but I do think we should observe the effects of size of a system versus odds of failure, find an optimum size that preserves our technological advancement while allowing personal involvement in a system by each participant, and then move our systems to that size, in some way, voluntarily, before it comes to bite us in the ass. And then, hopefully, these optimally sized systems continue to function properly, without corruption – or, if they get corrupted, they get fixed quickly, with minimal pain.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.