The dangers of big data have by now become apparent to all. Objectification of humans can now take place on a scale unseen in human history. The only organisations that can afford the costs of collecting and process big data about our lives are governments and large corporations. This means your existence has a kind of shadow life within the databanks of aggressive states and those of ad-sellers such as google and facebook. Those reading this in the UK will in particular find themselves in the databanks of the UK and US governments, two of the most historically violent military powers in the world. As for the ad-men at google and facebook, it hardly need be said that their motivations are not to improve the lot of humanity. They sell ads. In the process of doing so they have to provide us a ‘service’ that does something we want, or something like what we want at least.

These are organisations incapable of love or care. Nobody loves your data self. Yet the world will increasingly be shaped for the borderless nation of billions existing within databanks. All the money in the world is being thrown not at how to love you better, but at how to deliver you a product more quickly, by self-driving car, by drone, by satellite broadband. Life will get increasingly easy for those who can afford it, but it will not get more loving.

What, as people fighting for power over our lives, for a more caring world, is the right reaction to this grim fact? Should we all log off the internet and throw away our phones? A few might do it, just as a few opt out of television. It will make no difference.

Should we nationalise facebook and google, turn them into instruments for social good? But who should nationalise, who should be in control? Who should decide what social goods are?

Perhaps instead we could make them into co-ops owned by their users, run democratically. Yes, perhaps, I think that would be an improvement. And yet your data self can still only be processed en-masse. The problem isn’t just who is in charge of your data, it’s that the nature of big data means it can only treat you as the data you input. Over time personalisation will get better, impressive even. And yet it still won’t be dealing with you, a body that needs eye contact and touch and the sight of trees to feel whole.

I see no solution. I am just trying to state the problem in one particular way. We probably shouldn’t create worlds run by algorithms. They will never comfort us when we wake up crying in the middle of the night.