Human-centered AI, ostensibly the form of artificial intelligence every major player in the field is busily building, seems increasingly oxymoronic to me. It’s not that I doubt the sincerity of any of the individuals most closely involved with these efforts, rather my perspective comes more from an acknowledgement of the system they’re all operating within, a system we’re all operating within.
I care a lot about the climate crisis, and I also understand the way I live my life emits a huge amount of carbon relative to global per capita averages. Here in America we drive larger vehicles for longer distances because that’s how things are setup. Because of who I am, what I do and the people in my life, I tend to fly on airplanes a few times each year. The systems we are part of have enormous influence over our options.
The most powerful and wide-reaching AI systems in the world are all being developed within systems that are not human-centric. At root, the paramount organizing principle in these environments is to win marketshare, to make money. Even if the individuals working on this or that piece of the endeavor aren’t in it for the money, when they collaborate, the rules of the road and the roads themselves all lead to seeking profit.
We all saw what happened late last year when some of OpenAI’s leadership team tried to utilize the nonprofit oversight mechanism the company’s founders put in place to try to ensure human centricity (see my earlier write up). The most powerful avatars of capital immediately rose up to ensure no course corrections would be made.
None of this is to scold or even criticize this behavior. Everyone is playing a small role within a powerful system that absorbs and diffuses individual agency and is our inheritance from generations of human economic evolution. I am just saying, it is what it is.
So let’s just assume for the sake of argument we will continue to see increasingly influential and capable forms of wealth-centered AI emerge onto the scene. What should we do about that? I would argue that we have relatively more power and purchase if we focus on the work of creating a more human-centered society, which can have the size and durability to contain and utilize super technologies we’re building to create economic value.
Policy is powerful and we have, in principle and often in practice, ways for every citizen to participate in shaping it. In a presidential election year I am constantly reminded of how much power is contained in a single vote, especially in one of the key counties in a decisive swing state. Of the eight billion people on earth, steering the planet’s most influential nation comes down to just a few tens of thousands of people in Michigan or Arizona, each of whom can be thought of as making a choice for 80,000 other humans on earth who will be impacted by the outcomes in November.
Let’s say it takes an hour of your time to cast a vote in one of these places, minute-for-minute, I struggle to think of a way in which most of us can have more influence on the course of history and world events. It never feels like it immediately, but this is the way, ever so slowly, that we shape the systems that determine so much in our lives.
My point is that we should try to look upstream as much as possible. As a kid growing up in Minnesota, I remember going to the source of the great Mississippi River at Lake Itasca. Our best chance of harnessing AI is to do so well before it ever becomes AI, way up at the headwaters of policy. Quite literally, controlling water rights is a means of influencing the progression of AI because of how much water is needed to cool massive data centers.
It’s tempting to gawk at the mouth of the river, to be mesmerized by the dramatic culmination of water’s journey. Likewise we track the stratospheric rise of the stock prices of tech titans and the Forbes billionaire rankings. I have a fuzzy memory of secretly peeing in Lake Itasca and delighting in the idea that my contribution to the great flow of the river would be carried thousands of miles to the Gulf of Mexico. If you have the opportunity to vote this year and you’re concerned about the direction of powerful technology on our society, channel your inner Calvin, cast your ballot, and know that you’re having an impact somewhere downstream.
Thanks for reading this edition of The Process, a free weekly newsletter and companion podcast discussing the future of technology and philanthropic work. Please subscribe for free, share and comment.
Philip Deng is the CEO of Grantable, a company building AI-powered grant writing software to empower mission-driven organizations to access grant funding they deserve.