“I would create a set of safety standards…We can give your office a long other list on the things that we think are important there” Sam Altman - May 2023 - US Senate held a hearing on AI
“The details really matter,” Altman reportedly said. “We will try to comply, but if we can’t comply we will cease operating.” - May 2023 - Sam Altman in response to EU’s AI Act
We talk a lot about technical communication at Codesmith - it’s the ability to get your mental model of a complex thing into someone else's head (adjusted to their understanding)
When you look back at the ‘Internet is a series of tubes’, it seems reasonable to bring in the experts in to explain such technically complex stuff (the internet is not a big truck)
To be fair, it’s actually a great analogy (as everyone seems to be realizing) but Sen Ted Stevens is definitely not the web protocol expert you’re looking for
So experts like sama are only too happy to help with explaining the next tech revolution - GPTs, LLMs (AI) - but this is simply not going to work.
As always new technology creates outsized opportunities for those who invested time/money early on. But whether through malice or a natural desire for a return on their hard work they use the complexity of that tech to lock in their power - by giving expert guidance that benefits them and obscuring its consequences
Finance leaders in the run up to the 2008 financial crisis leaned into the obscurity of the financial tools they developed (derivatives, securities) to convince policy makers that they should regulate themselves - as the domain experts on such complex technology
This only heightened the systemic risk. The potential self-correcting tools of democracy (new ideas, new people, new competitors) could not prevent the 2008 systemic collapse. Worse, it ultimately left the financial firms protected while millions lost their homes.
The 2008 crisis was a direct result of this disconnect between those explaining the new technology (the few designing and regulating mortgage-backed securities and credit default swaps) and those affected by it (the many 5% down-payment mortgage holders).
This was the focus of my research under Walter Mattli at Oxford University when I traveled to Switzerland to interview private banking execs on their role in international financial regulation (and was met by the execs + their lawyers)
What followed – the rise of Occupy Wall Street, societal resentment of elites and ‘experts’ - were the seeds of 2020s’ populism.
Powerful explanations and advocacy also emerged to help build the popular support for the Dodd-Frank act - not least The Big Short - my brilliant sister’s (and legal advocate) favorite film. These became mainstream tools of understanding the financial complexity (even the Brookings Institution hosted a review - link)
There are lots of parallels between the financial crisis and the tech industry today. Both feature new, abstract technology at their core – ‘the algorithm,’ ‘data mining,’ GPTs. And both have obfuscating insiders, á la Sam Altman et al claiming ‘we’re the only ones who can guide on regulation.’
The potential damage that may come from this is at least as great as the financial crisis. These technologies are tightly tied to one’s identity – their social lives, their jobs, their health, and only more to come as everything becomes software and AI.
—
Concentrated power is often, to use the term of Nassim Taleb, fragile. A small group of people holding outsize influence for an extended period of time is a recipe for all sorts of issues: lack of new ideas, institutional capture and societal resentment - it’s ultimately a moral failing to serve the few over many if you think everyone has a right to reasonably influence their own destiny
As technology becomes ever more abstract, that attribution is harder to trace; where the value and power lies is harder to understand.
Preventing concentration requires this understanding. Who possibly understands it? Those who’ve invested the time and resources, and they want a return on that investment.
New tech needs new explainers. In early 20th century Russia (and across Europe) reading groups formed in every workplace to explain industrial capitalism. It was easier to recognize where the power lay - you could see the train wheels being forged on the factory floor and the owner's office elevated above.
I’m not advocating for the rise of tech reading groups (although maybe?) but a wider and deeper understanding of tech by the many vs the few.
Some of this is achieved by the Selena Gomezes and Anthony Bordains explaining the AI equivalents of collateralized debt obligations
But it’s also the rise of people into these positions of leadership that mirror society, are not life-long insiders and remember ‘not knowing’ tech. Contrast that to this early 2010s Quora conversation between Stripe’s CEO Patrick Collinson and OpenAI’s CEO Sam Altman
What I hope is that Codesmith does its small part to put people into tech sectors old and new who have lived experience being on the other side, remember not being inside the system, and therefore have an intrinsic empathy for the users and desire to explain how it’s working to all. In this way Codesmith is one incremental effort of self-correction.