Yesterday I was on a support call with a client. We offer various web services that clients use to access their data that we keep in a database. It’s common for clients to integrate some views of the live data into their own web sites and we support that with code, examples, widgets, documents, APIs and consultation. It’s part of my job to assist clients in these web integration developments.
The call in question yesterday began routinely. I checked the data in the database and looked at the web page the client said had errors. The widget was indeed displaying a JSON syntax error message instead of the intended live data snippet. I looked at it with the browser’s page inspection tool and the markup was unfamiliar. So I mentioned that to the client and asked where he got the widget code from. He told me that he asked someoneorummermurmer to write it for him. I blamed my cell phone’s crummy speakerphone for this and asked him to repeat that. He said he asked an AI to produce the code for the widget.
I hope I managed to conceal my feelings at that point but I felt like Prof. Farnsworth when he said “I don’t want to live on this planet anymore.”
Below is a nine-second YouTube clip from Futurama episode “A Clockwork Origin” in 2010, in which Farnsworth is honored at the Museum of Natural History for discovering homo farnsworth, the missing link that Dr. Banjo says disproves evolution.
It was one of those moments when one little thing, a short sentence in this case, lets me see the future. Just those words over the phone and something obvious I already knew in its constituents came together in a fully materialized vision of what is to come. The enshittification of everything (Cory Doctorow, Feb 2024) by our rent-seeking tech overlords will include a new form of humiliation: having to fix crummy code someone else made with an AI.
A clear enough pattern has emerged in the use of AI to “automate” certain kinds of specialized text generation. The AI vendor demonstrates to executives that documents with all the appearances of their specialism can be made by anyone who knows how to specify what they need. Out comes the text and the executives’ eyes widen and big smiles appear as they imagine how they are going to make their staff’s lives even worse with these tools.
In practice, skilled people experienced in the production of specialized texts such as legal documents, engineering specifications, financial reports, clinical notes and so on can do boilerplate work very quickly and instead spend the bulk of the time on gathering and organizing the source information and ensuring correctness. Consequences for errors vary but they can be serious.
Its clear enough to anyone who stops to think about it that AI in such authoring isn’t going to replace the skilled and experienced writers. Company execs are all about risk and blame. An AI vendor won’t indemnify a law firm against losses resulting from errors in documents the AI produces. So the partners will still need the services of their skilled, experienced document wranglers.
Assuming that company execs are not too dim to understand this, why are they adopting AI at all? One theory that I like: it is yet another way to discipline salaried employees (Yves Smith, Jan 2025). Another is it’s a kind of FOMO; the insecurity that one can feel when everyone else is doing a fashionable new thing that you kinda know is BS but aren’t 100% sure. (Is that a version of Pascal's wager?)
Computer code is a specialized kind of text and there are analogies in its production with that of the natural language texts already mentioned. Much is boilerplate but it needs to be the right boilerplate and it must be suitably tailored. It must exclude nothing that’s required and include everything that is. And it needs to be correct without any possibility of ambiguous interpretation. The computer will interpret it exactly one way and this needs to correspond perfectly with the intent of the system in all circumstances.
Plenty of blog entries have been written already by experienced software engineers saying that they don’t find the AI to make their jobs easier. Boilerplate is what AI does best but boilerplate is easy anyway to the experienced coder. Fixing someone else’s code isn’t easier than writing your own so producing the code the usual way is still preferable. So, yes, it does seem like bosses insisting that developers use AI is just another instance of neoliberal sadism (David Graeber, 2018); exercising power for its own sake.
That support call I had yesterday was perfectly typical except for this one novel aspect: I was asked to deal with an error on a web page where someone had copy-pasted code from an AI to their web site. Our clients are not generally computer specialists. They are in media and entertainment and they need web sites to show off their work but web tech is not their main thing. So in this instance it almost makes sense that some of them might take in the AI hype and use one to produce something for their web site. And now I have to support that.
Later in the day I read the last chapter of Peace On Earth by Stanisław Lem, in which the old spy Adelaide Kramer explains to our hero Ijon Tichy about the global calamity Tichy brought to planet Earth: a virulent microscopic bug that has spread through the world and then attacked and destroyed all software and computer data. The consequences of this are discussed although they don’t mention the Amish saying, “Who’s laughing now, suckers!?”
Now that our insane psychopath tech overlords have been given the federal government as a toy and whatever support they need to help produce monopoly AI power over as much of everyday life as they like, and after that support call, I couldn’t help feeling that I could quite go for global datacide right now. Imagine if all the banks had to go back to paper and pen, that no weapons using any kind of computer would work, and that AI just evaporated. As Kramer put it:
“The poor will be on top now, the Fourth World, because they still have the old Remingtons, maybe even muskets from 1870, and of course spears and boomerangs. Those are now the weapons of mass destruction. We could not withstand an invasion of Australian Aborigines.”