Large Language Models (LLMs), as with all things hyped up, have a reason for the hype. We have definitely reached the point, though, where hyperbolical statements are making people very cautious—possibly to the point of being dismissive.
However, this is not that message. I am not dismissive of the value of generative AI.
I am very concerned about the AI-fication of all SaaS software. Many applications offering “AI” now have left me wondering: why do you think so little of what your users are doing that you assume it can be done with an LLM?
Productivity Theater SaaS
Yes, I'm looking at you Notion, Coda, etc.
I realize you all are productivity apps and giving you users ways to create text more quickly is like your thing. But by the same token, having the machine write and organize content in many ways goes against the core feature set of pushing your users to engage with the act of writing and organizing content.
Yes, I know you’re also about finding and reading the content as well. But the primary cases I see being pushed are not those. Even then, LLMs are famous for being strangely overconfident in the answers to a search. And yes, I know that’s not the only problem with LLMs1.
My beef is at the making of ideas. Part of the point of creation—both words and images—is that it’s hard. Shoving it all onto an AI to have it do it for you with a generic bland prompts lead to generic bland output and a lack of the users actually engaging with what they're doing—the entire point of your application.
The arrangement being setup is like selling a pedometer and a stand to shake it. I think you missed the point.
Brief praise for LLMs
I love the recent boom in AI and interest in it. You all probably know about ChatGPT at this point, but if you haven’t tried it yet, it’s definitely worth checking out. It will initially make you question a lot of what you thought computers can or cannot do.
I will also put in a plug for Hugging Face for anyone actually interested on owning the models made instead of just making OpenAPI’s better.
I regularly use various generative tools, including LLMs as a brainstorming tool or summarize documents or simply to get over the empty page problem.
These are not my problem use cases, though.
Task Replacement
Go away or I will replace you with a simple shell script - unknown sysadmin
I sincerely doubt that most people’s jobs are fully replaceable by an LLM. If I can replace your job with a generative AI completely, then I question the value of what you were doing in the first place.
Likewise, if I can replace large amounts of the tasks you’re doing with automation that encourages your lack of engagement, then I devalue the outcome.
To be clear, I am a strong believer in automation of tasks. My first big Aha moment for me was when I replaced a 30+ hour task for my boss with a script that took 15 min to compile the data for.
The automation is not the problem. It’s the prominence of them and the use cases it’s encouraging people to use them for.
You still need to think about what you’re doing. You can’t outsource that to an LLM. Products that move you away from engaging in thinking to just producing large quantities of text are of very questionable value to me.
And further, if you replace the core value of interacting with your product, I question whether you believe your product had any core value in the first place.
Sometimes it’ll get a bit unhinged and call you a liar too or make potentially libelous statements.