Read
I replaced Netflix with Claude Code. I lie in bed thinking about what I can spin up before I fall asleep, what can run while I’m unconscious. Reading a novel feels indulgent now. Watching a movie without a laptop open feels wasteful. This voice in my head that says “something could be running right now” just doesn’t shut off. I’m not even building a company. I’m just addicted to building my random ideas.
Well, this just sounds like a miserable way to live life.
Read
It would seem that we are addicted to a new drug, and we don’t understand all of its effects yet. But one of them is massive fatigue, every day.
Read this piece. I don’t agree with everything in it, but it is capturing something very real that is happening right now in companies with employees that are fully embracing AI, particularly in the discipline of software engineering. I found myself resonating with much of it.
It’s not even remotely sustainable for companies to capture 100% of the value from AI. And when employees capture 100% of the value, it will be temporary at best: that company gets beat by someone who’s got the dial turned higher.
So, “Why isn’t it sustainable to fully capture the value of AI?”, you ask?
I’ve argued that AI has turned us all into Jeff Bezos, by automating the easy work, and leaving us with all the difficult decisions, summaries, and problem-solving. I find that I am only really comfortable working at that pace for short bursts of a few hours once or occasionally twice a day, even with lots of practice.
So I guess what I’m trying to say is, the new workday should be three to four hours. For everyone. It may involve 8 hours of hanging out with people. But not doing this crazy vampire thing the whole time. That will kill people.
At the root of this issue is the question: What, exactly are we automating? Writing code is hard. However, writing code is a very small piece in the job of software engineering. The messiness of humans is the rest of it, and I’m pretty confident the mess can’t be easily automated. Producing a significantly-larger volume of code will not result in better software, in most instances. It will not even result in software that ships notably faster, in a lot of scenarios.
I regret the unrealistic standards that I’m contributing to setting. I don’t believe most people can work like I’ve been working. I’m not sure how long Ican work how I’ve been working.
Given the opportunity, I’m confident we will find new norms and a human way of integrating AI into our workflows and our day-to-day. But my concern is that the promise of AI—the temptation of capturing 100% of its value—will not give us this opportunity.
Read
The former, elder worker may find some interest or curiosity in applying her knowledge to this new technology, especially as the modes and methods for doing so are still being developed. But what of the worker who begins their work a decade from now, who has been specialized to do nothing more than ask for something? What will she know beyond that menial, dispiriting little task? What kind of people are we designing now?
“What kind of people are we designing now?” is a haunting line.
We’re watching the first generation who can’t remember before social media enter adulthood right now. Are they the type of humans we intended to design at the onset of social media? What of the first generation to always have AI at their fingertips? If they—potentially—have unfulfilling jobs that simply glue different AI agents together, filling the gaps until the tech is more capable, will that be a generation of humans that humanity will feel proud of designing?