AI Hot Takes for Posterity

screengrab

Who am I?

I’m a software engineer < 1 year into my career. I use ChatGPT a lot, sometimes maybe too much or beyond its purposes (working at the edge of your abilities > working at the edge of ChatGPT’s abilities).

Use-Cases I’m Not Ashamed Of

Acronym hunting

Say I’m reading a blogpost I found on hacker news and a paragraph or sentence mentions two or three acronyms I haven’t heard of. ctrl + c / ctrl + v -> ChatGPT! Since the context is already embedded in whatever I’m giving it, my anecdotal experience is that it does pretty decently defining terms (and I avoid the annoying thing of googling some obscure acronym and getting back some Realtor’s Association in Nevada that shares the acronym I’m hunting for).

‘fix the syntax’

Early in projects, when I hit a wall, I give it a few lines of code that aren’t working and say ‘fix the syntax’. When I’m really brave, I even go back to my broken code, and add pseudocode that I know won’t work but I also know is describing exactly what I want to have happen. The more specific the better.

It’s kind of ‘rubber ducking’ with the LLM, and so that ridiculous exercise of having to explain myself to ‘it’ I think has value on its own anyways. But also, like, ‘ask and you shall receive.’ Sometimes it hits and sometimes it misses, and it’s always hard to phrase specifically exactly what you want. But since this is a hot take, for the record: I am pro-giving-chatGPT-trashy-pseudocode.

Hot Takes

Her vs …Her?

My mental model of chatGPT is that I’m talking to the most boring person I’ve ever met in my life…but it just so happens that they’ve read every book ever written and kinda know everything?

And so when I hear people predict a coming wave of AI boyfriends and girlfriends and platonic friends and all that…I’m not buying it at the moment or in any future I can imagine iterating straightforwardly from the technology we have today. ChatGPT isn’t ‘Her’ as in Scarlett Johannson from the 2013 movie, it’s ‘…Her?’ as in that (pretty mean) joke from Arrested Development. The audio quality of ChatGPT specifically is quite good. The latency when you have a conversation with it is equally impressive. But there’s still just zero personality there. You’re talking to math. And yet I have have many conversations with normal, smart people who talk as if there already are or will someday soon be non-human entities worth socializing with (I’m talking about the ‘kids will learn history by asking Abraham Lincoln questions in the metaverse!’ crowd). Whenever this happens, I quietly think to myself that either this person isn’t up on the technology, or they have no personality themselves.

Your ChatGPT conversations are not interesting to others

I, too, was riveted when Kevin Roose had that bizarre chat with Sydney/Bing and it ran in the NYTimes. But pretty much ever since that day, whenever anyone, anywhere, has told me (or worse, read to me out-loud, verbatim) about this hilarious or whatever thing chatGPT told them…I stop listening. Reading something chatGPT wrote out loud is never funny, interesting, or anything but a waste of time. Full stop.

Other hot takes for posterity