I requested the data source, which is pointing to the Apple Developer Documentation
I call BS on the data source (on the LLM, not you).
Here’s Apple’s official documentation covering NSAlert. There is absolutely nothing on this page mentioning makeFirstResponder. Note that I never found this in any documentation, instead I was assisted at a WWDC Lab in 2019 by an Apple engineer (thank you Troy!).
This page does link to more extensive documentation about dialogs and special panels.
Note the copyright page on this more detailed documentation - 2009. That’s 17 years ago - before the first line of code in Panorama X was written!
On the other hand, the advice to call makeFirstResponder: is correct. Here’s the code in Panorama that does this.
I’m kind of curious where ChatGPT got this information, since it didn’t get it from official Apple documentation. Just now I found ONE reference to this on the web, on a Stack Overflow post from 2019, a few months after my WWDC lab. However, I don’t think this is where this came from, because this post uses Swift, not Objective-C, and it says that makeFirstResponder doesn’t work!
So I have no idea where ChatGPT got this answer. Maybe from scanning a private code repo? (If it was a public repo, I think it should have shown up in Google.) In any case, this is an answer I already got from an Apple engineer, and also an answer that is around a decade old (the problem had already been occurring for at least a couple of years in 2019). Definitely not a Tahoe specific answer.
Some people are apparently having good results using LLMs for JavaScript and Python, where there is a huge corpus of public material that can be used for training. In my opinion, trying to use an LLM for Objective-C and/or AppKit is not going to get productive results. Basically, I think it is a waste of time. There’s unlikely to be more training material becoming available on these topics, so I don’t see this changing ever unless somehow these LLMs get access to private source code repos.
By the way, I’m told that even for Swift and SwiftUI LLMs often are very problematic, possibly because Swift and SwiftUI are changing so rapidly that the training material is badly out of date. So it’s very common for the LLM to spit out obsolete information - of course it presents this obsolete information in a cheerful and confident manner!! 