• Generating questions for interviews. ChatGPT is surprisingly great at that.
  • Generating images.
  • Occasionally writing draft introductions to articles, as well as conclusions, descriptions and summaries. I’ve always had trouble writing that kind of thing. I don’t use the version ChatGPT generates—I tear that up and write my own—but ChatGPT gets me started. I don’t do this often, but I’m grateful when I do.
  • Casual low-stakes queries, when I remember to use ChatGPT for that. “What was the name of the movie that was set in a boarding house for actresses that starred Katherine Hepburn?” “Stage Door.” “Was Lucille Ball in that one too?” “Yes.” “Was that Katherine Hepburn’s first movie?” “No.” And ChatGPT provided some additional information. I probably could have gotten that information from Google, but ChatGPT was faster.
  • I find otter.ai extremely useful for transcriptions, likewise Grammarly for proofreading. Those applications use AI, but do they use GenAI? I don’t know.

My big problem, and the reason I don’t us ChatGPT more, is that ChatGPT lies. Not only that, but it lies convincingly. A convincing liar is even worse than a liar. I don’t have much use for an information source that I can’t trust. I don’t see an obvious way to solve this problem.