Q&A about ChatGPT

I used Copilot to edit my article in Ukrainian that I wrote for an online portal. It did a great job. But I wouldn’t do it for my research. It won’t work. Nor would I do it for writing in English because I do it much better myself.

In my field, which is literary criticism, the quality of writing is half the value of the piece. And even Claude (which is the better AI for writing) doesn’t get beyond a gushy Filipina type of writing.

The best use I have found for AI so far is to have it write a piece of code to clean out my Gmail inbox. The code only needed a small tweak to work moderately fine.

But please rest assured, every word of Neoliberal Love was written by me.

11 thoughts on “Q&A about ChatGPT

  1. Do you foresee at some point the tool being good enough that you would use it? What would have to happen in order for you to find it useful for writing?

    I am a proponent of not using it even for science writing. First, it is unreliable and it hallucinates. Cross checking everything it writes takes time and I may as well do the writing myself. Importantly though, writing is thinking. Writing proposals and papers often clarifies my thinking about the problem and leads to new insights. Outsourcing that to a tool seems like a bad idea. At the same time, I know a good number of people in the field use it so I am just wondering if I am going about this the wrong way.

    Liked by 1 person

  2. I am emphatically against generative AI. I hate it with a passion, and given I’m in STEM it really makes my gut churn to see how gleefully my colleagues use it.

    I am also steeped in the arts and writing social media, where creatives are being robbed blind of their work to train these environment-destroying plagiarism machines. My first novel was among the texts stolen in that big Meta AI training sweep.

    AI should be used for stuff humans fundamentally do, like test billions of compounds for potential antibiotic properties (apparently, they did that recently and managed to get 11 very new compounds that work as broad-spectrum antibiotics and better than those we currently have as first line of defense.

    Like

      1. I know for a fact that people are using it to write book proposals in sciences. I was surprised the first time I have seen it (publishing companies are running everything through the AI recognition/plagiarism tools – I was too naive to even consider it), but now I almost expect it. And, with some of the crazy reviews my manuscripts have been getting recently, I do wonder if some of the reviewers are outsourcing their duties to the AI as well.

        Liked by 1 person

      2. People use it for everything these days. There are already studies on how AI has influenced the language people themselves use. The simplest example is when it uses certain words unusually frequently given their distribution in the available English-language text (eg “delve”), so people start using them more as well.

        AI requires a shit ton of human work to label things and guide it (this is referred to as “feedback”). The people hired to do that work are from poorer countries, of course. So the language AI uses is influenced, for example, by Nigerian English more than you’d expect. Maybe they tend to use “delve” a lot.

        Like

    1. Totally.

      Like my psychoanalyst used to say, rubbing his hands in eager anticipation, “well, what other form of enjoyment can we take away from you today?”

      These were unhealthy forms of enjoyment but still.

      Like

  3. I have received a lot of good coding help from ChatGPT. I’m not doing any very serious coding but trying to convert Google Map API to Mapbox on WordPress was beyond my coding pay grade.

    Like

Leave a reply to Ed Frey Cancel reply