I have now uploaded to the arXiv my paper on the function \(M(x)\) mentioned in a previous blog post (the length of the longest subsequence of the numbers up to \(x\) on which the Euler totient function is nondecreasing). I was able to get an asymptotic for this quantity largely through elementary arguments. More discussion at https://terrytao.wordpress.com/2023/09/06/monotone-non-decreasing-sequences-of-the-euler-totient-function/
I'm finding that while AI tools are not so useful with the core mathematical portions of my research (or perhaps it is simply that I am not tempted to try them on tasks that I already think I can accomplish expertly), but in this case I found AI useful both for generating code (as discussed in previous comments), and also for creating a first draft of the flowchart present in the paper (which used a LaTeX package (tikz) that I have rarely used in the past). Broadly speaking, I am finding that #GPT is allowing me to abstract away the particular language for a computational task (whether it be Python, SAGE, regex, LaTeX, Excel, the arXiv API or whatever); it's almost at the point where I can just formulate my request in natural language to GPT and it will provide suitable code in a suitable language (though, in the absence of full integration, I still have to cut and paste the GPT output into a suitable file to be compiled). It's beginning to change my workflow; I have avoided code-intensive solutions to tasks in the past due to a psychological perception of difficulty and effort, but this is now melting away and I am finding myself much more amenable to doing some coding as part of my daily work.