Coding with ChatGPT
I started using ChatGPT when it came out a few months ago. It was mind blowing to chat with a computer and have it feel almost like a real person.
Some people are talking about how it's going to replace all sorts of jobs, including software developers. I'm not sure about that. But I have found some ways that it can definitely make our jobs easier, if we understand how it works and use it cautiously.
Understanding the limitations
Like all Large Language Models (LLMs), ChatGPT has been trained on a massive quantity of text from the Internet. Its basically a function which takes a context as input, including your prompt and the rest of your chat log, up to a limit of roughly 2,000 words. Based on that context, it is trying to make an educated guess of what should come next. Specifically, it's trying to predict what the people who trained it would have voted as the best response.
So when you're using it for coding, or anything else, always keep in mind that it is a guessing machine. Yes, it has been trained with a large amount of information, but not all that information is correct or up to date, and, most importantly, it's very good at completely making things up while exuding confidence.
It's amazing that ChatGPT can run some statistical analysis on words written by humans, and suddenly people are wondering if this thing is going to take over the world (no), take our jobs (maybe), or become self aware (definitely not). Statistically analyzing text becomes an excellent way to imitate humans, and come up with text that looks extremely plausible. GPT probably stands for "Guessing Plausible Text" (j/k).
Unfortunately, in programming, plausible doesn't cut it. We need things to be precisely accurate. It can quickly become frustrating to use an LLM as a tool to help you with programming, because it's wrong so often.
There's still hope. I've found it can still be a very powerful and helpful tool, even for developers.
Researching with ChatGPT
I think a very natural but dangerous way to use ChatGPT is as a search engine, to ask it factual questions. The problem here is that it's like playing "two truths and a lie". A lot of what it says is certainly true, but there's absolutely no way to know which parts are completely made up.
Even knowing this, I find myself using it this way anyway, but with a caveat. You need to treat ChatGPT as if it's your know-it-all friend who will go on and on confidently about any topic, even ones he is actually clueless about. I've learned about lots of new tools and features with ChatGPT, and some of them really did exist!
One trick is to ask for references. This is as simple as adding "Give references." to your prompts, or asking for them after. For coding topics, ChatGPT will usually be able to give you URLs you can click on to specific official documentation, and that is very useful.
Clicking those links to follow-up is absolutely critical here, because very often ChatGPT has told me how to do something using some specific API or function, and it has turned out to have been making it up. These situations did not save me any time, it actually wasted my time.
All that said, I love how ChatGPT can introduce me to all sorts of things I've never heard of before. Searching on Google would have required me clicking on dozens of semi-related pages and skimming through. ChatGPT is excellent at summarizing content, so you can take advantage of that.
Here's where ChatGPT can really shine: Let's say you have some specific software architectural challenge in front of you and you're not sure how to approach it. Open up ChatGPT and write it out in as much detail as you can.
Seconds later, you'll have a list of options, some of which you may not have heard of, and links to read more about each one. If there's one you like, or if you have any follow up questions, you can just say "Tell me more about #2". Or you can provide more detail with your specific requirements to refine it's suggestions.
You always need to be careful, because I find that the more specific you get, the more likely you're going to encourage it to make up something that doesn't exist. Always ask for references, and don't make a decision until you've followed up on other websites to verify what ChatGPT says.
Transforming code and text
There are some low-risk and highly effective uses of ChatGPT, and transforming content is one of them. You can paste in some code or text, and ask it to rewrite it in some specific way. In these cases, it seems much less likely to make an error, and if it does make a mistake, you should be able to recognize it and refine your request quickly.
I've pasted in an email from a client with a list of described menu options, plus a snippet of Svelte code with a few placeholders in the menu, and asked ChatGPT to add all the menu options into the code. It handled this very well.
If you ever have these sorts of straightforward boring text transformation jobs in front of you, and your IDE isn't up to the job, try asking ChatGPT to do it for you, and save the headache.
Understanding & improving code
ChatGPT is excellent at summarizing any type of content, and that includes code.
Just paste in a chunk of code and it'll be able to tell you what the code does. You can ask it to add inline comments to the code for you too, though Copilot is quite good at this too.
If you get a weird error message, ChatGPT might be able to give you an explanation of why the error might have happened, and some possible ways to fix the error. Unlike a webpage, you can ask follow up questions in realtime and get feedback to help you find a solution.
I've also had success pasting in a freshly written function or module, and asking ChatGPT to suggest improvements. It's told me ways to improve error handling, or some cases I hadn't thought of where things might break. It's even found a few bugs in my code, and showed me how to fix them. If you work alone, it's nice to use ChatGPT for feedback and review, and maybe you'll learn something new too.
Coding with ChatGPT
ChatGPT is very capable of writing code. However, like everything else it does, it often makes mistakes.
In my experience, the code written by ChatGPT is rarely perfect on the first try. Very often, the code will try to do something that isn't possible, or misunderstand what was being asked of it. I guess that's true of code written by humans too.
When you're asking ChatGPT to write code for you, it's up to you to run the code and paste back any error messages or problems into the chat, asking for fixes. In a way, it's like the roles are reversed. You're no longer the programmer, but stuck between the AI and the compiler. I have to say, this is not a very fun place to be. I would much rather just make changes to the code myself, than to try different prompts until ChatGPT is able to generate the right code. Often it's faster to type the code you specifically want and need than to type some prompts and wait to see if ChatGPT has made it correct.
It's almost like working with a junior developer, except that a junior developer is capable of learning and improving and eventually becoming a senior developer. ChatGPT, on the other hand, isn't learning anything from you over the long term. It might learn from you in the short term, but remember, the context of an LLM is limited, and that means that it will soon forget the suggestions you made for improvement.
If, on the other hand, you're new to programming, then ChatGPT is going to be extremely helpful and time saving. I've seen lots of new developers have great success using ChatGPT in this way, to do things they don't know how to do. I believe ChatGPT and similar tools will enable a lot more people to get into coding, and that's really exciting.
Even as an experienced developer, we're always learning new things. Having ChatGPT lead the way and provide feedback in a new programming language or library can be extremely helpful. Just be wary that it's very likely to make mistakes, so you still need to understand what the code is doing. Never trust code written by an AI, just as you wouldn't trust any code you find on the Internet. Ultimately, code generated by an LLM is coming from code from the Internet, security issues and all.
Fortunately, ChatGPT makes some of this easier for you. As mentioned above, you can ask ChatGPT to explain the code it's written, or look for bugs. Sometimes it's worth doing this with the code it just generated. It's kind of funny how that's possible. Since it generates a word at a time, it can't often go back and fix its own mistakes during generation. So if you ask it if it made any mistakes, sometimes it'll be able to spot the mistake right away and write a better version.
Ask for small, simple code snippets
To be honest, I haven't enjoyed having ChatGPT generate large amounts of code for me. It hasn't seemed to saved me much time, it just changed how I spent my time. I've had more success asking it to do smaller, more limited things.
It's really good at writing SQL queries for you. Paste in the table schema and tell it what you're looking to query. You can also be specific about which programming language and library you're using to connect to the database. I think this will be very helpful to a lot of people.
It can also generate things like regular expressions, or other complex code, based on your description. More detail is always better here, including specific examples of edge cases.
Ask it to generate some boilerplate code for you, to give you a head start. Or, paste in the specifications from your manager and have it attempt a first draft for you to use as a starting point. Depending on your skill level, you might prefer to move into your editor and do the rest of the coding from here.
It's important that you're able to quickly test what it generated and verify that it works as expected. You can even paste in some code and ask ChatGPT to generate some unit tests for you. You can use it with Test Driven Development, pasting in some unit tests and ask it to write the code. You can even ask ChatGPT to generate some test code alongside any other code it generates, by including in your prompt something like "Write tests for the code too."
Comparison to Copilot
As I've written about before, I really enjoy using GitHub Copilot, and it helps me to be more productive. Copilot also uses GPT, but it's doing so in a more focused way that automatically takes your code into its context. It's very good at suggesting code while you're writing it, suggesting comments for your code, or generating code based on your comments. ChatGPT hasn't at all replaced my use of Copilot. If anything, it has made me appreciate Copilot more, and encouraged me to use Copilot in more creative ways. I've found myself bringing up the Copilot suggestions panel more often, to see the variety of suggestions available, and very often there are some better and more useful snippets available in here.
For some reason, using Copilot is less misleading. When Copilot makes a wrong suggestion, it doesn't bother me. Perhaps it's because there's no confidence here, everything is just a "suggestion".
ChatGPT is of course better at discussing and explaining things in plain language. Microsoft is already planning to integrate a chat interface into Copilot, so-called "GitHub Copilot X". You can sign up for the beta if you want to get early access to Copilot chat. I'm really looking forward to this, as it'll likely be a lot more useful for coding than ChatGPT currently is.
It's not a human
It's very important to keep in mind that ChatGPT is not a person. It's a statistically-driven guessing machine.
Like a human, it makes mistakes, but it won't tell you how sure or unsure it is about being right.
Like a human, it's trying to generate responses it thinks you'll like, but it has no feelings and will never be your friend.
Like a human, it has biases that it's not aware of and can't articulate, but it's incapable of growing and learning from you over time.
It can be hard to talk to a machine like this without all the baggage we've picked up from talking to real humans. I find myself saying "please" and "thank you" when I really don't need to.
I think we need to create a new place in our brains for interacting with things like this.
It's ok to be a bit blunt and succinct. Often it's necessary to be extra explicit, and state things that might otherwise seem obvious. You don't need to spare the feelings of these guessing machines. You need to tell it whenever it's wrong and ask it to fix its mistakes. You can tell it to "be succinct", to "skip unnecessary phrases" and "just output the code" and other commands which speed it up and tailor the output to your preferences. You may need to repeat these phrases regularly, and you may likely find some new patterns that work well for you.
Try it for yourself, have fun
I've outlined some of the approaches that have worked for me, but I suggest you try it out yourself and see what works for you. I think it's worth experimenting and finding a way for ChatGPT and other AI tools to help you out in your work.
These tools should make your life better, and make work more fun. The goal isn't just to save time, but to enjoy the process.
When you're feeling stuck, you can use ChatGPT as a mentor to help you get unstuck. When you want to bounce some ideas off someone, ChatGPT can give you helpful suggestions.
Save the fun coding stuff for yourself, and leave the boring parts for ChatGPT.