Claude3 Generates Better Snake Game Than ChatGPT4
With the release of Claude3 (free), I put it up against my current ChatGPT4 Pro subscription.
1. Writing a satirical job description. Both did fine, but it’s difficult to be objective about which is “better”. I can say (that in all the tasks) Claude3 generated the output much quicker than ChatGPT (for now - we’ll see how speed changes with popularity):
2. Calculating Pi to 1000 decimal places and a sort function in PHP. Here the winner was ChatGPT for the sort function, but Claude for calculating pi; the winners used an existing library rather than writing a custom function:
3. Ok the above two tests weren’t very helpful for the “real world”. Let’s get down to business, a game of Snake. The input to each was “create html file of the snake game, using javascript”, both required a reply to fix bugs. Both games work, but I prefer playing the Claude version. Press any arrow key to start playing:
The ChatGTP version had a few UI issues, like letting the snake slightly out of the box, not being as pretty, and not embedding well in other webpages. So we just have screenshot below:
Summary
Both ChatGPT and Claude did the job, but I’d say overall, Claude is faster, better, and cheaper (I was using Claude-free against GPT-pro). There’s a paid version of Claude that’s apparently even better, will have to investigate that next time. And if Claude gains traction, we’ll see how the speed and costs change over time.
About the Author
Matthew Whyte was born in rural Waikato and grew up in Tonga, before returning for tertiary studies at the University of Waikato. He completed a degree in computer science, including artificial intelligence, in the mid-2000s. At the time AI wasn’t very smart, so he focussed on web development and hosting. He has since been involved in the telecommunications and software development sectors, working in Aotearoa and the EU.