Many software engineers are in denial about the potential threat posed by ChatGPT, a natural language processing tool, to their jobs. In this article, we will explore some of the counterarguments and deflections given by engineers, and why they are unable to admit that their jobs may be at risk.
Critics often argue that, just like airline pilots are still needed despite the existence of fully automated planes, software engineers are still necessary despite the presence of ChatGPT. However, this analogy falls short when considering the difference between the two fields.
In the event of a malfunction, an airline pilot has a limited amount of time to save hundreds of lives, whereas a software engineer’s ability to save lives is limited by the fragility of software. Furthermore, software engineers often have responsibilities such as collaboration, consultation, and mentoring, which are difficult for an AI to replace. In a thought experiment, imagine a company in which all engineers go on strike, leaving behind a black box (ChatGPT) to assist non-engineer employees in solving technical issues.
ChatGPT’s ability to quickly understand and respond to non-specialist questions allows employees to find solutions faster than a software engineer could, given their limited understanding of the business’s requirements. It is likely that in the future, those who can ask ChatGPT the best questions will become the most effective software engineers.

Google sister company DeepMind has released research showing that its AlphaCode artificial intelligence (AI) system can compete with humans in solving simple computer-science problems. AlphaCode is a large language model, a system based on neural networks that learns to perform tasks by digesting large amounts of existing human-generated text. The system, which was released in February, was trained to answer questions from software-writing contests. It beat around half of human participants in code competitions. Microsoft and Amazon have also developed code-writing tools.
AlphaCode and ChatGPT, two large language models trained on human-generated text, have been causing a stir among AI researchers and social media users, however they are not yet advanced enough to replace human programmers. AlphaCode is more specialised, having been trained on human answers to software-writing competitions, while ChatGPT is a general-purpose conversation engine. While the two systems are based on the same architecture, their main difference is the data sets on which they are trained. The ability of machines to generate large-scale software systems from scratch is unclear, however tools such as these could become “second-nature” for programmers.
AlphaCode or ChatGPT or other AI tools can write better code, it can replace average programmers. But eventually humans will take the back seat, and AI will be doing most of the heavy lifting in programming front. But Humans will be pivotal behind the AI revolution, and if the AI ethics are followed properly, AI can very well be a tool to help us, the second nature!