Custom CSS

A professional's perspective on the use of AI tools within the workplace

Photo of the author who wrote this blog post
Oliver Back
10+ min read

AI has already seen a surge in usage within the academic community. Whether it’s a knowledge discovery tool, a proof-reading app, or a writing service – many students will already be using AI as part of their daily workflow.

The question of AI technology integration in the work place is another live debate. Fears of AI systems using any information you feed into them as training data has scared off many big businesses, as well as banks from allowing AI tools of any kind to be installed on their systems.

Daniel Cousins, a cloud computing developer shared their perception on using AI within the workplace, in a fast-paced tech environment.

In the last year or so since OpenAI launched ChatGPT, have you been aware of more of your colleagues talking about or using this technology or other AI tools for writing, paraphrasing or other purposes?


Definitely. Everyone's talking about it. All my friends are talking about it. I went to a tech conference recently. We're talking about lots of new stuff, what's coming in the next year, and everyone's talking about AI, whether it's chatbots for customers or internal tools. A lot of coding related technologies have emerged, which is my main interest, but also I have  friends that are on business courses and one of the biggest things they talk about is tools for Teams or online meeting transcripts. The ability now to generate a summary of the meeting, with actions and takeaways means that you can be  present in the meeting and get involved in discussion without having to worry about typing away at the same time which can be distracting and unproductive. I was a big proponent of using tools to help me study when I was at university. I am very familiar with WolframAlpha, I would heavily use that if I was stuck on a problem. I think GPT would just be another arrow in the arrow in the quiver. At CES recently, everything is about AI, I think I even saw an AI toaster.

A lot of programmers will reuse code via templates, or by copy and pasting from previous projects. It seems like a lot of people are becoming more reliant on ChatGPT instead, where they can simply ask the AI to generate the code for them. Have you had any experience using AI tools for this activity?


Yeah I’ve done exactly that. I had to generate some example schemas, which are certain data types, it's a different structure. And instead of me writing it out and finding the data types, I asked ChatGPT: “What are the main complex data types?” Generate them all in one schema, please. Thank you. Two seconds later, I have it

In addition to being a useful chatbot, perhaps ChatGPT can also be marketed as a productivity solution. Programmers will no longer have to keep track of where they have stored old code they want to reuse - they can go directly to ChatGPT and have it rewritten.

What is your perception of AI tools being used to help programmers to research and implement new technologies?


I was picking up a new language recently called ‘Go’ - a cloud native, sustainable, very fast, lightweight language.  I was struggling to get it set up. And instead of trudging through a guide I just went to ChatGPT and said, “I'm getting this error. What's the problem?” It told me what the problem was, and I fixed it and moved on. Tasks like spinning up prototypes, finding direct answers to your questions without having to trudge through pages of Stack Overflow are made much easier with AI. I think it's making it way more accessible. I see AI as the new Google.
My dad sort of gives me a hard time because he gave us the nickname ‘the information’. Where he had to learn things the hard way, reading books and figuring the answer out by himself, we now have the internet to help us. If I want to figure something out, like how to fix my car, I'll just watch a YouTube video, or Google it and ten people will tell me exactly how to do it with step-by-step instructions. AI could parse all that information for you and give you an answer. And the thing that I think is most cool is it'll get it wrong and you'll say, ‘hey, look, sorry this didn't work, I've got this error instead’. And it'll then give you another prompt to try and get you to the right answer. I think for picking up new tech, it just makes it much easier.

What do you think of AI tools being used to help programmers to read technical literature?


I think the summaries it gives are awesome. Technical documentation is so heavy and dense, there's so much in there it can take you all day to read when you really just need a little snippet. For example, when I was trying to find a complete list of complex data types for this language that we're using, I didn't even bother looking at the docs. I just went to GPT, punched them in, and that was it.

What's your attitude towards using AI tools for communication and documentation in your team?


I think this is probably one of the biggest use cases for most business professionals, whether it's, documenting meetings or generating transcripts and summaries. I think for long meetings, AI make them easier and more productive. Meeting transcripts can be quite lengthy and the entire takeaway could be reduced to a short summary of 5 or 6 lines, and a couple of bullet points. After meetings, I'll listen to the transcript on 2x speed, but if I had one of those tools that could just give me a complete summary that I knew was accurate, it would save a ton of time.
I think it's possible to use AI to clear up a lot of miscommunications as well, because I have team members that maybe don't get every takeaway or they don't realise an action was theirs, and not everyone takes super detailed notes. I actually think it's quite rare to see someone in a meeting converting detailed notes to the main takeaways and actions, because let's be honest, nobody really has the time to do it. Whereas if everyone has access to the summarised chat transcript, there's a lot less room for miscommunication

In the next five years, do you think it's inevitable that tools like ChatGPT will be used by programmers of all levels? How do you feel about the impact this may have on the industry? Do you think the ramifications of this tech will be positive or negative?


Yes, I think it's inevitable. I remember seeing a joke about someone who's getting into programming, and they texted another programmer saying it felt like it was it was cheating to use Google. That's what we all do, just imagine trying to programme today without Google as a resource. Everything would take twice as long, maybe even longer, and I think AI tools are the next evolution of that. Being able to generate code examples instantly, or being given precise feedback on what your error is and how it can be solved is revolutionary. And if it gets it wrong, you can give it feedback saying ‘this didn't work, have you got anything else?’ At the moment, I only really go to ChatGPT when I have tough problems. I'm still sort of stuck in my ways of Stack Overflow and all of those other tools, and it's kind of a fallback for me when I'm really struggling.
But maybe it shouldn't be. Maybe it should be the first protocol, because it always does seem to get there. Or at least get me closer. I think it's widespread use is inevitable; whether that'll be a good thing or a bad thing, I don't know. A lot of the discussion right now at work is sceptical of AI. People don’t seem to think it’s going to take jobs of programmers. They seem to think it’s going to be like a tool in the toolbox, just something to utilise to make people more productive. I certainly hope that that’s the case.

Are you already using or do you currently expect to adopt AI powered tools, such as ChatGPT, as part of your work as a programmer?


I only tend to fall back on it as sort of like a last line of last line of problem solving, when Stack Overflow or Google has failed me or not exactly solved my issue. It does seem to save a lot of time, and expect to use it more for different types of problems. For example, with the generation of those test example schemas, why would I do them myself  when I don't have to? I'm always googling, example schema, primitive data types, or other, and then trying to find someone who's done it before and posted it, or if they already exist. But now I can just go to ChatGPT and it generates them for me in seconds which I can then copy and paste. And I'm off to the races. Sometimes you do need to use a little bit of due diligence and check it over, but I do find myself using it more, and I'm sure that I'll continue to use it as it gets better and becomes more integrated.

Many organisations such as financial institutions and banks do not allow use of AI tools, as they want to control their data. This has led to scepticism and push back due to the security concerns with inputting code examples, data, or industry reports into such systems. What do you think about this?


I don't know where my data is going and I can't be the one posting production code onto the internet for anyone to see, so that is still a bit of a barrier for me. I do still rely on Stack Overflow whenever I run into an issue. I have to break my problem down and come up with an example code snippet that I can stick into it. Sometimes I don't even bother with the code snippet, I'll just sort of write in words what I'm what I'm doing, and it'll get there. But I don't want to copy and paste my code into this online portal that could potentially share sensitive company information with the entire internet.
If there was a corporation approved model, that was integrated into our development environments, so that we could just say, ‘hey, look, how do I solve this problem?’, that would be really useful. I’ve got a friend who is a researcher, and for them data privacy is a massive concern. Everything they're working on is self-hosted. You don’t want anything getting out, it’s the same with data and LLMs. You want your server to run this model and it's not connected to anything. It’s you and only you seeing this data. Because I agree with you. I think those privacy concerns are valid, and as these tools become more widespread and these big corporations get their own models, we’ll be using our own specific models for a specific task, which will be more useful for us as well.

Do you expect the way students are taught how to code to adjust to the use of AI in the same way that teaching mathematics adjusted to the invention of the calculator?


I don't know if the teaching would change, but I think the learning process will change. I can talk about my own learning experience here.  I recently moved into a new position as a Java engineer with very little previous Java experience. I have lots of experience in Python and other languages, but wasn't familiar with Java, so I went through this massive learning curve where I was consuming so much information from educational videos and such. But the best way for me to learn is to just get stuck in, and whenever I have a problem, ask ChatGPT how to solve it. It's like having your own personal teacher that can take your specific problem and and guide you to a solution. I definitely think that that the learning experience of the students will change. They'll have a lot more tools to fall back on.

StackOverflow is a common starting point for a lot of programmers facing challenges with their code. This popular internet forum is host to programming discussions of all levels of expertise and many programmers will use Stack Overflow to either ask a new question or see how someone has already answered a similar question.

Asking questions and sharing knowledge through this type of forum is how a lot of beginners get into learning a new skill or hobby, especially programming.

ChatGPT takes this to another level. Where before, a question would be asked, and edits would be made to include more information – ChatGPT now provides a personalised response. This makes it easier to come to a solution but it doesn’t necessarily require the user to understand what they are doing in order to fix their broken code. So, ChatGPT may become a ‘go to’ tool for a lot of programmers, but it may not result in programmers improving in the same way that StackOverflow does.


I also think there's going to be concern around things becoming a little bit too easy.  Because if you get guidance on Stack Overflow that doesn’t exactly answer your problem, you probably have the basis for figuring out the answer yourself. This is where the real learning takes place. If an AI spits out an answer to every single thing I ever ask it, and I don't have to think that much, am I going to be learning?
I guess we'll see. And the thing is, people always think that whenever you get a new job, you have to know everything. I Google stuff all the time. With ChatGPT, as long as I can prompt it to give me the right result, to get the right output for whatever task I've been assigned, it’s going to help me be more productive in the workplace.