Jim Clyde Monge | 12 Dec, 2023
Robert Johns | Co-author

What Is Prompt Engineering? Our Guide To Prompt Tuning

New Feature - Listen to this article 

What is AI prompt engineering? And why is AI prompt engineering important? These are great questions that I'll aim to answer in this article.

Before we dive in, here's another question for you! What do you get when you cross an AI model with a human? An AI prompt engineer, of course!

Believe it or not, if you've spent any time working with popular AI tools like ChatGPT, you've already been practicing the art of AI prompt engineering. That's right, you're a budding AI prompt engineer, and you didn't even know it! You didn't even need to learn how to code.

That said, like anything related to AI, there's more than meets the eye to prompt engineering.

Don't worry, I'm going to cover all of that, including best practices for AI prompt engineering, how to use prompt tuning, and more.

So, if you're ready, let's dive in to learn about AI prompt engineering.

What Is AI Prompt Engineering?

Maybe you have a broad idea of what artificial intelligence is, or perhaps you've been using generative AI tools, but you're still not sure what I mean when I say prompt engineering?

No problem, that's why I've written this article!

Prompt engineering is the process of designing prompts that help large language models (LLMs) generate text, translate languages, write different kinds of creative content, and answer your questions in an informative way.

Prompts are essentially instructions that tell the LLM what to do. They can be as simple as a single word or as complex as a paragraph of text.

For example, if you want to know the weather, you could try an AI chatbot like Google Bard and then ask it a question with a suitable prompt. 

Example Prompt: Hi, what is the weather today in Yokohama, Japan?

We asked Google Bard "what is the weather today in Yokohama, Japan?"

How To Construct an Effective AI Prompt

One of the best ways to understand how to create an effective AI prompt is to look at the difference between a well-constructed prompt and a poorly constructed prompt via useful prompt engineering examples.

Let's do that now with a simple prompt engineering guide.

Well-Constructed Prompt

A well-constructed AI prompt is one that is clear, concise, and specific. It should provide the language model with all the information it needs to generate a high-quality response.

Here are some examples of well-constructed AI prompts:

  • Write a poem about a lost love.
  • Write a short story about a robot who falls in love with a human.
  • Translate this sentence from English to Spanish.
  • Write a code that prints the numbers from 1 to 10.

Poorly Constructed Prompt

A poorly constructed AI prompt, on the other hand, is vague, ambiguous, or incomplete. It may not provide the language model with enough information to generate a response, or it may lead the language model to generate a response that is not relevant to the prompt.

Here are some examples of poorly constructed AI prompts:

  • Write something.
  • Tell me a story.
  • Translate this.
  • Write a program.

These may not provide the language model with enough information to generate a response, or they may lead the language model to generate a response that is not relevant to the prompt. 

A great example highlighting the difference between a well-constructed prompt and a poorly constructed prompt can be seen when using any of the best AI image generators, like MidJourney, Dall-E2, and Stable Diffusion. 

Look at the illustrations produced by Dall-E2 when using the following prompt.

Example Prompt: An image of a dog

We used the AI prompt "an image of a dog" in Dall-E2

They look pretty basic, right? Now, let’s upgrade the prompt.

Example Prompt: A photorealistic image of a smiling cute corgi, wearing a blue jumper, full body angle, situated by an open window, through which gentle daylight is streaming, natural light on the scene

We used prompt engineering to get a better result in Dall-E2

Now, isn’t he a lot cuter and happier? These are the important details you need to remember whether you're crafting stable diffusion prompts or working with any other popular AI image tool.

Best Practices for AI Prompt Engineering

Creating effective prompts for AI chatbots can be quite nuanced, but that said, here are some key tips and examples.

1. Make Your AI Prompts Specific

Ensure your instructions are detailed and specific about the task you want the model to perform. The more descriptive the prompt, the better the results. 

For instance, if you want the AI to extract only the animals from a text, your prompt could be:

Example Prompt: Extract the names of animals from the following text.

Desired format:
Animal: <comma_separated_list_of_places>
Input: XXX

Screenshot of how to use prompt engineering for AI

Be mindful of the prompt's length and avoid unnecessary details. Experimentation and iteration are key to optimizing prompts.

2. Make Your AI Prompts Precise

Being too clever or vague can lead to imprecise responses. Be direct and clear in your communication with AI. For instance, if you want a concise explanation of artificial general intelligence (AGI), do not feed it with a prompt like this:

Example Prompt: Explain to me what AGI is. Make it short and simple.

You need to be clearer about how many sentences you need and in what style you want the response. For example, take a look at this improved prompt.

Example Prompt: In 2 to 3 sentences, explain the concept of AGI to a high school student.

3. Avoid Prompts That Ask What Not to Do

Rather than instructing the model what not to do, specify what it should do. This encourages specificity and leads to more accurate responses.

For example, avoid this kind of prompt.

Example Prompt: This is an AI agent that recommends TV series to a customer. Do not ask about customer interests. Do not ask for personal information.

Customer: Please recommend a movie based on my interests.

Agent:

Instead, rephrase the prompt to something like this.

Example Prompt: The following is an AI agent that recommends TV series to a customer. The AI agent should recommend a TV series from IMDB. It should refrain from asking users for their interests and do not ask for personal information. If the agent doesn't have a TV series to recommend, it should respond with "Sorry...". 

Customer: Please recommend a movie based on my interests.

Agent: Sorry...

Prompt Tuning

Now, I've spent some time talking about how to create effective prompts, but let's take a moment to consider the idea of prompt tuning.

But what is this? And what do I mean by prompt tuning?

Great questions. Well, let me ask you a question. How many times have you given a prompt to an AI model only to get an answer that doesn't match up with your desired outputs?

If you're like me, this must have happened a lot, and even after you've spent time carefully designing your prompt!

But let's be fair; this is not the AI model's fault. After all, it's interpreting your prompt as well as it can.

Of course, if we use excellent prompt engineering skills, we might be able to avoid these situations, but even then, there's a good chance you'll want the AI model to tweak its response.

This is where we need prompt tuning, as this is the process of fine-tuning our instructions and prompts for the AI model to produce a response more in line with our expectations.

Think of it like this: prompt tuning is much like giving precise instructions to an incredibly talented but somewhat literal-minded chef. If we don't spend the time to reiterate and reinstruct the chef on how to prepare the dish, we may not get what we truly want.

Let's take a look at some prompt tuning examples for three different topics to see how this all works.

Text Generation: Using Prompt Tuning To Add Clarity

Let's say you asked for a brief history of the internet but ended up with an overly technical response. This mismatch signals the need for query refinement.

The goal here is to transform your original prompt from "Describe the history of the internet" to "Provide a concise, layman-friendly overview of the internet's development, highlighting key milestones."

This adjustment makes your intent for a non-technical summary crystal clear.

Coding: Using Prompt Tuning To Get Precise Solutions

Suppose you're one of the many developers using AI models to help with coding, and you want to search a sorted list in Python via binary search. You'd probably use a prompt like, "Implement a binary search in Python".

But what you didn't mention is that your goal is to explore recursive solutions.

So, you would need to refine your prompt to be something like, "Write a Python function to perform a binary search using recursion. The function should recursively divide the array until the target value is found."

Here, we've used prompt tuning to explicitly request a recursive method, which is ideal for guiding the AI towards our preferred approach.

Side note: I also think this highlights just how far away we are from AI replacing programmers, as these types of miscommunication are very common.

Image Creation: Using Prompt Tuing To Remove Ambiguities

Imagine you requested your preferred AI image generator to create an image of "a bear in a market". The net result is an AI-generated picture of a literal bear in a marketplace.

This is reasonable, but what you actually meant to request was a stock market 'bear' in the marketplace.

It's easy to see how the ambiguity in the original prompt could lead to this unexpected result.

So, to clarify, you would use prompt tuning to revise your prompt to "Create an image of a stock market bear symbol, depicting a downward trending graph with a bear icon, representing a declining market."

This refinement directly addresses the ambiguity, guiding the AI to depict the financial concept of a 'bear market' rather than the animal.

Prompt Tuning In a Nutshell

Hopefully, you can see how I was able to use prompt tuning to bolster my prompt engineering skills and guide the AI to the result I actually wanted for both simple and complex tasks.

Remember that no matter how careful you are with your initial prompt engineering, chances are that you will need to use some prompt tuning to get the right answer for your own needs.

From my experience, the more time I spend creating prompts, the less I need to use prompt tuning, as I've become more familiar with what works and how AI models interpret my prompts.

Equally, some AI models can also learn from their interactions with you and intuit your needs, which is a nice feedback loop where both the AI model and I become better at understanding one another.

Again, this only happens over time after providing many prompts with some prompt tuning, but it's a nice feature.

Overall, just remember that prompt tuning is a dynamic process of give-and-take, where each iteration brings you closer to your desired outcome.

As a fun side project, you can also use the skills you've learned here for prompt engineering and prompt tuning to create your own GPT model with OpenAI, as they offer a fully interactie and no-code builder interface that relies on prompts.

That's quite a full circle moment I'd say!

Applications and Use Cases for Prompt Engineering

As you've probably noticed, there seems to be a new AI model being released every week, whether it's Amazon Q for AWS, Gemini for Google, or the latest offering for OpenAI.

Prompt engineering is useful for each of these models, and you can use it for a wide range of applications.

Let’s take a look at some interesting examples:

  • Generating creative content, such as poems, stories, and scripts
  • Translating languages
  • Answering questions
  • Writing different kinds of content, such as blog posts, articles, and reports
  • Generating code
  • Solving math problems
  • Creating art or images
  • Generating new ideas

Prompt Engineering for Developers

If you’re a developer who wants to dive deeper into AI and prompt engineering, it's always a great idea to consider an AI course to bolster your skills in NLP, Deep Learning, Machine Learning, and more.

I’d also recommend this free prompt engineering course from DeepLearning.AI — sidebar, highly respected Stanford professor Andrew Ng also happens to be the founder of DeepLearning.AI and Coursera.

OpenAI Course photo introducing the instructors

I'm really impressed by this free course, and I think it's great for anyone who wants to quickly learn how to use a large language model (LLM), whether that's to create powerful applications or to learn best practices for creating effective prompts when interacting with AI chatbots.

Wrapping Up

Prompt engineering is a powerful tool that can be used to unleash the full potential of LLMs. By following the best practices outlined in this article, you can create AI prompts that will help you achieve your desired results.

The key to being an effective prompt engineer isn’t so much about knowing the perfect prompt, it’s about having a good process to develop prompts that are effective for your application. — Andrew Ng

I hope you found this article helpful. Feel free to share any other ideas and best practices for better prompt engineering that you know about.

Are you fascinated by the world of AI and want to learn how to unleash its potential? Check out:

Stanford's Artificial Intelligence Professional Program

Frequently Asked Questions

1. What Does A Prompt Engineer Do?

A Prompt Engineer designs and optimizes prompts to effectively communicate with AI models, ensuring accurate and relevant responses.

2. Can Anyone Learn Prompt Engineering?

Yes, anyone can learn Prompt Engineering; it involves understanding AI capabilities and crafting clear, concise prompts to achieve desired outcomes.

By Jim Clyde Monge

A software engineer with 10 years of experience. Jim is a technical writer focused on Generative AI with over 4 million views on Medium. He is also a solopreneur and founder of https://talesfactory.app/. Jim loves to read and draw in his spare time.

View all post by the author

Subscribe to our Newsletter for Articles, News, & Jobs.

I accept the Terms and Conditions.

Disclosure: Hackr.io is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission.

In this article

Learn More

Please login to leave comments