How to prompt on GPT-o1
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. More information
OpenAI‘s newest model family, GPT-o1, promises to be more powerful and better reasoning than previous models.
Using GPT-o1 will be slightly different than asking GPT-4 or even GPT-4o. Because this model has more reasoning capabilities, some regular prompt engineering methods won’t work as well. Previous models required more guidance, and people used longer context windows to provide more instructions to the models.
According to OpenAI’s API documentationthe o1 models “perform best with clear instructions.” Techniques such as instructing the model and giving shot cues “may not improve performance and may sometimes hinder it.”
OpenAI advised o1 users to think about four things when asking for the new models:
- Keep the directions simple and direct and don’t direct the model too much as it understands the instructions well
- Avoid a series of thoughts, as o1 models already reason internally
- Use separators such as triple-quote markets, XML tags, and section titles to give the model clarity about which sections it is interpreting
- Limit additional context for augmented generation (RAG) retrieval, as OpenAI said adding more context or documents when using the models for RAG tasks could overcomplicate their response
OpenAI’s advice for o1 varies greatly from the suggestions it gave to users of its previous models. Previously the company proposed to be incredibly specific, including details and providing step-by-step instructions for models, GPTo1 will be better able to ‘think’ for itself on how to solve questions.
Ethan Mollick, a professor at the University of Pennsylvania’s Wharton School of Business, said in his One Useful Thing blog that his experience as an early adopter of o1 showed that it works better on tasks that require planning, where the model infers how to solve problems independently.
Fast engineering and easier model creation
Prompt engineering naturally became a method by which people could dig into details and get the answers they wanted from an AI model. It has become not only an important skill, but also an expanding job category.
Other AI developers have released tools to make it easier to create prompts when designing AI applications. Google launched Prompt Poet, built using Character.ai, which integrates external data sources to make responses more relevant.
GPT-o1 is still new and people are still figuring out exactly how to use it (including myself, who has yet to figure out my first prompt). However, some social media users predict that people will have to change the way they approach ChatGPT.
Source link