This is an exhaustive white paper written by Lee Boonstra of Google for developers and product designers who want to improve the use of Large Language Models (LLM) through Prompt Engineering. Document around Gemini model and Vertex AI Expansion is also applicable to other models such as GPT, Claude, LLaMA, etc.
Overview of main content:
1. Tips for engineering basics
- Prompt is the primary interface for LLM inputs and the key to guiding the model in generating output.
- A good Prompt combines elements such as mission goals, model selection, parameter configuration, and structural design.
2. LLM output control parameters
- Temperature: Controls the “randomness” or “certainty” of the output.
- Top-K / Top-P(Nucleus Sampling): Control the sampling range.
- Token length: Limit the number of words output, affecting cost and performance.
3. Prompt design method
- Zero-shot、One-shot、Few-shot: No example/one example/multiple examples.
- System Prompting: Set the overall task background and output requirements.
- Role Prompting: Let the model play a certain role (such as a travel consultant).
- Contextual Prompting: Add additional context to help the model understand tasks.
4. advanced techniques
- Step-back Prompting: Ask a general question first and then refine the task.
- Chain of Thought (CoT): Guide the model to think step by step, suitable for complex reasoning tasks.
- Self-consistency: Improve consistency by generating opinions multiple times and taking majority opinions.
- Tree of Thoughts (ToT): A generalized version of multipath reasoning.
- ReAct (Reason + Act): Combine thinking and tool invocation to achieve agent-like interaction.
- Automatic Prompt Generation (APE): Use the model to generate hints, and then filter and optimize them to form an automatic iteration closed loop.
5. Code related tips
- Provide tips for generating code, interpreting code, translating code, debugging code, etc.
6. Multimodal prompt
- Support the use of images, audio, codes, etc. as input and combined with text prompts.
7. best practices
- Providing clear, concise, and specific instructions is better than negative constraints.
- It is recommended to use variables, control the number of tokens, and mix input formats.
- Continuously testing and recording the effectiveness of each prompt will help optimize.
Original Google:https://drive.google.com/file/d/1AbaBYbEa_EbPelsT40-vj64L-2IwUJHy/view? pli=1
Oil tubing: