Optimizers
Module for prompt optimizers.
get_optimizer(config=None, optimizer=None, meta_prompt=None, task_description=None, *args, **kwargs)
Factory function to create and return an optimizer instance based on the provided configuration.
This function selects and instantiates the appropriate optimizer class based on the 'optimizer' field in the config object. Alternatively you can pass the relevant parameters. It supports three types of optimizers: 'dummy', 'evopromptde', 'evopromptga', and 'opro'.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
Config
|
Configuration object containing the optimizer type. |
None
|
optimizer
|
str
|
Identifier for the optimizer to use. Special cases: - "dummy" for DummyOptimizer - Any other string for the specified optimizer class |
None
|
include_task_desc
|
bool
|
Flag to include task description in the prompt. |
required |
meta_prompt
|
str
|
Meta prompt for the optimizer. |
None
|
task_description
|
str
|
Task description for the optimizer. |
None
|
*args
|
Variable length argument list passed to the optimizer constructor. |
()
|
|
**kwargs
|
Arbitrary keyword arguments passed to the optimizer constructor |
{}
|
Returns:
Type | Description |
---|---|
An instance of the specified optimizer class. |
Raises:
Type | Description |
---|---|
ValueError
|
If an unknown optimizer type is specified in the config. |
Source code in promptolution/optimizers/__init__.py
base_optimizer
Base class for prompt optimizers.
BaseOptimizer
Bases: ABC
Abstract base class for prompt optimizers.
This class defines the basic structure and interface for prompt optimization algorithms.
Concrete optimizer implementations should inherit from this class and implement
the optimize
method.
Attributes:
Name | Type | Description |
---|---|---|
prompts |
List[str]
|
List of current prompts being optimized. |
task |
BaseTask
|
The task object used for evaluating prompts. |
callbacks |
List[Callable]
|
List of callback functions to be called during optimization. |
predictor |
The predictor used for prompt evaluation (if applicable). |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
initial_prompts
|
List[str]
|
Initial set of prompts to start optimization with. |
required |
task
|
BaseTask
|
Task object for prompt evaluation. |
required |
callbacks
|
List[Callable]
|
List of callback functions. Defaults to an empty list. |
[]
|
predictor
|
optional
|
Predictor for prompt evaluation. Defaults to None. |
None
|
Source code in promptolution/optimizers/base_optimizer.py
__init__(initial_prompts, task, callbacks=[], predictor=None, verbosity=0)
Initialize the BaseOptimizer.
Source code in promptolution/optimizers/base_optimizer.py
optimize(n_steps)
abstractmethod
Abstract method to perform the optimization process.
This method should be implemented by concrete optimizer classes to define the specific optimization algorithm.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n_steps
|
int
|
Number of optimization steps to perform. |
required |
Returns:
Type | Description |
---|---|
List[str]
|
List[str]: The optimized list of prompts after all steps. |
Raises:
Type | Description |
---|---|
NotImplementedError
|
If not implemented by a concrete class. |
Source code in promptolution/optimizers/base_optimizer.py
DummyOptimizer
Bases: BaseOptimizer
A dummy optimizer that doesn't perform any actual optimization.
This optimizer simply returns the initial prompts without modification. It's useful for testing or as a baseline comparison.
Attributes:
Name | Type | Description |
---|---|---|
prompts |
List[str]
|
List of prompts (unchanged from initialization). |
callbacks |
List[Callable]
|
Empty list of callbacks. |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
initial_prompts
|
List[str]
|
Initial set of prompts. |
required |
*args
|
Variable length argument list (unused). |
()
|
|
**kwargs
|
Arbitrary keyword arguments (unused). |
{}
|
Source code in promptolution/optimizers/base_optimizer.py
__init__(initial_prompts, *args, **kwargs)
optimize(n_steps)
Simulate an optimization process without actually modifying the prompts.
This method calls the callback methods to simulate a complete optimization cycle, but returns the initial prompts unchanged.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n_steps
|
int
|
Number of optimization steps (unused in this implementation). |
required |
Returns:
Type | Description |
---|---|
list[str]
|
List[str]: The original list of prompts, unchanged. |
Source code in promptolution/optimizers/base_optimizer.py
evoprompt_de
Module for EvoPromptDE optimizer.
EvoPromptDE
Bases: BaseOptimizer
EvoPromptDE: Differential Evolution-based Prompt Optimizer.
This class implements a differential evolution algorithm for optimizing prompts in large language models. It is adapted from the paper "Connecting Large Language Models with Evolutionary Algorithms Yields Powerful Prompt Optimizers" by Guo et al., 2023.
The optimizer uses a differential evolution strategy to generate new prompts from existing ones, with an option to use the current best prompt as a donor.
Attributes:
Name | Type | Description |
---|---|---|
prompt_template |
str
|
Template for generating meta-prompts during evolution. |
donor_random |
bool
|
If False, uses the current best prompt as a donor; if True, uses a random prompt. |
meta_llm |
Language model used for generating child prompts from meta-prompts. |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt_template
|
str
|
Template for meta-prompts. |
None
|
meta_llm
|
BaseLLM
|
Language model for child prompt generation. |
None
|
donor_random
|
bool
|
Whether to use a random donor. Defaults to False. |
False
|
**args
|
Additional arguments passed to the BaseOptimizer. |
{}
|
Source code in promptolution/optimizers/evoprompt_de.py
__init__(prompt_template=None, meta_llm=None, donor_random=False, n_eval_samples=20, **args)
Initialize the EvoPromptDE optimizer.
Source code in promptolution/optimizers/evoprompt_de.py
optimize(n_steps)
Perform the optimization process for a specified number of steps.
This method iteratively improves the prompts using a differential evolution strategy. It evaluates prompts, generates new prompts using the DE algorithm, and replaces prompts if the new ones perform better.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n_steps
|
int
|
Number of optimization steps to perform. |
required |
Returns:
Type | Description |
---|---|
List[str]
|
List[str]: The optimized list of prompts after all steps. |
Source code in promptolution/optimizers/evoprompt_de.py
evoprompt_ga
Module for EvoPromptGA optimizer.
EvoPromptGA
Bases: BaseOptimizer
EvoPromptGA: Genetic Algorithm-based Prompt Optimizer.
This class implements a genetic algorithm for optimizing prompts in large language models. It is adapted from the paper "Connecting Large Language Models with Evolutionary Algorithms Yields Powerful Prompt Optimizers" by Guo et al., 2023.
The optimizer uses crossover operations to generate new prompts from existing ones, with different selection methods available for choosing parent prompts.
Attributes:
Name | Type | Description |
---|---|---|
prompt_template |
str
|
Template for generating meta-prompts during crossover. |
meta_llm |
Language model used for generating child prompts from meta-prompts. |
|
selection_mode |
str
|
Method for selecting parent prompts ('random', 'wheel', or 'tour'). |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt_template
|
str
|
Template for meta-prompts. |
None
|
meta_llm
|
BaseLLM
|
Language model for child prompt generation. |
None
|
selection_mode
|
str
|
Parent selection method. Defaults to "wheel". |
'wheel'
|
**args
|
Additional arguments passed to the BaseOptimizer. |
{}
|
Raises:
Type | Description |
---|---|
AssertionError
|
If an invalid selection mode is provided. |
Source code in promptolution/optimizers/evoprompt_ga.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 |
|
__init__(prompt_template=None, meta_llm=None, selection_mode='wheel', n_eval_samples=20, **args)
Initialize the EvoPromptGA optimizer.
Source code in promptolution/optimizers/evoprompt_ga.py
optimize(n_steps)
Perform the optimization process for a specified number of steps.
This method iteratively improves the prompts using genetic algorithm techniques. It evaluates prompts, performs crossover to generate new prompts, and selects the best prompts for the next generation.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n_steps
|
int
|
Number of optimization steps to perform. |
required |
Returns:
Type | Description |
---|---|
List[str]
|
List[str]: The optimized list of prompts after all steps. |
Source code in promptolution/optimizers/evoprompt_ga.py
opro
Module for OPRO.
Opro
Bases: BaseOptimizer
Opro: Optimization by PROmpting.
Proposed by the paper "Large Language Models as Optimizers" by Yang et. al: https://arxiv.org/abs/2309.03409. This Optimizer works by providing the Meta-LLM with a task-description, as well as previous prompts with their respective score.
Attributes:
Name | Type | Description |
---|---|---|
llm |
BaseLLM
|
The Meta-LLM to optimize. |
n_samples |
int
|
The number of samples from the task dataset to show the Meta-LLM. |
Methods:
Name | Description |
---|---|
_sample_examples |
Sample examples from the task dataset. |
_format_old_instructions |
Format the previous prompts and their scores. |
optimize |
Optimize the Meta-LLM by providing it with a new prompt. |
Source code in promptolution/optimizers/opro.py
__init__(meta_llm, n_samples=2, prompt_template=None, **args)
Initialize the Opro optimizer.
Source code in promptolution/optimizers/opro.py
optimize(n_steps)
Optimize the Meta-LLM by providing it with a new prompt.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n_steps
|
int
|
The number of optimization steps to perform. |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
List[str]
|
The best prompt found by the optimizer. |