optimization_variable
Graph.optimization_variable(count, lower_bound, upper_bound, is_lower_unbounded=False, is_upper_unbounded=False, initial_values=None, *, name=None)
Create a 1D Tensor of optimization variables, which can be bounded, semi-bounded, or unbounded.
Use this function to create a sequence of variables that can be tuned by the optimizer (within specified bounds) in order to minimize the cost function.
Parameters
- count (int) – The number
- lower_bound (float) – The lower bound
- upper_bound (float) – The upper bound
- is_lower_unbounded (bool , optional) – Defaults to False. Set this flag to True to define a semi-bounded variable with lower bound
- is_upper_unbounded (bool , optional) – Defaults to False. Set this flag to True to define a semi-bounded variable with upper bound
- initial_values (np.ndarray or List [ np.ndarray ] or None , optional) – Initial values for the optimization variable. You can either provide a single initial value, or a list of them. Note that all optimization variables in a graph with non-default initial values must have the same length. That is, you must set them either as a single array or a list of arrays of the same length. Defaults to None, meaning the optimizer initializes the variables with random values.
- name (str or None , optional) – The name of the node.
Returns
The sequence of optimization variables. If both is_lower_unbounded and is_upper_unbounded are False, these variables are bounded such that . If one of the flags is True (for example is_lower_unbounded=True), these variables are semi-bounded (for example ). If both of them are True, then these variables are unbounded and satisfy that
Return type
SEE ALSO
Graph.anchored_difference_bounded_variables
: Create anchored optimization variables with a difference bound.
Graph.complex_optimizable_pwc_signal
: Create a complex optimizable Pwc signal.
Graph.optimizable_scalar
: Create an optimization scalar.
Graph.real_optimizable_pwc_signal
: Create a real optimizable Pwc signal.
boulderopal.run_optimization
: Function to find the minimum of a generic function.
Examples
Perform a simple optimization task.
>>> variables = graph.optimization_variable(
... 2, lower_bound=0, upper_bound=1, name="variables"
... )
>>> x = variables[0]
>>> y = variables[1]
>>> cost = (x - 0.1) ** 2 + graph.sin(y) ** 2
>>> cost.name = "cost"
>>> result = bo.run_optimization(
... graph=graph, cost_node_name="cost", output_node_names="variables"
... )
>>> result["cost"]
0.0
>>> result["output"]["variables"]["value"]
array([0.1, 0.])
See examples about optimal control of quantum systems in the How to optimize controls in arbitrary quantum systems using graphs user guide.