graph – Interface for the PyTensor graph#
Core graph classes.
- class pytensor.graph.basic.Apply(op, inputs, outputs)[source]#
A
Noderepresenting the application of an operation to inputs.Basically, an
Applyinstance is an object that represents the Python statementoutputs = op(*inputs).This class is typically instantiated by a
Op.make_nodemethod, which is called byOp.__call__.The function
pytensor.compile.function.functionusesApply.inputstogether withVariable.ownerto search the expression graph and determine which inputs are necessary to compute the function’s outputs.A
Linkeruses theApplyinstance’sopfield to compute numeric values for the output variables.Notes
The
Variable.ownerfield of eachApply.outputselement is set toselfinApply.make_node.If an output element has an owner that is neither
Nonenorself, then aValueErrorexception will be raised.- clone(clone_inner_graph=False)[source]#
Clone this
Applyinstance.- Parameters:
clone_inner_graph – If
True, cloneHasInnerGraphOps and their inner-graphs.- Return type:
A new
Applyinstance with new outputs.
Notes
Tags are copied from
selfto the returned instance.
- clone_with_new_inputs(inputs, strict=True, clone_inner_graph=False)[source]#
Duplicate this
Applyinstance in a new graph.- Parameters:
inputs (list of Variables) – List of
Variableinstances to use as inputs.strict (bool) – If
True, the type fields of all the inputs must be equal to the current ones (or compatible, for instanceTensorTypeof the same dtype and broadcastable patterns, in which case they will be converted into currentType), and returned outputs are guaranteed to have the same types asself.outputs. IfFalse, then there’s no guarantee that the clone’s outputs will have the same types asself.outputs, and cloning may not even be possible (it depends on theOp).clone_inner_graph (bool) – If
True, cloneHasInnerGraphOps and their inner-graphs.
- Returns:
An
Applyinstance with the sameOpbut different outputs.- Return type:
object
- default_output()[source]#
Returns the default output for this node.
- Returns:
An element of self.outputs, typically self.outputs[0].
- Return type:
Variable instance
Notes
May raise AttributeError self.op.default_output is out of range, or if there are multiple outputs and self.op.default_output does not exist.
- class pytensor.graph.basic.AtomicVariable(type, name=None, **kwargs)[source]#
A node type that has no ancestors and should never be considered an input to a graph.
- clone(**kwargs)[source]#
Return a new, un-owned
Variablelikeself.- Parameters:
**kwargs (dict) – Optional “name” keyword argument for the copied instance. Same as
self.nameif value not provided.- Returns:
A new
Variableinstance with no owner or index.- Return type:
Variable instance
Notes
Tags and names are copied to the returned instance.
- class pytensor.graph.basic.Constant(type, data, name=None)[source]#
A
Variablewith a fixeddatafield.Constantnodes make numerous optimizations possible (e.g. constant in-lining in C code, constant folding, etc.)Notes
The data field is filtered by what is provided in the constructor for the
Constant’s type field.- clone(**kwargs)[source]#
Return a new, un-owned
Variablelikeself.- Parameters:
**kwargs (dict) – Optional “name” keyword argument for the copied instance. Same as
self.nameif value not provided.- Returns:
A new
Variableinstance with no owner or index.- Return type:
Variable instance
Notes
Tags and names are copied to the returned instance.
- class pytensor.graph.basic.Node[source]#
A
Nodein an PyTensor graph.Currently, graphs contain two kinds of
Nodes:Variables andApplys. Edges in the graph are not explicitly represented. Instead eachNodekeeps track of its parents viaVariable.owner/Apply.inputs.
- class pytensor.graph.basic.NominalVariable(id, typ, **kwargs)[source]#
A variable that enables alpha-equivalent comparisons.
- clone(**kwargs)[source]#
Return a new, un-owned
Variablelikeself.- Parameters:
**kwargs (dict) – Optional “name” keyword argument for the copied instance. Same as
self.nameif value not provided.- Returns:
A new
Variableinstance with no owner or index.- Return type:
Variable instance
Notes
Tags and names are copied to the returned instance.
- class pytensor.graph.basic.Variable(type, owner, index=None, name=None)[source]#
A Variable is a node in an expression graph that represents a variable.
The inputs and outputs of every
ApplyareVariableinstances. The input and output arguments to create afunctionare alsoVariableinstances. AVariableis like a strongly-typed variable in some other languages; eachVariablecontains a reference to aTypeinstance that defines the kind of value theVariablecan take in a computation.A
Variableis a container for four important attributes:typeaTypeinstance defining the kind of value thisVariablecan have,ownereitherNone(for graph roots) or theApplyinstance of whichselfis an output,indexthe integer such thatowner.outputs[index] is this_variable(ignored ifownerisNone),namea string to use in pretty-printing and debugging.
There are a few kinds of
Variables to be aware of: AVariablewhich is the output of a symbolic computation has a reference to theApplyinstance to which it belongs (property: owner) and the position of itself in the owner’s output list (property: index).Variable(this base type) is typically the output of a symbolic computation.Constant: a subclass which adds a default and un-replaceablevalue, and requires that owner is None.TensorVariablesubclass ofVariablethat represents anumpy.ndarrayobject.
TensorSharedVariable: a shared version ofTensorVariable.SparseVariable: a subclass ofVariablethat represents ascipy.sparse.{csc,csr}_matrixobject.RandomVariable.
A
Variablewhich is the output of a symbolic computation will have an owner not equal to None.Using a
Variables’ owner field and anApplynode’s inputs fields, one can navigate a graph from an output all the way to the inputs. The opposite direction is possible with aFunctionGraphand itsFunctionGraph.clientsdict, which mapsVariables to a list of their clients.- Parameters:
type (a Type instance) – The type governs the kind of data that can be associated with this variable.
owner (None or Apply instance) – The
Applyinstance which computes the value for this variable.index (None or int) – The position of this
Variablein owner.outputs.name (None or str) – A string for pretty-printing and debugging.
Examples
import pytensor import pytensor.tensor as pt a = pt.constant(1.5) # declare a symbolic constant b = pt.fscalar() # declare a symbolic floating-point scalar c = a + b # create a simple expression # this works because a has a value associated with it already f = pytensor.function([b], [c]) # bind 2.5 to an internal copy of b and evaluate an internal c assert 4.0 == f(2.5) # compilation error because b (required by c) is undefined pytensor.function([a], [c]) # compilation error because a is constant, it can't be an input pytensor.function([a, b], [c])
The python variables
a, b, call refer to instances of typeVariable. TheVariablereferred to byais also an instance ofConstant.- clone(**kwargs)[source]#
Return a new, un-owned
Variablelikeself.- Parameters:
**kwargs (dict) – Optional “name” keyword argument for the copied instance. Same as
self.nameif value not provided.- Returns:
A new
Variableinstance with no owner or index.- Return type:
Variable instance
Notes
Tags and names are copied to the returned instance.
- eval(inputs_to_values=None, **kwargs)[source]#
Evaluate the
Variablegiven a set of values for its inputs.- Parameters:
inputs_to_values – A dictionary mapping PyTensor
Variables or names to values. Not needed if variable has no required inputs.kwargs – Optional keyword arguments to pass to the underlying
pytensor.function
Examples
>>> import numpy as np >>> import pytensor.tensor as pt >>> x = pt.dscalar("x") >>> y = pt.dscalar("y") >>> z = x + y >>> np.allclose(z.eval({x: 16.3, y: 12.1}), 28.4) True
We passed
eval()a dictionary mapping symbolic PyTensorVariables to the values to substitute for them, and it returned the numerical value of the expression.Notes
eval()will be slow the first time you call it on a variable – it needs to callfunction()to compile the expression behind the scenes. Subsequent calls toeval()on that same variable will be fast, because the variable caches the compiled function.This way of computing has more overhead than a normal PyTensor function, so don’t use it too much in real scripts.
- get_parents()[source]#
Return a list of the parents of this node. Should return a copy–i.e., modifying the return value should not modify the graph structure.
- pytensor.graph.basic.as_string(inputs, outputs, leaf_formatter=<class 'str'>, node_formatter=<function default_node_formatter>)[source]#
Returns a string representation of the subgraph between
inputsandoutputs.- Parameters:
- Returns:
Returns a string representation of the subgraph between
inputsandoutputs. If the same node is used by several other nodes, the first occurrence will be marked as*n -> descriptionand all subsequent occurrences will be marked as*n, wherenis an id number (ids are attributed in an unspecified order and only exist for viewing convenience).- Return type:
list of str
- pytensor.graph.basic.clone(inputs, outputs, copy_inputs=True, copy_orphans=None, clone_inner_graphs=False)[source]#
Copies the sub-graph contained between inputs and outputs.
- Parameters:
inputs – Input
Variables.outputs – Output
Variables.copy_inputs – If
True, the inputs will be copied (defaults toTrue).copy_orphans – When
None, use thecopy_inputsvalue. WhenTrue, new orphans nodes are created. WhenFalse, original orphans nodes are reused in the new graph.clone_inner_graphs (bool) – If
True, cloneHasInnerGraphOps and their inner-graphs.
- Return type:
The inputs and outputs of that copy.
Notes
A constant, if in the
inputslist is not an orphan. So it will be copied conditional on thecopy_inputsparameter; otherwise, it will be copied conditional on thecopy_orphansparameter.
- pytensor.graph.basic.clone_get_equiv(inputs, outputs, copy_inputs=True, copy_orphans=True, memo=None, clone_inner_graphs=False, **kwargs)[source]#
Clone the graph between
inputsandoutputsand return a map of the cloned objects.This function works by recursively cloning inputs and rebuilding a directed graph from the inputs up.
If
memoalready contains entries for some of the objects in the graph, those objects are replaced with their values inmemoand not unnecessarily cloned.- Parameters:
inputs – Inputs of the graph to be cloned.
outputs – Outputs of the graph to be cloned.
copy_inputs –
Truemeans to create the cloned graph from cloned input nodes.Falsemeans to clone a graph that is rooted at the original input nodes.Constants are not cloned.copy_orphans – When
True, inputs with no owners are cloned. WhenFalse, original inputs are reused in the new graph. Cloning is not performed forConstants.memo – Optionally start with a partly-filled dictionary for the return value. If a dictionary is passed, this function will work in-place on that dictionary and return it.
clone_inner_graphs – If
True, cloneHasInnerGraphOps and their inner-graphs.kwargs – Keywords passed to
Apply.clone_with_new_inputs.
- pytensor.graph.basic.clone_node_and_cache(node, clone_d, clone_inner_graphs=False, **kwargs)[source]#
Clone an
Applynode and cache the results inclone_d.This function handles
Opclones that are generated by inner-graph cloning.- Returns:
Noneif all ofnode’s outputs are already inclone_d; otherwise,return the clone of
node.
- pytensor.graph.basic.equal_computations(xs, ys, in_xs=None, in_ys=None, strict_dtype=True)[source]#
Checks if PyTensor graphs represent the same computations.
The two lists
xs,ysshould have the same number of entries. The function checks if for any corresponding pair(x, y)fromzip(xs, ys)xandyrepresent the same computations on the same variables (unless equivalences are provided usingin_xs,in_ys).If
in_xsandin_ysare provided, then when comparing a nodexwith a nodeythey are automatically considered as equal if there is some indexisuch thatx == in_xs[i]andy == in_ys[i](and they both have the same type). Note thatxandycan be in the listxsandys, but also represent subgraphs of a computational graph inxsorys.