Advanced Usage

Advanced techniques and complex use cases demonstrating the full power of iTensor in sophisticated scenarios.

Advanced Workflows

This section explores more complex iTensor workflows, combining multiple operations and techniques to solve sophisticated problems. These examples demonstrate advanced features and real-world applications of tensor mathematics.

These advanced examples assume you're already familiar with the basic operations covered in theBasic Examplessection.

Example 1: Tensor Decomposition

This example demonstrates how to perform tensor decomposition, specifically Singular Value Decomposition (SVD) of a matrix, which is useful for dimensionality reduction, data compression, and solving systems of equations.

Request

// POST /api/tensors/numerical/ { "operation": "svd", "tensor": { "name": "A", "shape": [3, 2], "values": [ [1.0, 2.0], [3.0, 4.0], [5.0, 6.0] ] } }

Response

{ "operation": "svd", "result": { "U": { "shape": [3, 3], "values": [ [-0.2298477, -0.8834610, -0.4082483], [-0.5247448, -0.2408120, 0.8164966], [-0.8196419, 0.4018369, -0.4082483] ] }, "S": { "shape": [2], "values": [9.5255, 0.5143] }, "V": { "shape": [2, 2], "values": [ [-0.6196290, -0.7848944], [-0.7848944, 0.6196290] ] } } }

The SVD decomposes a matrix A into three matrices U, S, and V such that A = U × diag(S) × V^T. This decomposition has numerous applications in signal processing, statistics, and machine learning.

Example 2: Tensor Network Contraction

This example shows how to perform complex tensor contractions using Einstein notation, which is a powerful way to express operations on multi-dimensional arrays.

Request

// POST /api/tensors/numerical/ { "operation": "einsum", "subscripts": "abc,cd,def->abef", "operands": [ { "name": "A", "shape": [2, 3, 4], "values": [/* 3D array of values */] }, { "name": "B", "shape": [4, 5], "values": [/* 2D array of values */] }, { "name": "C", "shape": [5, 2, 3], "values": [/* 3D array of values */] } ] }

Response

{ "operation": "einsum", "result": { "shape": [2, 3, 2, 3], "values": [/* 4D array of result values */] } }

Einstein summation (einsum) is a concise way to express operations on tensors, including matrix multiplication, dot products, and complex multi-dimensional contractions. The notation "abc,cd,def->abef" specifies which indices are contracted and which remain in the output.

Example 3: Custom Tensor Operations

This example demonstrates how to define and apply custom operations to tensors using a combination of symbolic and numerical processing.

Step 1: Define a Custom Function Symbolically

// POST /api/tensors/symbolic/ { "operation": "define_function", "function": { "name": "softplus", "expression": "log(1 + exp(x))", "variables": ["x"] } }

Step 2: Apply the Function to a Numerical Tensor

// POST /api/tensors/numerical/ { "operation": "apply_function", "function": "softplus", "tensor": { "name": "X", "shape": [2, 3], "values": [ [-2.0, -1.0, 0.0], [1.0, 2.0, 3.0] ] } }

Response

{ "operation": "apply_function", "result": { "shape": [2, 3], "values": [ [0.1269, 0.3133, 0.6931], [1.3133, 2.1269, 3.0486] ] } }

This example shows how to define a custom activation function (softplus, which is a smooth approximation of ReLU used in neural networks) symbolically and then apply it to a numerical tensor.

Example 4: Optimization with Tensor Gradients

This example demonstrates how to use symbolic differentiation with numerical optimization to find minima of complex functions.

Step 1: Define the Function Symbolically

// POST /api/tensors/symbolic/ { "operation": "define_function", "function": { "name": "rosenbrock", "expression": "(1 - x)^2 + 100*(y - x^2)^2", "variables": ["x", "y"] } }

Step 2: Compute the Gradient

// POST /api/tensors/symbolic/ { "operation": "gradient", "function": "rosenbrock", "result_name": "grad_rosenbrock" }

Step 3: Run Gradient Descent Optimization

// POST /api/tensors/numerical/ { "operation": "optimize", "function": "rosenbrock", "gradient": "grad_rosenbrock", "method": "gradient_descent", "initial_point": [0.0, 0.0], "learning_rate": 0.001, "max_iterations": 5000 }

Response

{ "operation": "optimize", "result": { "minimum_point": [0.9996, 0.9992], "minimum_value": 0.0000023, "iterations": 4218, "convergence": true } }

This example uses the Rosenbrock function, a standard test function for optimization algorithms. It has a global minimum at (1, 1) but is notoriously difficult to optimize due to its shape. By combining symbolic differentiation with numerical optimization, we can efficiently find the minimum.

Example 5: Quantum Circuit Simulation

This example demonstrates how to use tensors to simulate a basic quantum circuit, showing the power of tensor operations in quantum computing applications.

Step 1: Define Quantum Gates as Tensors

// POST /api/tensors/numerical/ { "operation": "create_batch", "tensors": [ { "name": "H", // Hadamard gate "shape": [2, 2], "values": [ [0.7071067811865475, 0.7071067811865475], [0.7071067811865475, -0.7071067811865475] ] }, { "name": "X", // Pauli-X gate "shape": [2, 2], "values": [ [0, 1], [1, 0] ] }, { "name": "CNOT", // Controlled-NOT gate "shape": [4, 4], "values": [ [1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 0, 1], [0, 0, 1, 0] ] } ] }

Step 2: Initialize Qubit State

// POST /api/tensors/numerical/ { "operation": "create", "tensor": { "name": "state", "shape": [2, 2], // Two qubits in |00⟩ state "values": [ [1, 0], [0, 0] ] } }

Step 3: Apply Quantum Gates

// Apply Hadamard to first qubit // POST /api/tensors/numerical/ { "operation": "matmul", "operands": [ { "name": "H" }, { "name": "state" } ], "result_name": "state_after_H" } // Apply CNOT gate // POST /api/tensors/numerical/ { "operation": "reshape", "tensor": { "name": "state_after_H" }, "new_shape": [4, 1], "result_name": "state_reshaped" } // POST /api/tensors/numerical/ { "operation": "matmul", "operands": [ { "name": "CNOT" }, { "name": "state_reshaped" } ], "result_name": "final_state_reshaped" } // POST /api/tensors/numerical/ { "operation": "reshape", "tensor": { "name": "final_state_reshaped" }, "new_shape": [2, 2], "result_name": "final_state" }

Result: Entangled Bell State

{ "operation": "reshape", "result": { "shape": [2, 2], "values": [ [0.7071067811865475, 0], [0, 0.7071067811865475] ] } }

This example simulates a simple quantum circuit that creates a Bell state (an entangled state of two qubits). The circuit applies a Hadamard gate to the first qubit followed by a CNOT gate between the two qubits. This demonstrates how tensor operations can be used to simulate quantum systems.