Advanced Usage
Advanced techniques and complex use cases demonstrating the full power of iTensor in sophisticated scenarios.
Advanced Workflows
This section explores more complex iTensor workflows, combining multiple operations and techniques to solve sophisticated problems. These examples demonstrate advanced features and real-world applications of tensor mathematics.
These advanced examples assume you're already familiar with the basic operations covered in theBasic Examplessection.
Example 1: Tensor Decomposition
This example demonstrates how to perform tensor decomposition, specifically Singular Value Decomposition (SVD) of a matrix, which is useful for dimensionality reduction, data compression, and solving systems of equations.
Request
Response
The SVD decomposes a matrix A into three matrices U, S, and V such that A = U × diag(S) × V^T. This decomposition has numerous applications in signal processing, statistics, and machine learning.
Example 2: Tensor Network Contraction
This example shows how to perform complex tensor contractions using Einstein notation, which is a powerful way to express operations on multi-dimensional arrays.
Request
Response
Einstein summation (einsum) is a concise way to express operations on tensors, including matrix multiplication, dot products, and complex multi-dimensional contractions. The notation "abc,cd,def->abef" specifies which indices are contracted and which remain in the output.
Example 3: Custom Tensor Operations
This example demonstrates how to define and apply custom operations to tensors using a combination of symbolic and numerical processing.
Step 1: Define a Custom Function Symbolically
Step 2: Apply the Function to a Numerical Tensor
Response
This example shows how to define a custom activation function (softplus, which is a smooth approximation of ReLU used in neural networks) symbolically and then apply it to a numerical tensor.
Example 4: Optimization with Tensor Gradients
This example demonstrates how to use symbolic differentiation with numerical optimization to find minima of complex functions.
Step 1: Define the Function Symbolically
Step 2: Compute the Gradient
Step 3: Run Gradient Descent Optimization
Response
This example uses the Rosenbrock function, a standard test function for optimization algorithms. It has a global minimum at (1, 1) but is notoriously difficult to optimize due to its shape. By combining symbolic differentiation with numerical optimization, we can efficiently find the minimum.
Example 5: Quantum Circuit Simulation
This example demonstrates how to use tensors to simulate a basic quantum circuit, showing the power of tensor operations in quantum computing applications.
Step 1: Define Quantum Gates as Tensors
Step 2: Initialize Qubit State
Step 3: Apply Quantum Gates
Result: Entangled Bell State
This example simulates a simple quantum circuit that creates a Bell state (an entangled state of two qubits). The circuit applies a Hadamard gate to the first qubit followed by a CNOT gate between the two qubits. This demonstrates how tensor operations can be used to simulate quantum systems.