Lesson overview | Previous part | Next part
Riemannian Geometry: Part 2: Formal Definitions
2. Formal Definitions
Formal Definitions develops the part of riemannian geometry specified by the approved Chapter 25 table of contents. The treatment is geometry-first and AI-facing.
2.1 Riemannian metric
Riemannian metric belongs to the canonical scope of Riemannian Geometry. The goal is to make curved-space reasoning concrete enough for ML practice without turning the section into a pure topology course.
Working scope for this subsection: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry. The recurring pattern is localize, linearize, measure, move, and return to the manifold.
Operational definition.
A Riemannian metric assigns an inner product to every tangent space smoothly.
Worked reading.
If a coordinate metric is , then length of a velocity is .
| Geometric object | Meaning | AI interpretation |
|---|---|---|
| Manifold | Curved space with local coordinates | Data manifold, latent space, constraint set, parameter space |
| Chart | Local coordinate map | Local representation or embedding coordinates |
| Tangent space | Linearized directions at | Local perturbations, gradients, velocities |
| Metric | Inner product on | Geometry-aware length, angle, steepest descent |
| Geodesic | Straightest curved-space path | Latent interpolation, shortest motion, curved optimization path |
| Retraction | Practical map from tangent step back to | Efficient constrained update in training loops |
Three examples of riemannian metric :
- Euclidean metric on a sphere inherited from ambient space.
- Fisher metric on statistical models.
- Affine-invariant metric on SPD matrices.
Two non-examples clarify the boundary:
- A distance formula with no tangent-space inner product.
- A fixed Euclidean metric used after nonlinear reparameterization without checking geometry.
Proof or verification habit for riemannian metric :
Check symmetry, bilinearity, positive definiteness, and smooth variation with the base point.
global object -> curved manifold or constraint set
local object -> chart, tangent space, or coordinate patch
linear operation -> derivative, gradient, velocity, Hessian approximation
geometric measure -> metric, length, distance, curvature
algorithmic move -> tangent step followed by geodesic or retraction
In AI systems, riemannian metric matters because learned representations and constrained parameter spaces are rarely globally flat. A local linear approximation may be useful, but it must be attached to the point where it is valid.
The metric determines what steepest descent, distance, and regularization mean for a representation or parameter space.
Mini derivation lens.
- Choose a point on the manifold and name the local representation used near .
- Move the question into a chart, tangent space, or embedded constraint where first-order calculus is available.
- Compute the local object: derivative, tangent projection, metric-weighted gradient, path velocity, or retraction step.
- Translate the result back into coordinate-free language so the answer is not tied to one chart by accident.
- Check the invariant: the point remains on , the direction remains in , or the distance/gradient uses the stated metric.
Implementation lens.
A practical ML implementation should store both the ambient array representation and the geometric contract attached to it. For example, a normalized embedding is not just a vector; it is a point on a sphere. An orthogonal weight matrix is not just a matrix; it is a point on a Stiefel-type constraint. A covariance matrix is not just a symmetric array; it must stay positive definite.
The clean computational pattern is: encode the state, compute an ambient derivative if needed, convert it into a tangent or metric-aware object, take a small local step, and then return to the manifold with a geodesic formula or retraction. This is the same pattern used in the companion notebooks, just scaled down to visible two- and three-dimensional examples.
The important warning is that coordinate code can pass shape checks while still violating geometry. Differential geometry adds checks that are semantic: tangentness, smooth compatibility, metric choice, path validity, and constraint preservation.
Practical checklist:
- State the manifold and whether it is abstract, embedded, or quotient-like.
- State the local coordinates or tangent representation being used.
- Separate ambient vectors from tangent vectors.
- Name the metric before computing distances, angles, or gradients.
- Use geodesics or retractions when moving on the manifold.
- For ML claims, identify whether geometry is data geometry, parameter geometry, or statistical geometry.
Local diagnostic: State the metric before computing lengths or gradients.
The companion notebook uses low-dimensional synthetic examples: circles, spheres, tangent projections, spherical interpolation, SPD matrices, and orthogonality constraints. These examples keep geometry visible while preserving the same update logic used in higher-dimensional ML systems.
| Compact ML phrase | Differential-geometric reading |
|---|---|
| local linearization | tangent-space approximation at a point |
| normalized embedding | point on a sphere with tangent constraints |
| natural gradient | Riemannian gradient under Fisher metric |
| orthogonal weights | point on a Stiefel-type manifold |
| latent interpolation | path that may need geodesic structure |
| covariance geometry | SPD manifold rather than arbitrary matrices |
A useful learning move is to compute everything first on a sphere. The sphere has visible curvature, simple tangent spaces, closed-form geodesics, and practical retractions. Once those are clear, Stiefel, Grassmann, SPD, and information-geometric examples become less mysterious.
For implementation, the main discipline is to avoid leaving the manifold silently. If a gradient step violates a constraint, either project the gradient into the tangent space before stepping or use a method whose update is intrinsic by design.
The final question for this subsection is whether a Euclidean formula is being used as an approximation, a coordinate expression, or a mistaken replacement for geometry. Differential geometry is the habit of telling those cases apart.
2.2 Riemannian manifold
Riemannian manifold belongs to the canonical scope of Riemannian Geometry. The goal is to make curved-space reasoning concrete enough for ML practice without turning the section into a pure topology course.
Working scope for this subsection: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry. The recurring pattern is localize, linearize, measure, move, and return to the manifold.
Operational definition.
Riemannian manifold belongs to the canonical scope of Riemannian Geometry: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry.
Worked reading.
Start from a concrete embedded example, compute the local tangent or metric object, then translate back to intrinsic notation.
| Geometric object | Meaning | AI interpretation |
|---|---|---|
| Manifold | Curved space with local coordinates | Data manifold, latent space, constraint set, parameter space |
| Chart | Local coordinate map | Local representation or embedding coordinates |
| Tangent space | Linearized directions at | Local perturbations, gradients, velocities |
| Metric | Inner product on | Geometry-aware length, angle, steepest descent |
| Geodesic | Straightest curved-space path | Latent interpolation, shortest motion, curved optimization path |
| Retraction | Practical map from tangent step back to | Efficient constrained update in training loops |
Three examples of riemannian manifold :
- Sphere geometry.
- Embedding-space local coordinates.
- Matrix-manifold parameter constraints.
Two non-examples clarify the boundary:
- A flat Euclidean approximation used globally.
- A geometric claim made without metric or tangent space.
Proof or verification habit for riemannian manifold :
The proof habit is to compute locally and verify coordinate-independent meaning.
global object -> curved manifold or constraint set
local object -> chart, tangent space, or coordinate patch
linear operation -> derivative, gradient, velocity, Hessian approximation
geometric measure -> metric, length, distance, curvature
algorithmic move -> tangent step followed by geodesic or retraction
In AI systems, riemannian manifold matters because learned representations and constrained parameter spaces are rarely globally flat. A local linear approximation may be useful, but it must be attached to the point where it is valid.
The AI relevance is that model spaces are often curved even when implemented as arrays.
Mini derivation lens.
- Choose a point on the manifold and name the local representation used near .
- Move the question into a chart, tangent space, or embedded constraint where first-order calculus is available.
- Compute the local object: derivative, tangent projection, metric-weighted gradient, path velocity, or retraction step.
- Translate the result back into coordinate-free language so the answer is not tied to one chart by accident.
- Check the invariant: the point remains on , the direction remains in , or the distance/gradient uses the stated metric.
Implementation lens.
A practical ML implementation should store both the ambient array representation and the geometric contract attached to it. For example, a normalized embedding is not just a vector; it is a point on a sphere. An orthogonal weight matrix is not just a matrix; it is a point on a Stiefel-type constraint. A covariance matrix is not just a symmetric array; it must stay positive definite.
The clean computational pattern is: encode the state, compute an ambient derivative if needed, convert it into a tangent or metric-aware object, take a small local step, and then return to the manifold with a geodesic formula or retraction. This is the same pattern used in the companion notebooks, just scaled down to visible two- and three-dimensional examples.
The important warning is that coordinate code can pass shape checks while still violating geometry. Differential geometry adds checks that are semantic: tangentness, smooth compatibility, metric choice, path validity, and constraint preservation.
Practical checklist:
- State the manifold and whether it is abstract, embedded, or quotient-like.
- State the local coordinates or tangent representation being used.
- Separate ambient vectors from tangent vectors.
- Name the metric before computing distances, angles, or gradients.
- Use geodesics or retractions when moving on the manifold.
- For ML claims, identify whether geometry is data geometry, parameter geometry, or statistical geometry.
Local diagnostic: Name the manifold, tangent space, metric, and map being used.
The companion notebook uses low-dimensional synthetic examples: circles, spheres, tangent projections, spherical interpolation, SPD matrices, and orthogonality constraints. These examples keep geometry visible while preserving the same update logic used in higher-dimensional ML systems.
| Compact ML phrase | Differential-geometric reading |
|---|---|
| local linearization | tangent-space approximation at a point |
| normalized embedding | point on a sphere with tangent constraints |
| natural gradient | Riemannian gradient under Fisher metric |
| orthogonal weights | point on a Stiefel-type manifold |
| latent interpolation | path that may need geodesic structure |
| covariance geometry | SPD manifold rather than arbitrary matrices |
A useful learning move is to compute everything first on a sphere. The sphere has visible curvature, simple tangent spaces, closed-form geodesics, and practical retractions. Once those are clear, Stiefel, Grassmann, SPD, and information-geometric examples become less mysterious.
For implementation, the main discipline is to avoid leaving the manifold silently. If a gradient step violates a constraint, either project the gradient into the tangent space before stepping or use a method whose update is intrinsic by design.
The final question for this subsection is whether a Euclidean formula is being used as an approximation, a coordinate expression, or a mistaken replacement for geometry. Differential geometry is the habit of telling those cases apart.
2.3 Length of curves and induced distance
Length of curves and induced distance belongs to the canonical scope of Riemannian Geometry. The goal is to make curved-space reasoning concrete enough for ML practice without turning the section into a pure topology course.
Working scope for this subsection: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry. The recurring pattern is localize, linearize, measure, move, and return to the manifold.
Operational definition.
A Riemannian metric assigns an inner product to every tangent space smoothly.
Worked reading.
If a coordinate metric is , then length of a velocity is .
| Geometric object | Meaning | AI interpretation |
|---|---|---|
| Manifold | Curved space with local coordinates | Data manifold, latent space, constraint set, parameter space |
| Chart | Local coordinate map | Local representation or embedding coordinates |
| Tangent space | Linearized directions at | Local perturbations, gradients, velocities |
| Metric | Inner product on | Geometry-aware length, angle, steepest descent |
| Geodesic | Straightest curved-space path | Latent interpolation, shortest motion, curved optimization path |
| Retraction | Practical map from tangent step back to | Efficient constrained update in training loops |
Three examples of length of curves and induced distance:
- Euclidean metric on a sphere inherited from ambient space.
- Fisher metric on statistical models.
- Affine-invariant metric on SPD matrices.
Two non-examples clarify the boundary:
- A distance formula with no tangent-space inner product.
- A fixed Euclidean metric used after nonlinear reparameterization without checking geometry.
Proof or verification habit for length of curves and induced distance:
Check symmetry, bilinearity, positive definiteness, and smooth variation with the base point.
global object -> curved manifold or constraint set
local object -> chart, tangent space, or coordinate patch
linear operation -> derivative, gradient, velocity, Hessian approximation
geometric measure -> metric, length, distance, curvature
algorithmic move -> tangent step followed by geodesic or retraction
In AI systems, length of curves and induced distance matters because learned representations and constrained parameter spaces are rarely globally flat. A local linear approximation may be useful, but it must be attached to the point where it is valid.
The metric determines what steepest descent, distance, and regularization mean for a representation or parameter space.
Mini derivation lens.
- Choose a point on the manifold and name the local representation used near .
- Move the question into a chart, tangent space, or embedded constraint where first-order calculus is available.
- Compute the local object: derivative, tangent projection, metric-weighted gradient, path velocity, or retraction step.
- Translate the result back into coordinate-free language so the answer is not tied to one chart by accident.
- Check the invariant: the point remains on , the direction remains in , or the distance/gradient uses the stated metric.
Implementation lens.
A practical ML implementation should store both the ambient array representation and the geometric contract attached to it. For example, a normalized embedding is not just a vector; it is a point on a sphere. An orthogonal weight matrix is not just a matrix; it is a point on a Stiefel-type constraint. A covariance matrix is not just a symmetric array; it must stay positive definite.
The clean computational pattern is: encode the state, compute an ambient derivative if needed, convert it into a tangent or metric-aware object, take a small local step, and then return to the manifold with a geodesic formula or retraction. This is the same pattern used in the companion notebooks, just scaled down to visible two- and three-dimensional examples.
The important warning is that coordinate code can pass shape checks while still violating geometry. Differential geometry adds checks that are semantic: tangentness, smooth compatibility, metric choice, path validity, and constraint preservation.
Practical checklist:
- State the manifold and whether it is abstract, embedded, or quotient-like.
- State the local coordinates or tangent representation being used.
- Separate ambient vectors from tangent vectors.
- Name the metric before computing distances, angles, or gradients.
- Use geodesics or retractions when moving on the manifold.
- For ML claims, identify whether geometry is data geometry, parameter geometry, or statistical geometry.
Local diagnostic: State the metric before computing lengths or gradients.
The companion notebook uses low-dimensional synthetic examples: circles, spheres, tangent projections, spherical interpolation, SPD matrices, and orthogonality constraints. These examples keep geometry visible while preserving the same update logic used in higher-dimensional ML systems.
| Compact ML phrase | Differential-geometric reading |
|---|---|
| local linearization | tangent-space approximation at a point |
| normalized embedding | point on a sphere with tangent constraints |
| natural gradient | Riemannian gradient under Fisher metric |
| orthogonal weights | point on a Stiefel-type manifold |
| latent interpolation | path that may need geodesic structure |
| covariance geometry | SPD manifold rather than arbitrary matrices |
A useful learning move is to compute everything first on a sphere. The sphere has visible curvature, simple tangent spaces, closed-form geodesics, and practical retractions. Once those are clear, Stiefel, Grassmann, SPD, and information-geometric examples become less mysterious.
For implementation, the main discipline is to avoid leaving the manifold silently. If a gradient step violates a constraint, either project the gradient into the tangent space before stepping or use a method whose update is intrinsic by design.
The final question for this subsection is whether a Euclidean formula is being used as an approximation, a coordinate expression, or a mistaken replacement for geometry. Differential geometry is the habit of telling those cases apart.
2.4 Riemannian gradient
Riemannian gradient belongs to the canonical scope of Riemannian Geometry. The goal is to make curved-space reasoning concrete enough for ML practice without turning the section into a pure topology course.
Working scope for this subsection: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry. The recurring pattern is localize, linearize, measure, move, and return to the manifold.
Operational definition.
The Riemannian gradient is the tangent vector whose inner product with any direction equals the directional derivative.
Worked reading.
In coordinates with metric matrix , the Riemannian gradient is , not usually the raw Euclidean gradient.
| Geometric object | Meaning | AI interpretation |
|---|---|---|
| Manifold | Curved space with local coordinates | Data manifold, latent space, constraint set, parameter space |
| Chart | Local coordinate map | Local representation or embedding coordinates |
| Tangent space | Linearized directions at | Local perturbations, gradients, velocities |
| Metric | Inner product on | Geometry-aware length, angle, steepest descent |
| Geodesic | Straightest curved-space path | Latent interpolation, shortest motion, curved optimization path |
| Retraction | Practical map from tangent step back to | Efficient constrained update in training loops |
Three examples of riemannian gradient :
- Natural gradient using Fisher information.
- Projected gradient on the sphere.
- Geometry-aware update for SPD covariance matrices.
Two non-examples clarify the boundary:
- Raw parameter gradient treated as invariant under reparameterization.
- A direction off the tangent space called a manifold gradient.
Proof or verification habit for riemannian gradient :
Use the defining identity for all tangent directions.
global object -> curved manifold or constraint set
local object -> chart, tangent space, or coordinate patch
linear operation -> derivative, gradient, velocity, Hessian approximation
geometric measure -> metric, length, distance, curvature
algorithmic move -> tangent step followed by geodesic or retraction
In AI systems, riemannian gradient matters because learned representations and constrained parameter spaces are rarely globally flat. A local linear approximation may be useful, but it must be attached to the point where it is valid.
Natural gradient and second-order preconditioning are geometry choices, not only optimizer tricks.
Mini derivation lens.
- Choose a point on the manifold and name the local representation used near .
- Move the question into a chart, tangent space, or embedded constraint where first-order calculus is available.
- Compute the local object: derivative, tangent projection, metric-weighted gradient, path velocity, or retraction step.
- Translate the result back into coordinate-free language so the answer is not tied to one chart by accident.
- Check the invariant: the point remains on , the direction remains in , or the distance/gradient uses the stated metric.
Implementation lens.
A practical ML implementation should store both the ambient array representation and the geometric contract attached to it. For example, a normalized embedding is not just a vector; it is a point on a sphere. An orthogonal weight matrix is not just a matrix; it is a point on a Stiefel-type constraint. A covariance matrix is not just a symmetric array; it must stay positive definite.
The clean computational pattern is: encode the state, compute an ambient derivative if needed, convert it into a tangent or metric-aware object, take a small local step, and then return to the manifold with a geodesic formula or retraction. This is the same pattern used in the companion notebooks, just scaled down to visible two- and three-dimensional examples.
The important warning is that coordinate code can pass shape checks while still violating geometry. Differential geometry adds checks that are semantic: tangentness, smooth compatibility, metric choice, path validity, and constraint preservation.
Practical checklist:
- State the manifold and whether it is abstract, embedded, or quotient-like.
- State the local coordinates or tangent representation being used.
- Separate ambient vectors from tangent vectors.
- Name the metric before computing distances, angles, or gradients.
- Use geodesics or retractions when moving on the manifold.
- For ML claims, identify whether geometry is data geometry, parameter geometry, or statistical geometry.
Local diagnostic: Ask which metric converts covectors into update vectors.
The companion notebook uses low-dimensional synthetic examples: circles, spheres, tangent projections, spherical interpolation, SPD matrices, and orthogonality constraints. These examples keep geometry visible while preserving the same update logic used in higher-dimensional ML systems.
| Compact ML phrase | Differential-geometric reading |
|---|---|
| local linearization | tangent-space approximation at a point |
| normalized embedding | point on a sphere with tangent constraints |
| natural gradient | Riemannian gradient under Fisher metric |
| orthogonal weights | point on a Stiefel-type manifold |
| latent interpolation | path that may need geodesic structure |
| covariance geometry | SPD manifold rather than arbitrary matrices |
A useful learning move is to compute everything first on a sphere. The sphere has visible curvature, simple tangent spaces, closed-form geodesics, and practical retractions. Once those are clear, Stiefel, Grassmann, SPD, and information-geometric examples become less mysterious.
For implementation, the main discipline is to avoid leaving the manifold silently. If a gradient step violates a constraint, either project the gradient into the tangent space before stepping or use a method whose update is intrinsic by design.
The final question for this subsection is whether a Euclidean formula is being used as an approximation, a coordinate expression, or a mistaken replacement for geometry. Differential geometry is the habit of telling those cases apart.
2.5 Volume forms and integration preview
Volume forms and integration preview belongs to the canonical scope of Riemannian Geometry. The goal is to make curved-space reasoning concrete enough for ML practice without turning the section into a pure topology course.
Working scope for this subsection: Riemannian metrics, curve length, induced distance, Riemannian gradients, metric tensors, connections, curvature previews, and information geometry. The recurring pattern is localize, linearize, measure, move, and return to the manifold.
Operational definition.
A Riemannian metric assigns an inner product to every tangent space smoothly.
Worked reading.
If a coordinate metric is , then length of a velocity is .
| Geometric object | Meaning | AI interpretation |
|---|---|---|
| Manifold | Curved space with local coordinates | Data manifold, latent space, constraint set, parameter space |
| Chart | Local coordinate map | Local representation or embedding coordinates |
| Tangent space | Linearized directions at | Local perturbations, gradients, velocities |
| Metric | Inner product on | Geometry-aware length, angle, steepest descent |
| Geodesic | Straightest curved-space path | Latent interpolation, shortest motion, curved optimization path |
| Retraction | Practical map from tangent step back to | Efficient constrained update in training loops |
Three examples of volume forms and integration preview:
- Euclidean metric on a sphere inherited from ambient space.
- Fisher metric on statistical models.
- Affine-invariant metric on SPD matrices.
Two non-examples clarify the boundary:
- A distance formula with no tangent-space inner product.
- A fixed Euclidean metric used after nonlinear reparameterization without checking geometry.
Proof or verification habit for volume forms and integration preview:
Check symmetry, bilinearity, positive definiteness, and smooth variation with the base point.
global object -> curved manifold or constraint set
local object -> chart, tangent space, or coordinate patch
linear operation -> derivative, gradient, velocity, Hessian approximation
geometric measure -> metric, length, distance, curvature
algorithmic move -> tangent step followed by geodesic or retraction
In AI systems, volume forms and integration preview matters because learned representations and constrained parameter spaces are rarely globally flat. A local linear approximation may be useful, but it must be attached to the point where it is valid.
The metric determines what steepest descent, distance, and regularization mean for a representation or parameter space.
Mini derivation lens.
- Choose a point on the manifold and name the local representation used near .
- Move the question into a chart, tangent space, or embedded constraint where first-order calculus is available.
- Compute the local object: derivative, tangent projection, metric-weighted gradient, path velocity, or retraction step.
- Translate the result back into coordinate-free language so the answer is not tied to one chart by accident.
- Check the invariant: the point remains on , the direction remains in , or the distance/gradient uses the stated metric.
Implementation lens.
A practical ML implementation should store both the ambient array representation and the geometric contract attached to it. For example, a normalized embedding is not just a vector; it is a point on a sphere. An orthogonal weight matrix is not just a matrix; it is a point on a Stiefel-type constraint. A covariance matrix is not just a symmetric array; it must stay positive definite.
The clean computational pattern is: encode the state, compute an ambient derivative if needed, convert it into a tangent or metric-aware object, take a small local step, and then return to the manifold with a geodesic formula or retraction. This is the same pattern used in the companion notebooks, just scaled down to visible two- and three-dimensional examples.
The important warning is that coordinate code can pass shape checks while still violating geometry. Differential geometry adds checks that are semantic: tangentness, smooth compatibility, metric choice, path validity, and constraint preservation.
Practical checklist:
- State the manifold and whether it is abstract, embedded, or quotient-like.
- State the local coordinates or tangent representation being used.
- Separate ambient vectors from tangent vectors.
- Name the metric before computing distances, angles, or gradients.
- Use geodesics or retractions when moving on the manifold.
- For ML claims, identify whether geometry is data geometry, parameter geometry, or statistical geometry.
Local diagnostic: State the metric before computing lengths or gradients.
The companion notebook uses low-dimensional synthetic examples: circles, spheres, tangent projections, spherical interpolation, SPD matrices, and orthogonality constraints. These examples keep geometry visible while preserving the same update logic used in higher-dimensional ML systems.
| Compact ML phrase | Differential-geometric reading |
|---|---|
| local linearization | tangent-space approximation at a point |
| normalized embedding | point on a sphere with tangent constraints |
| natural gradient | Riemannian gradient under Fisher metric |
| orthogonal weights | point on a Stiefel-type manifold |
| latent interpolation | path that may need geodesic structure |
| covariance geometry | SPD manifold rather than arbitrary matrices |
A useful learning move is to compute everything first on a sphere. The sphere has visible curvature, simple tangent spaces, closed-form geodesics, and practical retractions. Once those are clear, Stiefel, Grassmann, SPD, and information-geometric examples become less mysterious.
For implementation, the main discipline is to avoid leaving the manifold silently. If a gradient step violates a constraint, either project the gradient into the tangent space before stepping or use a method whose update is intrinsic by design.
The final question for this subsection is whether a Euclidean formula is being used as an approximation, a coordinate expression, or a mistaken replacement for geometry. Differential geometry is the habit of telling those cases apart.