Tools & applications

Publish at:

A formula alone changes nothing. Its value appears only when embedded in practice, when numbers emerge from real systems rather than notebooks.

So far we established:

  • V (volume) represents internal cohesion — how many elements work together without escaping the boundary.
  • A (surface) represents the external contract — public methods, imports, outgoing calls, dependencies.
  • S (sphericity) expresses balance:

S = V / (A^α)[1]

High S indicates dense purpose with minimal exposure. Low S signals a boundary that leaks more logic than it contains. The remaining task is mechanical: extract V and A from code, compute S, and surface the results.

What the Tool Must Recognize #

To measure sphericity in a TypeScript codebase, the tool observes structure rather than semantics:

What we measure Represents Counts toward
Private methods, internal calls, internal class references Cohesion inside the boundary V
Public methods, module exports, API handlers System surface A
External imports or cross-service calls Surface tension A
Internal dependency graph density Strength of internal attraction V

No runtime behavior is required. Only structure and relationships matter.

How It Works in Principle #

  1. Parse source code into an AST (Abstract Syntax Tree)[2]
  2. Build a dependency graph of internal vs external links[3]
  3. Classify edges:
    • internal → increases V
    • public or external → increases A
  4. Compute S per module, per class, and per service

Expected Patterns #

Situation Expected S value Interpretation
Many internals, few public endpoints High Dense, stable, cohesive
Many exports, thin implementation Low All surface, little purpose
Heavy internal logic + many external calls Medium but unstable Needs boundary refinement
Balanced core, narrow interface High and healthy Sustainable shape

Unlike generic complexity metrics, sphericity measures proportion, not size.

A small system can be unhealthy, a large system can be stable.

The First Tool: Concept #

The initial analyzer does not attempt to understand behavior, intention, or domain meaning. Its role is simpler: expose the geometry underneath the design.

Minimal goals:

  • Read a TypeScript project
  • Count internal vs external references
  • Calculate S = V / A^α
  • Produce a ranked list of modules by stability
  • Highlight surfaces that have grown disproportionately thin

This first version behaves like a thermometer. It measures shape, not correctness.

What the Output Might Look Like #

Module: UserService
  Internal cohesion (V): 212
  Public surface (A): 14
  Sphericity (S): 4.28  ✅ Stable core

Module: ApiRouter
  Internal cohesion (V): 48
  Public surface (A): 27
  Sphericity (S): 0.62  ⚠️ High exposure

Module: ReportGenerator
  Internal cohesion (V): 310
  Public surface (A): 6
  Sphericity (S): 12.9  ✅ Highly cohesive

A low S is not an error. It is a spotlight.

Tooling Choices #

The natural environment for this first iteration is the TypeScript ecosystem:

  • ts-morph for structural parsing[4]
  • graphlib for dependency graph construction[5]
  • Node CLI to produce local reports

This keeps the tool:

  • language-native
  • static (no execution needed)
  • fast enough for CI pipelines
  • readable for extension into other languages later

What Comes After #

Once modules have measurable geometry, new questions can be asked:

  • Which boundaries leak the most complexity?
  • Which modules are strong but too isolated?
  • Where should a service split or merge?
  • Which parts of the system feel smooth, and which feel spiky?

The numbers are coordinates. They turn design into terrain — something that can be navigated instead of debated.

A codebase improves when its boundaries settle with them.

References

  1. Isoperimetric inequality (opens in a new tab) · Back
  2. Abstract syntax tree (opens in a new tab) · Back
  3. Dependency graph (opens in a new tab) · Back
  4. ts-morph (opens in a new tab) · Back
  5. graphlib (opens in a new tab) · Back