Testing & Debugging Aspects
Testing and debugging compile-time code requires different approaches than traditional run-time code. This chapter covers all the strategies available.
Testing Strategies
Metalama supports three complementary testing approaches:
| Strategy | What It Tests | Executes Code? | Best For |
|---|---|---|---|
| Snapshot Testing | Code transformation correctness | No | Verifying generated code shape |
| Run-Time Testing | Actual behavior | Yes | Verifying side effects and outcomes |
| Compile-Time Unit Testing | Compile-time helper methods | Partially | Complex compile-time logic |
Snapshot Testing
Snapshot testing compares the transformed output against a baseline file. If the aspect changes its code generation, the test fails.
Setup
- Add the test framework package:
dotnet add package Metalama.Testing.AspectTesting
- Create a test project with the following structure:
MyAspect.Tests/
├── LogTests/
│ ├── BasicLog.cs ← Input code
│ └── BasicLog.t.cs ← Expected output (baseline)
├── RetryTests/
│ ├── RetrySync.cs
│ ├── RetrySync.t.cs
│ ├── RetryAsync.cs
│ └── RetryAsync.t.cs
└── MyAspect.Tests.csproj
Writing a Snapshot Test
Input file (BasicLog.cs):
using MyAspects;
public class TestTarget
{
[Log]
public int Add(int a, int b)
{
return a + b;
}
}
Expected output (BasicLog.t.cs):
using MyAspects;
public class TestTarget
{
[Log]
public int Add(int a, int b)
{
Console.WriteLine(">> Entering Add");
Console.WriteLine($" a = {a}");
Console.WriteLine($" b = {b}");
try
{
int result;
result = a + b;
Console.WriteLine($"<< Exiting Add with result: {result}");
return result;
}
catch (Exception ex)
{
Console.WriteLine($"!! Exception in Add: {ex.Message}");
throw;
}
}
}
Running Snapshot Tests
Run tests with dotnet test. The framework:
- Compiles the input file with Metalama
- Compares the transformed output to the
.t.csbaseline - Reports differences as test failures
Updating Baselines
When you intentionally change an aspect's behavior:
# Regenerate all baselines
dotnet test -p:UpdateExpectedOutput=true
Testing Diagnostics
To test that an aspect produces expected warnings or errors:
// Input file (ErrorTest.cs):
public class TestTarget
{
[Cache] // Should produce error: void methods can't be cached
public void DoSomething() { }
}
Expected output (ErrorTest.t.cs) includes diagnostic comments:
public class TestTarget
{
[Cache]
public void DoSomething() // Error MY001: Cannot cache void methods
{ }
}
Run-Time Testing
Run-time testing verifies the actual behavior of aspect-transformed code using standard testing frameworks.
Setup
Use any standard testing framework (xUnit, NUnit, MSTest):
// Using xUnit + FluentAssertions + NSubstitute (GST convention)
public class LogAttributeTests
{
[Fact]
public void Log_ShouldLogMethodEntry()
{
// Arrange
var logger = Substitute.For<ILoggerService>();
AspectServiceLocator.Initialize(
new ServiceCollection()
.AddSingleton(logger)
.BuildServiceProvider());
var service = new TestService();
// Act
service.DoWork();
// Assert
logger.Received(1).Debug(
Arg.Any<string>(),
Arg.Is<string>(s => s.Contains("Entering DoWork")));
}
}
public class TestService
{
[Log]
public void DoWork()
{
// Business logic
}
}
Testing the GST Aspects
The GST framework has comprehensive run-time tests in GST.Core.Aspects.Tests:
public class NotNullAttributeTests
{
[Fact]
public void NotNull_WhenNull_ThrowsArgumentNullException()
{
// Arrange
var service = new TestService();
// Act & Assert
var act = () => service.Process(null!);
act.Should().Throw<ArgumentNullException>()
.WithParameterName("input");
}
[Fact]
public void NotNull_WhenNotNull_Succeeds()
{
var service = new TestService();
var act = () => service.Process("valid");
act.Should().NotThrow();
}
}
public class TestService
{
public void Process([NotNull] string input)
{
// Only reached if input is not null
}
}
Testing Retry Behavior
public class RetryAttributeTests
{
[Fact]
public void Retry_ShouldRetryOnFailure()
{
var callCount = 0;
var service = new RetryTestService(() =>
{
callCount++;
if (callCount < 3)
throw new InvalidOperationException("Transient error");
});
service.UnstableMethod();
callCount.Should().Be(3); // Called 3 times (2 failures + 1 success)
}
}
Testing Caching
public class CacheAttributeTests
{
[Fact]
public void Cache_ShouldReturnCachedValue()
{
// Arrange
var cacheService = new MemoryCacheService();
AspectServiceLocator.Initialize(
new ServiceCollection()
.AddSingleton<ICacheService>(cacheService)
.AddSingleton<ICacheKeyGenerator, DefaultCacheKeyGenerator>()
.BuildServiceProvider());
var repository = new TestRepository();
// Act
var result1 = repository.GetById(1);
var result2 = repository.GetById(1); // Should hit cache
// Assert
repository.CallCount.Should().Be(1); // Only called once
result2.Should().Be(result1);
}
}
Debugging Aspects
The Challenge
Aspect code exists in two forms:
- Source form: What you write in the aspect class (exists at compile time)
- Transformed form: What actually runs (exists in
obj/.../metalama/)
You cannot set breakpoints in the source form and expect them to hit at runtime. The debugger sees the transformed form.
Strategy 1: Debug Compile-Time Code
For debugging BuildAspect() and fabric code:
public override void BuildAspect(IAspectBuilder<IMethod> builder)
{
// This breakpoint will pause the COMPILER
Debugger.Break();
// Your compile-time logic
var method = builder.Target;
// ...
}
Then build with:
dotnet build -p:MetalamaDebugCompiler=True -p:MetalamaConcurrentBuildEnabled=False
The compiler will pause and ask you to attach a debugger.
Strategy 2: Debug Templates
For debugging template expansion:
public override dynamic? OverrideMethod()
{
// This inserts a Debugger.Break() into the GENERATED code
meta.DebugBreak();
Console.WriteLine("Before");
var result = meta.Proceed();
Console.WriteLine("After");
return result;
}
Critical: Use
meta.DebugBreak()in templates, NOTDebugger.Break(). The latter would be emitted as run-time code that always breaks.
Strategy 3: Debug Transformed Code
- Build the project
- Navigate to
obj/<Configuration>/<TFM>/metalama/ - Open the transformed
.csfile - Set breakpoints in the transformed code
- Run with debugger attached
Strategy 4: LamaDebug Configuration
Create a LamaDebug build configuration in Visual Studio for easy debugging:
- Open project properties → Build configurations
- Create a new configuration named
LamaDebug - In the project file:
<PropertyGroup Condition="'$(Configuration)' == 'LamaDebug'">
<DefineConstants>DEBUG;TRACE;LAMADEBUG</DefineConstants>
<MetalamaDebugTransformedCode>True</MetalamaDebugTransformedCode>
</PropertyGroup>
- Switch to
LamaDebugconfiguration when debugging aspects - F11 (Step Into) will step into the transformed code
Strategy 5: Inspect Generated Code
Even without debugging, you can read the generated code:
# After building, check:
ls obj/Debug/net8.0/metalama/
# You'll see transformed versions of your source files
# Open them to understand what the aspect generated
Debugging Tips
Common Debugging Scenarios
| Scenario | Approach |
|---|---|
| Aspect doesn't apply | Check eligibility rules, check attribute placement |
| Wrong code generated | Read transformed code in obj/.../metalama/ |
| Template logic error | Use meta.DebugBreak(), inspect generated code |
| BuildAspect logic error | Use Debugger.Break(), build with MetalamaDebugCompiler=True |
| Runtime behavior wrong | Debug transformed code directly |
| Aspect order wrong | Check [AspectOrder] attribute, inspect generated code |
Logging from Compile-Time Code
You can write diagnostic messages during compilation:
public override void BuildAspect(IAspectBuilder<IMethod> builder)
{
// This appears in the build output
builder.Diagnostics.Report(
DiagnosticDefinition.Create("DBG001", Severity.Warning,
$"Processing method: {builder.Target.Name}")
.WithMessage($"Processing method: {builder.Target.Name}"));
}
Checking Aspect Application
Use the Metalama Transitive Graph to see which aspects are applied to which declarations:
// In a fabric, you can enumerate all aspects:
public override void AmendProject(IProjectAmender amender)
{
amender.SelectMany(p => p.Types)
.SelectMany(t => t.Methods)
.Where(m => m.Attributes.Any(a => a.Type.Is(typeof(LogAttribute))))
.ForEach(m =>
{
// Log which methods have [Log]
Console.WriteLine($"[Log] applied to: {m.DeclaringType.Name}.{m.Name}");
});
}
GST Test Conventions
The GST framework follows these testing conventions:
| Convention | Details |
|---|---|
| Framework | xUnit |
| Assertions | FluentAssertions |
| Mocking | NSubstitute |
| Test location | tests/GST.Core.Aspects.Tests/ |
| Naming | {AspectName}Tests.cs |
| Pattern | Arrange-Act-Assert |
Test File Structure
tests/GST.Core.Aspects.Tests/
├── Validation/
│ ├── NotNullAttributeTests.cs
│ ├── NotEmptyAttributeTests.cs
│ └── RangeAttributeTests.cs
├── Caching/
│ └── CacheAttributeTests.cs
├── Authorization/
│ └── AuthorizeAttributeTests.cs
├── Audit/
│ └── AuditAttributeTests.cs
└── Helpers/
└── TestServiceProvider.cs
Summary
| Testing Type | What | How | When |
|---|---|---|---|
| Snapshot | Code transformation | .t.cs baseline files | Every aspect change |
| Run-Time | Actual behavior | xUnit + mocks | Critical business logic |
| Compile-Time | Helper methods | Standard unit tests | Complex compile-time logic |
| Debugging Target | Method | Key API |
|---|---|---|
BuildAspect() | Debugger.Break() + MetalamaDebugCompiler=True | Debugger.Break() |
| Template code | meta.DebugBreak() | meta.DebugBreak() |
| Transformed code | Breakpoints in obj/.../metalama/ | Standard debugger |
| Build output | Diagnostic reports | builder.Diagnostics.Report() |
Next: Advantages & Disadvantages — When to use (and not use) Metalama.