Skip to content

Commit 57756c8

Browse files
committed
fix a race in clearing spans in this test case
There is a race with SimpleSpanProcessor#onEnd() where the spans are not synchronously 'exported' if span.resource.asyncAttributesPending. Something about the updates in the PR results in this race being more likely to occur.
1 parent 262ebba commit 57756c8

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

detectors/node/opentelemetry-resource-detector-aws/test/detectors/AwsSuppressTracing.test.ts

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,9 +32,10 @@ describe('[Integration] Internal tracing', () => {
3232
'http://169.254.169.254/metadata';
3333

3434
const memoryExporter = new InMemorySpanExporter();
35+
const spanProcessor = new SimpleSpanProcessor(memoryExporter);
3536
const sdk = new NodeSDK({
3637
instrumentations: [new FsInstrumentation(), new HttpInstrumentation()],
37-
spanProcessors: [new SimpleSpanProcessor(memoryExporter)],
38+
spanProcessors: [spanProcessor],
3839
});
3940
sdk.start();
4041

@@ -58,7 +59,7 @@ describe('[Integration] Internal tracing', () => {
5859

5960
// NOTE: the require process makes use of the fs API so spans are being exported.
6061
// We reset the exporter to have a clean state for assertions
61-
await new Promise(r => setTimeout(r, 0));
62+
await spanProcessor.forceFlush();
6263
memoryExporter.reset();
6364

6465
const detectors = [

0 commit comments

Comments
 (0)