Skip to content

Commit 50f55eb

Browse files
author
Unity Technologies
committed
com.unity.test-framework.performance@0.1.33-preview
## [0.1.33] - 2018-8-3 ### Small fixes Obsolete warnings, doc update with modules and internals, ValueSource fix ## [0.1.32] - 2018-7-9 ### Add custom measurement/warmup counts Method and Frames measurements can now specify custom warmup, measurement and iteration counts ## [0.1.31] - 2018-7-04 ### mark metadata tests with performance category
1 parent d6bafbd commit 50f55eb

File tree

7 files changed

+165
-52
lines changed

7 files changed

+165
-52
lines changed

CHANGELOG.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,17 @@
11
# Changelog
22

3+
## [0.1.33] - 2018-8-3
4+
5+
### Small fixes
6+
7+
Obsolete warnings, doc update with modules and internals, ValueSource fix
8+
9+
## [0.1.32] - 2018-7-9
10+
11+
### Add custom measurement/warmup counts
12+
13+
Method and Frames measurements can now specify custom warmup, measurement and iteration counts
14+
315
## [0.1.31] - 2018-7-04
416

517
### mark metadata tests with performance category

README.md

Lines changed: 63 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,30 @@
1-
# Performance testing extension for Unity Test Runner
1+
# Performance testing extension for Unity Test Runner
22

33
Extension provides a set of calls to make it easier to take measurements and record profiler markers. It also collects data about build and player settings which is useful when comparing data for separating different hardware and configurations.
44

55
## Installing
6-
To install this package, follow the instructions in the [Package Manager documentation](https://docs.unity3d.com/Packages/com.unity.package-manager-ui@latest/index.html).
6+
To install this package, follow the instructions in the [Package Manager documentation](https://docs.unity3d.com/Packages/com.unity.package-manager-ui@latest/index.html).
77

88
And add `com.unity.test-framework.performance` your packages manifest.
99
YourProject/Packages/manifest.json
1010

1111
``` json
1212
{
13-
"dependencies": {
14-
"com.unity.test-framework.performance": "0.1.31-preview"
15-
},
16-
"testables": [
17-
"com.unity.test-framework.performance"
18-
]
13+
"dependencies": {
14+
"com.unity.test-framework.performance": "0.1.33-preview",
15+
"com.unity.modules.jsonserialize": "1.0.0",
16+
"com.unity.modules.unitywebrequestwww": "1.0.0",
17+
"com.unity.modules.vr": "1.0.0"
18+
},
19+
"testables": [
20+
"com.unity.test-framework.performance"
21+
],
22+
"registry": "https://staging-packages.unity.com"
1923
}
2024
```
2125

26+
If you are using 2018.1 or 2018.2 the module dependencies are unnecessary.
27+
2228
Assembly definitions should reference `Unity.PerformanceTesting` in order to use it. Create a new folder for storing tests in and then create a new asset from context menu called `right click/Create/Assembly definition`. In inspector for the assembly file check "Test Assemblies and apply. Then open the file in text editor and add `Unity.PerformanceTesting`.
2329

2430
``` json
@@ -38,11 +44,10 @@ Assembly definitions should reference `Unity.PerformanceTesting` in order to use
3844
}
3945
```
4046

41-
More information on how to create and run tests please refer to [Unity Test Runner docs](https://docs.unity3d.com/Manual/testing-editortestsrunner.html).
42-
43-
## Saving results
47+
How to test internals can be found in the following link:
48+
https://q.unity3d.com/questions/992/how-to-test-internal-variables-in-the-editor-tests.html
4449

45-
If you are on 2018.3+ version of unity you can launch the editor with command line argument `-performanceTestResults "path"` it will save test results as json to path.
50+
More information on how to create and run tests please refer to [Unity Test Runner docs](https://docs.unity3d.com/Manual/testing-editortestsrunner.html).
4651

4752

4853
## Test Attributes
@@ -64,7 +69,7 @@ Required parameters
6469
Optional parameters
6570
- **sampleUnit** : Unit of the measurement.
6671
- Nanosecond, Microsecond, Millisecond, Second, Byte, Kilobyte, Megabyte, Gigabyte
67-
- **aggregationType** : Preferred aggregation (default is median)
72+
- **aggregationType** : Preferred aggregation (default is median)
6873
- **percentile** : If aggregationType is Percentile, the percentile value used for the aggregation. i.e 0.95.
6974
- **threshold** : Threshold used for regression detection. If current sample value is over the threshold different from the baseline results, the result is concidered as a regression or a progression. Default value is 0.15f.
7075
- **increaseIsBetter** : Defines if an increase in the measurement value is concidered as a progression (better) or a regression. Default is false.
@@ -73,11 +78,11 @@ If unspecified a default SampleGroupDefinition will be used with the name of "Me
7378

7479
## Taking measurements
7580

76-
Preferred way is to use `Measure.Method` or `Measure.Frames`. They both do a couple of warmup iterations which are then used to decide how many iterations should be measured. It is advised to keep measurements above 100 microseconds.
81+
Preferred way is to use `Measure.Method` or `Measure.Frames`. They both do a couple of warmup iterations which are then used to decide how many iterations per measurement should be used.
7782

78-
**void Method(int executions, Action action, SampleGroupDefinition sampleGroupDefinition)**
83+
**MethodMeasurement Method()**
7984

80-
It will execute provided method multiple times.
85+
It will execute provided method at least 3 times for warmup and 7 for measurements.
8186

8287
``` csharp
8388
[PerformanceTest]
@@ -87,9 +92,27 @@ public void Test()
8792
}
8893
```
8994

90-
**IEnumerator Measure.Frames(int frameCount, SampleGroupdDefinition sampleGroupDefinition)**
95+
In cases where you feel the default values are not ideal you can specify custom iterations.
96+
97+
WarmupCount - how many iterations to run without measuring for warmup
98+
MeasurementCount - how many measurements to take
99+
IterationsPerMeasurement - how many iterations per measurement to take
100+
101+
```
102+
[PerformanceTest]
103+
public void Test()
104+
{
105+
Measure.Method(() => { ... })
106+
.WarmupCount(10)
107+
.MeasurementCount(10)
108+
.IterationsPerMeasurement(5)
109+
.Run();
110+
}
111+
```
112+
113+
**FramesMeasurement Measure.Frames()**
91114

92-
Used to yield for a specified amount of frames. Records frame times.
115+
Used to yield for frames. It will automatically select the number of warmup and runtime frames.
93116

94117
``` csharp
95118
[PerformanceUnityTest]
@@ -101,20 +124,34 @@ public IEnumerator Test()
101124
}
102125
```
103126

104-
In cases where you are measuring a system over frametime it is advised to disable frametime measurements and instead measure profiler systems markers.
127+
In cases where you are measuring a system over frametime it is advised to disable frametime measurements and instead measure profiler markers for your system.
105128
``` csharp
106129
[PerformanceUnityTest]
107130
public IEnumerator Test()
108131
{
109132
...
110133

111-
// passing in false will run the frames but will not take frametime mesaurements
112134
yield return Measure.Frames()
113135
.ProfilerMarkers(...)
114-
.Run(false);
136+
.DontRecordFrametime()
137+
.Run();
115138
}
116139
```
117140

141+
If you want more control, you can specify how many frames you want to measure.
142+
143+
``` csharp
144+
[PerformanceUnityTest]
145+
public IEnumerator Test()
146+
{
147+
...
148+
149+
yield return Measure.Frames()
150+
.WarmupCount(5)
151+
.MeasurementCount(10)
152+
.Run();
153+
}
154+
```
118155

119156
When method or frame measurements are not enough you can use the following to measure. It will measure Scope, Frames, Markers or Cusom.
120157

@@ -157,9 +194,9 @@ Records profiler samples for a scope. The name of sample group definition has to
157194
[PerformanceTest]
158195
public void Test()
159196
{
160-
SampleGroupDefinition[] m_definitions =
197+
SampleGroupDefinition[] m_definitions =
161198
{
162-
new SampleGroupDefinition("Instantiate"),
199+
new SampleGroupDefinition("Instantiate"),
163200
new SampleGroupDefinition("Instantiate.Copy"),
164201
new SampleGroupDefinition("Instantiate.Produce"),
165202
new SampleGroupDefinition("Instantiate.Awake")
@@ -240,7 +277,7 @@ Each performance test will have a performance test summary. Every sample group w
240277

241278
``` csharp
242279
// Records total and frame times for loading a scene async
243-
280+
244281
[PerformanceUnityTest]
245282
public IEnumerator LoadAsync_SampleScene()
246283
{
@@ -255,9 +292,9 @@ Each performance test will have a performance test summary. Every sample group w
255292
```
256293

257294
``` csharp
258-
SampleGroupDefinition[] m_definitions =
295+
SampleGroupDefinition[] m_definitions =
259296
{
260-
new SampleGroupDefinition("Instantiate"),
297+
new SampleGroupDefinition("Instantiate"),
261298
new SampleGroupDefinition("Instantiate.Copy"),
262299
new SampleGroupDefinition("Instantiate.Produce"),
263300
new SampleGroupDefinition("Instantiate.Awake")

Runtime/Attributes/PerformanceTestAttribute.cs

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,8 @@
99
namespace Unity.PerformanceTesting
1010
{
1111
[AttributeUsage(AttributeTargets.Method)]
12-
public class PerformanceTestAttribute : CombiningStrategyAttribute, IImplyFixture, IWrapTestMethod
12+
public class PerformanceTestAttribute : TestAttribute, IWrapTestMethod
1313
{
14-
public PerformanceTestAttribute() : base(new UnityCombinatorialStrategy(), new ParameterDataSourceProvider())
15-
{
16-
}
17-
1814
public TestCommand Wrap(TestCommand command)
1915
{
2016
#if UNITY_2018_2_OR_NEWER

Runtime/Measurements/FramesMeasurement.cs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ public FramesMeasurement Definition(string name, SampleUnit sampleUnit, Aggregat
4848
increaseIsBetter, failOnBaseline));
4949
}
5050

51-
public FramesMeasurement ExecutionCount(int count)
51+
public FramesMeasurement MeasurementCount(int count)
5252
{
5353
m_Executions = count;
5454
return this;

Runtime/Measurements/MethodMeasurement.cs

Lines changed: 46 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,9 @@ public class MethodMeasurement
1919
private readonly List<SampleGroup> m_SampleGroups = new List<SampleGroup>();
2020

2121
private SampleGroupDefinition m_Definition;
22+
private int m_WarmupCount;
23+
private int m_MeasurementCount;
24+
private int m_IterationCount = 1;
2225

2326
public MethodMeasurement(Action action)
2427
{
@@ -63,20 +66,44 @@ public MethodMeasurement Definition(string name, SampleUnit sampleUnit, Aggregat
6366
return Definition(new SampleGroupDefinition(name, sampleUnit, aggregationType, percentile, threshold,
6467
increaseIsBetter, failOnBaseline));
6568
}
69+
70+
public MethodMeasurement WarmupCount(int count)
71+
{
72+
m_WarmupCount = count;
73+
return this;
74+
}
75+
76+
public MethodMeasurement IterationsPerMeasurement(int count)
77+
{
78+
m_IterationCount = count;
79+
return this;
80+
}
81+
82+
public MethodMeasurement MeasurementCount(int count)
83+
{
84+
m_MeasurementCount = count;
85+
return this;
86+
}
6687

6788
public void Run()
6889
{
90+
if (m_MeasurementCount > 0)
91+
{
92+
Warmup(m_WarmupCount);
93+
RunForIterations(m_IterationCount, m_MeasurementCount);
94+
return;
95+
}
96+
6997
var iterations = Probing();
70-
71-
RunForIterations(iterations);
98+
RunForIterations(iterations, k_MeasurementCount);
7299
}
73100

74-
private void RunForIterations(int iterations)
101+
private void RunForIterations(int iterations, int measurements)
75102
{
76103
UpdateSampleGroupDefinition();
77104

78-
EnableProfilerMarkers();
79-
for (int j = 0; j < k_MeasurementCount; j++)
105+
EnableMarkers();
106+
for (int j = 0; j < measurements; j++)
80107
{
81108
var executionTime = Time.realtimeSinceStartup;
82109
for (var i = 0; i < iterations; i++)
@@ -87,18 +114,18 @@ private void RunForIterations(int iterations)
87114
Measure.Custom(m_Definition, Utils.ConvertSample(SampleUnit.Millisecond, m_Definition.SampleUnit, executionTime));
88115
}
89116

90-
MeasureProfilerMarkers();
117+
DisableAndMeasureMarkers();
91118
}
92119

93-
private void EnableProfilerMarkers()
120+
private void EnableMarkers()
94121
{
95122
foreach (var sampleGroup in m_SampleGroups)
96123
{
97124
sampleGroup.Recorder.enabled = true;
98125
}
99126
}
100127

101-
private void MeasureProfilerMarkers()
128+
private void DisableAndMeasureMarkers()
102129
{
103130
foreach (var sampleGroup in m_SampleGroups)
104131
{
@@ -119,10 +146,7 @@ private int Probing()
119146
while (executionTime < k_MinWarmupTimeMs)
120147
{
121148
executionTime = Time.realtimeSinceStartup;
122-
for (var i = 0; i < iterations; i++)
123-
{
124-
m_Action.Invoke();
125-
}
149+
Warmup(iterations);
126150
executionTime = (Time.realtimeSinceStartup - executionTime) * 1000f;
127151

128152
if (executionTime < k_MinWarmupTimeMs)
@@ -138,13 +162,21 @@ private int Probing()
138162

139163
return 1;
140164
}
141-
165+
142166
var deisredIterationsCount =
143167
Mathf.Clamp((int) (k_MinMeasurementTimeMs * iterations / executionTime), 1, k_MaxIterations);
144-
168+
145169
return deisredIterationsCount;
146170
}
147171

172+
private void Warmup(int iterations)
173+
{
174+
for (var i = 0; i < iterations; i++)
175+
{
176+
m_Action.Invoke();
177+
}
178+
}
179+
148180
private void UpdateSampleGroupDefinition()
149181
{
150182
if (m_Definition.Name == null)

0 commit comments

Comments
 (0)