Skip to content

Commit 3a334f2

Browse files
author
Unity Technologies
committed
com.unity.test-framework.performance@0.1.39-preview
## [0.1.39] - 2018-9-14 ### remove duplicate module from docs ## [0.1.38] - 2018-9-14 ### doc updates
1 parent d2e3ec2 commit 3a334f2

File tree

4 files changed

+102
-354
lines changed

4 files changed

+102
-354
lines changed

CHANGELOG.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,13 @@
11
# Changelog
22

3+
## [0.1.39] - 2018-9-14
4+
5+
### remove duplicate module from docs
6+
7+
## [0.1.38] - 2018-9-14
8+
9+
### doc updates
10+
311
## [0.1.36] - 2018-8-27
412

513
### ProfilerMarkers now take params as arguments

Documentation~/index.md

Lines changed: 72 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -1,32 +1,46 @@
1-
# Performance testing extension for Unity Test Runner
1+
# Performance Testing Extension for Unity Test Runner
22

3-
Extension provides a set of calls to make it easier to take measurements and record profiler markers. It also collects data about build and player settings which is useful when comparing data for separating different hardware and configurations.
3+
The Unity Performance Testing Extension is a Unity Editor package that, when installed, provides an API and test case decorators to make it easier to take measurements/samples of Unity profiler markers, and other custom metrics outside of the profiler, within the Unity Editor and built players. It also collects configuration metadata, such as build and player settings, which is useful when comparing data against different hardware and configurations.
4+
5+
The Performance Testing Extension is intended to be used with, and complement, the Unity Test Runner framework.
6+
7+
**Important Note:** When tests are run with the Unity Test Runner, a development player is always built to support communication between the editor and player, effectively overriding the development build setting from the build settings UI or scripting API.
48

59
## Installing
6-
To install this package, follow the instructions in the [Package Manager documentation](https://docs.unity3d.com/Packages/com.unity.package-manager-ui@latest/index.html).
710

8-
And add `com.unity.test-framework.performance` your packages manifest.
9-
YourProject/Packages/manifest.json
11+
To install the Performance Testing Extension package
12+
1. Open the manifest.json file for your Unity project (located in the YourProject/Packages directory) in a text editor
13+
2. Add com.unity.test-framework.performance to the dependencies as seen below
14+
3. Add com.unity.test-framework.performance to the testables section. If there is not a testables section in your manifest.json file, go ahead and add it.
15+
4. Save the manifest.json file
16+
5. Verify the Performance Testing Extension is now installed opening the Unity Package Manager window
17+
6. Ensure you have created an Assembly Definition file in the same folder where your tests or scripts are that you’ll reference the Performance Testing Extension with. This Assembly Definition file needs to reference Unity.PerformanceTesting in order to use the Performance Testing Extension. Example of how to do this:
18+
* Create a new folder for storing tests in ("Tests", for example)
19+
* Create a new assembly definition file in the new folder using the context menu (right click/Create/Assembly definition) and name it "Tests" (or whatever you named the folder from step a. above)
20+
* In inspector for the assembly definition file check "Test Assemblies", and then Apply.
21+
* Open the assembly definition file in a text editor and add Unity.PerformanceTesting. To the references section. Save the file when you’re done doing this.
22+
23+
> Example: manifest.json file
1024
1125
``` json
1226
{
1327
"dependencies": {
14-
"com.unity.test-framework.performance": "0.1.37-preview",
28+
"com.unity.test-framework.performance": "0.1.39-preview",
1529
"com.unity.modules.jsonserialize": "1.0.0",
1630
"com.unity.modules.unitywebrequest": "1.0.0",
1731
"com.unity.modules.unityanalytics": "1.0.0",
18-
"com.unity.modules.vr": "1.0.0"
32+
"com.unity.modules.vr": "1.0.0",
33+
"com.unity.modules.physics": "1.0.0",
34+
"com.unity.modules.xr": "1.0.0"
1935
},
2036
"testables": [
2137
"com.unity.test-framework.performance"
22-
],
23-
"registry": "https://staging-packages.unity.com"
38+
]
2439
}
2540
```
2641

27-
If you are using 2018.1 or 2018.2 the module dependencies are unnecessary.
2842

29-
Assembly definitions should reference `Unity.PerformanceTesting` in order to use it. Create a new folder for storing tests in and then create a new asset from context menu called `right click/Create/Assembly definition`. In inspector for the assembly file check "Test Assemblies and apply. Then open the file in text editor and add `Unity.PerformanceTesting`.
43+
> Example: assembly definition file
3044
3145
``` json
3246
{
@@ -45,9 +59,6 @@ Assembly definitions should reference `Unity.PerformanceTesting` in order to use
4559
}
4660
```
4761

48-
How to test internals can be found in the following link:
49-
https://q.unity3d.com/questions/992/how-to-test-internal-variables-in-the-editor-tests.html
50-
5162
More information on how to create and run tests please refer to [Unity Test Runner docs](https://docs.unity3d.com/Manual/testing-editortestsrunner.html).
5263

5364

@@ -61,29 +72,46 @@ More information on how to create and run tests please refer to [Unity Test Runn
6172

6273
## SampleGroupDefinition
6374

64-
**struct SampleGroupDefinition**
65-
SampleGroupDefinition is used to define how a measurement is used in reporting and in regression detection.
75+
**struct SampleGroupDefinition** - used to define how a measurement is used in reporting and in regression detection.
6676

6777
Required parameters
6878
- **name** : Name of the measurement. Should be kept short and simple.
6979

7080
Optional parameters
71-
- **sampleUnit** : Unit of the measurement.
81+
- **sampleUnit** : Unit of the measurement to report samples in. Possible values are:
7282
- Nanosecond, Microsecond, Millisecond, Second, Byte, Kilobyte, Megabyte, Gigabyte
73-
- **aggregationType** : Preferred aggregation (default is median)
74-
- **percentile** : If aggregationType is Percentile, the percentile value used for the aggregation. i.e 0.95.
75-
- **threshold** : Threshold used for regression detection. If current sample value is over the threshold different from the baseline results, the result is concidered as a regression or a progression. Default value is 0.15f.
76-
- **increaseIsBetter** : Defines if an increase in the measurement value is concidered as a progression (better) or a regression. Default is false.
83+
- **aggregationType** : Preferred aggregation (default is median). Possible values are:
84+
- Median, Average, Min, Max, Percentile
85+
- **percentile** : If aggregationType is Percentile, the percentile value used for the aggregation. e.g. 0.95.
86+
- **increaseIsBetter** : Determines whether or not an increase in the measurement value should be considered a progression (performance improved) or a performance regression. Default is false. **NOTE:** This value is not used directly in the Performance Testing Extension, but recorded for later use in a reporting tool (such as the [Unity Performance Benchmark Reporter](https://github.com/Unity-Technologies/PerformanceBenchmarkReporter/wiki)) to determine whether or not a performance regression has occurred when used with a baseline result set.
87+
- **threshold** : The threshold, as a percentage of the aggregated sample group value, to use for regression detection. Default value is 0.15f. **NOTE:** This value is not used directly in the Performance Testing Extension, but recorded for later use in a reporting tool (such as the [Unity Performance Benchmark Reporter](https://github.com/Unity-Technologies/PerformanceBenchmarkReporter/wiki)) to determine whether or not a performance regression has occurred when used with a baseline result set.
7788

78-
If unspecified a default SampleGroupDefinition will be used with the name of "Measure.Scope", it is recommended to specify a name that is descriptive of what it is measuring.
89+
If unspecified a default SampleGroupDefinition will be used with the name of "Time", it is recommended to specify a name that is descriptive of what it is measuring.
7990

8091
## Taking measurements
8192

82-
Preferred way is to use `Measure.Method` or `Measure.Frames`. They both do a couple of warmup iterations which are then used to decide how many iterations per measurement should be used.
93+
The Performance Testing Extension provides several API methods you can use to take measurements in your performance test, depending on what you need to measure and how you want to do it. They are:
94+
* Measure.Method
95+
* Measure.Frames
96+
* Measure.Scope(SampleGroupdDefinition sampleGroupDefinition)
97+
* Measure.FrameTimes(SampleGroupdDefinition sampleGroupDefinition)
98+
* Measure.ProfilerMarkers(SampleGroupDefinition[] sampleGroupDefinitions)
99+
* Measure.Custom(SampleGroupDefinition sampleGroupDefinition, double value)
100+
101+
The sections below detail the specifics of each measurement method with examples.
102+
103+
Preferred way is to use Measure.Method or Measure.Frames. They both do a couple of warmup iterations which are then used to decide how many iterations per measurement should be used.
104+
83105

84106
**MethodMeasurement Method()**
85107

86-
It will execute provided method at least 3 times for warmup and 7 for measurements.
108+
This will execute the provided method, sampling performance using the following additional properties/methods to control how the measurements are taken:
109+
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. Default is 3 if not specified.
110+
* **MeasurementCount(int n)** - number of measurements to capture. Default is 7 if not specified.
111+
* **IterationsPerMeasurement(int n)** - number of iterations per measurement to use
112+
* **GC()** - if specified, will measure the Gargage Collection allocation value.
113+
114+
> Example 1: Simple method measurement using default values
87115
88116
``` csharp
89117
[PerformanceTest]
@@ -93,12 +121,7 @@ public void Test()
93121
}
94122
```
95123

96-
In cases where you feel the default values are not ideal you can specify custom iterations.
97-
98-
WarmupCount - how many iterations to run without measuring for warmup
99-
MeasurementCount - how many measurements to take
100-
IterationsPerMeasurement - how many iterations per measurement to take
101-
GC - measures the amount of GC allocations
124+
> Example 2: Customize Measure.Method properties
102125
103126
```
104127
[PerformanceTest]
@@ -115,7 +138,15 @@ public void Test()
115138

116139
**FramesMeasurement Measure.Frames()**
117140

118-
Used to yield for frames. It will automatically select the number of warmup and runtime frames.
141+
This will sample perf frame, records time per frame by default and provides additional properties/methods to control how the measurements are taken:
142+
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. Default is 3 if not specified.
143+
* **MeasurementCount(int n)** - number of measurements to capture. Default is 7 if not specified.
144+
* **DontRecordFrametime()** - disables frametime measurement
145+
* **ProfilerMarkers(...)** - sample profile markers per frame
146+
147+
It will automatically select the number of warmup and runtime frames.
148+
149+
> Example 1: Simple frame time measurement
119150
120151
``` csharp
121152
[PerformanceUnityTest]
@@ -128,6 +159,8 @@ public IEnumerator Test()
128159
```
129160

130161
In cases where you are measuring a system over frametime it is advised to disable frametime measurements and instead measure profiler markers for your system.
162+
> Example 2: Sample profile markers per frame, disable frametime measurement
163+
131164
``` csharp
132165
[PerformanceUnityTest]
133166
public IEnumerator Test()
@@ -142,6 +175,7 @@ public IEnumerator Test()
142175
```
143176

144177
If you want more control, you can specify how many frames you want to measure.
178+
> Example 3: Specify custom WarmupCount and MeasurementCount per frame
145179
146180
``` csharp
147181
[PerformanceUnityTest]
@@ -156,11 +190,11 @@ public IEnumerator Test()
156190
}
157191
```
158192

159-
When method or frame measurements are not enough you can use the following to measure. It will measure Scope, Frames, Markers or Cusom.
160-
161193
**IDisposable Measure.Scope(SampleGroupdDefinition sampleGroupDefinition)**
162194

163-
Used to measure a scope.
195+
When method or frame measurements are not enough you can use the following to measure. It will measure Scope, Frames, Markers or Custom.
196+
197+
> Example 1: Measuring a scope
164198
165199
``` csharp
166200
[PerformanceTest]
@@ -175,7 +209,7 @@ public void Test()
175209

176210
**IDisposable Measure.FrameTimes(SampleGroupdDefinition sampleGroupDefinition)**
177211

178-
Records frame times for a scope.
212+
> Example 1: Sample frame times for a scope
179213
180214
``` csharp
181215
[PerformanceUnityTest]
@@ -191,7 +225,9 @@ public IEnumerator Test()
191225

192226
**IDisposable Measure.ProfilerMarkers(SampleGroupDefinition[] sampleGroupDefinitions)**
193227

194-
Records profiler samples for a scope. The name of sample group definition has to match profiler sample names.
228+
When you want to record samples outside of frame time, method time, or profiler markers, use a custom measurement. It can be any double value. A sample group definition is required.
229+
230+
> Example 1: Use a custom measurement to capture total allocated memory
195231
196232
``` csharp
197233
[PerformanceTest]

0 commit comments

Comments
 (0)