Skip to content

Commit 44eae0c

Browse files
author
Unity Technologies
committed
com.unity.test-framework.performance@0.1.40-preview
## [0.1.40] - 2018-9-17 ### Update documentation ## [0.1.39] - 2018-9-14 ### Remove duplicate module from docs
1 parent 3a334f2 commit 44eae0c

File tree

4 files changed

+81
-75
lines changed

4 files changed

+81
-75
lines changed

CHANGELOG.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,16 @@
11
# Changelog
22

3+
## [0.1.40] - 2018-9-17
4+
5+
### Update documentation
6+
37
## [0.1.39] - 2018-9-14
48

5-
### remove duplicate module from docs
9+
### Remove duplicate module from docs
610

711
## [0.1.38] - 2018-9-14
812

9-
### doc updates
13+
### Documentation updates
1014

1115
## [0.1.36] - 2018-8-27
1216

Documentation~/index.md

Lines changed: 72 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -2,30 +2,31 @@
22

33
The Unity Performance Testing Extension is a Unity Editor package that, when installed, provides an API and test case decorators to make it easier to take measurements/samples of Unity profiler markers, and other custom metrics outside of the profiler, within the Unity Editor and built players. It also collects configuration metadata, such as build and player settings, which is useful when comparing data against different hardware and configurations.
44

5-
The Performance Testing Extension is intended to be used with, and complement, the Unity Test Runner framework.
5+
The Performance Testing Extension is intended to be used with, and complement, the Unity Test Runner framework. For more information on how to create and run tests please refer to [Unity Test Runner documentation](https://docs.unity3d.com/Manual/testing-editortestsrunner.html).
6+
67

78
**Important Note:** When tests are run with the Unity Test Runner, a development player is always built to support communication between the editor and player, effectively overriding the development build setting from the build settings UI or scripting API.
89

910
## Installing
1011

1112
To install the Performance Testing Extension package
1213
1. Open the manifest.json file for your Unity project (located in the YourProject/Packages directory) in a text editor
13-
2. Add com.unity.test-framework.performance to the dependencies as seen below
14-
3. Add com.unity.test-framework.performance to the testables section. If there is not a testables section in your manifest.json file, go ahead and add it.
14+
2. Add `com.unity.test-framework.performance` to the dependencies as seen below
15+
3. Add `com.unity.test-framework.performance` to the testables section. If there is not a testables section in your manifest.json file, go ahead and add it.
1516
4. Save the manifest.json file
1617
5. Verify the Performance Testing Extension is now installed opening the Unity Package Manager window
17-
6. Ensure you have created an Assembly Definition file in the same folder where your tests or scripts are that you’ll reference the Performance Testing Extension with. This Assembly Definition file needs to reference Unity.PerformanceTesting in order to use the Performance Testing Extension. Example of how to do this:
18-
* Create a new folder for storing tests in ("Tests", for example)
19-
* Create a new assembly definition file in the new folder using the context menu (right click/Create/Assembly definition) and name it "Tests" (or whatever you named the folder from step a. above)
20-
* In inspector for the assembly definition file check "Test Assemblies", and then Apply.
21-
* Open the assembly definition file in a text editor and add Unity.PerformanceTesting. To the references section. Save the file when you’re done doing this.
18+
6. Ensure you have created an Assembly Definition file in the same folder where your tests or scripts are that you’ll reference the Performance Testing Extension with. This Assembly Definition file needs to reference `Unity.PerformanceTesting` in order to use the Performance Testing Extension. Example of how to do this:
19+
1. Create a new folder for storing tests in ("Tests", for example)
20+
2. Create a new assembly definition file in the new folder using the context menu (right click/Create/Assembly definition) and name it "Tests" (or whatever you named the folder from step a. above)
21+
3. In inspector for the assembly definition file check "Test Assemblies", and then Apply.
22+
4. Open the assembly definition file in a text editor and add Unity.PerformanceTesting. To the references section. Save the file when you’re done doing this.
2223

23-
> Example: manifest.json file
24+
#### Example: manifest.json file
2425

2526
``` json
2627
{
2728
"dependencies": {
28-
"com.unity.test-framework.performance": "0.1.39-preview",
29+
"com.unity.test-framework.performance": "0.1.40-preview",
2930
"com.unity.modules.jsonserialize": "1.0.0",
3031
"com.unity.modules.unitywebrequest": "1.0.0",
3132
"com.unity.modules.unityanalytics": "1.0.0",
@@ -40,7 +41,7 @@ To install the Performance Testing Extension package
4041
```
4142

4243

43-
> Example: assembly definition file
44+
#### Example: assembly definition file
4445

4546
``` json
4647
{
@@ -59,38 +60,34 @@ To install the Performance Testing Extension package
5960
}
6061
```
6162

62-
More information on how to create and run tests please refer to [Unity Test Runner docs](https://docs.unity3d.com/Manual/testing-editortestsrunner.html).
63-
64-
6563
## Test Attributes
66-
**[PerformanceTest]** - Non yeilding performance test.
64+
**[PerformanceTest]** - Non-yielding performance test. This type of performance test starts and ends within the same frame.
6765

68-
**[PerformanceUnityTest]** - Yeilding performance test.
66+
**[PerformanceUnityTest]** - Yielding performance test. This is a good choice if you want to sample measurements across multiple frames.
6967

70-
**[Version(string version)]** - Performance tests should be versioned with every change. If not specified it will be assumed to be 1
68+
**[Version(string version)]** - Performance tests should be versioned with every change. If not specified it will be assumed to be 1.
7169

7270

7371
## SampleGroupDefinition
7472

7573
**struct SampleGroupDefinition** - used to define how a measurement is used in reporting and in regression detection.
7674

77-
Required parameters
78-
- **name** : Name of the measurement. Should be kept short and simple.
79-
8075
Optional parameters
76+
- **name** : Name of the measurement. If unspecified a default name of "Time" will be used.
8177
- **sampleUnit** : Unit of the measurement to report samples in. Possible values are:
82-
- Nanosecond, Microsecond, Millisecond, Second, Byte, Kilobyte, Megabyte, Gigabyte
78+
Nanosecond, Microsecond, Millisecond, Second, Byte, Kilobyte, Megabyte, Gigabyte
8379
- **aggregationType** : Preferred aggregation (default is median). Possible values are:
84-
- Median, Average, Min, Max, Percentile
80+
Median, Average, Min, Max, Percentile
8581
- **percentile** : If aggregationType is Percentile, the percentile value used for the aggregation. e.g. 0.95.
8682
- **increaseIsBetter** : Determines whether or not an increase in the measurement value should be considered a progression (performance improved) or a performance regression. Default is false. **NOTE:** This value is not used directly in the Performance Testing Extension, but recorded for later use in a reporting tool (such as the [Unity Performance Benchmark Reporter](https://github.com/Unity-Technologies/PerformanceBenchmarkReporter/wiki)) to determine whether or not a performance regression has occurred when used with a baseline result set.
8783
- **threshold** : The threshold, as a percentage of the aggregated sample group value, to use for regression detection. Default value is 0.15f. **NOTE:** This value is not used directly in the Performance Testing Extension, but recorded for later use in a reporting tool (such as the [Unity Performance Benchmark Reporter](https://github.com/Unity-Technologies/PerformanceBenchmarkReporter/wiki)) to determine whether or not a performance regression has occurred when used with a baseline result set.
8884

89-
If unspecified a default SampleGroupDefinition will be used with the name of "Time", it is recommended to specify a name that is descriptive of what it is measuring.
9085

9186
## Taking measurements
9287

93-
The Performance Testing Extension provides several API methods you can use to take measurements in your performance test, depending on what you need to measure and how you want to do it. They are:
88+
The Performance Testing Extension provides several API methods you can use to take measurements in your performance test, depending on what you need to measure and how you want to do it.
89+
90+
They are:
9491
* Measure.Method
9592
* Measure.Frames
9693
* Measure.Scope(SampleGroupdDefinition sampleGroupDefinition)
@@ -100,18 +97,16 @@ The Performance Testing Extension provides several API methods you can use to ta
10097

10198
The sections below detail the specifics of each measurement method with examples.
10299

103-
Preferred way is to use Measure.Method or Measure.Frames. They both do a couple of warmup iterations which are then used to decide how many iterations per measurement should be used.
104100

105-
106-
**MethodMeasurement Method()**
101+
### Measure.Method()
107102

108103
This will execute the provided method, sampling performance using the following additional properties/methods to control how the measurements are taken:
109-
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. Default is 3 if not specified.
104+
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. If unspecified, a default warmup is executed. This default warmup will wait for 7 ms. However, if less than 3 method executions have finished in that time, the warmup will wait until 3 method executions have completed.
110105
* **MeasurementCount(int n)** - number of measurements to capture. Default is 7 if not specified.
111-
* **IterationsPerMeasurement(int n)** - number of iterations per measurement to use
112-
* **GC()** - if specified, will measure the Gargage Collection allocation value.
106+
* **IterationsPerMeasurement(int n)** - number of method executions per measurement to use. If this value is not specified, the method will be executed as many times as possible until approximately 1 ms has elapsed.
107+
* **GC()** - if specified, will measure the total number of Garbage Collection allocation calls.
113108

114-
> Example 1: Simple method measurement using default values
109+
#### Example 1: Simple method measurement using default values
115110

116111
``` csharp
117112
[PerformanceTest]
@@ -121,7 +116,7 @@ public void Test()
121116
}
122117
```
123118

124-
> Example 2: Customize Measure.Method properties
119+
#### Example 2: Customize Measure.Method properties
125120

126121
```
127122
[PerformanceTest]
@@ -136,17 +131,17 @@ public void Test()
136131
}
137132
```
138133

139-
**FramesMeasurement Measure.Frames()**
134+
### Measure.Frames()
140135

141-
This will sample perf frame, records time per frame by default and provides additional properties/methods to control how the measurements are taken:
142-
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. Default is 3 if not specified.
143-
* **MeasurementCount(int n)** - number of measurements to capture. Default is 7 if not specified.
144-
* **DontRecordFrametime()** - disables frametime measurement
145-
* **ProfilerMarkers(...)** - sample profile markers per frame
136+
Records time per frame by default and provides additional properties/methods to control how the measurements are taken:
137+
* **WarmupCount(int n)** - number of times to to execute before measurements are collected. If unspecified, a default warmup is executed. This default warmup will wait for 80 ms. However, if less than 3 full frames have rendered in that time, the warmup will wait until 3 full frames have been rendered.
138+
* **MeasurementCount(int n)** - number of frames to capture measurements. If this value is not specified, frames will be captured as many times as possible until approximately 500 ms has elapsed.
139+
* **DontRecordFrametime()** - disables frametime measurement.
140+
* **ProfilerMarkers(...)** - sample profile markers per frame.
141+
* **Scope()** - measures frame times in a given asynchronous scope.
146142

147-
It will automatically select the number of warmup and runtime frames.
148143

149-
> Example 1: Simple frame time measurement
144+
#### Example 1: Simple frame time measurement using default values of at least 7 frames and default WarmupCount (see description above).
150145

151146
``` csharp
152147
[PerformanceUnityTest]
@@ -158,8 +153,9 @@ public IEnumerator Test()
158153
}
159154
```
160155

161-
In cases where you are measuring a system over frametime it is advised to disable frametime measurements and instead measure profiler markers for your system.
162-
> Example 2: Sample profile markers per frame, disable frametime measurement
156+
#### Example 2: Sample profile markers per frame, disable frametime measurement
157+
158+
If you’d like to sample profiler markers across multiple frames, and don’t have a need to record frametime, it is possible to disable the frame time measurement.
163159

164160
``` csharp
165161
[PerformanceUnityTest]
@@ -174,8 +170,22 @@ public IEnumerator Test()
174170
}
175171
```
176172

173+
#### Example 3: Sample frame times in a scope
174+
175+
``` csharp
176+
[PerformanceUnityTest]
177+
public IEnumerator Test()
178+
{
179+
using (Measure.Frames().Scope())
180+
{
181+
yield return ...;
182+
}
183+
}
184+
```
185+
186+
#### Example 3: Specify custom WarmupCount and MeasurementCount per frame
187+
177188
If you want more control, you can specify how many frames you want to measure.
178-
> Example 3: Specify custom WarmupCount and MeasurementCount per frame
179189

180190
``` csharp
181191
[PerformanceUnityTest]
@@ -190,11 +200,11 @@ public IEnumerator Test()
190200
}
191201
```
192202

193-
**IDisposable Measure.Scope(SampleGroupdDefinition sampleGroupDefinition)**
203+
### Measure.Scope()
194204

195-
When method or frame measurements are not enough you can use the following to measure. It will measure Scope, Frames, Markers or Custom.
205+
Measures execution time for the scope as a single time, for both synchronous and asynchronous methods.
196206

197-
> Example 1: Measuring a scope
207+
#### Example 1: Measuring a scope; execution time is measured for everything in the using statement
198208

199209
``` csharp
200210
[PerformanceTest]
@@ -207,27 +217,12 @@ public void Test()
207217
}
208218
```
209219

210-
**IDisposable Measure.FrameTimes(SampleGroupdDefinition sampleGroupDefinition)**
211-
212-
> Example 1: Sample frame times for a scope
213-
214-
``` csharp
215-
[PerformanceUnityTest]
216-
public IEnumerator Test()
217-
{
218-
using (Measure.Frames().Scope())
219-
{
220-
yield return ...;
221-
}
222-
}
223-
```
224-
225220

226-
**IDisposable Measure.ProfilerMarkers(SampleGroupDefinition[] sampleGroupDefinitions)**
221+
### Measure.ProfilerMarkers()
227222

228-
When you want to record samples outside of frame time, method time, or profiler markers, use a custom measurement. It can be any double value. A sample group definition is required.
223+
Used to record profiler markers. Profiler marker timings will be sampled within the scope of the `using` statement. Note that deep and editor profiling markers are not available.
229224

230-
> Example 1: Use a custom measurement to capture total allocated memory
225+
#### Example 1: Use a custom measurement to capture total allocated memory
231226

232227
``` csharp
233228
[PerformanceTest]
@@ -249,9 +244,11 @@ public void Test()
249244
```
250245

251246

252-
**void Custom(SampleGroupDefinition sampleGroupDefinition, double value)**
247+
### Measure.Custom()
253248

254-
Records a custom sample. It can be any double value. A sample group definition is required.
249+
When you want to record samples outside of frame time, method time, or profiler markers, use a custom measurement. It can be any double value. A sample group definition is required.
250+
251+
#### Example 1: Use a custom measurement to capture total allocated memory
255252

256253
``` csharp
257254
[PerformanceTest]
@@ -264,11 +261,15 @@ public void Test()
264261

265262
## Output
266263

267-
Each performance test will have a performance test summary. Every sample group will have multiple aggregated samples such as median, min, max, average, standard deviation, sample count, count of zero samples and sum of all samples.
264+
When a test is selected in the Unity Test Runner window within the Unity Editor, each performance test will have a performance test summary. This summary includes every sample group’s aggregated samples such as median, min, max, average, standard deviation, sample count, count of zero samples and sum of all samples.
265+
266+
#### Example 1: Performance Test Summary from Unity Test Runner window
268267

269268
`Time Millisecond Median:53.59 Min:53.36 Max:62.10 Avg:54.07 Std:1.90 Zeroes:0 SampleCount: 19 Sum: 1027.34`
270269

271-
## Examples
270+
## More Examples
271+
272+
#### Example 1: Measure execution time to serialize simple object to JSON
272273

273274
``` csharp
274275
[PerformanceTest, Version("2")]
@@ -313,6 +314,7 @@ Each performance test will have a performance test summary. Every sample group w
313314
```
314315

315316

317+
#### Example 2: Measure execution time to create 5000 simple cubes
316318

317319
``` csharp
318320
SampleGroupDefinition[] m_definitions =
@@ -340,6 +342,7 @@ Each performance test will have a performance test summary. Every sample group w
340342
}
341343
```
342344

345+
#### Example 3: Scene measurements
343346

344347
``` csharp
345348
[PerformanceUnityTest]
@@ -355,10 +358,9 @@ Each performance test will have a performance test summary. Every sample group w
355358
}
356359
```
357360

361+
#### Example 4: Custom measurement to capture total allocated and reserved memory
358362

359363
``` csharp
360-
// Records allocated and reserved memory, specifies that the sample unit is in Megabytes.
361-
362364
[PerformanceTest, Version("1")]
363365
public void Measure_Empty()
364366
{

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ To install the Performance Testing Extension package
2121
``` json
2222
{
2323
"dependencies": {
24-
"com.unity.test-framework.performance": "0.1.39-preview",
24+
"com.unity.test-framework.performance": "0.1.40-preview",
2525
"com.unity.modules.jsonserialize": "1.0.0",
2626
"com.unity.modules.unitywebrequest": "1.0.0",
2727
"com.unity.modules.unityanalytics": "1.0.0",

package.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
{
22
"name": "com.unity.test-framework.performance",
33
"displayName":"Performance testing API",
4-
"version": "0.1.39-preview",
5-
"unity": "2018.2",
4+
"version": "0.1.40-preview",
5+
"unity": "2018.1",
66
"description": "Performance testing API.",
77
"keywords": ["performance", "test"],
88
"dependencies": {

0 commit comments

Comments
 (0)