Calculate the time it takes for a media related metric value to reach a certain threshold from an event and store that time in a custom metric.
Arguments
Name | Type | Description |
---|---|---|
key | string | The name of the metric to set. |
event name | string | The name of the event to start the timer from. Events are created using rtcEvent(). |
criteria | string | The criteria threshold where the timer should be stopped. Similar to rtcSetTestExpectation(). |
aggregate | string | The type of aggregation to use for this custom metric across agents in the same test run: “sum” – Sum the metric’s value across all agents “avg” – Calculate the average of this metric’s value across all agents. |
Supported criteria
The following criteria are supported:
- [video|audio].[in|out].bitrate – expressed in Kbits
- video.[in|out].fps – expressed in integer values
Code examples
client
.rtcSetMetricFromThresholdTime("timeTo1Mbps", "CallStart", "video.out.bitrate > 1000", "avg");
Note: The example checks the time it took the
probe to reach 1 Mbps of outgoing video bitrate, placing it in a custom
metric called timeTo1Mbps and aggregating it as average across probes in the
test.