Commit 61463aa1 by Torkel Ödegaard Committed by GitHub

Transform: Fixed issue in labels to fields and update docs (#27501)

parent 6932b459
......@@ -107,25 +107,24 @@ In the example below, we have two queries returning table data. It is visualized
Query A:
| Time | Job | Uptime |
|---------------------|---------|-----------|
| ------------------- | ------- | --------- |
| 2020-07-07 11:34:20 | node | 25260122 |
| 2020-07-07 11:24:20 | postgre | 123001233 |
Query B:
| Time | Job | Errors |
|---------------------|---------|--------|
| ------------------- | ------- | ------ |
| 2020-07-07 11:34:20 | node | 15 |
| 2020-07-07 11:24:20 | postgre | 5 |
Here is the result after applying the `Merge` transformation.
| Time | Job | Errors | Uptime |
|---------------------|---------|--------|-----------|
| ------------------- | ------- | ------ | --------- |
| 2020-07-07 11:34:20 | node | 15 | 25260122 |
| 2020-07-07 11:24:20 | postgre | 5 | 123001233 |
### Filter by name
Use this transformation to remove portions of the query results.
......@@ -209,12 +208,30 @@ In the example below, I added two fields together and named them Sum.
### Labels to fields
Use this transformation to group series by time and return labels or tags as fields.
> **Note:** In order to apply this transformation, you must have a query to a data source that returns labeled fields.
> **Note:** In order to apply this transformation, your query needs to returns labeled fields.
When you select this transformation, Grafana automatically transforms all labeled data into fields.
Example: Given a query result of two time series
1: labels Server=Server A, Datacenter=EU
2: labels Server=Server B, Datacenter=EU
This would result in a table like this
| Time | Server | Datacenter | Value |
| ------------------- | -------- | ---------- | ----- |
| 2020-07-07 11:34:20 | Server A | EU | 1 |
| 2020-07-07 11:34:20 | Server B | EU | 2 |
**Value field name**
If you where to select `Server` as in the **Value field name** you would get one field for every value of the `Server`
label.
| Time | Datacenter | Server A | Server B |
| ------------------- | ---------- | -------- | -------- |
| 2020-07-07 11:34:20 | EU | 1 | 2 |
For this example, I manually defined labels in the Random Walk visualization of TestData DB.
{{< docs-imagebox img="/img/docs/transformations/labels-to-fields-before-7-0.png" class="docs-image--no-shadow" max-width= "1100px" >}}
......@@ -223,7 +240,6 @@ After I apply the transformation, my labels appear in the table as fields.
{{< docs-imagebox img="/img/docs/transformations/labels-to-fields-after-7-0.png" class="docs-image--no-shadow" max-width= "1100px" >}}
### Group By
> **Note:** This transformation is only available in Grafana 7.2+.
......@@ -232,57 +248,58 @@ This transformation groups the data by a specified field (column) value and proc
Here's an example of original data.
| Time | Server ID | CPU Temperature | Server Status
|---------------------|-------------|-----------------|----------
| 2020-07-07 11:34:20 | server 1 | 80 | Shutdown
| 2020-07-07 11:34:20 | server 3 | 62 | OK
| 2020-07-07 10:32:20 | server 2 | 90 | Overload
| 2020-07-07 10:31:22 | server 3 | 55 | OK
| 2020-07-07 09:30:57 | server 3 | 62 | Rebooting
| 2020-07-07 09:30:05 | server 2 | 88 | OK
| 2020-07-07 09:28:06 | server 1 | 80 | OK
| 2020-07-07 09:25:05 | server 2 | 88 | OK
| 2020-07-07 09:23:07 | server 1 | 86 | OK
| Time | Server ID | CPU Temperature | Server Status |
| ------------------- | --------- | --------------- | ------------- |
| 2020-07-07 11:34:20 | server 1 | 80 | Shutdown |
| 2020-07-07 11:34:20 | server 3 | 62 | OK |
| 2020-07-07 10:32:20 | server 2 | 90 | Overload |
| 2020-07-07 10:31:22 | server 3 | 55 | OK |
| 2020-07-07 09:30:57 | server 3 | 62 | Rebooting |
| 2020-07-07 09:30:05 | server 2 | 88 | OK |
| 2020-07-07 09:28:06 | server 1 | 80 | OK |
| 2020-07-07 09:25:05 | server 2 | 88 | OK |
| 2020-07-07 09:23:07 | server 1 | 86 | OK |
This transformation goes in two steps. First you specify one or multiple fields to group the data by. This will group all the same values of those fields together, as if you sorted them. For instance if we `Group By` the `Server ID` field, it would group the data this way:
| Time | Server ID | CPU Temperature | Server Status
|---------------------|-------------|-----------------|----------
| 2020-07-07 11:34:20 | **server 1** | 80 | Shutdown
| 2020-07-07 09:28:06 | **server 1** | 80 | OK
| 2020-07-07 09:23:07 | **server 1** | 86 | OK
| Time | Server ID | CPU Temperature | Server Status |
| ------------------- | ------------ | --------------- | ------------- |
| 2020-07-07 11:34:20 | **server 1** | 80 | Shutdown |
| 2020-07-07 09:28:06 | **server 1** | 80 | OK |
| 2020-07-07 09:23:07 | **server 1** | 86 | OK |
|
| 2020-07-07 10:32:20 | server 2 | 90 | Overload
| 2020-07-07 09:30:05 | server 2 | 88 | OK
| 2020-07-07 09:25:05 | server 2 | 88 | OK
|
| 2020-07-07 11:34:20 | ***server 3*** | 62 | OK
| 2020-07-07 10:31:22 | ***server 3*** | 55 | OK
| 2020-07-07 09:30:57 | ***server 3*** | 62 | Rebooting
| 2020-07-07 11:34:20 | **_server 3_** | 62 | OK
| 2020-07-07 10:31:22 | **_server 3_** | 55 | OK
| 2020-07-07 09:30:57 | **_server 3_** | 62 | Rebooting
All rows with the same value of `Server ID` are grouped together.
After choosing which field you want to group your data by, you can add various calculations on the other fields, and the calculation will be applied on each group of rows. For instance, we could want to calculate the average `CPU temperature` for each of those servers. So we can add the _mean_ calculation applied on the `CPU Temperature` field to get the following:
| Server ID | CPU Temperature (mean)
|-----------|--------------------------
| server 1 | 82
| server 2 | 88.6
| server 3 | 59.6
| Server ID | CPU Temperature (mean) |
| --------- | ---------------------- |
| server 1 | 82 |
| server 2 | 88.6 |
| server 3 | 59.6 |
And we can add more than one of those calculation. For instance :
- For field `Time`, we can calculate the *Last* value, to know when the last data point was received for each server
- For field `Server Status`, we can calculate the *Last* value to know what is the last state value for each server
- For field `Temperature`, we can also calculate the *Last* value to know what is the latest monitored temperature for each server
- For field `Time`, we can calculate the _Last_ value, to know when the last data point was received for each server
- For field `Server Status`, we can calculate the _Last_ value to know what is the last state value for each server
- For field `Temperature`, we can also calculate the _Last_ value to know what is the latest monitored temperature for each server
We would then get :
| Server ID | CPU Temperature (mean) | CPU Temperature (last) | Time (last) | Server Status (last)
|-----------|-------------------------- |------------------------|------------------|----------------------
| server 1 | 82 | 80 | 2020-07-07 11:34:20 | Shutdown
| server 2 | 88.6 | 90 | 2020-07-07 10:32:20 | Overload
| server 3 | 59.6 | 62 | 2020-07-07 11:34:20 | OK
| Server ID | CPU Temperature (mean) | CPU Temperature (last) | Time (last) | Server Status (last) |
| --------- | ---------------------- | ---------------------- | ------------------- | -------------------- |
| server 1 | 82 | 80 | 2020-07-07 11:34:20 | Shutdown |
| server 2 | 88.6 | 90 | 2020-07-07 10:32:20 | Overload |
| server 3 | 59.6 | 62 | 2020-07-07 11:34:20 | OK |
This transformation allows you to extract some key information out of your time series and display them in a convenient way.
......@@ -299,7 +316,7 @@ In the example below, we have two queries returning time series data. It is visu
Query A:
| Time | Temperature |
|---------------------|-------------|
| ------------------- | ----------- |
| 2020-07-07 11:34:20 | 25 |
| 2020-07-07 10:31:22 | 22 |
| 2020-07-07 09:30:05 | 19 |
......@@ -307,7 +324,7 @@ Query A:
Query B:
| Time | Humidity |
|---------------------|----------|
| ------------------- | -------- |
| 2020-07-07 11:34:20 | 24 |
| 2020-07-07 10:32:20 | 29 |
| 2020-07-07 09:30:57 | 33 |
......@@ -315,7 +332,7 @@ Query B:
Here is the result after applying the `Series to rows` transformation.
| Time | Metric | Value |
|---------------------|-------------|-------|
| ------------------- | ----------- | ----- |
| 2020-07-07 11:34:20 | Temperature | 25 |
| 2020-07-07 11:34:20 | Humidity | 22 |
| 2020-07-07 10:32:20 | Humidity | 29 |
......
......@@ -50,7 +50,16 @@ describe('Labels as Columns', () => {
name: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000] },
{ name: 'Value', type: FieldType.number, values: [1, 2], labels: { location: 'inside', name: 'Request' } },
{
name: 'Value',
type: FieldType.number,
values: [1, 2],
labels: { location: 'inside', name: 'Request' },
config: {
displayName: 'Custom1',
displayNameFromDS: 'Custom2',
},
},
],
});
......
......@@ -51,7 +51,9 @@ export const labelsToFieldsTransformer: DataTransformerInfo<LabelsToFieldsOption
name,
config: {
...field.config,
// we need to clear thes for this transform as these can contain label names that we no longer want
displayName: undefined,
displayNameFromDS: undefined,
},
labels: undefined,
});
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment