Commit 67ed5796 by Daniel Lee Committed by GitHub

azuremonitor: Deep linking from Log Analytic queries to the Azure Portal (#24417)

* azuremonitor: add gzipped and base64 encoded query to metadata

for Azure Log Analytic query responses

* azure monitor: add fields to metadata for log analytics

* azuremonitor: correction to text in query editor

* azuremonitor: adds subscription id to result metadata

* azuremonitor: build deep link url for Log Analytics

Most of the information needed for building the url
comes from the backend. The workspace friendly name
and the resource group that the workspace belongs
to are fetched in a separate API call. This call is
cached otherwise there would be a workspaces call
per query on the dashboard.

* docs: azure log analytics deep linking

* Apply suggestions from code review

Co-authored-by: Diana Payton <52059945+oddlittlebird@users.noreply.github.com>

* docs: fixing review comments for azure monitor

Co-authored-by: Diana Payton <52059945+oddlittlebird@users.noreply.github.com>
parent ae7f0aeb
......@@ -79,7 +79,7 @@ In the query editor for a panel, after choosing your Azure Monitor data source,
The query editor will change depending on which one you pick. Azure Monitor is the default.
## Querying the Azure Monitor Service
## Querying the Azure Monitor service
The Azure Monitor service provides metrics for all the Azure services that you have running. It helps you understand how your applications on Azure are performing and to proactively find issues affecting your applications.
......@@ -93,7 +93,7 @@ Examples of metrics that you can get from the service are:
{{< docs-imagebox img="/img/docs/v60/azuremonitor-service-query-editor.png" class="docs-image--no-shadow" caption="Azure Monitor Query Editor" >}}
### Formatting Legend Keys with Aliases for the Azure Monitor Service
### Formatting legend keys with aliases for Azure Monitor
The default legend formatting for the Azure Monitor API is:
......@@ -106,7 +106,7 @@ Azure Monitor Examples:
- `dimension: {{dimensionvalue}}`
- `{{resourcegroup}} - {{resourcename}}`
### Alias Patterns for Azure Monitor
### Alias patterns for Azure Monitor
- `{{resourcegroup}}` = replaced with the value of the Resource Group
- `{{namespace}}` = replaced with the value of the Namespace (e.g. Microsoft.Compute/virtualMachines)
......@@ -115,7 +115,7 @@ Azure Monitor Examples:
- `{{dimensionname}}` = replaced with dimension key/label (e.g. blobtype)
- `{{dimensionvalue}}` = replaced with dimension value (e.g. BlockBlob)
### Templating with Variables for the Azure Monitor Service
### Templating with variables for Azure Monitor
Instead of hard-coding things like server, application and sensor name in your metric queries you can use variables in their place. Variables are shown as dropdown select boxes at the top of the dashboard. These dropdowns make it easy to change the data being displayed in your dashboard.
......@@ -149,11 +149,11 @@ Examples:
Check out the [Templating]({{< relref "../../variables/templates-and-variables.md" >}}) documentation for an introduction to the templating feature and the different
types of template variables.
### Azure Monitor Metrics Whitelist
### Azure Monitor metrics whitelist
Not all metrics returned by the Azure Monitor API have values. The Grafana data source has a whitelist to only return metric names if it is possible they might have values. This whitelist is updated regularly as new services and metrics are added to the Azure cloud. You can find the current whitelist [here](https://github.com/grafana/grafana/blob/master/public/app/plugins/datasource/grafana-azure-monitor-datasource/azure_monitor/supported_namespaces.ts).
### Azure Monitor Alerting
### Azure Monitor alerting
Grafana alerting is supported for the Azure Monitor service. This is not Azure Alerts support. Read more about how alerting in Grafana works [here]({{< relref "../../alerting/rules.md" >}}).
......@@ -163,7 +163,7 @@ Grafana alerting is supported for the Azure Monitor service. This is not Azure A
{{< docs-imagebox img="/img/docs/v60/appinsights-service-query-editor.png" class="docs-image--no-shadow" caption="Application Insights Query Editor" >}}
### Formatting Legend Keys with Aliases for the Application Insights Service
### Formatting legend keys with aliases for Application Insights
The default legend formatting is:
......@@ -177,13 +177,13 @@ Application Insights Examples:
- `city: {{groupbyvalue}}`
- `{{groupbyname}}: {{groupbyvalue}}`
### Alias Patterns for Application Insights
### Alias patterns for Application Insights
- `{{groupbyvalue}}` = replaced with the value of the group by
- `{{groupbyname}}` = replaced with the name/label of the group by
- `{{metric}}` = replaced with metric name (e.g. requests/count)
### Filter Expressions for Application Insights
### Filter expressions for Application Insights
The filter field takes an OData filter expression.
......@@ -194,7 +194,7 @@ Examples:
- `client/city ne 'Boydton' and client/city ne 'Dublin'`
- `client/city eq 'Boydton' or client/city eq 'Dublin'`
### Templating with Variables for Application Insights
### Templating with variables for Application Insights
Use the one of the following queries in the `Query` field in the Variable edit view.
......@@ -214,13 +214,13 @@ Examples:
{{< docs-imagebox img="/img/docs/v60/appinsights-service-variables.png" class="docs-image--no-shadow" caption="Nested Application Insights Template Variables" >}}
### Application Insights Alerting
### Application Insights alerting
Grafana alerting is supported for Application Insights. This is not Azure Alerts support. Read more about how alerting in Grafana works [here]({{< relref "../../alerting/rules.md" >}}).
{{< docs-imagebox img="/img/docs/v60/azuremonitor-alerting.png" class="docs-image--no-shadow" caption="Azure Monitor Alerting" >}}
## Querying the Azure Log Analytics Service
## Querying the Azure Log Analytics service
Queries are written in the new [Azure Log Analytics (or KustoDB) Query Language](https://docs.loganalytics.io/index). A Log Analytics Query can be formatted as Time Series data or as Table data.
......@@ -246,7 +246,7 @@ If your credentials give you access to multiple subscriptions then choose the ap
{{< docs-imagebox img="/img/docs/v60/azureloganalytics-service-query-editor.png" class="docs-image--no-shadow" caption="Azure Log Analytics Query Editor" >}}
### Azure Log Analytics Macros
### Azure Log Analytics macros
To make writing queries easier there are several Grafana macros that can be used in the where clause of a query:
......@@ -268,13 +268,13 @@ To make writing queries easier there are several Grafana macros that can be used
If using the `All` option, then check the `Include All Option` checkbox and in the `Custom all value` field type in the following value: `all`. If `$myVar` has value `all` then the macro will instead expand to `1 == 1`. For template variables with a lot of options, this will increase the query performance by not building a large where..in clause.
### Azure Log Analytics Builtin Variables
### Azure Log Analytics builtin variables
There are also some Grafana variables that can be used in Azure Log Analytics queries:
- `$__interval` - Grafana calculates the minimum time grain that can be used to group by time in queries. More details on how it works [here]({{< relref "../../variables/templates-and-variables.md#interval-variables" >}}). It returns a time grain like `5m` or `1h` that can be used in the bin function. E.g. `summarize count() by bin(TimeGenerated, $__interval)`
### Templating with Variables for Azure Log Analytics
### Templating with variables for Azure Log Analytics
Any Log Analytics query that returns a list of values can be used in the `Query` field in the Variable edit view. There is also one Grafana function for Log Analytics that returns a list of workspaces.
......@@ -313,11 +313,25 @@ Perf
| order by TimeGenerated asc
```
### Azure Log Analytics Alerting
### Deep linking from Grafana panels to the Log Analytics query editor in Azure Portal
Not implemented yet.
> Only available in Grafana v7.0+.
### Writing Analytics Queries For the Application Insights Service
{{< docs-imagebox img="/img/docs/v70/azure-log-analytics-deep-linking.png" max-width="500px" class="docs-image--right" caption="Azure Log Analytics deep linking" >}}
Click on a time series in the panel to see a context menu with a link to `View in Azure Portal`. Clicking that link opens the Azure Log Analytics query editor in the Azure Portal and runs the query from the Grafana panel there.
If you're not currently logged in to the Azure Portal, then the link opens the login page. The provided link is valid for any account, but it only displays the query if your account has access to the Azure Log Analytics workspace specified in the query.
<div class="clearfix"></div>
### Azure Log Analytics alerting
> Only available in Grafana v7.0+.
Grafana alerting is supported for Application Insights. This is not Azure Alerts support. Read more about how alerting in Grafana works in [Alerting rules]({{< relref "../../alerting/rules.md" >}}).
### Writing analytics queries For the Application Insights service
If you change the service type to "Application Insights", the menu icon to the right adds another option, "Toggle Edit Mode". Once clicked, the query edit mode changes to give you a full text area in which to write log analytics queries. (This is identical to how the InfluxDB data source lets you write raw queries.)
......@@ -362,4 +376,4 @@ datasources:
appInsightsApiKey: <app-insights-api-key>
logAnalyticsClientSecret: <log-analytics-client-secret>
version: 1
```
\ No newline at end of file
```
package azuremonitor
import (
"bytes"
"compress/gzip"
"context"
"encoding/base64"
"encoding/json"
"errors"
"fmt"
......@@ -35,6 +38,7 @@ type AzureLogAnalyticsQuery struct {
RefID string
ResultFormat string
URL string
Model *simplejson.Json
Params url.Values
Target string
}
......@@ -77,7 +81,6 @@ func (e *AzureLogAnalyticsDatasource) buildQueries(queries []*tsdb.Query, timeRa
}
urlComponents := map[string]string{}
urlComponents["subscription"] = fmt.Sprintf("%v", query.Model.Get("subscription").MustString())
urlComponents["workspace"] = fmt.Sprintf("%v", azureLogAnalyticsTarget["workspace"])
apiURL := fmt.Sprintf("%s/query", urlComponents["workspace"])
......@@ -92,6 +95,7 @@ func (e *AzureLogAnalyticsDatasource) buildQueries(queries []*tsdb.Query, timeRa
RefID: query.RefId,
ResultFormat: resultFormat,
URL: apiURL,
Model: query.Model,
Params: params,
Target: params.Encode(),
})
......@@ -145,12 +149,12 @@ func (e *AzureLogAnalyticsDatasource) executeQuery(ctx context.Context, query *A
azlog.Debug("AzureLogsAnalytics", "Response", queryResult)
if query.ResultFormat == "table" {
queryResult.Tables, queryResult.Meta, err = e.parseToTables(data, query.Params.Get("query"))
queryResult.Tables, queryResult.Meta, err = e.parseToTables(data, query.Model, query.Params)
if err != nil {
return nil, err
}
} else {
queryResult.Series, queryResult.Meta, err = e.parseToTimeSeries(data, query.Params.Get("query"))
queryResult.Series, queryResult.Meta, err = e.parseToTimeSeries(data, query.Model, query.Params)
if err != nil {
return nil, err
}
......@@ -233,9 +237,10 @@ func (e *AzureLogAnalyticsDatasource) unmarshalResponse(res *http.Response) (Azu
return data, nil
}
func (e *AzureLogAnalyticsDatasource) parseToTables(data AzureLogAnalyticsResponse, query string) ([]*tsdb.Table, *simplejson.Json, error) {
meta := metadata{
Query: query,
func (e *AzureLogAnalyticsDatasource) parseToTables(data AzureLogAnalyticsResponse, model *simplejson.Json, params url.Values) ([]*tsdb.Table, *simplejson.Json, error) {
meta, err := createMetadata(model, params)
if err != nil {
return nil, simplejson.NewFromAny(meta), err
}
tables := make([]*tsdb.Table, 0)
......@@ -267,9 +272,10 @@ func (e *AzureLogAnalyticsDatasource) parseToTables(data AzureLogAnalyticsRespon
return nil, nil, errors.New("no data as no PrimaryResult table was returned in the response")
}
func (e *AzureLogAnalyticsDatasource) parseToTimeSeries(data AzureLogAnalyticsResponse, query string) (tsdb.TimeSeriesSlice, *simplejson.Json, error) {
meta := metadata{
Query: query,
func (e *AzureLogAnalyticsDatasource) parseToTimeSeries(data AzureLogAnalyticsResponse, model *simplejson.Json, params url.Values) (tsdb.TimeSeriesSlice, *simplejson.Json, error) {
meta, err := createMetadata(model, params)
if err != nil {
return nil, simplejson.NewFromAny(meta), err
}
for _, t := range data.Tables {
......@@ -352,3 +358,32 @@ func (e *AzureLogAnalyticsDatasource) parseToTimeSeries(data AzureLogAnalyticsRe
return nil, nil, errors.New("no data as no PrimaryResult table was returned in the response")
}
func createMetadata(model *simplejson.Json, params url.Values) (metadata, error) {
meta := metadata{
Query: params.Get("query"),
Subscription: model.Get("subscriptionId").MustString(),
Workspace: model.Get("azureLogAnalytics").Get("workspace").MustString(),
}
encQuery, err := encodeQuery(meta.Query)
if err != nil {
return meta, err
}
meta.EncodedQuery = encQuery
return meta, nil
}
func encodeQuery(rawQuery string) (string, error) {
var b bytes.Buffer
gz := gzip.NewWriter(&b)
if _, err := gz.Write([]byte(rawQuery)); err != nil {
return "", err
}
if err := gz.Close(); err != nil {
return "", err
}
return base64.StdEncoding.EncodeToString(b.Bytes()), nil
}
......@@ -57,8 +57,15 @@ func TestBuildingAzureLogAnalyticsQueries(t *testing.T) {
RefID: "A",
ResultFormat: "time_series",
URL: "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/query",
Params: url.Values{"query": {"query=Perf | where ['TimeGenerated'] >= datetime('2018-03-15T13:00:00Z') and ['TimeGenerated'] <= datetime('2018-03-15T13:34:00Z') | where ['Computer'] in ('comp1','comp2') | summarize avg(CounterValue) by bin(TimeGenerated, 34000ms), Computer"}},
Target: "query=query%3DPerf+%7C+where+%5B%27TimeGenerated%27%5D+%3E%3D+datetime%28%272018-03-15T13%3A00%3A00Z%27%29+and+%5B%27TimeGenerated%27%5D+%3C%3D+datetime%28%272018-03-15T13%3A34%3A00Z%27%29+%7C+where+%5B%27Computer%27%5D+in+%28%27comp1%27%2C%27comp2%27%29+%7C+summarize+avg%28CounterValue%29+by+bin%28TimeGenerated%2C+34000ms%29%2C+Computer",
Model: simplejson.NewFromAny(map[string]interface{}{
"azureLogAnalytics": map[string]interface{}{
"query": "query=Perf | where $__timeFilter() | where $__contains(Computer, 'comp1','comp2') | summarize avg(CounterValue) by bin(TimeGenerated, $__interval), Computer",
"resultFormat": "time_series",
"workspace": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
},
}),
Params: url.Values{"query": {"query=Perf | where ['TimeGenerated'] >= datetime('2018-03-15T13:00:00Z') and ['TimeGenerated'] <= datetime('2018-03-15T13:34:00Z') | where ['Computer'] in ('comp1','comp2') | summarize avg(CounterValue) by bin(TimeGenerated, 34000ms), Computer"}},
Target: "query=query%3DPerf+%7C+where+%5B%27TimeGenerated%27%5D+%3E%3D+datetime%28%272018-03-15T13%3A00%3A00Z%27%29+and+%5B%27TimeGenerated%27%5D+%3C%3D+datetime%28%272018-03-15T13%3A34%3A00Z%27%29+%7C+where+%5B%27Computer%27%5D+in+%28%27comp1%27%2C%27comp2%27%29+%7C+summarize+avg%28CounterValue%29+by+bin%28TimeGenerated%2C+34000ms%29%2C+Computer",
},
},
Err: require.NoError,
......@@ -69,7 +76,7 @@ func TestBuildingAzureLogAnalyticsQueries(t *testing.T) {
t.Run(tt.name, func(t *testing.T) {
queries, err := datasource.buildQueries(tt.queryModel, tt.timeRange)
tt.Err(t, err)
if diff := cmp.Diff(tt.azureLogAnalyticsQueries, queries, cmpopts.EquateNaNs()); diff != "" {
if diff := cmp.Diff(tt.azureLogAnalyticsQueries, queries, cmpopts.IgnoreUnexported(simplejson.Json{})); diff != "" {
t.Errorf("Result mismatch (-want +got):\n%s", diff)
}
})
......@@ -100,7 +107,7 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
},
},
},
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"Computer","type":"string"},{"name":"avg_CounterValue","type":"real"}],"query":"test query"}`,
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"Computer","type":"string"},{"name":"avg_CounterValue","type":"real"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
{
......@@ -133,7 +140,7 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
},
},
},
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"ObjectName","type":"string"},{"name":"avg_CounterValue","type":"real"}],"query":"test query"}`,
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"ObjectName","type":"string"},{"name":"avg_CounterValue","type":"real"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
{
......@@ -150,7 +157,7 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
},
},
},
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"avg_CounterValue","type":"int"}],"query":"test query"}`,
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"avg_CounterValue","type":"int"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
{
......@@ -158,7 +165,7 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
testFile: "loganalytics/4-log-analytics-response-metrics-no-time-column.json",
query: "test query",
series: nil,
meta: `{"columns":[{"name":"Computer","type":"string"},{"name":"avg_CounterValue","type":"real"}],"query":"test query"}`,
meta: `{"columns":[{"name":"Computer","type":"string"},{"name":"avg_CounterValue","type":"real"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
{
......@@ -166,7 +173,7 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
testFile: "loganalytics/5-log-analytics-response-metrics-no-value-column.json",
query: "test query",
series: nil,
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"Computer","type":"string"}],"query":"test query"}`,
meta: `{"columns":[{"name":"TimeGenerated","type":"datetime"},{"name":"Computer","type":"string"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
}
......@@ -175,7 +182,15 @@ func TestParsingAzureLogAnalyticsResponses(t *testing.T) {
t.Run(tt.name, func(t *testing.T) {
data, _ := loadLogAnalyticsTestFile(tt.testFile)
series, meta, err := datasource.parseToTimeSeries(data, tt.query)
model := simplejson.NewFromAny(map[string]interface{}{
"subscriptionId": "1234",
"azureLogAnalytics": map[string]interface{}{
"workspace": "aworkspace",
},
})
params := url.Values{}
params.Add("query", tt.query)
series, meta, err := datasource.parseToTimeSeries(data, model, params)
tt.Err(t, err)
if diff := cmp.Diff(tt.series, series, cmpopts.EquateNaNs()); diff != "" {
......@@ -262,7 +277,7 @@ func TestParsingAzureLogAnalyticsTableResponses(t *testing.T) {
},
meta: `{"columns":[{"name":"TenantId","type":"string"},{"name":"Computer","type":"string"},{"name":"ObjectName","type":"string"},{"name":"CounterName","type":"string"},` +
`{"name":"InstanceName","type":"string"},{"name":"Min","type":"real"},{"name":"Max","type":"real"},{"name":"SampleCount","type":"int"},{"name":"CounterValue","type":"real"},` +
`{"name":"TimeGenerated","type":"datetime"}],"query":"test query"}`,
`{"name":"TimeGenerated","type":"datetime"}],"subscription":"1234","workspace":"aworkspace","query":"test query","encodedQuery":"H4sIAAAAAAAA/ypJLS5RKCxNLaoEBAAA///0rBfVCgAAAA=="}`,
Err: require.NoError,
},
}
......@@ -271,7 +286,15 @@ func TestParsingAzureLogAnalyticsTableResponses(t *testing.T) {
t.Run(tt.name, func(t *testing.T) {
data, _ := loadLogAnalyticsTestFile(tt.testFile)
tables, meta, err := datasource.parseToTables(data, tt.query)
model := simplejson.NewFromAny(map[string]interface{}{
"subscriptionId": "1234",
"azureLogAnalytics": map[string]interface{}{
"workspace": "aworkspace",
},
})
params := url.Values{}
params.Add("query", tt.query)
tables, meta, err := datasource.parseToTables(data, model, params)
tt.Err(t, err)
if diff := cmp.Diff(tt.tables, tables, cmpopts.EquateNaNs()); diff != "" {
......
......@@ -79,8 +79,11 @@ type AzureLogAnalyticsTable struct {
}
type metadata struct {
Columns []column `json:"columns"`
Query string `json:"query"`
Columns []column `json:"columns"`
Subscription string `json:"subscription"`
Workspace string `json:"workspace"`
Query string `json:"query"`
EncodedQuery string `json:"encodedQuery"`
}
type column struct {
......
......@@ -218,7 +218,7 @@ export default class AppInsightsDatasource {
return {
status: 'error',
message: 'Returned http status code ' + response.status,
message: 'Application Insights: Returned http status code ' + response.status,
};
})
.catch((error: any) => {
......
......@@ -151,8 +151,11 @@ describe('AzureLogAnalyticsDatasource', () => {
refId: 'A',
meta: {
columns: ['TimeGenerated', 'Computer', 'avg_CounterValue'],
subscription: 'xxx',
workspace: 'aaaa-1111-bbbb-2222',
query:
'Perf\r\n| where ObjectName == "Memory" and CounterName == "Available MBytes Memory"\n| where TimeGenerated >= datetime(\'2020-04-23T09:15:20Z\') and TimeGenerated <= datetime(\'2020-04-23T09:20:20Z\')\n| where 1 == 1\n| summarize avg(CounterValue) by bin(TimeGenerated, 1m), Computer \n| order by TimeGenerated asc',
encodedQuery: 'gzipped_base64_encoded_query',
},
series: [
{
......@@ -171,12 +174,30 @@ describe('AzureLogAnalyticsDatasource', () => {
},
};
const workspacesResponse = {
value: [
{
properties: {
customerId: 'aaaa-1111-bbbb-2222',
},
id:
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourcegroups/defaultresourcegroup/providers/microsoft.operationalinsights/workspaces/aworkspace',
name: 'aworkspace',
type: 'Microsoft.OperationalInsights/workspaces',
},
],
};
describe('in time series format', () => {
describe('and the data is valid (has time, metric and value columns)', () => {
beforeEach(() => {
datasourceRequestMock.mockImplementation((options: { url: string }) => {
expect(options.url).toContain('/api/tsdb/query');
return Promise.resolve({ data: response, status: 200 });
if (options.url.indexOf('Microsoft.OperationalInsights/workspaces') > 0) {
return Promise.resolve({ data: workspacesResponse, status: 200 });
} else {
expect(options.url).toContain('/api/tsdb/query');
return Promise.resolve({ data: response, status: 200 });
}
});
});
......@@ -193,6 +214,11 @@ describe('AzureLogAnalyticsDatasource', () => {
expect(results.data[0].fields[1].values.get(0)).toEqual(2017.25);
expect(results.data[0].fields[0].values.get(1)).toEqual(1587633360000);
expect(results.data[0].fields[1].values.get(1)).toEqual(2048);
expect(results.data[0].fields[0].config.links[0].title).toEqual('View in Azure Portal');
expect(results.data[0].fields[0].config.links[0].targetBlank).toBe(true);
expect(results.data[0].fields[0].config.links[0].url).toEqual(
'https://portal.azure.com/#blade/Microsoft_OperationsManagementSuite_Workspace/AnalyticsBlade/initiator/AnalyticsShareLinkToQuery/isQueryEditorVisible/true/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2Fxxx%2Fresourcegroups%2Fdefaultresourcegroup%2Fproviders%2Fmicrosoft.operationalinsights%2Fworkspaces%2Faworkspace%22%7D%5D%7D/query/gzipped_base64_encoded_query/isQueryBase64Compressed/true/timespanInIsoFormat/P1D'
);
});
});
});
......
......@@ -15,6 +15,7 @@ export default class AzureLogAnalyticsDatasource {
azureMonitorUrl: string;
defaultOrFirstWorkspace: string;
subscriptionId: string;
cache: Map<string, any>;
/** @ngInject */
constructor(
......@@ -22,6 +23,7 @@ export default class AzureLogAnalyticsDatasource {
private templateSrv: TemplateSrv
) {
this.id = instanceSettings.id;
this.cache = new Map();
switch (this.instanceSettings.jsonData.cloudName) {
case 'govazuremonitor': // Azure US Government
......@@ -57,7 +59,7 @@ export default class AzureLogAnalyticsDatasource {
const azureCloud = this.instanceSettings.jsonData.cloudName || 'azuremonitor';
this.azureMonitorUrl = `/${azureCloud}/subscriptions`;
} else {
this.subscriptionId = this.instanceSettings.jsonData.logAnalyticsSubscriptionId;
this.subscriptionId = this.instanceSettings.jsonData.logAnalyticsSubscriptionId || '';
switch (this.instanceSettings.jsonData.cloudName) {
case 'govazuremonitor': // Azure US Government
......@@ -75,19 +77,23 @@ export default class AzureLogAnalyticsDatasource {
}
}
getWorkspaces(subscription: string): Promise<AzureLogsVariable[]> {
async getWorkspaces(subscription: string): Promise<AzureLogsVariable[]> {
const response = await this.getWorkspaceList(subscription);
return (
_.map(response.data.value, (val: any) => {
return { text: val.name, value: val.properties.customerId };
}) || []
);
}
getWorkspaceList(subscription: string): Promise<any> {
const subscriptionId = this.templateSrv.replace(subscription || this.subscriptionId);
const workspaceListUrl =
this.azureMonitorUrl +
`/${subscriptionId}/providers/Microsoft.OperationalInsights/workspaces?api-version=2017-04-26-preview`;
return this.doRequest(workspaceListUrl).then((response: any) => {
return (
_.map(response.data.value, val => {
return { text: val.name, value: val.properties.customerId };
}) || []
);
});
return this.doRequest(workspaceListUrl, true);
}
getSchema(workspace: string) {
......@@ -148,26 +154,87 @@ export default class AzureLogAnalyticsDatasource {
const result: DataQueryResponseData[] = [];
if (data.results) {
Object.values(data.results).forEach((queryRes: any) => {
queryRes.series?.forEach((series: any) => {
const results: any[] = Object.values(data.results);
for (let queryRes of results) {
for (let series of queryRes.series || []) {
const timeSeries: TimeSeries = {
target: series.name,
datapoints: series.points,
refId: queryRes.refId,
meta: queryRes.meta,
};
result.push(toDataFrame(timeSeries));
});
const df = toDataFrame(timeSeries);
if (queryRes.meta.encodedQuery && queryRes.meta.encodedQuery.length > 0) {
const url = await this.buildDeepLink(queryRes);
if (url.length > 0) {
for (const field of df.fields) {
field.config.links = [
{
url: url,
title: 'View in Azure Portal',
targetBlank: true,
},
];
}
}
}
result.push(df);
}
queryRes.tables?.forEach((table: any) => {
for (let table of queryRes.tables || []) {
result.push(toDataFrame(table));
});
});
}
}
}
return result;
}
private async buildDeepLink(queryRes: any) {
const base64Enc = encodeURIComponent(queryRes.meta.encodedQuery);
const workspaceId = queryRes.meta.workspace;
const subscription = queryRes.meta.subscription;
const details = await this.getWorkspaceDetails(workspaceId);
if (!details.workspace || !details.resourceGroup) {
return '';
}
const url =
`https://portal.azure.com/#blade/Microsoft_OperationsManagementSuite_Workspace/` +
`AnalyticsBlade/initiator/AnalyticsShareLinkToQuery/isQueryEditorVisible/true/scope/` +
`%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F${subscription}` +
`%2Fresourcegroups%2F${details.resourceGroup}%2Fproviders%2Fmicrosoft.operationalinsights%2Fworkspaces%2F${details.workspace}` +
`%22%7D%5D%7D/query/${base64Enc}/isQueryBase64Compressed/true/timespanInIsoFormat/P1D`;
return url;
}
async getWorkspaceDetails(workspaceId: string) {
const response = await this.getWorkspaceList(this.subscriptionId);
const details = response.data.value.find((o: any) => {
return o.properties.customerId === workspaceId;
});
if (!details) {
return {};
}
const regex = /.*resourcegroups\/(.*)\/providers.*/;
const results = regex.exec(details.id);
if (!results || results.length < 2) {
return {};
}
return {
workspace: details.name,
resourceGroup: results[1],
};
}
metricFindQuery(query: string) {
const workspacesQuery = query.match(/^workspaces\(\)/i);
if (workspacesQuery) {
......@@ -290,19 +357,29 @@ export default class AzureLogAnalyticsDatasource {
});
}
doRequest(url: string, maxRetries = 1): Promise<any> {
return getBackendSrv()
.datasourceRequest({
async doRequest(url: string, useCache = false, maxRetries = 1): Promise<any> {
try {
if (useCache && this.cache.has(url)) {
return this.cache.get(url);
}
const res = await getBackendSrv().datasourceRequest({
url: this.url + url,
method: 'GET',
})
.catch((error: any) => {
if (maxRetries > 0) {
return this.doRequest(url, maxRetries - 1);
}
throw error;
});
if (useCache) {
this.cache.set(url, res);
}
return res;
} catch (error) {
if (maxRetries > 0) {
return this.doRequest(url, useCache, maxRetries - 1);
}
throw error;
}
}
testDatasource() {
......
......@@ -39,7 +39,7 @@
<button class="btn btn-primary width-10" ng-click="ctrl.panelCtrl.refresh()">Run</button>
</div>
<div class="gf-form">
<label class="gf-form-label">(Run Query: Shift+Enter, Trigger Suggestion: Ctrl+Space)</label>
<label class="gf-form-label">(New Line: Shift+Enter, Run Query: Enter, Trigger Suggestion: Ctrl+Space)</label>
</div>
</div>
<kusto-editor
......
......@@ -210,7 +210,7 @@
<button class="btn btn-primary width-10" ng-click="ctrl.refresh()">Run</button>
</div>
<div class="gf-form">
<label class="gf-form-label">(Run Query: Shift+Enter, Trigger Suggestion: Ctrl+Space)</label>
<label class="gf-form-label">(New Line: Shift+Enter, Run Query: Enter, Trigger Suggestion: Ctrl+Space)</label>
</div>
<div class="gf-form gf-form--grow">
<div class="gf-form-label gf-form-label--grow"></div>
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment