Alerting: Add first Grafana reserved label grafana_folder (#50262)

* Alerting: Add first Grafana reserved label g_label

g_label holds the title of the folder container the alert. The intention of this label
is to use it as part of the new default notification policy groupBy.

* Add nil check on updateRule labels map

* Disable gocyclo lint on schedule.ruleRoutine

will remove later in a separate refactoring PR to reduce complexity.

* Address doc suggestions

* Update g_folder for rules in folder when folder title changes

* Remove global bus in FolderService

* Modify tests to fit new common g_folder label

* Add changelog entry

* Fix merge conflicts

* Switch GrafanaReservedLabelPrefix from `g_` to `grafana_`
pull/50907/head
Matthew Jacobson 3 years ago committed by GitHub
parent 40b152e813
commit 5dee2ed24c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 12
      docs/sources/alerting/fundamentals/annotation-label/how-to-use-labels.md
  2. 11
      pkg/bus/mock/mock.go
  3. 8
      pkg/events/events.go
  4. 20
      pkg/services/dashboards/service/folder_service.go
  5. 4
      pkg/services/dashboards/service/folder_service_test.go
  6. 5
      pkg/services/libraryelements/libraryelements_test.go
  7. 6
      pkg/services/librarypanels/librarypanels_test.go
  8. 1
      pkg/services/ngalert/CHANGELOG.md
  9. 8
      pkg/services/ngalert/models/alert_rule.go
  10. 9
      pkg/services/ngalert/ngalert.go
  11. 122
      pkg/services/ngalert/schedule/schedule.go
  12. 54
      pkg/services/ngalert/schedule/schedule_mock.go
  13. 3
      pkg/services/ngalert/schedule/schedule_test.go
  14. 20
      pkg/services/ngalert/schedule/schedule_unit_test.go
  15. 11
      pkg/services/ngalert/store/alert_rule.go
  16. 15
      pkg/services/ngalert/store/testing.go
  17. 7
      pkg/services/ngalert/tests/util.go
  18. 91
      pkg/tests/api/alerting/api_notification_channel_test.go

@ -19,6 +19,16 @@ This topic explains why labels are a fundamental component of alerting.
- The Alertmanager uses labels to match alerts for [silences]({{< relref "../../silences/" >}}) and [alert groups]({{< relref "../../alert-groups/" >}}) in [notification policies]({{< relref "../../notifications/" >}}). - The Alertmanager uses labels to match alerts for [silences]({{< relref "../../silences/" >}}) and [alert groups]({{< relref "../../alert-groups/" >}}) in [notification policies]({{< relref "../../notifications/" >}}).
- The alerting UI shows labels for every alert instance generated during evaluation of that rule. - The alerting UI shows labels for every alert instance generated during evaluation of that rule.
- Contact points can access labels to dynamically generate notifications that contain information specific to the alert that is resulting in a notification. - Contact points can access labels to dynamically generate notifications that contain information specific to the alert that is resulting in a notification.
- Labels can be added to an [alerting rule]({{< relref "../../alerting-rules/" >}}). These manually configured labels are able to use template functions and reference other labels. Labels added to an alerting rule take precedence in the event of a collision between labels. - You can add labels to an [alerting rule]({{< relref "../../alerting-rules/" >}}). Labels are manually configurable, use template functions, and can reference other labels. Labels added to an alerting rule take precedence in the event of a collision between labels (except in the case of [Grafana reserved labels](#grafana-reserved-labels)).
{{< figure src="/static/img/docs/alerting/unified/rule-edit-details-8-0.png" max-width="550px" caption="Alert details" >}} {{< figure src="/static/img/docs/alerting/unified/rule-edit-details-8-0.png" max-width="550px" caption="Alert details" >}}
# Grafana reserved labels
> **Note:** Labels prefixed with `grafana_` are reserved by Grafana for special use. If a manually configured label is added beginning with `grafana_` it may be overwritten in case of collision.
Grafana reserved labels can be used in the same way as manually configured labels. The current list of available reserved labels are:
| Label | Description |
| -------------- | ----------------------------------------- |
| grafana_folder | Title of the folder containing the alert. |

@ -0,0 +1,11 @@
package mock
import (
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/infra/tracing"
)
func New() bus.Bus {
tracer := tracing.InitializeTracerForTest()
return bus.ProvideBus(tracer)
}

@ -62,3 +62,11 @@ type DataSourceCreated struct {
UID string `json:"uid"` UID string `json:"uid"`
OrgID int64 `json:"org_id"` OrgID int64 `json:"org_id"`
} }
type FolderUpdated struct {
Timestamp time.Time `json:"timestamp"`
Title string `json:"name"`
ID int64 `json:"id"`
UID string `json:"uid"`
OrgID int64 `json:"org_id"`
}

@ -4,7 +4,10 @@ import (
"context" "context"
"errors" "errors"
"strings" "strings"
"time"
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/events"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/accesscontrol" "github.com/grafana/grafana/pkg/services/accesscontrol"
@ -23,12 +26,15 @@ type FolderServiceImpl struct {
searchService *search.SearchService searchService *search.SearchService
features featuremgmt.FeatureToggles features featuremgmt.FeatureToggles
permissions accesscontrol.FolderPermissionsService permissions accesscontrol.FolderPermissionsService
// bus is currently used to publish events that cause scheduler to update rules.
bus bus.Bus
} }
func ProvideFolderService( func ProvideFolderService(
cfg *setting.Cfg, dashboardService dashboards.DashboardService, dashboardStore dashboards.Store, cfg *setting.Cfg, dashboardService dashboards.DashboardService, dashboardStore dashboards.Store,
searchService *search.SearchService, features featuremgmt.FeatureToggles, folderPermissionsService accesscontrol.FolderPermissionsService, searchService *search.SearchService, features featuremgmt.FeatureToggles, folderPermissionsService accesscontrol.FolderPermissionsService,
ac accesscontrol.AccessControl, ac accesscontrol.AccessControl, bus bus.Bus,
) *FolderServiceImpl { ) *FolderServiceImpl {
ac.RegisterScopeAttributeResolver(dashboards.NewFolderNameScopeResolver(dashboardStore)) ac.RegisterScopeAttributeResolver(dashboards.NewFolderNameScopeResolver(dashboardStore))
ac.RegisterScopeAttributeResolver(dashboards.NewFolderIDScopeResolver(dashboardStore)) ac.RegisterScopeAttributeResolver(dashboards.NewFolderIDScopeResolver(dashboardStore))
@ -41,6 +47,7 @@ func ProvideFolderService(
searchService: searchService, searchService: searchService,
features: features, features: features,
permissions: folderPermissionsService, permissions: folderPermissionsService,
bus: bus,
} }
} }
@ -222,6 +229,17 @@ func (f *FolderServiceImpl) UpdateFolder(ctx context.Context, user *models.Signe
return err return err
} }
cmd.Result = folder cmd.Result = folder
if err := f.bus.Publish(ctx, &events.FolderUpdated{
Timestamp: time.Now(),
Title: folder.Title,
ID: dash.Id,
UID: dash.Uid,
OrgID: orgID,
}); err != nil {
f.log.Error("failed to publish FolderUpdated event", "folder", folder.Title, "user", user.UserId, "error", err)
}
return nil return nil
} }

@ -9,6 +9,7 @@ import (
"github.com/stretchr/testify/mock" "github.com/stretchr/testify/mock"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock" acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
@ -30,7 +31,7 @@ func TestIntegrationProvideFolderService(t *testing.T) {
cfg := setting.NewCfg() cfg := setting.NewCfg()
ac := acmock.New() ac := acmock.New()
ProvideFolderService(cfg, nil, nil, nil, nil, nil, ac) ProvideFolderService(cfg, nil, nil, nil, nil, nil, ac, busmock.New())
require.Len(t, ac.Calls.RegisterAttributeScopeResolver, 2) require.Len(t, ac.Calls.RegisterAttributeScopeResolver, 2)
}) })
@ -57,6 +58,7 @@ func TestIntegrationFolderService(t *testing.T) {
searchService: nil, searchService: nil,
features: features, features: features,
permissions: folderPermissions, permissions: folderPermissions,
bus: busmock.New(),
} }
t.Run("Given user has no permissions", func(t *testing.T) { t.Run("Given user has no permissions", func(t *testing.T) {

@ -13,6 +13,7 @@ import (
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/api/response" "github.com/grafana/grafana/pkg/api/response"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock" acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
@ -236,7 +237,7 @@ func createFolderWithACL(t *testing.T, sqlStore *sqlstore.SQLStore, title string
) )
s := dashboardservice.ProvideFolderService( s := dashboardservice.ProvideFolderService(
cfg, d, dashboardStore, nil, cfg, d, dashboardStore, nil,
features, folderPermissions, ac, features, folderPermissions, ac, busmock.New(),
) )
t.Logf("Creating folder with title and UID %q", title) t.Logf("Creating folder with title and UID %q", title)
folder, err := s.CreateFolder(context.Background(), &user, user.OrgId, title, title) folder, err := s.CreateFolder(context.Background(), &user, user.OrgId, title, title)
@ -340,7 +341,7 @@ func testScenario(t *testing.T, desc string, fn func(t *testing.T, sc scenarioCo
SQLStore: sqlStore, SQLStore: sqlStore,
folderService: dashboardservice.ProvideFolderService( folderService: dashboardservice.ProvideFolderService(
cfg, dashboardService, dashboardStore, nil, cfg, dashboardService, dashboardStore, nil,
features, folderPermissions, ac, features, folderPermissions, ac, busmock.New(),
), ),
} }

@ -11,6 +11,7 @@ import (
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/api/routing" "github.com/grafana/grafana/pkg/api/routing"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock" acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
@ -1393,7 +1394,7 @@ func createFolderWithACL(t *testing.T, sqlStore *sqlstore.SQLStore, title string
dashboardPermissions := acmock.NewMockedPermissionsService() dashboardPermissions := acmock.NewMockedPermissionsService()
dashboardStore := database.ProvideDashboardStore(sqlStore) dashboardStore := database.ProvideDashboardStore(sqlStore)
d := dashboardservice.ProvideDashboardService(cfg, dashboardStore, nil, features, folderPermissions, dashboardPermissions, ac) d := dashboardservice.ProvideDashboardService(cfg, dashboardStore, nil, features, folderPermissions, dashboardPermissions, ac)
s := dashboardservice.ProvideFolderService(cfg, d, dashboardStore, nil, features, folderPermissions, ac) s := dashboardservice.ProvideFolderService(cfg, d, dashboardStore, nil, features, folderPermissions, ac, busmock.New())
t.Logf("Creating folder with title and UID %q", title) t.Logf("Creating folder with title and UID %q", title)
folder, err := s.CreateFolder(context.Background(), user, user.OrgId, title, title) folder, err := s.CreateFolder(context.Background(), user, user.OrgId, title, title)
@ -1494,10 +1495,9 @@ func testScenario(t *testing.T, desc string, fn func(t *testing.T, sc scenarioCo
cfg, dashboardStore, &alerting.DashAlertExtractorService{}, cfg, dashboardStore, &alerting.DashAlertExtractorService{},
features, folderPermissions, dashboardPermissions, ac, features, folderPermissions, dashboardPermissions, ac,
) )
folderService := dashboardservice.ProvideFolderService( folderService := dashboardservice.ProvideFolderService(
cfg, dashboardService, dashboardStore, nil, cfg, dashboardService, dashboardStore, nil,
features, folderPermissions, ac, features, folderPermissions, ac, busmock.New(),
) )
elementService := libraryelements.ProvideService(cfg, sqlStore, routing.NewRouteRegister(), folderService) elementService := libraryelements.ProvideService(cfg, sqlStore, routing.NewRouteRegister(), folderService)

@ -46,6 +46,7 @@ Scopes must have an order to ensure consistency and ease of search, this helps u
## Grafana Alerting - main / unreleased ## Grafana Alerting - main / unreleased
- [CHANGE] Rule API to reject request to update rules that affects provisioned rules #50835 - [CHANGE] Rule API to reject request to update rules that affects provisioned rules #50835
- [FEATURE] Add first Grafana reserved label, grafana_folder is created during runtime and stores an alert's folder/namespace title #50262
- [FEATURE] use optimistic lock by version field when updating alert rules #50274 - [FEATURE] use optimistic lock by version field when updating alert rules #50274
- [ENHANCEMENT] Scheduler: Drop ticks if rule evaluation is too slow and adds a metric grafana_alerting_schedule_rule_evaluations_missed_total to track missed evaluations per rule #48885 - [ENHANCEMENT] Scheduler: Drop ticks if rule evaluation is too slow and adds a metric grafana_alerting_schedule_rule_evaluations_missed_total to track missed evaluations per rule #48885
- [ENHANCEMENT] Ticker to tick at predictable time #50197 - [ENHANCEMENT] Ticker to tick at predictable time #50197

@ -87,6 +87,14 @@ const (
// This isn't a hard-coded secret token, hence the nolint. // This isn't a hard-coded secret token, hence the nolint.
//nolint:gosec //nolint:gosec
ScreenshotTokenAnnotation = "__alertScreenshotToken__" ScreenshotTokenAnnotation = "__alertScreenshotToken__"
// GrafanaReservedLabelPrefix contains the prefix for Grafana reserved labels. These differ from "__<label>__" labels
// in that they are not meant for internal-use only and will be passed-through to AMs and available to users in the same
// way as manually configured labels.
GrafanaReservedLabelPrefix = "grafana_"
// FolderTitleLabel is the label that will contain the title of an alert's folder/namespace.
FolderTitleLabel = GrafanaReservedLabelPrefix + "folder"
) )
var ( var (

@ -8,6 +8,7 @@ import (
"golang.org/x/sync/errgroup" "golang.org/x/sync/errgroup"
"github.com/grafana/grafana/pkg/api/routing" "github.com/grafana/grafana/pkg/api/routing"
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/expr" "github.com/grafana/grafana/pkg/expr"
"github.com/grafana/grafana/pkg/infra/kvstore" "github.com/grafana/grafana/pkg/infra/kvstore"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
@ -35,7 +36,8 @@ import (
func ProvideService(cfg *setting.Cfg, dataSourceCache datasources.CacheService, routeRegister routing.RouteRegister, func ProvideService(cfg *setting.Cfg, dataSourceCache datasources.CacheService, routeRegister routing.RouteRegister,
sqlStore *sqlstore.SQLStore, kvStore kvstore.KVStore, expressionService *expr.Service, dataProxy *datasourceproxy.DataSourceProxyService, sqlStore *sqlstore.SQLStore, kvStore kvstore.KVStore, expressionService *expr.Service, dataProxy *datasourceproxy.DataSourceProxyService,
quotaService *quota.QuotaService, secretsService secrets.Service, notificationService notifications.Service, m *metrics.NGAlert, quotaService *quota.QuotaService, secretsService secrets.Service, notificationService notifications.Service, m *metrics.NGAlert,
folderService dashboards.FolderService, ac accesscontrol.AccessControl, dashboardService dashboards.DashboardService, renderService rendering.Service) (*AlertNG, error) { folderService dashboards.FolderService, ac accesscontrol.AccessControl, dashboardService dashboards.DashboardService, renderService rendering.Service,
bus bus.Bus) (*AlertNG, error) {
ng := &AlertNG{ ng := &AlertNG{
Cfg: cfg, Cfg: cfg,
DataSourceCache: dataSourceCache, DataSourceCache: dataSourceCache,
@ -53,6 +55,7 @@ func ProvideService(cfg *setting.Cfg, dataSourceCache datasources.CacheService,
accesscontrol: ac, accesscontrol: ac,
dashboardService: dashboardService, dashboardService: dashboardService,
renderService: renderService, renderService: renderService,
bus: bus,
} }
if ng.IsDisabled() { if ng.IsDisabled() {
@ -90,6 +93,8 @@ type AlertNG struct {
// Alerting notification services // Alerting notification services
MultiOrgAlertmanager *notifier.MultiOrgAlertmanager MultiOrgAlertmanager *notifier.MultiOrgAlertmanager
accesscontrol accesscontrol.AccessControl accesscontrol accesscontrol.AccessControl
bus bus.Bus
} }
func (ng *AlertNG) init() error { func (ng *AlertNG) init() error {
@ -147,7 +152,7 @@ func (ng *AlertNG) init() error {
} }
stateManager := state.NewManager(ng.Log, ng.Metrics.GetStateMetrics(), appUrl, store, store, ng.SQLStore, ng.dashboardService, ng.imageService) stateManager := state.NewManager(ng.Log, ng.Metrics.GetStateMetrics(), appUrl, store, store, ng.SQLStore, ng.dashboardService, ng.imageService)
scheduler := schedule.NewScheduler(schedCfg, ng.ExpressionService, appUrl, stateManager) scheduler := schedule.NewScheduler(schedCfg, ng.ExpressionService, appUrl, stateManager, ng.bus)
ng.stateManager = stateManager ng.stateManager = stateManager
ng.schedule = scheduler ng.schedule = scheduler

@ -8,13 +8,16 @@ import (
"sync" "sync"
"time" "time"
"github.com/grafana/grafana/pkg/bus"
"github.com/grafana/grafana/pkg/events"
"github.com/grafana/grafana/pkg/expr" "github.com/grafana/grafana/pkg/expr"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/alerting" "github.com/grafana/grafana/pkg/services/alerting"
"github.com/grafana/grafana/pkg/services/ngalert/api/tooling/definitions" "github.com/grafana/grafana/pkg/services/ngalert/api/tooling/definitions"
"github.com/grafana/grafana/pkg/services/ngalert/eval" "github.com/grafana/grafana/pkg/services/ngalert/eval"
"github.com/grafana/grafana/pkg/services/ngalert/metrics" "github.com/grafana/grafana/pkg/services/ngalert/metrics"
"github.com/grafana/grafana/pkg/services/ngalert/models" ngmodels "github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/services/ngalert/notifier" "github.com/grafana/grafana/pkg/services/ngalert/notifier"
"github.com/grafana/grafana/pkg/services/ngalert/sender" "github.com/grafana/grafana/pkg/services/ngalert/sender"
"github.com/grafana/grafana/pkg/services/ngalert/state" "github.com/grafana/grafana/pkg/services/ngalert/state"
@ -40,13 +43,17 @@ type ScheduleService interface {
// organization. // organization.
DroppedAlertmanagersFor(orgID int64) []*url.URL DroppedAlertmanagersFor(orgID int64) []*url.URL
// UpdateAlertRule notifies scheduler that a rule has been changed // UpdateAlertRule notifies scheduler that a rule has been changed
UpdateAlertRule(key models.AlertRuleKey) UpdateAlertRule(key ngmodels.AlertRuleKey)
// UpdateAlertRulesByNamespaceUID notifies scheduler that all rules in a namespace should be updated.
UpdateAlertRulesByNamespaceUID(ctx context.Context, orgID int64, uid string) error
// DeleteAlertRule notifies scheduler that a rule has been changed // DeleteAlertRule notifies scheduler that a rule has been changed
DeleteAlertRule(key models.AlertRuleKey) DeleteAlertRule(key ngmodels.AlertRuleKey)
// the following are used by tests only used for tests // the following are used by tests only used for tests
evalApplied(models.AlertRuleKey, time.Time) evalApplied(ngmodels.AlertRuleKey, time.Time)
stopApplied(models.AlertRuleKey) stopApplied(ngmodels.AlertRuleKey)
overrideCfg(cfg SchedulerCfg) overrideCfg(cfg SchedulerCfg)
folderUpdateHandler(ctx context.Context, evt *events.FolderUpdated) error
} }
type schedule struct { type schedule struct {
@ -65,12 +72,12 @@ type schedule struct {
// evalApplied is only used for tests: test code can set it to non-nil // evalApplied is only used for tests: test code can set it to non-nil
// function, and then it'll be called from the event loop whenever the // function, and then it'll be called from the event loop whenever the
// message from evalApplied is handled. // message from evalApplied is handled.
evalAppliedFunc func(models.AlertRuleKey, time.Time) evalAppliedFunc func(ngmodels.AlertRuleKey, time.Time)
// stopApplied is only used for tests: test code can set it to non-nil // stopApplied is only used for tests: test code can set it to non-nil
// function, and then it'll be called from the event loop whenever the // function, and then it'll be called from the event loop whenever the
// message from stopApplied is handled. // message from stopApplied is handled.
stopAppliedFunc func(models.AlertRuleKey) stopAppliedFunc func(ngmodels.AlertRuleKey)
log log.Logger log log.Logger
@ -91,7 +98,7 @@ type schedule struct {
// Senders help us send alerts to external Alertmanagers. // Senders help us send alerts to external Alertmanagers.
adminConfigMtx sync.RWMutex adminConfigMtx sync.RWMutex
sendAlertsTo map[int64]models.AlertmanagersChoice sendAlertsTo map[int64]ngmodels.AlertmanagersChoice
sendersCfgHash map[int64]string sendersCfgHash map[int64]string
senders map[int64]*sender.Sender senders map[int64]*sender.Sender
adminConfigPollInterval time.Duration adminConfigPollInterval time.Duration
@ -103,6 +110,9 @@ type schedule struct {
// current tick depends on its evaluation interval and when it was // current tick depends on its evaluation interval and when it was
// last evaluated. // last evaluated.
schedulableAlertRules schedulableAlertRulesRegistry schedulableAlertRules schedulableAlertRulesRegistry
// bus is used to hook into events that should cause rule updates.
bus bus.Bus
} }
// SchedulerCfg is the scheduler configuration. // SchedulerCfg is the scheduler configuration.
@ -110,9 +120,9 @@ type SchedulerCfg struct {
C clock.Clock C clock.Clock
BaseInterval time.Duration BaseInterval time.Duration
Logger log.Logger Logger log.Logger
EvalAppliedFunc func(models.AlertRuleKey, time.Time) EvalAppliedFunc func(ngmodels.AlertRuleKey, time.Time)
MaxAttempts int64 MaxAttempts int64
StopAppliedFunc func(models.AlertRuleKey) StopAppliedFunc func(ngmodels.AlertRuleKey)
Evaluator eval.Evaluator Evaluator eval.Evaluator
RuleStore store.RuleStore RuleStore store.RuleStore
OrgStore store.OrgStore OrgStore store.OrgStore
@ -126,11 +136,11 @@ type SchedulerCfg struct {
} }
// NewScheduler returns a new schedule. // NewScheduler returns a new schedule.
func NewScheduler(cfg SchedulerCfg, expressionService *expr.Service, appURL *url.URL, stateManager *state.Manager) *schedule { func NewScheduler(cfg SchedulerCfg, expressionService *expr.Service, appURL *url.URL, stateManager *state.Manager, bus bus.Bus) *schedule {
ticker := alerting.NewTicker(cfg.C, cfg.BaseInterval, cfg.Metrics.Ticker) ticker := alerting.NewTicker(cfg.C, cfg.BaseInterval, cfg.Metrics.Ticker)
sch := schedule{ sch := schedule{
registry: alertRuleInfoRegistry{alertRuleInfo: make(map[models.AlertRuleKey]*alertRuleInfo)}, registry: alertRuleInfoRegistry{alertRuleInfo: make(map[ngmodels.AlertRuleKey]*alertRuleInfo)},
maxAttempts: cfg.MaxAttempts, maxAttempts: cfg.MaxAttempts,
clock: cfg.C, clock: cfg.C,
baseInterval: cfg.BaseInterval, baseInterval: cfg.BaseInterval,
@ -148,14 +158,18 @@ func NewScheduler(cfg SchedulerCfg, expressionService *expr.Service, appURL *url
metrics: cfg.Metrics, metrics: cfg.Metrics,
appURL: appURL, appURL: appURL,
stateManager: stateManager, stateManager: stateManager,
sendAlertsTo: map[int64]models.AlertmanagersChoice{}, sendAlertsTo: map[int64]ngmodels.AlertmanagersChoice{},
senders: map[int64]*sender.Sender{}, senders: map[int64]*sender.Sender{},
sendersCfgHash: map[int64]string{}, sendersCfgHash: map[int64]string{},
adminConfigPollInterval: cfg.AdminConfigPollInterval, adminConfigPollInterval: cfg.AdminConfigPollInterval,
disabledOrgs: cfg.DisabledOrgs, disabledOrgs: cfg.DisabledOrgs,
minRuleInterval: cfg.MinRuleInterval, minRuleInterval: cfg.MinRuleInterval,
schedulableAlertRules: schedulableAlertRulesRegistry{rules: make(map[models.AlertRuleKey]*models.SchedulableAlertRule)}, schedulableAlertRules: schedulableAlertRulesRegistry{rules: make(map[ngmodels.AlertRuleKey]*ngmodels.SchedulableAlertRule)},
bus: bus,
} }
bus.AddEventListener(sch.folderUpdateHandler)
return &sch return &sch
} }
@ -216,7 +230,7 @@ func (sch *schedule) SyncAndApplyConfigFromDatabase() error {
continue continue
} }
// We have no running sender and alerts are handled internally, no-op. // We have no running sender and alerts are handled internally, no-op.
if !ok && cfg.SendAlertsTo == models.InternalAlertmanager { if !ok && cfg.SendAlertsTo == ngmodels.InternalAlertmanager {
sch.log.Debug("alerts are handled internally", "org", cfg.OrgID) sch.log.Debug("alerts are handled internally", "org", cfg.OrgID)
continue continue
} }
@ -313,7 +327,7 @@ func (sch *schedule) DroppedAlertmanagersFor(orgID int64) []*url.URL {
} }
// UpdateAlertRule looks for the active rule evaluation and commands it to update the rule // UpdateAlertRule looks for the active rule evaluation and commands it to update the rule
func (sch *schedule) UpdateAlertRule(key models.AlertRuleKey) { func (sch *schedule) UpdateAlertRule(key ngmodels.AlertRuleKey) {
ruleInfo, err := sch.registry.get(key) ruleInfo, err := sch.registry.get(key)
if err != nil { if err != nil {
return return
@ -321,8 +335,28 @@ func (sch *schedule) UpdateAlertRule(key models.AlertRuleKey) {
ruleInfo.update() ruleInfo.update()
} }
// UpdateAlertRulesByNamespaceUID looks for the active rule evaluation for every rule in the given namespace and commands it to update the rule.
func (sch *schedule) UpdateAlertRulesByNamespaceUID(ctx context.Context, orgID int64, uid string) error {
q := ngmodels.ListAlertRulesQuery{
OrgID: orgID,
NamespaceUIDs: []string{uid},
}
if err := sch.ruleStore.ListAlertRules(ctx, &q); err != nil {
return err
}
for _, r := range q.Result {
sch.UpdateAlertRule(ngmodels.AlertRuleKey{
OrgID: orgID,
UID: r.UID,
})
}
return nil
}
// DeleteAlertRule stops evaluation of the rule, deletes it from active rules, and cleans up state cache. // DeleteAlertRule stops evaluation of the rule, deletes it from active rules, and cleans up state cache.
func (sch *schedule) DeleteAlertRule(key models.AlertRuleKey) { func (sch *schedule) DeleteAlertRule(key ngmodels.AlertRuleKey) {
// It can happen that the scheduler has deleted the alert rule before the // It can happen that the scheduler has deleted the alert rule before the
// Ruler API has called DeleteAlertRule. This can happen as requests to // Ruler API has called DeleteAlertRule. This can happen as requests to
// the Ruler API do not hold an exclusive lock over all scheduler operations. // the Ruler API do not hold an exclusive lock over all scheduler operations.
@ -403,7 +437,7 @@ func (sch *schedule) schedulePeriodic(ctx context.Context) error {
sch.metrics.SchedulableAlertRulesHash.Set(float64(hashUIDs(alertRules))) sch.metrics.SchedulableAlertRulesHash.Set(float64(hashUIDs(alertRules)))
type readyToRunItem struct { type readyToRunItem struct {
key models.AlertRuleKey key ngmodels.AlertRuleKey
ruleName string ruleName string
ruleInfo *alertRuleInfo ruleInfo *alertRuleInfo
version int64 version int64
@ -491,7 +525,8 @@ func (sch *schedule) schedulePeriodic(ctx context.Context) error {
} }
} }
func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRuleKey, evalCh <-chan *evaluation, updateCh <-chan struct{}) error { //nolint: gocyclo
func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key ngmodels.AlertRuleKey, evalCh <-chan *evaluation, updateCh <-chan struct{}) error {
logger := sch.log.New("uid", key.UID, "org", key.OrgID) logger := sch.log.New("uid", key.UID, "org", key.OrgID)
logger.Debug("alert rule routine started") logger.Debug("alert rule routine started")
@ -509,7 +544,7 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
// Send alerts to local notifier if they need to be handled internally // Send alerts to local notifier if they need to be handled internally
// or if no external AMs have been discovered yet. // or if no external AMs have been discovered yet.
var localNotifierExist, externalNotifierExist bool var localNotifierExist, externalNotifierExist bool
if sch.sendAlertsTo[key.OrgID] == models.ExternalAlertmanagers && len(sch.AlertmanagersFor(key.OrgID)) > 0 { if sch.sendAlertsTo[key.OrgID] == ngmodels.ExternalAlertmanagers && len(sch.AlertmanagersFor(key.OrgID)) > 0 {
logger.Debug("no alerts to put in the notifier") logger.Debug("no alerts to put in the notifier")
} else { } else {
logger.Debug("sending alerts to local notifier", "count", len(alerts.PostableAlerts), "alerts", alerts.PostableAlerts) logger.Debug("sending alerts to local notifier", "count", len(alerts.PostableAlerts), "alerts", alerts.PostableAlerts)
@ -533,7 +568,7 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
sch.adminConfigMtx.RLock() sch.adminConfigMtx.RLock()
defer sch.adminConfigMtx.RUnlock() defer sch.adminConfigMtx.RUnlock()
s, ok := sch.senders[key.OrgID] s, ok := sch.senders[key.OrgID]
if ok && sch.sendAlertsTo[key.OrgID] != models.InternalAlertmanager { if ok && sch.sendAlertsTo[key.OrgID] != ngmodels.InternalAlertmanager {
logger.Debug("sending alerts to external notifier", "count", len(alerts.PostableAlerts), "alerts", alerts.PostableAlerts) logger.Debug("sending alerts to external notifier", "count", len(alerts.PostableAlerts), "alerts", alerts.PostableAlerts)
s.SendAlerts(alerts) s.SendAlerts(alerts)
externalNotifierExist = true externalNotifierExist = true
@ -551,8 +586,8 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
notify(expiredAlerts, logger) notify(expiredAlerts, logger)
} }
updateRule := func(ctx context.Context, oldRule *models.AlertRule) (*models.AlertRule, error) { updateRule := func(ctx context.Context, oldRule *ngmodels.AlertRule) (*ngmodels.AlertRule, error) {
q := models.GetAlertRuleByUIDQuery{OrgID: key.OrgID, UID: key.UID} q := ngmodels.GetAlertRuleByUIDQuery{OrgID: key.OrgID, UID: key.UID}
err := sch.ruleStore.GetAlertRuleByUID(ctx, &q) err := sch.ruleStore.GetAlertRuleByUID(ctx, &q)
if err != nil { if err != nil {
logger.Error("failed to fetch alert rule", "err", err) logger.Error("failed to fetch alert rule", "err", err)
@ -561,14 +596,34 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
if oldRule != nil && oldRule.Version < q.Result.Version { if oldRule != nil && oldRule.Version < q.Result.Version {
clearState() clearState()
} }
user := &models.SignedInUser{
UserId: 0,
OrgRole: models.ROLE_ADMIN,
OrgId: key.OrgID,
}
folder, err := sch.ruleStore.GetNamespaceByUID(ctx, q.Result.NamespaceUID, q.Result.OrgID, user)
if err != nil {
logger.Error("failed to fetch alert rule namespace", "err", err)
return nil, err
}
if q.Result.Labels == nil {
q.Result.Labels = make(map[string]string)
} else if val, ok := q.Result.Labels[ngmodels.FolderTitleLabel]; ok {
logger.Warn("alert rule contains protected label, value will be overwritten", "label", ngmodels.FolderTitleLabel, "value", val)
}
q.Result.Labels[ngmodels.FolderTitleLabel] = folder.Title
return q.Result, nil return q.Result, nil
} }
evaluate := func(ctx context.Context, r *models.AlertRule, attempt int64, e *evaluation) error { evaluate := func(ctx context.Context, r *ngmodels.AlertRule, attempt int64, e *evaluation) error {
logger := logger.New("version", r.Version, "attempt", attempt, "now", e.scheduledAt) logger := logger.New("version", r.Version, "attempt", attempt, "now", e.scheduledAt)
start := sch.clock.Now() start := sch.clock.Now()
condition := models.Condition{ condition := ngmodels.Condition{
Condition: r.Condition, Condition: r.Condition,
OrgID: r.OrgID, OrgID: r.OrgID,
Data: r.Data, Data: r.Data,
@ -606,7 +661,7 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
} }
evalRunning := false evalRunning := false
var currentRule *models.AlertRule var currentRule *ngmodels.AlertRule
defer sch.stopApplied(key) defer sch.stopApplied(key)
for { for {
select { select {
@ -669,11 +724,11 @@ func (sch *schedule) ruleRoutine(grafanaCtx context.Context, key models.AlertRul
func (sch *schedule) saveAlertStates(ctx context.Context, states []*state.State) { func (sch *schedule) saveAlertStates(ctx context.Context, states []*state.State) {
sch.log.Debug("saving alert states", "count", len(states)) sch.log.Debug("saving alert states", "count", len(states))
for _, s := range states { for _, s := range states {
cmd := models.SaveAlertInstanceCommand{ cmd := ngmodels.SaveAlertInstanceCommand{
RuleOrgID: s.OrgID, RuleOrgID: s.OrgID,
RuleUID: s.AlertRuleUID, RuleUID: s.AlertRuleUID,
Labels: models.InstanceLabels(s.Labels), Labels: ngmodels.InstanceLabels(s.Labels),
State: models.InstanceStateType(s.State.String()), State: ngmodels.InstanceStateType(s.State.String()),
StateReason: s.StateReason, StateReason: s.StateReason,
LastEvalTime: s.LastEvaluationTime, LastEvalTime: s.LastEvaluationTime,
CurrentStateSince: s.StartsAt, CurrentStateSince: s.StartsAt,
@ -686,6 +741,11 @@ func (sch *schedule) saveAlertStates(ctx context.Context, states []*state.State)
} }
} }
// folderUpdateHandler listens for folder update events and updates all rules in the given folder.
func (sch *schedule) folderUpdateHandler(ctx context.Context, evt *events.FolderUpdated) error {
return sch.UpdateAlertRulesByNamespaceUID(ctx, evt.OrgID, evt.UID)
}
// overrideCfg is only used on tests. // overrideCfg is only used on tests.
func (sch *schedule) overrideCfg(cfg SchedulerCfg) { func (sch *schedule) overrideCfg(cfg SchedulerCfg) {
sch.clock = cfg.C sch.clock = cfg.C
@ -697,7 +757,7 @@ func (sch *schedule) overrideCfg(cfg SchedulerCfg) {
} }
// evalApplied is only used on tests. // evalApplied is only used on tests.
func (sch *schedule) evalApplied(alertDefKey models.AlertRuleKey, now time.Time) { func (sch *schedule) evalApplied(alertDefKey ngmodels.AlertRuleKey, now time.Time) {
if sch.evalAppliedFunc == nil { if sch.evalAppliedFunc == nil {
return return
} }
@ -706,7 +766,7 @@ func (sch *schedule) evalApplied(alertDefKey models.AlertRuleKey, now time.Time)
} }
// stopApplied is only used on tests. // stopApplied is only used on tests.
func (sch *schedule) stopApplied(alertDefKey models.AlertRuleKey) { func (sch *schedule) stopApplied(alertDefKey ngmodels.AlertRuleKey) {
if sch.stopAppliedFunc == nil { if sch.stopAppliedFunc == nil {
return return
} }

@ -1,13 +1,15 @@
// Code generated by mockery v2.10.0. DO NOT EDIT. // Code generated by mockery v2.10.2. DO NOT EDIT.
package schedule package schedule
import ( import (
context "context" context "context"
models "github.com/grafana/grafana/pkg/services/ngalert/models" events "github.com/grafana/grafana/pkg/events"
mock "github.com/stretchr/testify/mock" mock "github.com/stretchr/testify/mock"
models "github.com/grafana/grafana/pkg/services/ngalert/models"
time "time" time "time"
url "net/url" url "net/url"
@ -55,20 +57,6 @@ func (_m *FakeScheduleService) DroppedAlertmanagersFor(orgID int64) []*url.URL {
return r0 return r0
} }
// Pause provides a mock function with given fields:
func (_m *FakeScheduleService) Pause() error {
ret := _m.Called()
var r0 error
if rf, ok := ret.Get(0).(func() error); ok {
r0 = rf()
} else {
r0 = ret.Error(0)
}
return r0
}
// Run provides a mock function with given fields: _a0 // Run provides a mock function with given fields: _a0
func (_m *FakeScheduleService) Run(_a0 context.Context) error { func (_m *FakeScheduleService) Run(_a0 context.Context) error {
ret := _m.Called(_a0) ret := _m.Called(_a0)
@ -83,13 +71,18 @@ func (_m *FakeScheduleService) Run(_a0 context.Context) error {
return r0 return r0
} }
// Unpause provides a mock function with given fields: // UpdateAlertRule provides a mock function with given fields: key
func (_m *FakeScheduleService) Unpause() error { func (_m *FakeScheduleService) UpdateAlertRule(key models.AlertRuleKey) {
ret := _m.Called() _m.Called(key)
}
// UpdateAlertRulesByNamespaceUID provides a mock function with given fields: ctx, orgID, uid
func (_m *FakeScheduleService) UpdateAlertRulesByNamespaceUID(ctx context.Context, orgID int64, uid string) error {
ret := _m.Called(ctx, orgID, uid)
var r0 error var r0 error
if rf, ok := ret.Get(0).(func() error); ok { if rf, ok := ret.Get(0).(func(context.Context, int64, string) error); ok {
r0 = rf() r0 = rf(ctx, orgID, uid)
} else { } else {
r0 = ret.Error(0) r0 = ret.Error(0)
} }
@ -97,16 +90,25 @@ func (_m *FakeScheduleService) Unpause() error {
return r0 return r0
} }
// UpdateAlertRule provides a mock function with given fields: key
func (_m *FakeScheduleService) UpdateAlertRule(key models.AlertRuleKey) {
_m.Called(key)
}
// evalApplied provides a mock function with given fields: _a0, _a1 // evalApplied provides a mock function with given fields: _a0, _a1
func (_m *FakeScheduleService) evalApplied(_a0 models.AlertRuleKey, _a1 time.Time) { func (_m *FakeScheduleService) evalApplied(_a0 models.AlertRuleKey, _a1 time.Time) {
_m.Called(_a0, _a1) _m.Called(_a0, _a1)
} }
// folderUpdateHandler provides a mock function with given fields: ctx, evt
func (_m *FakeScheduleService) folderUpdateHandler(ctx context.Context, evt *events.FolderUpdated) error {
ret := _m.Called(ctx, evt)
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, *events.FolderUpdated) error); ok {
r0 = rf(ctx, evt)
} else {
r0 = ret.Error(0)
}
return r0
}
// overrideCfg provides a mock function with given fields: cfg // overrideCfg provides a mock function with given fields: cfg
func (_m *FakeScheduleService) overrideCfg(cfg SchedulerCfg) { func (_m *FakeScheduleService) overrideCfg(cfg SchedulerCfg) {
_m.Called(cfg) _m.Called(cfg)

@ -17,6 +17,7 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/services/dashboards" "github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/ngalert/eval" "github.com/grafana/grafana/pkg/services/ngalert/eval"
@ -164,7 +165,7 @@ func TestAlertingTicker(t *testing.T) {
Scheme: "http", Scheme: "http",
Host: "localhost", Host: "localhost",
} }
sched := schedule.NewScheduler(schedCfg, nil, appUrl, st) sched := schedule.NewScheduler(schedCfg, nil, appUrl, st, busmock.New())
go func() { go func() {
err := sched.Run(ctx) err := sched.Run(ctx)

@ -18,6 +18,7 @@ import (
"github.com/prometheus/client_golang/prometheus/testutil" "github.com/prometheus/client_golang/prometheus/testutil"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/expr" "github.com/grafana/grafana/pkg/expr"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/services/annotations" "github.com/grafana/grafana/pkg/services/annotations"
@ -397,6 +398,21 @@ func TestSchedule_ruleRoutine(t *testing.T) {
require.Equal(t, rule.UID, queries[0].UID) require.Equal(t, rule.UID, queries[0].UID)
require.Equal(t, rule.OrgID, queries[0].OrgID) require.Equal(t, rule.OrgID, queries[0].OrgID)
}) })
t.Run("it should get rule folder title from database and attach as label", func(t *testing.T) {
queries := make([]store.GenericRecordedQuery, 0)
for _, op := range ruleStore.RecordedOps {
switch q := op.(type) {
case store.GenericRecordedQuery:
queries = append(queries, q)
}
}
require.NotEmptyf(t, queries, "Expected a %T request to rule store but nothing was recorded", store.GenericRecordedQuery{})
require.Len(t, queries, 1, "Expected exactly one request of %T but got %d", store.GenericRecordedQuery{}, len(queries))
require.Equal(t, rule.NamespaceUID, queries[0].Params[1])
require.Equal(t, rule.OrgID, queries[0].Params[0])
require.NotEmptyf(t, rule.Labels[models.FolderTitleLabel], "Expected a non-empty title in label %s", models.FolderTitleLabel)
require.Equal(t, rule.Labels[models.FolderTitleLabel], ruleStore.Folders[rule.OrgID][0].Title)
})
t.Run("it should process evaluation results via state manager", func(t *testing.T) { t.Run("it should process evaluation results via state manager", func(t *testing.T) {
// TODO rewrite when we are able to mock/fake state manager // TODO rewrite when we are able to mock/fake state manager
states := sch.stateManager.GetStatesForRuleUID(rule.OrgID, rule.UID) states := sch.stateManager.GetStatesForRuleUID(rule.OrgID, rule.UID)
@ -948,7 +964,7 @@ func setupScheduler(t *testing.T, rs store.RuleStore, is store.InstanceStore, ac
Scheme: "http", Scheme: "http",
Host: "localhost", Host: "localhost",
} }
return NewScheduler(schedCfg, expr.ProvideService(&setting.Cfg{ExpressionsEnabled: true}, nil, nil), appUrl, st), mockedClock return NewScheduler(schedCfg, expr.ProvideService(&setting.Cfg{ExpressionsEnabled: true}, nil, nil), appUrl, st, busmock.New()), mockedClock
} }
// createTestAlertRule creates a dummy alert definition to be used by the tests. // createTestAlertRule creates a dummy alert definition to be used by the tests.
@ -1020,7 +1036,7 @@ func CreateTestAlertRule(t *testing.T, dbstore *store.FakeRuleStore, intervalSec
ExecErrState: models.AlertingErrState, ExecErrState: models.AlertingErrState,
For: forDuration, For: forDuration,
Annotations: map[string]string{"testAnnoKey": "testAnnoValue"}, Annotations: map[string]string{"testAnnoKey": "testAnnoValue"},
Labels: nil, Labels: make(map[string]string),
} }
dbstore.PutRule(ctx, rule) dbstore.PutRule(ctx, rule)

@ -50,6 +50,7 @@ type RuleStore interface {
GetRuleGroupInterval(ctx context.Context, orgID int64, namespaceUID string, ruleGroup string) (int64, error) GetRuleGroupInterval(ctx context.Context, orgID int64, namespaceUID string, ruleGroup string) (int64, error)
GetUserVisibleNamespaces(context.Context, int64, *models.SignedInUser) (map[string]*models.Folder, error) GetUserVisibleNamespaces(context.Context, int64, *models.SignedInUser) (map[string]*models.Folder, error)
GetNamespaceByTitle(context.Context, string, int64, *models.SignedInUser, bool) (*models.Folder, error) GetNamespaceByTitle(context.Context, string, int64, *models.SignedInUser, bool) (*models.Folder, error)
GetNamespaceByUID(context.Context, string, int64, *models.SignedInUser) (*models.Folder, error)
// InsertAlertRules will insert all alert rules passed into the function // InsertAlertRules will insert all alert rules passed into the function
// and return the map of uuid to id. // and return the map of uuid to id.
InsertAlertRules(ctx context.Context, rule []ngmodels.AlertRule) (map[string]int64, error) InsertAlertRules(ctx context.Context, rule []ngmodels.AlertRule) (map[string]int64, error)
@ -386,6 +387,16 @@ func (st DBstore) GetNamespaceByTitle(ctx context.Context, namespace string, org
return folder, nil return folder, nil
} }
// GetNamespaceByUID is a handler for retrieving a namespace by its UID. Alerting rules follow a Grafana folder-like structure which we call namespaces.
func (st DBstore) GetNamespaceByUID(ctx context.Context, uid string, orgID int64, user *models.SignedInUser) (*models.Folder, error) {
folder, err := st.FolderService.GetFolderByUID(ctx, user, orgID, uid)
if err != nil {
return nil, err
}
return folder, nil
}
// GetAlertRulesForScheduling returns a short version of all alert rules except those that belong to an excluded list of organizations // GetAlertRulesForScheduling returns a short version of all alert rules except those that belong to an excluded list of organizations
func (st DBstore) GetAlertRulesForScheduling(ctx context.Context, query *ngmodels.GetAlertRulesForSchedulingQuery) error { func (st DBstore) GetAlertRulesForScheduling(ctx context.Context, query *ngmodels.GetAlertRulesForSchedulingQuery) error {
return st.SQLStore.WithDbSession(ctx, func(sess *sqlstore.DBSession) error { return st.SQLStore.WithDbSession(ctx, func(sess *sqlstore.DBSession) error {

@ -305,6 +305,21 @@ func (f *FakeRuleStore) GetNamespaceByTitle(_ context.Context, title string, org
return nil, fmt.Errorf("not found") return nil, fmt.Errorf("not found")
} }
func (f *FakeRuleStore) GetNamespaceByUID(_ context.Context, uid string, orgID int64, _ *models2.SignedInUser) (*models2.Folder, error) {
f.RecordedOps = append(f.RecordedOps, GenericRecordedQuery{
Name: "GetNamespaceByUID",
Params: []interface{}{orgID, uid},
})
folders := f.Folders[orgID]
for _, folder := range folders {
if folder.Uid == uid {
return folder, nil
}
}
return nil, fmt.Errorf("not found")
}
func (f *FakeRuleStore) UpdateAlertRules(_ context.Context, q []UpdateRule) error { func (f *FakeRuleStore) UpdateAlertRules(_ context.Context, q []UpdateRule) error {
f.mtx.Lock() f.mtx.Lock()
defer f.mtx.Unlock() defer f.mtx.Unlock()

@ -8,6 +8,7 @@ import (
"time" "time"
"github.com/grafana/grafana/pkg/api/routing" "github.com/grafana/grafana/pkg/api/routing"
busmock "github.com/grafana/grafana/pkg/bus/mock"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock" acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
"github.com/grafana/grafana/pkg/services/dashboards" "github.com/grafana/grafana/pkg/services/dashboards"
@ -54,14 +55,16 @@ func SetupTestEnv(t *testing.T, baseInterval time.Duration) (*ngalert.AlertNG, *
cfg, dashboardStore, nil, cfg, dashboardStore, nil,
features, folderPermissions, dashboardPermissions, ac, features, folderPermissions, dashboardPermissions, ac,
) )
bus := busmock.New()
folderService := dashboardservice.ProvideFolderService( folderService := dashboardservice.ProvideFolderService(
cfg, dashboardService, dashboardStore, nil, cfg, dashboardService, dashboardStore, nil,
features, folderPermissions, ac, features, folderPermissions, ac, bus,
) )
ng, err := ngalert.ProvideService( ng, err := ngalert.ProvideService(
cfg, nil, routing.NewRouteRegister(), sqlStore, nil, nil, nil, nil, cfg, nil, routing.NewRouteRegister(), sqlStore, nil, nil, nil, nil,
secretsService, nil, m, folderService, ac, &dashboards.FakeDashboardService{}, nil, secretsService, nil, m, folderService, ac, &dashboards.FakeDashboardService{}, nil, bus,
) )
require.NoError(t, err) require.NoError(t, err)
return ng, &store.DBstore{ return ng, &store.DBstore{

@ -2060,28 +2060,28 @@ var expEmailNotifications = []*models.SendEmailCommandSync{
To: []string{"test@email.com"}, To: []string{"test@email.com"},
SingleEmail: true, SingleEmail: true,
Template: "ng_alert_notification", Template: "ng_alert_notification",
Subject: "[FIRING:1] EmailAlert ", Subject: "[FIRING:1] EmailAlert (default)",
Data: map[string]interface{}{ Data: map[string]interface{}{
"Title": "[FIRING:1] EmailAlert ", "Title": "[FIRING:1] EmailAlert (default)",
"Message": "", "Message": "",
"Status": "firing", "Status": "firing",
"Alerts": channels.ExtendedAlerts{ "Alerts": channels.ExtendedAlerts{
channels.ExtendedAlert{ channels.ExtendedAlert{
Status: "firing", Status: "firing",
Labels: template.KV{"alertname": "EmailAlert"}, Labels: template.KV{"alertname": "EmailAlert", "grafana_folder": "default"},
Annotations: template.KV{}, Annotations: template.KV{},
StartsAt: time.Time{}, StartsAt: time.Time{},
EndsAt: time.Time{}, EndsAt: time.Time{},
GeneratorURL: "http://localhost:3000/alerting/grafana/UID_EmailAlert/view", GeneratorURL: "http://localhost:3000/alerting/grafana/UID_EmailAlert/view",
Fingerprint: "08c220aa26cd0cf5", Fingerprint: "1e8f5e886dc14813",
SilenceURL: "http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEmailAlert", SilenceURL: "http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DEmailAlert&matcher=grafana_folder%3Ddefault",
DashboardURL: "", DashboardURL: "",
PanelURL: "", PanelURL: "",
ValueString: "[ var='A' labels={} value=1 ]", ValueString: "[ var='A' labels={} value=1 ]",
}, },
}, },
"GroupLabels": template.KV{"alertname": "EmailAlert"}, "GroupLabels": template.KV{"alertname": "EmailAlert"},
"CommonLabels": template.KV{"alertname": "EmailAlert"}, "CommonLabels": template.KV{"alertname": "EmailAlert", "grafana_folder": "default"},
"CommonAnnotations": template.KV{}, "CommonAnnotations": template.KV{},
"ExternalURL": "http://localhost:3000/", "ExternalURL": "http://localhost:3000/",
"RuleUrl": "http://localhost:3000/alerting/list", "RuleUrl": "http://localhost:3000/alerting/list",
@ -2103,10 +2103,10 @@ var expNonEmailNotifications = map[string][]string{
"icon_url": "https://awesomeemoji.com/rocket", "icon_url": "https://awesomeemoji.com/rocket",
"attachments": [ "attachments": [
{ {
"title": "Integration Test [FIRING:1] SlackAlert1 ", "title": "Integration Test [FIRING:1] SlackAlert1 (default)",
"title_link": "http://localhost:3000/alerting/list", "title_link": "http://localhost:3000/alerting/list",
"text": "Integration Test ", "text": "Integration Test ",
"fallback": "Integration Test [FIRING:1] SlackAlert1 ", "fallback": "Integration Test [FIRING:1] SlackAlert1 (default)",
"footer": "Grafana v", "footer": "Grafana v",
"footer_icon": "https://grafana.com/assets/img/fav32.png", "footer_icon": "https://grafana.com/assets/img/fav32.png",
"color": "#D63232", "color": "#D63232",
@ -2130,10 +2130,10 @@ var expNonEmailNotifications = map[string][]string{
"username": "Integration Test", "username": "Integration Test",
"attachments": [ "attachments": [
{ {
"title": "[FIRING:1] SlackAlert2 ", "title": "[FIRING:1] SlackAlert2 (default)",
"title_link": "http://localhost:3000/alerting/list", "title_link": "http://localhost:3000/alerting/list",
"text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = SlackAlert2\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_SlackAlert2/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DSlackAlert2\n", "text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = SlackAlert2\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_SlackAlert2/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DSlackAlert2&matcher=grafana_folder%%3Ddefault\n",
"fallback": "[FIRING:1] SlackAlert2 ", "fallback": "[FIRING:1] SlackAlert2 (default)",
"footer": "Grafana v", "footer": "Grafana v",
"footer_icon": "https://grafana.com/assets/img/fav32.png", "footer_icon": "https://grafana.com/assets/img/fav32.png",
"color": "#D63232", "color": "#D63232",
@ -2155,17 +2155,17 @@ var expNonEmailNotifications = map[string][]string{
`{ `{
"routing_key": "pagerduty_recv/pagerduty_test", "routing_key": "pagerduty_recv/pagerduty_test",
"dedup_key": "234edb34441f942f713f3c2ccf58b1d719d921b4cbe34e57a1630f1dee847e3b", "dedup_key": "234edb34441f942f713f3c2ccf58b1d719d921b4cbe34e57a1630f1dee847e3b",
"description": "[FIRING:1] PagerdutyAlert ", "description": "[FIRING:1] PagerdutyAlert (default)",
"event_action": "trigger", "event_action": "trigger",
"payload": { "payload": {
"summary": "Integration Test [FIRING:1] PagerdutyAlert ", "summary": "Integration Test [FIRING:1] PagerdutyAlert (default)",
"source": "%s", "source": "%s",
"severity": "warning", "severity": "warning",
"class": "testclass", "class": "testclass",
"component": "Integration Test", "component": "Integration Test",
"group": "testgroup", "group": "testgroup",
"custom_details": { "custom_details": {
"firing": "\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = PagerdutyAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_PagerdutyAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DPagerdutyAlert\n", "firing": "\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = PagerdutyAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_PagerdutyAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DPagerdutyAlert&matcher=grafana_folder%%3Ddefault\n",
"num_firing": "1", "num_firing": "1",
"num_resolved": "0", "num_resolved": "0",
"resolved": "" "resolved": ""
@ -2185,8 +2185,8 @@ var expNonEmailNotifications = map[string][]string{
`{ `{
"link": { "link": {
"messageUrl": "dingtalk://dingtalkclient/page/link?pc_slide=false&url=http%3A%2F%2Flocalhost%3A3000%2Falerting%2Flist", "messageUrl": "dingtalk://dingtalkclient/page/link?pc_slide=false&url=http%3A%2F%2Flocalhost%3A3000%2Falerting%2Flist",
"text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = DingDingAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_DingDingAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDingDingAlert\n", "text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = DingDingAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_DingDingAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDingDingAlert&matcher=grafana_folder%3Ddefault\n",
"title": "[FIRING:1] DingDingAlert " "title": "[FIRING:1] DingDingAlert (default)"
}, },
"msgtype": "link" "msgtype": "link"
}`, }`,
@ -2210,13 +2210,13 @@ var expNonEmailNotifications = map[string][]string{
], ],
"sections": [ "sections": [
{ {
"text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = TeamsAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_TeamsAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTeamsAlert\n", "text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = TeamsAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_TeamsAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTeamsAlert&matcher=grafana_folder%3Ddefault\n",
"title": "" "title": ""
} }
], ],
"summary": "[FIRING:1] TeamsAlert ", "summary": "[FIRING:1] TeamsAlert (default)",
"themeColor": "#D63232", "themeColor": "#D63232",
"title": "[FIRING:1] TeamsAlert " "title": "[FIRING:1] TeamsAlert (default)"
}`, }`,
}, },
"webhook_recv/webhook_test": { "webhook_recv/webhook_test": {
@ -2228,15 +2228,16 @@ var expNonEmailNotifications = map[string][]string{
{ {
"status": "firing", "status": "firing",
"labels": { "labels": {
"alertname": "WebhookAlert" "alertname": "WebhookAlert",
"grafana_folder": "default"
}, },
"annotations": {}, "annotations": {},
"startsAt": "%s", "startsAt": "%s",
"valueString": "[ var='A' labels={} value=1 ]", "valueString": "[ var='A' labels={} value=1 ]",
"endsAt": "0001-01-01T00:00:00Z", "endsAt": "0001-01-01T00:00:00Z",
"generatorURL": "http://localhost:3000/alerting/grafana/UID_WebhookAlert/view", "generatorURL": "http://localhost:3000/alerting/grafana/UID_WebhookAlert/view",
"fingerprint": "929467973978d053", "fingerprint": "15c59b0a380bd9f1",
"silenceURL": "http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DWebhookAlert", "silenceURL": "http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DWebhookAlert&matcher=grafana_folder%%3Ddefault",
"dashboardURL": "", "dashboardURL": "",
"panelURL": "" "panelURL": ""
} }
@ -2245,21 +2246,22 @@ var expNonEmailNotifications = map[string][]string{
"alertname": "WebhookAlert" "alertname": "WebhookAlert"
}, },
"commonLabels": { "commonLabels": {
"alertname": "WebhookAlert" "alertname": "WebhookAlert",
"grafana_folder": "default"
}, },
"commonAnnotations": {}, "commonAnnotations": {},
"externalURL": "http://localhost:3000/", "externalURL": "http://localhost:3000/",
"version": "1", "version": "1",
"groupKey": "{}/{alertname=\"WebhookAlert\"}:{alertname=\"WebhookAlert\"}", "groupKey": "{}/{alertname=\"WebhookAlert\"}:{alertname=\"WebhookAlert\"}",
"truncatedAlerts": 0, "truncatedAlerts": 0,
"title": "[FIRING:1] WebhookAlert ", "title": "[FIRING:1] WebhookAlert (default)",
"state": "alerting", "state": "alerting",
"message": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = WebhookAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_WebhookAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DWebhookAlert\n" "message": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = WebhookAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_WebhookAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DWebhookAlert&matcher=grafana_folder%%3Ddefault\n"
}`, }`,
}, },
"discord_recv/discord_test": { "discord_recv/discord_test": {
`{ `{
"content": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = DiscordAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_DiscordAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDiscordAlert\n", "content": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = DiscordAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_DiscordAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DDiscordAlert&matcher=grafana_folder%3Ddefault\n",
"embeds": [ "embeds": [
{ {
"color": 14037554, "color": 14037554,
@ -2267,7 +2269,7 @@ var expNonEmailNotifications = map[string][]string{
"icon_url": "https://grafana.com/assets/img/fav32.png", "icon_url": "https://grafana.com/assets/img/fav32.png",
"text": "Grafana v" "text": "Grafana v"
}, },
"title": "[FIRING:1] DiscordAlert ", "title": "[FIRING:1] DiscordAlert (default)",
"type": "rich", "type": "rich",
"url": "http://localhost:3000/alerting/list" "url": "http://localhost:3000/alerting/list"
} }
@ -2287,7 +2289,7 @@ var expNonEmailNotifications = map[string][]string{
}, },
"name": "default" "name": "default"
}, },
"output": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = SensuGoAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_SensuGoAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DSensuGoAlert\n", "output": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = SensuGoAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_SensuGoAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DSensuGoAlert&matcher=grafana_folder%%3Ddefault\n",
"status": 2 "status": 2
}, },
"entity": { "entity": {
@ -2300,26 +2302,26 @@ var expNonEmailNotifications = map[string][]string{
}`, }`,
}, },
"pushover_recv/pushover_test": { "pushover_recv/pushover_test": {
"--abcd\r\nContent-Disposition: form-data; name=\"user\"\r\n\r\nmysecretkey\r\n--abcd\r\nContent-Disposition: form-data; name=\"token\"\r\n\r\nmysecrettoken\r\n--abcd\r\nContent-Disposition: form-data; name=\"priority\"\r\n\r\n0\r\n--abcd\r\nContent-Disposition: form-data; name=\"sound\"\r\n\r\n\r\n--abcd\r\nContent-Disposition: form-data; name=\"title\"\r\n\r\n[FIRING:1] PushoverAlert \r\n--abcd\r\nContent-Disposition: form-data; name=\"url\"\r\n\r\nhttp://localhost:3000/alerting/list\r\n--abcd\r\nContent-Disposition: form-data; name=\"url_title\"\r\n\r\nShow alert rule\r\n--abcd\r\nContent-Disposition: form-data; name=\"message\"\r\n\r\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = PushoverAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_PushoverAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPushoverAlert\n\r\n--abcd\r\nContent-Disposition: form-data; name=\"html\"\r\n\r\n1\r\n--abcd--\r\n", "--abcd\r\nContent-Disposition: form-data; name=\"user\"\r\n\r\nmysecretkey\r\n--abcd\r\nContent-Disposition: form-data; name=\"token\"\r\n\r\nmysecrettoken\r\n--abcd\r\nContent-Disposition: form-data; name=\"priority\"\r\n\r\n0\r\n--abcd\r\nContent-Disposition: form-data; name=\"sound\"\r\n\r\n\r\n--abcd\r\nContent-Disposition: form-data; name=\"title\"\r\n\r\n[FIRING:1] PushoverAlert (default)\r\n--abcd\r\nContent-Disposition: form-data; name=\"url\"\r\n\r\nhttp://localhost:3000/alerting/list\r\n--abcd\r\nContent-Disposition: form-data; name=\"url_title\"\r\n\r\nShow alert rule\r\n--abcd\r\nContent-Disposition: form-data; name=\"message\"\r\n\r\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = PushoverAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_PushoverAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DPushoverAlert&matcher=grafana_folder%3Ddefault\n\r\n--abcd\r\nContent-Disposition: form-data; name=\"html\"\r\n\r\n1\r\n--abcd--\r\n",
}, },
"telegram_recv/bot6sh027hs034h": { "telegram_recv/bot6sh027hs034h": {
"--abcd\r\nContent-Disposition: form-data; name=\"chat_id\"\r\n\r\ntelegram_chat_id\r\n--abcd\r\nContent-Disposition: form-data; name=\"parse_mode\"\r\n\r\nhtml\r\n--abcd\r\nContent-Disposition: form-data; name=\"text\"\r\n\r\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = TelegramAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_TelegramAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTelegramAlert\n\r\n--abcd--\r\n", "--abcd\r\nContent-Disposition: form-data; name=\"chat_id\"\r\n\r\ntelegram_chat_id\r\n--abcd\r\nContent-Disposition: form-data; name=\"parse_mode\"\r\n\r\nhtml\r\n--abcd\r\nContent-Disposition: form-data; name=\"text\"\r\n\r\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = TelegramAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_TelegramAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DTelegramAlert&matcher=grafana_folder%3Ddefault\n\r\n--abcd--\r\n",
}, },
"googlechat_recv/googlechat_test": { "googlechat_recv/googlechat_test": {
`{ `{
"previewText": "[FIRING:1] GoogleChatAlert ", "previewText": "[FIRING:1] GoogleChatAlert (default)",
"fallbackText": "[FIRING:1] GoogleChatAlert ", "fallbackText": "[FIRING:1] GoogleChatAlert (default)",
"cards": [ "cards": [
{ {
"header": { "header": {
"title": "[FIRING:1] GoogleChatAlert " "title": "[FIRING:1] GoogleChatAlert (default)"
}, },
"sections": [ "sections": [
{ {
"widgets": [ "widgets": [
{ {
"textParagraph": { "textParagraph": {
"text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = GoogleChatAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_GoogleChatAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DGoogleChatAlert\n" "text": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = GoogleChatAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_GoogleChatAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DGoogleChatAlert&matcher=grafana_folder%%3Ddefault\n"
} }
}, },
{ {
@ -2356,8 +2358,8 @@ var expNonEmailNotifications = map[string][]string{
"alert_state": "alerting", "alert_state": "alerting",
"client": "Grafana", "client": "Grafana",
"client_url": "http://localhost:3000/alerting/list", "client_url": "http://localhost:3000/alerting/list",
"description": "[FIRING:1] KafkaAlert ", "description": "[FIRING:1] KafkaAlert (default)",
"details": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = KafkaAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_KafkaAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DKafkaAlert\n", "details": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = KafkaAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_KafkaAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DKafkaAlert&matcher=grafana_folder%3Ddefault\n",
"incident_key": "35c0bdb1715f9162a20d7b2a01cb2e3a4c5b1dc663571701e3f67212b696332f" "incident_key": "35c0bdb1715f9162a20d7b2a01cb2e3a4c5b1dc663571701e3f67212b696332f"
} }
} }
@ -2365,32 +2367,32 @@ var expNonEmailNotifications = map[string][]string{
}`, }`,
}, },
"line_recv/line_test": { "line_recv/line_test": {
`message=%5BFIRING%3A1%5D+LineAlert+%0Ahttp%3A%2Flocalhost%3A3000%2Falerting%2Flist%0A%0A%2A%2AFiring%2A%2A%0A%0AValue%3A+%5B+var%3D%27A%27+labels%3D%7B%7D+value%3D1+%5D%0ALabels%3A%0A+-+alertname+%3D+LineAlert%0AAnnotations%3A%0ASource%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fgrafana%2FUID_LineAlert%2Fview%0ASilence%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fsilence%2Fnew%3Falertmanager%3Dgrafana%26matcher%3Dalertname%253DLineAlert%0A`, `message=%5BFIRING%3A1%5D+LineAlert+%28default%29%0Ahttp%3A%2Flocalhost%3A3000%2Falerting%2Flist%0A%0A%2A%2AFiring%2A%2A%0A%0AValue%3A+%5B+var%3D%27A%27+labels%3D%7B%7D+value%3D1+%5D%0ALabels%3A%0A+-+alertname+%3D+LineAlert%0A+-+grafana_folder+%3D+default%0AAnnotations%3A%0ASource%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fgrafana%2FUID_LineAlert%2Fview%0ASilence%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fsilence%2Fnew%3Falertmanager%3Dgrafana%26matcher%3Dalertname%253DLineAlert%26matcher%3Dgrafana_folder%253Ddefault%0A`,
}, },
"threema_recv/threema_test": { "threema_recv/threema_test": {
`from=%2A1234567&secret=myapisecret&text=%E2%9A%A0%EF%B8%8F+%5BFIRING%3A1%5D+ThreemaAlert+%0A%0A%2AMessage%3A%2A%0A%2A%2AFiring%2A%2A%0A%0AValue%3A+%5B+var%3D%27A%27+labels%3D%7B%7D+value%3D1+%5D%0ALabels%3A%0A+-+alertname+%3D+ThreemaAlert%0AAnnotations%3A%0ASource%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fgrafana%2FUID_ThreemaAlert%2Fview%0ASilence%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fsilence%2Fnew%3Falertmanager%3Dgrafana%26matcher%3Dalertname%253DThreemaAlert%0A%0A%2AURL%3A%2A+http%3A%2Flocalhost%3A3000%2Falerting%2Flist%0A&to=abcdefgh`, `from=%2A1234567&secret=myapisecret&text=%E2%9A%A0%EF%B8%8F+%5BFIRING%3A1%5D+ThreemaAlert+%28default%29%0A%0A%2AMessage%3A%2A%0A%2A%2AFiring%2A%2A%0A%0AValue%3A+%5B+var%3D%27A%27+labels%3D%7B%7D+value%3D1+%5D%0ALabels%3A%0A+-+alertname+%3D+ThreemaAlert%0A+-+grafana_folder+%3D+default%0AAnnotations%3A%0ASource%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fgrafana%2FUID_ThreemaAlert%2Fview%0ASilence%3A+http%3A%2F%2Flocalhost%3A3000%2Falerting%2Fsilence%2Fnew%3Falertmanager%3Dgrafana%26matcher%3Dalertname%253DThreemaAlert%26matcher%3Dgrafana_folder%253Ddefault%0A%0A%2AURL%3A%2A+http%3A%2Flocalhost%3A3000%2Falerting%2Flist%0A&to=abcdefgh`,
}, },
"victorops_recv/victorops_test": { "victorops_recv/victorops_test": {
`{ `{
"alert_url": "http://localhost:3000/alerting/list", "alert_url": "http://localhost:3000/alerting/list",
"entity_display_name": "[FIRING:1] VictorOpsAlert ", "entity_display_name": "[FIRING:1] VictorOpsAlert (default)",
"entity_id": "633ae988fa7074bcb51f3d1c5fef2ba1c5c4ccb45b3ecbf681f7d507b078b1ae", "entity_id": "633ae988fa7074bcb51f3d1c5fef2ba1c5c4ccb45b3ecbf681f7d507b078b1ae",
"message_type": "CRITICAL", "message_type": "CRITICAL",
"monitoring_tool": "Grafana v", "monitoring_tool": "Grafana v",
"state_message": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = VictorOpsAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_VictorOpsAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DVictorOpsAlert\n", "state_message": "**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = VictorOpsAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_VictorOpsAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%%3DVictorOpsAlert&matcher=grafana_folder%%3Ddefault\n",
"timestamp": %s "timestamp": %s
}`, }`,
}, },
"opsgenie_recv/opsgenie_test": { "opsgenie_recv/opsgenie_test": {
`{ `{
"alias": "47e92f0f6ef9fe99f3954e0d6155f8d09c4b9a038d8c3105e82c0cee4c62956e", "alias": "47e92f0f6ef9fe99f3954e0d6155f8d09c4b9a038d8c3105e82c0cee4c62956e",
"description": "[FIRING:1] OpsGenieAlert \nhttp://localhost:3000/alerting/list\n\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = OpsGenieAlert\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_OpsGenieAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOpsGenieAlert\n", "description": "[FIRING:1] OpsGenieAlert (default)\nhttp://localhost:3000/alerting/list\n\n**Firing**\n\nValue: [ var='A' labels={} value=1 ]\nLabels:\n - alertname = OpsGenieAlert\n - grafana_folder = default\nAnnotations:\nSource: http://localhost:3000/alerting/grafana/UID_OpsGenieAlert/view\nSilence: http://localhost:3000/alerting/silence/new?alertmanager=grafana&matcher=alertname%3DOpsGenieAlert&matcher=grafana_folder%3Ddefault\n",
"details": { "details": {
"url": "http://localhost:3000/alerting/list" "url": "http://localhost:3000/alerting/list"
}, },
"message": "[FIRING:1] OpsGenieAlert ", "message": "[FIRING:1] OpsGenieAlert (default)",
"source": "Grafana", "source": "Grafana",
"tags": ["alertname:OpsGenieAlert"] "tags": ["alertname:OpsGenieAlert","grafana_folder:default"]
}`, }`,
}, },
// Prometheus Alertmanager. // Prometheus Alertmanager.
@ -2399,7 +2401,8 @@ var expNonEmailNotifications = map[string][]string{
{ {
"labels": { "labels": {
"__alert_rule_uid__": "UID_AlertmanagerAlert", "__alert_rule_uid__": "UID_AlertmanagerAlert",
"alertname": "AlertmanagerAlert" "alertname": "AlertmanagerAlert",
"grafana_folder": "default"
}, },
"annotations": { "annotations": {
"__value_string__": "[ var='A' labels={} value=1 ]" "__value_string__": "[ var='A' labels={} value=1 ]"

Loading…
Cancel
Save