Experiment Settings

Configure each experiment's targeting (URL pattern, query parameters, visitor status), traffic allocation, and consent behavior. All targeting rules are evaluated client-side before variant assignment.

Experiment Settings

Each experiment carries a small set of settings that control who enters it and how traffic is allocated. All of these are configured in the dashboard when you create or edit an experiment.

This page covers:

  • Targeting — URL patterns, query parameters, visitor status
  • Traffic allocation — what percentage of eligible visitors enter
  • Consent — how the runtime respects visitor consent

Targeting

Targeting rules decide which visitors are eligible for an experiment. Only visitors who pass all rules are assigned a variant. Everyone else sees the original page, unmodified. All rules are evaluated client-side by the experiment runtime before any variant is applied.

URL Patterns

URL patterns control which pages the experiment runs on. Visitors on matching pages are eligible; visitors on other pages are not.

Patterns match against the URL path (slug). Use * to match within a single segment and ** to match across segments.

PatternMatchesDoes Not Match
/pricing/pricing/pricing/enterprise
/pricing/*/pricing/monthly, /pricing/annual/pricing/a/b
/blog/**/blog/post, /blog/2024/recap/docs/blog

If no URL pattern is set, the experiment will not run on any page.

Query Parameters

Narrow an experiment to visitors whose URL contains specific query string values. This is useful for targeting paid traffic by UTM parameter, affiliate codes, or any key-value pair in the URL.

Each condition specifies a key, an operator, and (for most operators) a value. All conditions use AND logic — every condition must match for the visitor to be eligible.

OperatorMeaningExample
equalsKey exists and value matches (case-insensitive)utm_source equals meta
containsKey exists and value includes the substringutm_campaign contains spring
existsKey is present, regardless of valuegclid exists
not_equalsKey is missing, or value does not matchutm_source not_equals internal
not_existsKey is not present in the URLdebug not_exists

Matching is case-insensitive for all value comparisons.

Example

To target visitors arriving from a Meta ad campaign:

  • utm_source equals meta
  • utm_medium equals cpc

A visitor landing on ?utm_source=meta&utm_medium=cpc enters the experiment. A visitor with only ?utm_source=meta does not (both conditions must match).

Visitor Status

Target experiments based on whether the visitor has been to your site before.

OptionWho It Targets
All visitorsEveryone (default)
First-time visitors onlyVisitors who have never loaded a page with the experiment snippet on this domain
Returning visitors onlyVisitors who have loaded at least one previous page with the snippet

Visitor status is determined when window.ours_experiments.init({ visitorId }) runs. The runtime stores the last initialized visitor ID in its experiment cookie metadata. If the same visitor ID returns, they are considered returning. If the runtime has not seen that visitor ID before, they are considered first-time.

If a visitor clears their experiment cookies, they will be treated as first-time again.

How Targeting Rules Combine

All targeting rules are evaluated in order. A visitor must pass every rule to be eligible:

  1. URL pattern — is the visitor on a matching page?
  2. Query parameters — does the URL contain the required key-value pairs?
  3. Visitor status — is the visitor new or returning, as required?
  4. Traffic allocation — is the visitor within the allocation percentage (see below)?

If any rule rejects the visitor, the experiment is skipped entirely for that pageview.


Traffic Allocation

Traffic allocation controls what percentage of eligible visitors enter the experiment. Visitors who are eligible (they pass all targeting rules) but fall outside the allocation percentage see the original page and are not tracked as experiment participants.

Set this to 100 to include all eligible visitors. Lower it if you want to limit exposure while you validate results.


Experiments respect the same consent settings as the rest of the Ours Privacy platform. Visitors who have not consented to analytics are not assigned to a variant and do not see modified content. This is enforced at the pipeline level, not as a client-side check that can fail silently.

You don't need to add consent checks to individual experiments — the runtime handles it automatically.


Next Steps

How is this guide?

On this page