Every metric on the Analytics page reflects your active dashboard filters. When you adjust any of the filters below, all metrics update simultaneously — there is no need to apply them per section.
Filter
What it controls
Product
Limits metrics to promotions linked to the selected product
Ad Type
Limits metrics to promotions of the selected ad format
Country
Limits metrics to promotions targeting the selected country
Uploader
Limits metrics to promotions submitted by the selected person
Reviewer
Limits metrics to promotions reviewed by the selected person
Campaign
Limits metrics to a specific campaign
Evergreen
Includes or excludes evergreen promotions
Deleted, archived, staff-only, and affiliate records are excluded from every metric on this page, regardless of filters.
The total number of promotions that entered the review pipeline during the selected period. Use this to understand overall demand coming into your team.Promotions still in draft state are not counted — only those that have been actively submitted for review.
How this is calculated
Total Promotions Submitted= count of all submitted, non-draft promotions within the selected date range and filters
What counts: Promotions created within the selected date range that have been submitted (any status except draft).What doesn’t count: Drafts, deleted, archived, staff-only, and affiliate promotions.
See this in action
In the example above, Adclear had 415 promotions submitted. This tells you that 415 distinct pieces of content entered the review pipeline — each one representing a campaign that needed sign-off.
Total Versions Submitted
The total number of individual ad versions submitted during the selected period. Each version is a distinct creative upload within a promotion — this metric reflects the actual volume of content that reviewers handle.
How this is calculated
Total Versions Submitted= count of all submitted, non-draft versions linked to valid promotions in the selected date range and filters
What counts: Version records created in the selected date range that have been submitted (any status except draft).What doesn’t count: Draft versions, deleted versions, and versions belonging to deleted, archived, staff-only, or affiliate promotions.
Unlike most metrics, archived versions are not automatically excluded here. An archived version can still be counted if it was otherwise submitted and meets all other conditions.
See this in action
In the example above, Adclear had 461 versions submitted across those 415 promotions. That’s 46 more versions than promotions — meaning some promotions contained more than one creative upload.
Average Versions
The average number of submitted versions per promotion. A low number suggests promotions are typically straightforward; a higher number suggests more complex submissions with multiple creative variations.
How this is calculated
Average Versions = total submitted versions ÷ total submitted promotions
Displayed rounded to one decimal place. The dashboard always shows a minimum of 1.0 — this floor prevents the metric displaying a misleading value in edge cases.What counts: Submitted, non-draft versions linked to qualifying promotions in the selected period.What doesn’t count: Draft, deleted, or archived versions; draft, deleted, archived, staff-only, or affiliate promotions.
See this in action
In the example above, Adclear’s average was 1.1 versions per promotion. With 461 versions across 415 promotions, almost every promotion came in with a single creative — but just enough had two versions to nudge the average slightly above 1.
Early Requests
The percentage of submitted versions that were flagged as needing accelerated handling. This helps teams understand how much of their incoming workload is time-sensitive or has been prioritised by the submitter.
How this is calculated
Early Requests % = (versions flagged as early requests ÷ total submitted versions) × 100
What counts: Submitted versions that have been explicitly marked as an early request. Versions where this flag is absent or switched off are not included.What doesn’t count: Draft, deleted, or archived versions; versions belonging to deleted, archived, staff-only, or affiliate promotions.
See this in action
In the example above, 30.8% of versions were flagged as early requests — that’s 142 out of 461 submissions. Nearly a third of the workload arriving in this period was time-sensitive, which gives reviewers useful context for prioritisation.
The percentage of reviewed versions where the review was completed within the agreed service-level timeframe. This is the primary indicator of whether your review team is keeping pace with its commitments.
How this is calculated
SLA Compliance % = (versions reviewed on time ÷ total reviewed versions) × 100
What counts in the numerator: Versions that received a review outcome and were confirmed as meeting the SLA deadline, based on working-day rules.What counts in the denominator: Any version that reached a reviewed outcome (moved past pending or draft stage).What doesn’t count: Draft, deleted, or archived versions; versions tied to deleted, archived, staff-only, or affiliate promotions.
SLA compliance is assessed at the date level, not the exact time of day. If no versions have been reviewed in the selected period, this metric displays as 0%.
See this in action
In the example above, Adclear achieved 98.0% SLA compliance — 434 out of 443 versions reviewed were turned around within the agreed timeframe. Only 9 versions missed the deadline, suggesting a highly consistent review operation.
1st Time Approval
The percentage of promotions that were approved the very first time they were reviewed, without needing any further revisions or re-submissions. A higher number indicates stronger content quality and fewer back-and-forth review cycles.
How this is calculated
1st Time Approval % = (promotions approved on first attempt ÷ total reviewed promotions) × 100
What counts in the numerator: Promotions where the first review outcome was an approval, with no prior revision cycles.What counts in the denominator: All promotions that received any review outcome.What doesn’t count: Draft, deleted, archived, staff-only, and affiliate promotions.
The approval doesn’t need to happen within the selected date range. As long as the promotion was created in range, approvals that occur later are still factored in.
See this in action
In the example above, Adclear’s 1st Time Approval rate was 92.5% — 384 out of 415 promotions were approved without needing a second look. That means only around 31 promotions required revisions before being signed off.
Approval Time
How long it typically takes from the moment a version is first submitted to the moment it receives its first approval. This gives a clear picture of end-to-end review speed.The dashboard shows both the average and the median, displayed in the most readable unit (minutes, hours, days, or weeks) based on actual values.
How this is calculated
Approval Time = time of first approval − time of first submissionReported as both mean and median across all qualifying versions.
Clock starts: When the version is first submitted for review.Clock stops: When the version receives its first approval decision.What doesn’t count: Versions where either event is missing; draft, deleted, or archived versions; versions tied to deleted, archived, staff-only, or affiliate promotions.
See this in action
In the example above, Adclear’s average Approval Time was 5.5 hours across 427 approved versions. That means from the moment a version landed in the review queue, it was typically approved within the same working day.
First-Time Response
How long it takes for a reviewer to give any initial response after a version is submitted — even if that response isn’t a final decision. This highlights reviewer responsiveness early in the process.The dashboard shows both the average and the median.
How this is calculated
First-Time Response = time of first reviewer action − time of first submissionReported as both mean and median across all qualifying versions.
Clock starts: When the version is first submitted.Clock stops: When any status change occurs after that submission — not just a final approval.What doesn’t count: Versions missing either event; draft, deleted, or archived versions; versions tied to deleted, archived, staff-only, or affiliate promotions.
See this in action
In the example above, Adclear’s First-Time Response averaged 6.4 hours across 443 versions. Compared to the 5.5-hour approval time, this suggests that most initial responses were also final decisions — reviewers weren’t bouncing work back for minor clarifications before eventually approving.
Number of Iterations
The average number of times a promotion had to be re-submitted after its initial upload before receiving approval. Zero means the first submission was approved straight away. Higher numbers indicate more back-and-forth and rework.
How this is calculated
Iterations per promotion = total submissions up to first approval − 1 (minimum 0)Number of Iterations = average across all qualifying promotions
The median is also calculated alongside the average to give a better sense of the spread.What counts: All submitted, non-draft versions up to and including the first approved version. If a promotion was never approved, all of its submitted versions are used.What doesn’t count: Draft, deleted, or archived versions; deleted, archived, staff-only, or affiliate promotions.
A promotion with only one submission — approved first try — contributes 0 iterations, because the first submission is the baseline, not a re-submit.
See this in action
In the example above, the average Number of Iterations was 0.1. With a 92.5% first-time approval rate, this makes sense — the vast majority of promotions needed no rework at all, keeping the average extremely close to zero.
Shows which uploaders submitted the most work during the selected period. Useful for understanding where submission volume is coming from and spotting any concentration across a small number of submitters.Displays the top 5 uploaders by submission count, in descending order.
How this is calculated
Most Active Uploaders = top 5 uploaders ranked by submission count (descending)
The count switches depending on your current view mode:
Promotions mode — counts campaigns per uploader
Versions mode — counts individual ad versions per uploader
What counts: Records created within the selected date range, matched to their uploader.What doesn’t count: Deleted or archived records, staff-only campaigns, affiliate campaigns, and any record where no uploader can be identified.
See this in action
In the example above, João Paulo Siqueira was Adclear’s most active uploader with 90 versions submitted — more than double the next person on the list. The top 5 uploaders together accounted for 254 out of 461 total versions, showing that submission volume is concentrated in a relatively small group.
Most Active Reviewers
Shows which reviewers completed the most review work during the selected period. Useful for understanding how workload is distributed across the review team and whether any reviewers are handling a disproportionate share.Displays the top 5 reviewers by completed review count, in descending order.
How this is calculated
Most Active Reviewers = top 5 reviewers ranked by completed review count (descending)
The count switches depending on your current view mode:
Promotions mode — counts campaigns reviewed per reviewer
Versions mode — counts individual versions reviewed per reviewer
What counts: Review decisions made within the selected date range — any status change that represents a completed review outcome.What doesn’t count: Pending or draft status changes, deleted or archived records, staff-only and affiliate campaigns, and any record where no reviewer can be identified.
This metric is based on when the review decision was made, not when the promotion was originally created. A review event in the selected period counts even if the promotion was submitted earlier.
See this in action
In the example above, Malinka Marinova was Adclear’s most active reviewer, handling 217 versions — nearly twice as many as the second-ranked reviewer (Hristo Tomov at 111). The top 5 reviewers collectively reviewed 458 out of 483 total versions reviewed, indicating that a small team is carrying most of the review load.
Breaks down all promotions into their current status, showing what share of your submission volume sits at each stage of the workflow. Useful for spotting backlogs or understanding the overall mix of outcomes.
How this is calculated
Status share % = (promotions in a given status ÷ total promotions across all statuses shown) × 100
What counts: All qualifying promotions grouped by their current status value.What doesn’t count: Deleted, archived, staff-only, and affiliate promotions; anything outside the selected filters and date range.
Promotions with no status recorded appear as “Unknown.” The dashboard displays the draft status as “Draft” rather than its internal label.
See this in action
In the example above, 99.3% of Adclear’s promotions were approved (412 out of 415), with only 2 needing changes and 1 rejected. This breakdown instantly confirms that the pipeline is healthy and very little work is sitting unresolved.
Promotions by Product
Shows how promotion volume is distributed across different products. Helps identify which products are generating the most review traffic and whether any product is driving an outsized share of submissions.
How this is calculated
Product share % = (distinct promotions linked to a product ÷ total promotions across all products shown) × 100
What counts: Distinct promotions grouped by the product they are linked to.What doesn’t count: Deleted, archived, staff-only, and affiliate promotions; products not linked to the current organisation; anything outside the selected filters and date range.
Promotions not linked to any product appear under “Unknown.” Counts are based on distinct promotions, not raw database rows.
See this in action
In the example above, ActivTrader accounted for 79.3% of Adclear’s promotions (329 out of 415). The remaining volume was spread across Trading View, MetaTrader 4, and MetaTrader 5 — giving a clear picture of which product drives the bulk of the review workload.
Promotions by Ad Type
Shows how promotion volume is distributed across different ad formats. Helps teams understand which ad types create the most pipeline load and where review capacity may need to be focused.
How this is calculated
Ad Type share % = (distinct promotions linked to an ad type ÷ total promotions across all ad types shown) × 100
What counts: Distinct promotions grouped by the ad type they are linked to.What doesn’t count: Deleted, archived, staff-only, and affiliate promotions; ad types not linked to the current organisation; anything outside the selected filters and date range.
Promotions not linked to any ad type appear under “Unknown.” Counts are based on distinct promotions, not raw database rows.
See this in action
In the example above, Articles made up 37.1% of Adclear’s promotions (154 out of 415), followed by Content at 22.7% and Long-form video at 10.8%. The remaining volume was spread across a long tail of formats — useful context for understanding where reviewer expertise needs to be concentrated.
Shows which reasons are selected most often when reviewers reject a promotion. Helps identify recurring compliance or quality failure patterns so teams can address root causes upstream.
How this is calculated
Rejection Reason % = (count for a given reason ÷ total rejection reason selections across all reasons shown) × 100
What counts: Every rejection reason recorded against a rejected version or promotion within the selected date range.What doesn’t count: Records outside the selected date range, deleted or archived versions and campaigns, staff-only and affiliate campaigns, anything outside selected filters.
A single review can record multiple rejection reasons — each one is counted individually. If the same promotion is reviewed and rejected more than once over time, each rejection adds further rows. This means the total can exceed the number of rejected promotions.
See this in action
In the example above, 75% of Adclear’s rejections were due to missing or inadequate risk warnings and disclaimers (8 out of the total recorded reasons). The remaining 25% fell into an “Other” category. This signals a clear and consistent compliance gap — one that could be addressed through uploader guidance or submission checklists.
Changes Requested Reasons
Shows which reasons are selected most often when reviewers request changes rather than outright rejecting a promotion. Helps teams identify the most common fix requests and reduce revision cycles proactively.
How this is calculated
Changes Requested Reason % = (count for a given reason ÷ total changes-requested reason selections across all reasons shown) × 100
What counts: Every “changes requested” reason recorded against a version or promotion within the selected date range.What doesn’t count: Records outside the selected date range, deleted or archived versions and campaigns, staff-only and affiliate campaigns, anything outside selected filters.
Reasons are grouped by name, not by unique review event. If a reviewer selects multiple reasons in a single review, each reason is counted as a separate row. This means the total can exceed the number of promotions that received a changes-requested outcome.
See this in action
In the example above, 71.4% of Adclear’s change requests cited missing or inadequate risk warnings and disclaimers (15 out of 21 recorded reasons). This mirrors the top rejection reason — suggesting that risk disclosure is the single biggest quality gap in the current submission pipeline, appearing both as a reason for outright rejection and for revision requests.