Analytics & Reporting
Analytics & Reporting
Section titled “Analytics & Reporting”Use this page when you need to turn product and quality signals into a report that someone else will read. Reporting is different from day-to-day dashboard review because it needs a stable time range, clear metric definitions, and an explicit audience.
Before you build a report
Section titled “Before you build a report”- Define the audience first: workspace admins, leadership, support, security reviewers, or customer stakeholders.
- Choose a fixed reporting window and note any known rollout changes that affect the numbers.
- Confirm which metrics are authoritative in the current product and which still require feature-level validation.
Common report sets
Section titled “Common report sets”- Usage and participation trends.
- Quality and reliability patterns by environment or team.
- Governance signals such as retention, export, or artifact access.
- Adoption of summaries, search, and other follow-up workflows.
Reporting guardrails
Section titled “Reporting guardrails”- Export shape and rollup scope should be validated before you depend on them for executive reporting.
- Keep meeting-level anecdotes separate from workspace-wide trends unless they represent a repeated failure mode.
- Call out when preview features or limited rollouts affect the report so readers do not mistake rollout gaps for adoption failure.
Troubleshooting
Section titled “Troubleshooting”The report is technically correct but still misleading
Section titled “The report is technically correct but still misleading”- Add rollout context, audience scope, and metric definitions directly into the report.
- Separate pilot groups, guests, and general members if their workflows differ materially.
- Re-check whether network or support issues distorted adoption or quality numbers during the period you selected.