← Back to Menu
Phase 1

Data Preparation & Modeling

This phase focuses on creating a single, optimized "gold-level" view in Azure Databricks. This performant data layer will be the sole source for our Power BI model, ensuring speed and simplicity.

Step 1.1: Core SQL Transformation

The first step is to write a comprehensive SQL script in a Databricks notebook. This script joins multiple source tables (from Epic) and creates a clean, flattened fact view ready for analysis.

-- Conceptual SQL for v_ChargeRecon_Fact View
CREATE OR REPLACE VIEW gold.v_ChargeRecon_Fact AS

WITH Appointments AS (
    SELECT EncounterID, AppointmentDate, PatientID, ProviderID, ClinicID
    FROM silver.appointments WHERE AppointmentStatus = 'Completed'
),

Charges AS (
    SELECT EncounterID, ChargePostDate, ChargeAmount, IsDenied, DenialReasonID
    FROM silver.transactions
),

FactData AS (
    SELECT
        a.EncounterID, a.PatientID, a.ProviderID, a.ClinicID,
        a.AppointmentDate, c.ChargePostDate, c.ChargeAmount,
        c.IsDenied, c.DenialReasonID,
        DATEDIFF(c.ChargePostDate, a.AppointmentDate) AS ChargeLagDays
    FROM Appointments a
    LEFT JOIN Charges c ON a.EncounterID = c.EncounterID
)

SELECT *,
    CASE WHEN ChargeLagDays > 7 THEN TRUE ELSE FALSE END AS IsLateCharge
FROM FactData;
Phase 2

Dashboard Development

With our data prepared, we move to Power BI Desktop. Here we will build a robust star schema data model, write explicit DAX measures for our KPIs, and design an intuitive dashboard for our clinic leaders.

Step 2.1: The Star Schema Model

In Power BI's Model View, we'll connect our tables into a star schema. This is critical for performance, ensuring filters flow correctly from dimensions to the central fact table.

Fact: ChargeRecon
Dim: Provider
Dim: Clinic
Dim: Date
Dim: Denial Reason

Step 2.2: DAX Measures for KPIs

We will create explicit measures for all business logic. This centralizes calculations and ensures consistency. Click each card to view the DAX code.

Avg Charge Lag

Click to view code ▼
Avg Charge Lag = AVERAGE('v_ChargeRecon_Fact'[ChargeLagDays])

Denial Rate

Click to view code ▼
Denial Rate = 
VAR DeniedCharges = CALCULATE(COUNTROWS('v_ChargeRecon_Fact'), 'v_ChargeRecon_Fact'[IsDenied] = TRUE)
VAR TotalCharges = COUNTROWS('v_ChargeRecon_Fact')
RETURN DIVIDE(DeniedCharges, TotalCharges, 0)

YTD Revenue

Click to view code ▼
YTD Revenue = TOTALYTD(SUM('v_ChargeRecon_Fact'[ChargeAmount]), 'v_dim_Date'[Date])

Step 2.3: Dashboard Visual Mockups

Below are three visual options for the dashboard layout. Each mockup caters to a different user persona or analytical need, providing flexibility in the final design.

Option 1: The Leader's Digest

A balanced, high-level overview for quick insights.

Avg Lag
2.8 days
Denial %
4.1%
Late Charges
$12k
Recon. Rate
99.2%

Option 2: The Analyst's Deep Dive

A more granular, data-dense layout for root cause analysis.

Total Denials
1,422
Lag > 7 Days
310

Top Providers by Lag

Dr. Smith8.1d
Dr. Jones7.5d
Dr. Allen6.9d
Dr. Patel5.2d

Option 3: The Operational Monitor

A focused view on actionable items and current status.

Open Encounters
83
Recon. Rate (24h)
97%
New Denials
12

Actionable: Encounters Missing Charges

PatientProviderDate
J. DoeDr. Smith07/14/25
A. BellDr. Smith07/14/25
M. ChenDr. Allen07/13/25
S. RayDr. Jones07/13/25
L. KimDr. Smith07/12/25
Phase 3

Deployment & Governance

The final phase involves publishing the dashboard to the Power BI Service, securing the data with Row-Level Security, and empowering our leaders through training.

1

Publish & Schedule Refresh

Publish the `.pbix` file to a dedicated Power BI workspace. Configure a nightly scheduled refresh to run after the Databricks ETL process completes, ensuring data is always fresh.

2

Implement Row-Level Security

Create a "Clinic Leader" role in Power BI and apply a DAX filter (`[ClinicLeaderEmail] = USERPRINCIPALNAME()`). Map users to this role in the Power BI Service to ensure leaders only see data for their own clinics.

3

User Training & Rollout

Conduct focused training sessions on interpreting the visuals and using interactive features like slicers and drill-throughs. Establish a feedback channel for continuous improvement.