Several of the IT leadership roles I've been targeting recently list HIPAA compliance ownership as a core requirement. Digital health companies, SaaS platforms serving hospital clients, healthtech companies handling patient-adjacent data — the compliance posture is different from what I managed at Life360, and I wanted to understand exactly how different before walking into those interviews.
So I built the assessment tool I'd want to run on day one of an IT leadership role at a health-adjacent company.
I've spent time in compliance-heavy environments. Life360 is a public company; I owned SOX audit responsibilities for IT systems and access controls for three years. I understand what maintaining evidence looks like, what external auditors actually check, and what the difference is between a control that exists on paper and one that holds up under scrutiny. HIPAA is a different framework, but the operational pattern is the same: a defined set of controls, an external reviewer, and a gap between where you are today and where you need to be.
The gap assessment problem is the same across frameworks. The tooling for it is worse in HIPAA than almost anywhere else — most teams do it in spreadsheets that go stale immediately and don't connect to the systems they're assessing. This tool does.
The Required vs. Addressable Distinction
The most important thing the tool surfaces is one most IT managers get wrong: the difference between required and addressable implementation specifications.
Required means the control must be implemented. No flexibility. If it's required and it's missing, that's a gap.
Addressable does not mean optional. It means the implementation is flexible based on your environment. If an addressable control is reasonable and appropriate for your organization, you implement it. If it isn't, you document why and implement an equivalent alternative. Either way, the control is satisfied. "Addressable" is not a pass.
This distinction matters because the scoring logic is completely different. A missing required control scores 0 and flags as Critical. A missing addressable control where you have documented justification for an alternative still scores as compliant. Most HIPAA gap assessments treat every control the same way, which produces a misleadingly bad picture for organizations that have made legitimate implementation choices.
The tool applies the correct scoring logic for each of the 42 controls across all 3 safeguard categories: administrative, physical, and technical. Required versus addressable is tracked per control, not as a single setting.
The BAA Tracker Problem
Business Associate Agreements are the part of HIPAA that most IT teams undercount.
The rule is simple: any vendor that creates, receives, maintains, or transmits protected health information on your behalf is a Business Associate, and you need a signed BAA before they touch that data. In practice, most companies have a short list of obvious BAs — the EHR vendor, the billing system — and a much longer list of tools that technically qualify but nobody has ever flagged.
The demo dataset in the tool runs as Meridian Health Tech, a 150-person SaaS Business Associate serving 3 hospital EHR clients. When I modeled their realistic vendor stack and ran the BAA assessment, 5 critical BAA gaps surfaced: Slack, Google Workspace, DataDog, Snowflake, and Retool.
None of those feel like PHI tools. But a company that handles support tickets (which may contain patient information), cloud infrastructure hosting anything touching patient data, observability tooling ingesting application logs, a data warehouse running analytics, a Retool dashboard built on clinical data — all of these qualify depending on your actual data flows. The BAA gap isn't theoretical. It's a breach notification obligation and an HHS audit finding if something goes wrong.
The tracker maintains a vendor inventory with BAA status, last review date, and data types shared. The risk classification (Critical, High, Medium, Low) is based on the sensitivity of data shared and whether a BAA is in place.
The SOC2 Crosswalk
If you've already done SOC2 work, you're not starting HIPAA from scratch.
The tool includes a crosswalk between the HIPAA Security Rule controls and the SOC2 Trust Services Criteria. For a company with a SOC2 Type I in place, roughly 46% of HIPAA controls are at least partially satisfied by existing SOC2 evidence. Logical access controls, encryption requirements, incident response procedures, vendor risk management — these map reasonably well across frameworks.
The crosswalk shows you two things: which controls you can close quickly by pointing at existing SOC2 evidence, and which controls are HIPAA-specific with no SOC2 analog. The HIPAA-unique controls tend to cluster in the administrative safeguards and the BAA requirements. Those are the ones that require new work regardless of your SOC2 state.
What the Remediation Roadmap Looks Like
After the gap assessment runs, the tool generates a phased remediation roadmap via Claude. The roadmap isn't a generic list of HIPAA best practices — it's generated from your specific gap state, prioritizing required controls over addressable ones and critical findings over high and medium.
The output is a 3-phase action plan exportable as a Jira-importable CSV. Phase 1 is immediate: the Critical findings that represent active risk or definite audit failures. Phase 2 is the 30-60 day work: High findings that require process changes or policy documentation. Phase 3 is longer-lead: the Medium findings and the addressable controls where you need to evaluate and document your implementation choices.
The roadmap generates in under 5 seconds. It's a starting point for the remediation conversation, not a substitute for legal review of your actual policies.
The tool is built for IT managers at health-adjacent companies who own the HIPAA responsibility without a dedicated compliance team. Demo mode runs with no credentials required and uses the Meridian Health Tech dataset to show what a realistic assessment looks like.
The live demo is at hipaa-readiness.streamlit.app. The code is on GitHub.