GLBA Safeguards Rule for Higher Ed: A Field Guide for CISOs

Field guide · June 9, 2025

← All resourcesBy Harry Hoffman, Synaptic Cybersecurity Alliance

The GLBA Safeguards Rule used to be the compliance regime most Title IV institutions could leave on a shelf. It was nominally in force — every college and university that handles federal financial aid is technically subject to it — but the FTC didn't audit, the Department of Education didn't enforce, and most higher-ed CISOs prioritized the regulations with active examiners over the one that didn't seem to be watching.

That changed in June 2023.

The FTC's amendments to the Safeguards Rule went into effect, and what had been a vague obligation to "have a security program" became a concrete checklist with named requirements, qualified individuals, written documentation, board-reporting cadence, and explicit consequences for non-compliance. The Department of Education started referencing the rule in its single-audit guidance. The first round of institutional audit findings has started landing.

If you're a CISO at a college or university that participates in Title IV — and almost every accredited institution does — you are subject to this rule, and you're being measured against it. This is a practitioner's guide to what's actually in the rule, what's distinct about applying it in higher-education environments, and what an audit-ready Information Security Program looks like.

What changed in 2023

The original Safeguards Rule had been in place since 2003 with relatively loose requirements. The 2023 amendments tightened nine specific elements that a qualifying Information Security Program must include. Three of those changes are particularly load-bearing for higher-ed.

The first is the requirement for a qualified individual — a named person, not a committee or a vendor, with the authority and competence to lead the program. For most institutions that role is the CISO. For institutions without one, I read the rule as implicitly telling you to create the role. The FTC has not been quiet about this in enforcement; "a committee oversees security" does not satisfy.

The second is the requirement for a written risk assessment that is comprehensive, current, and tied to specific safeguards. The "written" part is doing real work in audits. Verbal understanding does not count. Neither does a slide deck that was presented once and never formally adopted. The rule wants documentation that can be handed to an auditor and stand on its own.

The third is board-level reporting at least annually. The qualified individual has to present the state of the program to the institution's board of trustees or equivalent governing body. The Safeguards Rule does not dictate format, but the FTC has been clear in enforcement actions that "a brief mention in the CIO's report" does not satisfy.

The remaining elements — encryption requirements, MFA, change management, monitoring, employee training, secure development, incident response, and service-provider oversight — were already best practice for any serious institution. What changed is that they are now enumerated obligations with audit consequences attached.

Why higher-education is uniquely exposed

The Safeguards Rule was written with financial institutions in mind. Banks. Insurance companies. Mortgage brokers. When the FTC extended it to Title IV institutions, the rule didn't add a higher-education-aware clause to recognize how universities actually work.

That creates three specific exposures:

The first exposure is structural. Most colleges and universities operate with significant decentralization across schools, departments, and research units. Each may have its own IT staff, its own purchased systems, and its own data handling. The Safeguards Rule expects a single qualified individual to have visibility into all of it, and that visibility is harder to achieve in higher-ed than in a centralized corporate environment. Decentralized IT is the design, not the defect — but the rule does not particularly care about that distinction.

The second is data sprawl. The Safeguards Rule applies to "customer information," which for Title IV institutions includes anyone who has applied for or received federal financial aid. That is broader than just current students. It includes prospective students who applied and never enrolled, former students whose data still sits in archived systems, and graduate workers whose research stipends touched financial-aid mechanisms. The data flows are not always obvious, and the data lives in more systems than the financial aid office knows about.

The third is cultural. A Safeguards Rule-aligned program requires consistent access controls, monitoring, and incident response. Higher-education research environments rarely operate under consistent anything. The audit does not accept "researchers do their own thing" as a control posture — but institutional culture often does, and that gap between what the rule expects and what universities actually look like is where most of the audit findings will land.

The nine elements, applied to higher-ed

Here's what each of the nine required elements actually looks like when you build it for a college or university.

1. Designate a qualified individual

The CISO, or equivalent. If your institution doesn't have one, this element is implicitly the FTC telling you to create the role. Title and reporting structure don't matter as much as authority — the qualified individual needs to be able to require remediation across the institution, not just recommend it.

2. Conduct a comprehensive risk assessment

The rule wants this in writing, updated at defined intervals (annually is the practical standard), identifying foreseeable internal and external risks to customer-information confidentiality, integrity, and availability, and tying each risk to a specific safeguard. The point is not that the assessment is exhaustive. The point is that it is current, written down, formally adopted, and traceable to the controls in place.

For higher-ed, the assessment has to span the decentralized landscape — including research environments where customer information might appear (financial-aid-touched grad student stipends, for example) and the third-party SaaS that processes the data on the institution's behalf.

3. Design and implement safeguards

The actual controls. Access controls, encryption, secure development, intrusion detection, response planning. The 2023 amendments specify some explicitly:

  • MFA for all individuals accessing any information system. Note: any information system. Not just student records — every system, including research enclaves and administrative tools.
  • Encryption at rest and in transit for customer information, including in backups.
  • Secure disposal of customer information no later than two years after the last legitimate business use, with narrowly defined exceptions.

4. Regularly test or monitor the effectiveness of safeguards

Continuous monitoring is the default expectation. Where continuous monitoring isn't operationally feasible, the rule allows for "periodic penetration testing and vulnerability assessments" — at least annually for pen testing and every six months for vulnerability assessments.

Higher-ed-specific note: this is where institutions without a real security operations function get exposed. "We run vulnerability scans" doesn't satisfy if no one is actually responding to findings.

5. Implement security awareness training

For employees. Tailored to specific roles. With more advanced training for staff who handle customer information directly — financial aid, registrar, bursar, advancement, IT operations, and any administrators with system-level access.

6. Oversee service providers

The Safeguards Rule expects the institution to periodically assess the security posture of service providers handling customer information. Periodically, not one-time at procurement. For higher-ed running 100+ SaaS vendors, this is where most programs are weakest. Tools like Azimuth exist specifically to operationalize HECVAT-based vendor risk assessment at the scale higher-ed actually has.

7. Keep the program current

Written program is updated at least annually to reflect changes in operations, technology, or threat landscape. Updates are documented. "We've had the same Information Security Policy since 2017" is not a defense.

8. Establish a written incident response plan

Documented. Tested. Includes specific roles, escalation paths, regulatory notification timelines, evidence-preservation procedures. The FTC has been very clear that an untested plan is the same as no plan — which is why we treat tabletop exercises as program-required, not optional.

9. Report to the board

At least annually. Covers: overall status of the program, material risk assessment results, results of testing the program's safeguards, security events and management responses, recommendations for changes.

Board-level reporting is where many higher-ed programs are formally weakest. A CISO who reports to a CIO who occasionally mentions security in the CIO's report to the board does not satisfy element 9. The qualified individual needs direct access to the board, or at minimum to the board committee with oversight of risk.

Common gaps I see in higher-ed

The pattern at most of the institutions I've talked with isn't that they are missing the substance. They are doing risk assessment work. They have deployed MFA broadly. They have an incident response plan. They review vendors at procurement. They report to the board. The audit failure happens because almost none of it is fully documented in the way the rule requires. Risk assessment lives in someone's notebook, or in a PowerPoint that was presented once and never formally adopted. MFA coverage looks good on paper, but the exceptions — legacy administrative tools, contractor accounts, service accounts, research lab systems — are silently accepted and never re-examined. Vendor reviews happen at procurement but they are not periodic, and the workflow to make them periodic at SaaS-portfolio scale does not exist. Incident response plans are written but have never been exercised. Board reporting happens, but as a sentence inside the CIO's broader report, rather than as a substantive presentation from the qualified individual to the board itself.

The other failure pattern shows up around disposal. The Safeguards Rule's expectation that customer information be securely disposed of no later than two years after the last legitimate business use is not widely known and is almost never operationalized. No documented schedule, no verification of secure deletion in backup tapes or archived systems. It is the kind of finding that does not catastrophically break a program, but it does signal to an auditor that the program is not fully thought through.

A 90-day audit prep checklist

If you have an audit coming and you're not sure where to start:

Weeks 1–2: Inventory and assess.

  • Identify your qualified individual (and confirm with the cabinet or president that this designation is formal).
  • Inventory all systems that touch customer information. Don't assume — verify with data-flow walks across financial aid, registrar, bursar, advancement, and research administration.
  • Document the institutional risk assessment in writing. Even if it's compressed, written counts.

Weeks 3–6: Close the highest-impact gaps.

  • Audit MFA coverage. Find the exceptions. Either remediate or document a formal risk acceptance with a remediation timeline.
  • Audit your service-provider list. Confirm each has had a security review (HECVAT or equivalent) within the past 12 months. Schedule re-reviews for anything past that mark.
  • Confirm the incident response plan is current and accurately reflects today's organizational structure. Schedule a tabletop exercise within the next 60 days.

Weeks 7–10: Document and present.

  • Update the written Information Security Program to reflect current state and current risks. Get it adopted by the qualified individual and reviewed by the appropriate authority.
  • Prepare the board-level annual report. Cover the nine elements. Include risk assessment results and testing summary.
  • Schedule the board presentation if one isn't already on the calendar.

Weeks 11–13: Test the evidence package.

  • Walk through the program documentation as if you were the auditor. Can each required element be supported with written evidence within five minutes? If not, find the gap.
  • Run the tabletop. Capture the after-action notes as part of the program evidence.
  • Schedule the next quarterly review of vendor risk and the next annual risk assessment refresh, then make sure both end up in your calendar with named owners.

When to start

If you're a Title IV institution, the right time to have started building a qualifying Safeguards Rule program was 2022. The right time to have it audit-ready is now. The amendments are in force, audits are happening, and the institutions getting hit hardest in early findings are the ones that assumed the rule would not apply or would not be enforced.

The harder question isn't whether the rule applies to you. It does. The question is whether you can demonstrate, before the auditor asks, that the program behind your answer is real. The nine elements are a structural checklist. Passing them on paper is the easy part. The harder work — and the more important one — is whether the program you have built actually deserves to pass. If Compliance Readiness is somewhere we can help, we are here for it. The model is the same as everything else we do: we do not take over the program; we help you build it up so it can stand on its own when the FTC, the Department of Education, or your state AG comes asking.


This field guide is a practitioner's interpretation of the Safeguards Rule as amended in June 2023. It is not legal advice. Consult your institution's general counsel before relying on any of the above for compliance decisions specific to your environment.

Tags

  • GLBA
  • Safeguards Rule
  • FTC
  • Title IV
  • Compliance
  • Higher Education