18 min read

Building a security program for edtech when your customers are the auditors


How a small software company navigates higher ed security reviews, HECVAT questionnaires, and the fact that every customer has their own definition of “good enough.”


If you sell software to higher education, you already know that the procurement process involves a security review. What you might not appreciate until you’re deep into it is just how different that review looks depending on who’s asking.

At Redrock Software, we build TracCloud, a tutoring and student success management platform used by colleges and universities. Our team fits in a conference room, and in any given month we might field a HECVAT questionnaire from a state university system, a custom 200-question security assessment from a community college’s IT department, a TX-RAMP authorization request from a Texas institution, and an email from a small private college asking if we have a privacy policy. All of these are legitimate customer security evaluations. All of them require different levels of detail in response, but all of them need to be backed by the same underlying program.

The thing that makes EdTech unusual as a market, and the thing that shapes nearly every decision we make about our security program’s structure, is that our customers aren’t only buying software. They’re taking on risk. Student data is protected by FERPA at the federal level, by student-specific privacy statutes across dozens of states, and increasingly by state-level cybersecurity authorization programs that treat cloud vendors the way federal agencies treat FedRAMP candidates. The institution’s IT security team is responsible for evaluating whether we’re a safe bet. That makes them, functionally, our auditors.


The Spectrum of Sophistication

Not every institution approaches vendor security the same way, and pretending otherwise will get you in trouble fast.

On one end, you have large state university systems with a dedicated CISO, a full information security team, and a vendor risk management program that mirrors federal practices. These folks know what a System Security Plan looks like. They’ll ask for your POA&M. They want to see your vulnerability scan remediation timelines, your incident response testing methodology, your data flow diagrams. They might require authorization through a state program like TX-RAMP before they can even sign a contract. The conversation feels like a federal audit because, for them, it essentially is one.

On the other end, you have smaller institutions where the “IT security team” is one person who also manages the LMS, runs the help desk, and troubleshoots the projectors in the lecture halls. They have a lot on their plate and are likely underpaid. Their security review might be a handful of questions about encryption, data hosting location, and whether you have a SOC 2 report they can file away. They don’t have less concern about security. They have less bandwidth to evaluate it in depth, and they’re relying more heavily on trust and on the artifacts you can hand them.

Between those two poles is everything else. Community colleges that have adopted HECVAT as their standard questionnaire. Regional universities with an emerging security program that’s still figuring out what to ask for. Multi-campus systems where the central IT office runs procurement but individual campuses have their own security requirements. Canadian institutions subject to PIPEDA. UK institutions operating under UK GDPR and the Age Appropriate Design Code.

You can’t build a separate security program for each of these customers, but you can build one program that’s structured well enough to serve all of them credibly. That’s what we’ve spent the last few years doing.


One Program, Many Audiences

Our security program is built on NIST SP 800–53 Rev 5 as the baseline. We chose 800–53 for a practical reason: it’s comprehensive enough that nearly every other framework maps into it. When a customer asks about our alignment with CIS Controls v8.1, SOC 2 Trust Services Criteria, CMMC 2.0, CSA CAIQ 4.1, HECVAT 4.1, or NIST CSF 2.0, we can answer from the same set of policies and evidence. We’re maintaining one program with multiple crosswalks instead of multiple programs.

Every NIST control family gets at least two documents: a policy (the what and why) and a procedure (the how, with specific system names, steps, and responsible parties). AC01 covers access control policy. AC02 covers access control procedures. This pattern repeats across all twenty families. Every document follows an identical structure: introduction, purpose, scope, related documents, roles and responsibilities, management commitment, authority, compliance, exceptions, policy review maintenance, family-specific requirements, and a compliance crosswalk table that maps our controls to each framework we support.

When a university security team asks “how do you handle access control?”, we can point them to a policy that covers account management, access enforcement, information flow enforcement, separation of duties, least privilege, unsuccessful login attempts, session management, remote access, wireless access, and mobile device controls. When they ask “how does that map to CIS Controls?”, the crosswalk is built into the document. When they ask “who’s responsible?”, the roles and responsibilities section names specific positions.

The structural consistency of our documentation enables efficient responses to security questionnaires. For instance, a HECVAT question about access control maps to the same policy that answers a TX-RAMP requirement, which in turn maps to the same evidence that satisfies an independent auditor. We crafted the documentation once, ensuring it caters to the needs of all audiences.


Infrastructure and Multi-Tenancy

TracCloud runs on AWS as a multi-tenant SaaS application. We chose AWS for a practical reason that ties directly back to the customer trust story: the depth of its security tooling lets us instrument our infrastructure for continuous compliance visibility, which means we can show institutional reviewers evidence of ongoing security posture rather than just point-in-time snapshots.

That continuous visibility is the backbone of our vulnerability management approach. AWS-native services scan our environment for host-level vulnerabilities and compliance drift on an ongoing basis. Software bills of materials are exported daily. Static code analysis runs on every code change before it reaches production. Weekly targeted scans cover the web application layer and network-level attack surface. Application-level error monitoring runs continuously. The practical result is that we’re never waiting for a monthly scan to discover problems — and when an institutional security team asks about our vulnerability management cadence, the answer isn’t a schedule, it’s a pipeline. An independent third-party assessor conducts a full penetration test annually on top of all of it, because continuous internal scanning and external validation serve different purposes.

Authentication in TracCloud is delegated to the institution’s own identity provider. We support SAML SSO, CAS SSO, and LDAP/Active Directory as customer-configurable authentication methods. This means Redrock doesn’t manage student credentials directly, leaving the institution’s IdP as the authoritative source for user identity, allowing us to solely consume the authentication assertion. Where an institution hasn’t configured SSO, local password authentication enforces complexity requirements, and all locally stored passwords are hashed using industry-standard cryptographic algorithms. Plaintext passwords are never stored or logged.

This authentication model is significant from a risk perspective. The institution retains control of their identity infrastructure, and we don’t become a high-value target for credential theft. It also means that when an institution has strong authentication policies (MFA, conditional access, device compliance) those protections extend to TracCloud automatically.


FERPA as the Constant Backdrop

Every conversation about EdTech security happens with FERPA in the background. The Family Educational Rights and Privacy Act governs how student education records can be used and disclosed, and as a vendor processing student data on behalf of an institution, we operate under the “school official” exception. The institution designates us as a school official in their FERPA notification, and we agree to use student data only for the purposes specified in our contract.

Our privacy policy states it plainly: we don’t collect any data for the purposes of advertising or selling. No hidden clauses, no convoluted terms of service. We don’t use student data for marketing, AI training, or any secondary purposes. Student records are subject to FERPA (20 U.S.C. Section 1232g), and we protect them according to industry standard administrative, physical, and technical safeguards. All sensitive data stored within our systems uses AES 256-bit encryption, and all data transfers are safeguarded in transit using TLS.

FERPA is just the floor, though. When you count federal statutes, state student data privacy laws, state consumer privacy laws, and international data protection regulations applicable to our customer base, we track compliance obligations across more than 80 distinct legal requirements. California has SOPIPA and the CCPA/CPRA. New York has Education Law 2-d with its own implementing regulations. Texas has both the TDPSA for general consumer privacy and separate student data protections. Colorado, Connecticut, Virginia, Delaware, Indiana, Iowa, Kentucky, Maryland, Minnesota, Montana, Nebraska, New Hampshire, New Jersey, Oregon, Tennessee, Utah, and more all have their own statutes. Then there’s COPPA for minors’ data, HIPAA where educational records intersect with health data, and international requirements like Canada’s PIPEDA, Quebec’s Law 25, and the UK’s GDPR and Data Protection Act 2018.

We track all of this in a single reference document. It catalogs every applicable law, regulation, and compliance obligation, organized by jurisdiction, with citations, effective dates, scope, enforcement mechanisms, and specific compliance requirements. Every policy in our library points back to this one reference instead of trying to list applicable laws inline. When a new state passes a student data privacy law (which happens regularly), we update one document and confirm coverage rather than overhauling twenty policies.

Institutions care about this because they’re on the hook for it. If we mishandle student data, the institution faces the regulatory consequences. That’s why their security teams take the evaluation seriously, and why we need to take our compliance program seriously in return.


The Data Processing Agreement as a Trust Document

For institutions that need a formal data processing agreement, we have one. Our DPA is incorporated into the Master Services Agreement and lays out the obligations of both parties with respect to personal data processing. It defines the institution as the Controller and Redrock as the Processor, which is the correct framing under both GDPR-style regulations and most state privacy laws.

The DPA explicitly names the applicable data protection laws: FERPA, GDPR, UK GDPR, CCPA/CPRA, New York Education Law 2-d, COPPA, PIPEDA, Quebec Law 25, and any other applicable federal, state, provincial, or international data protection law. It defines personal data broadly to include Education Records, Student Data, PII, and any other information defined as personal data under applicable law. It defines what constitutes a security incident or data breach: any accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data.

For a lot of institutions, the DPA is the document that matters most. It’s the contractual guarantee that data will be handled the way we say it will be. Having one that’s comprehensive, that names the right laws, and that aligns with the same frameworks our internal policies are built on makes a real difference in how quickly procurement moves.


Responding to Questionnaires

If you sell to higher ed, you will fill out questionnaires. Lots of them. More of them than you thought was possible. The HECVAT (Higher Education Community Vendor Assessment Toolkit) is the most common standardized one (finally), but many institutions have their own custom assessments that sometimes run to hundreds of questions.

The key to doing this efficiently is having your answers pre-written and backed by real documentation. When a HECVAT question asks about your vulnerability management program, you need to be able to say: we run continuous automated scanning of our infrastructure and application layer, with weekly targeted vulnerability scans and an independent third-party penetration test conducted annually. Critical findings are remediated within 24 hours, high-severity within 7 days, medium within 30 days, and low within 60 days. And you need that to be 100% true, documented in your risk assessment policy, and verifiable against your scan history.

When a custom questionnaire asks about incident response, you need to be able to point to an actual incident response plan that covers preparation, detection and analysis, containment, eradication, and recovery. Ours specifies that we test IR capability at least annually using NIST SP 800–61, that personnel with IR responsibilities receive role-based training within six weeks of taking on those duties, and that incident reports go to the Authorizing Official in accordance with US-CERT reporting timelines. We report suspected incidents and breaches through defined channels, and our information spillage response procedures include isolating affected systems, notifying the institution, and conducting annual spillage response training.

The details matter because the people reading these answers often know what good looks like. A university CISO who has been through a FedRAMP evaluation can tell the difference between an answer copied from a template and one that reflects an actual operational program. Specific system names, specific remediation timelines, specific role assignments: these are the signals that tell a reviewer your program is real.


Working with Institutional Security Teams

The relationship side of this deserves more attention than it typically gets. Responding to a questionnaire is one thing. Working productively with an institutional security team over the life of a contract is something else entirely.

Some institutions want a virtual walkthrough of your security program before they sign. Some want to see specific artifacts: your SSP summary, your most recent vulnerability scan report, your incident response test results. Some want an annual re-assessment. Some states, like Texas through TX-RAMP, require ongoing authorization that involves periodic evidence submission and recertification.

Being a small company works in your favor here rather than against you. When a university’s security team wants to talk to the person who manages your infrastructure, or the person who writes your policies, or the person who handles incident response, they’re talking to someone on our team directly, and it’s usually me. There’s no ticket routed through a security operations center to a GRC analyst who then schedules a call with a subject matter expert two weeks out. The person who picks up the phone is the person who built the system and can answer the question.

Our Information Security Roles and Responsibilities document makes this transparent. It lists every information system we operate, from our internal Active Directory domain to our cloud IAM environment to TracCloud’s database instances and file transfer systems, and names the specific individuals who hold each security role for each system. When an institutional reviewer asks “who is your Information Security Officer for TracCloud?”, we have a documented answer. When they ask who the Authorizing Official is, or who serves as the System Administrator for the backup systems, those answers are documented too.

On a small team, the same person often holds multiple roles. One person serves as Information Owner, Authorizing Official Designated Representative, Information Security Manager, IT Manager, and ISSO. The CEO is simultaneously the President and Authorizing Official. We document all of this explicitly because NIST requires that roles be defined and assignments be documented, not that each role be a separate person. Institutional reviewers understand this. They’ve seen it at their own smaller departments. What they care about is that you’ve thought through who’s responsible for what and put it on paper.


When an Incident Affects Students

The stakes in EdTech are different from enterprise B2B SaaS. A security incident doesn’t just affect the company. It affects students, real people, many of them minors or young adults, whose educational records, personal identifiers, and sometimes health information are in the system.

This shapes how we think about incident response in a way that goes beyond regulatory compliance. Our incident response policy requires reporting suspected incidents to the organizational IR capability within US-CERT timelines, coordinating handling activities with contingency planning, and incorporating lessons learned from every incident into updated procedures, training, and testing. The institutional notification piece is equally important. When student data is involved, the institution needs to know fast, because they’re the ones who have to notify students and potentially state regulators under the applicable breach notification statute.

Our DPA defines a security incident as any accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data transmitted, stored, or otherwise processed. That’s a broad definition, intentionally so. We’d rather over-report and let the institution make the determination about whether notification is required under their state’s law than under-report and leave them exposed.

Every state has its own breach notification timeline and requirements. Some require notification within 30 days, some within 60, some within 72 hours. The institution is the entity with the notification obligation, but they can only meet it if we inform them promptly. That’s a trust relationship, and it only works if the vendor takes it seriously.


The Regulatory Patchwork

Managing compliance across this many jurisdictions is worth dwelling on for a moment, because it’s one of the things that makes EdTech compliance genuinely complex.

These aren’t copies of the same law. Each jurisdiction has made its own choices about scope, definitions, consent requirements, data deletion timelines, and enforcement mechanisms. Some apply only to K-12, some extend to higher education. Some require signed data processing agreements, some don’t. Some have private rights of action, some rely on attorney general enforcement. Some require data protection assessments, some don’t.

We deal with this by writing our controls to the strictest applicable standard. If one jurisdiction requires data deletion within 30 days of contract termination and another allows 60, we build our process around 30. If one requires explicit consent for processing and another allows implied consent, we default to explicit. This means that adding coverage for a new law is usually a matter of confirming that we already meet or exceed the requirements rather than redesigning controls.

The international dimension adds another layer. Canadian institutions operating under PIPEDA have requirements around meaningful consent, breach notification with a “real risk of significant harm” standard, and data retention limitations. UK institutions subject to UK GDPR need us to have a lawful basis for processing, to respect data subject rights including access, rectification, erasure, and portability, and to handle international data transfers properly. Quebec’s Law 25 has its own requirements around privacy impact assessments and consent.

Our Applicable Laws document catalogs all of this in one place, organized by jurisdiction, so that when a question comes in about compliance with a specific regulation, we can point to it directly. It’s the most-referenced document in our entire library.


Small Team as an Advantage

There’s a perception in the market that small vendors are riskier from a security standpoint. Fewer people means fewer resources, means less mature security controls. Some institutional security teams come into the evaluation with this assumption.

In practice, we’ve found that size can work the other way. On our team, the person who writes the access control policy is the same person who enforces it. The person who conducts the risk assessment is the same person who remediates the findings. The person who builds the vulnerability scanning pipeline is the same person who triages the results. When our policy says we review audit logs weekly for unusual activity, that review actually happens, because the reviewer knows the system intimately and can spot anomalies that would take a larger organization’s analyst hours to triage.

We’ve also built custom security testing tooling tailored specifically to TracCloud’s architecture and attack surface. Rather than relying solely on off-the-shelf scanning, we maintain an internally developed continuous testing toolkit that targets the specific threat model relevant to a multi-tenant EdTech platform. That’s not something most vendors our size do, and it’s the kind of thing that’s only practical because the people building the security tooling are the same people who built the application.

Our assessment schedule reflects this. TracCloud and its associated systems undergo both internal and third-party assessment annually, using both automated and manual methods. We bring in an independent external penetration testing firm because you can’t grade your own work, and on a team this small, the conflict of interest is obvious. Between external assessments, the operational security work benefits from short feedback loops and direct accountability.

When an institutional security team asks to speak with our security leadership, they’re not getting a marketing person or a sales engineer with a cheat sheet. They’re getting the person who built the policies, manages the infrastructure, and can explain exactly how a specific control works in production. That directness counts for a lot. Several institutions have told us that the quality of the conversation during the security review was a factor in their procurement decision.

The honest flip side of the small-team advantage is the bus factor. When the same person who writes the policy also enforces it, also manages the infrastructure, and also handles incident response, you have short feedback loops and deep context…but you also have dangerous concentration of institutional knowledge. If that person is unavailable, you have a problem that no amount of documentation can fully solve in the moment.

We think about this a lot, and we put deliberate effort into reducing the exposure. Every critical system has its access credentials documented in a secured vault that isn’t dependent on any single person’s availability. Procedures are written with enough specificity that someone who understands the domain but hasn’t operated that particular system can follow them. Runbooks for common operational tasks — backup restoration, incident triage, vulnerability remediation — exist precisely because the person who normally handles them might not be the person who handles them next time.

Cross-training is part of this too, though on a team our size it’s more accurate to call it “making sure more than one person has actually touched each system.” We do tabletop exercises where the person who usually handles a scenario sits out and someone else walks through it. We document architectural decisions and the reasoning behind them, not just the outcomes, because the why is what gets lost first when knowledge lives in one person’s head.

None of this fully eliminates the risk. A small team will always carry more key-person dependency than a large one. But there’s a difference between a small team that hasn’t thought about it and one that has actively worked to reduce it, and institutional reviewers can usually tell the difference when they look at your documentation and ask the right questions.


Building Crosswalks into Everything

If I could give one piece of advice to another small EdTech company starting to build a security program for the higher ed market, it would be this: build your framework crosswalks into the document structure from day one.

We map every policy to NIST SP 800–53 Rev 5, NIST CSF 2.0, CIS Controls v8.1, SOC 2 Trust Services Criteria, CMMC 2.0, CSA CAIQ 4.1, HECVAT 4.1, and TX-RAMP. This mapping lives in each policy document as a compliance crosswalk table, not in a separate spreadsheet that gets out of sync.

This matters because customers will ask about different frameworks depending on their institutional context. A Texas state institution cares about TX-RAMP. A research university handling DoD-funded projects cares about CMMC. A Canadian institution wants to know about PIPEDA alignment. A security team that just finished a CIS Controls assessment wants to map your controls to their maturity model. If you can answer all of these questions from the same documentation, you save yourself enormous rework and you demonstrate that your program is coherent rather than a collection of ad hoc responses.

Going back and retrofitting framework mappings onto existing documentation is miserable work. We know because we’ve done it. If you’re starting fresh, save yourself the trouble and structure for it from the beginning.


What Institutions Actually Want to See

After years of responding to security reviews, some patterns have emerged in what institutional security teams consistently care about. Not every institution asks about all of these, but every institution asks about at least some of them:

Data ownership and portability. Who owns the data? Can the institution get it back? In what format? How quickly? Our position is straightforward: the institution is the data controller, the data is theirs, and they can export it or request deletion at any time.

Encryption. At rest and in transit. AES-256 for storage, TLS for transmission. No exceptions, no caveats.

Subprocessors. Which third parties touch the data? Our DPA defines subprocessors and requires that any third party engaged to process personal data on behalf of the institution meets the same obligations.

Incident notification. How fast will you tell us if something happens? Our answer is grounded in our IR policy’s reporting requirements, not in vague assurances.

Data deletion. What happens to student data when the contract ends? Clear retention and deletion policies, documented and enforceable.

Vulnerability management. How do you find and fix security issues? Continuous automated scanning, weekly targeted scans, annual independent penetration testing, and defined remediation timelines — critical within 24 hours, high within 7 days, medium within 30 days, low within 60 days.

Authentication and access control. Who at your company can access our data? How do students authenticate? Least privilege enforcement, role-based access, quarterly access reviews, documented separation of duties, and customer-configurable SSO that delegates authentication to the institution’s own identity provider.

Business continuity. If something goes wrong, can you keep running? Backup policies mapped to specific systems, tested restoration procedures, defined recovery objectives.

None of these are exotic requirements. They’re the basics, asked consistently by security teams that have learned through experience what matters and what’s window dressing. Having clear, documented, honest answers to each of them is what separates a productive security review from a painful one.


Honesty Over Polish

The biggest mistake a vendor can make in a security review is overstating their program’s maturity. Institutional security teams do this for a living. They can tell when an SSP was generated from a template and when a questionnaire response doesn’t match the evidence behind it.

We document our program honestly, including the parts that are still maturing. If a control is partially implemented, we say so and describe the plan for closing the gap. If we made a risk-based decision to deprioritize something, we document the rationale. Our risk assessment policy requires that risk assessment results be shared with the Information Owner, Information Security Manager, and the customer’s Authorizing Official. That sharing includes both the findings and the remediation plan, not just the clean summary.

An institutional reviewer who sees an honest program with a clear improvement trajectory will work with you. One who discovers that your documentation doesn’t match reality will walk away, and they’ll tell their colleagues at other institutions. Higher ed is a small world. Reputation travels. Honesty sells.


Keeping It Going

A security program that serves this market isn’t a project with an end date. Our policies review on a biennial cycle. Our procedures review annually (or when significant changes are made). Vulnerability scanning runs continuously with weekly targeted scans. Audit log reviews happen weekly. Major Incident response capability gets tested annually per NIST SP 800–61. Minor Incident response capability gets tested quarterly just to be safe. Independent penetration testing happens annually. Risk assessments happen at least every three years or when significant changes occur. Access privileges get reviewed quarterly.

These cadences aren’t aspirational targets. They’re the operational commitments that institutions are counting on when they approve us as a vendor. Missing them erodes the trust that the security review built.

For a small team, the most important thing is knowing exactly what’s due and when, and having the discipline to keep doing it even when there’s no audit on the calendar. The institutions are trusting you to maintain the program between reviews, not just perform when someone’s watching.

We’ve been doing this long enough now that the rhythm is familiar. A new questionnaire comes in, and most of the answers are already written. A state passes a new student privacy law, and we confirm coverage rather than scrambling. An institution’s security team wants to talk through our controls, and the people who can answer are the same people who implement them every day.

That’s roughly the goal for any small EdTech company trying to navigate this landscape. Build one honest program, document it thoroughly, map it to the frameworks your customers care about, and then actually run it. The customers doing the evaluating will notice.