Graduate student enrollment teams are facing rising complexity, shifting markets, and increasing pressure to make faster, data-informed decisions. 

Art Munin, Ph.D., Liaison’s Senior Associate Vice President for Enrollment Management Solutions, recently participated in a SEAGAP-sponsored workshop exploring how AI and predictive analytics move from abstract ideas to practical, mission-aligned tools within graduate enrollment. 

Drawing on real experience from the University of Florida’s Warrington College of Business, AI and Predictive Analytics in Graduate Enrollment examined common challenges in graduate education admissions, such as data readiness, system alignment, staff capacity, stakeholder buy-in to exemplify how collaborative planning can create meaningful change. 

Attendees learned how to assess AI tools with clarity, distinguish between what predictive models truly support and what still requires human judgment, and outline a roadmap for bringing AI-driven insights into daily enrollment decisions. The session emphasized transparency, trust, and human judgment as essential partners to AI, offering a clear path from ideation to adoption for institutions. 

AI in Graduate Enrollment: The Future is Here

As Dr. Munin noted, AI today occupies roughly the same place CRMs did two decades ago: a technology some institutions hesitate to adopt, while early movers are already benefiting. Yet unlike CRM adoption, the timeline for AI adoption will be much shorter. Students expect personalization. Institutions want earlier insight. Leadership demands greater efficiency and accountability. AI helps make sense of the vast amounts of data institutions already collect—data that historically has been siloed or underleveraged.

He also emphasized that AI is a tool, not a replacement for human judgment. Emotional intelligence, contextual awareness, and ethical decision‑making remain squarely in human hands. AI simply provides the clarity teams need to spend their time more wisely.

The University of Florida’s Challenge: Volume, Complexity, and Competing Priorities

Naz Erenguc, Ph.D., Director of Admissions at the University of Florida’s Warrington College of Business, grounded the session in the realities of a large, complex graduate business school. The Warrington College of Business enrolls roughly 900 MBA students across nine program formats and 13 annual intakes—a staggering operational rhythm that leaves little room for inefficiency.

As Dr. Erenguc put it, recruitment often feels like “the wild west.” The team knew they needed not just more data, but better, more unified data to guide decisions.

The Turning Point: Moving Beyond “Descriptive” Data

Before adopting an AI‑driven model, Warrington had plenty of descriptive and diagnostic data that could explain what happened and sometimes why. What the team needed was predictive and prescriptive insight that could help answer questions including:

  • Who is most likely to enroll?
  • Who needs a personalized outreach?
  • What actions meaningfully increase yield?
  • What timeline is too slow and risks melt?
  • Where are we losing students—and why?

This is where the partnership with Liaison’s Othot AI-powered analytics platform began.

Building the Foundation: Clean Data, Connected Systems, and Collaboration

The implementation was not simply plugging in a new tool. It required aligning and integrating data across application management, CRM, and prospect management systems. It also required deep collaboration between admissions, marketing, data architects, and the Othot team.

The experience reinforced a critical truth for institutions exploring AI:

AI is only as good as the data it receives. “Garbage in, garbage out” absolutely applies.

That’s why the university conducts ongoing data‑quality checks and meets bi‑weekly with its Othot data scientist to review outputs, identify anomalies, and continuously refine the model.

What Predictive Analytics Made Possible

Once implemented, Othot began generating real‑time insights that changed the way the institution managed its pipeline. Every prospective student is now associated with:

  • A likelihood‑to‑enroll score.
  • A propensity score showing how different actions raise or lower that likelihood.
  • Recommended “next best actions” to increase yield.
  • An explanation of which factors most influence each score.

As a result, the university can now:

Prioritize outreach strategically | Rather than treating every applicant as equally likely to enroll, the team now focuses staff time where it matters most while still automating communication for lower‑probability groups.

Spot problems earlier | Data can now reveal the impact of delayed admissions decisions on enrollment likelihood. That insight directly informs process improvements to reduce turnaround time.

Understand what works and what doesn’t | When marketing launches a new email or promoted an event, they can see in real time how those actions influence student behavior and pipeline predictions.

Fine-tune melt-reduction initiatives | Predictive analytics helps reinforce and further personalize post‑admit engagement.

As Dr. Erenguc emphasized, the tool doesn’t make decisions, it guides them. Humans still admit students, deny students, build relationships, and balance institutional priorities.

Evaluating AI Tools: Questions Every Institution Should Ask

Evaluating AI Tools: Questions Every Institution Should Ask

Dr. Munin closed the session with a set of practical criteria for assessing any AI solution:

  • How long has the tool been in the market?
  • Is it a “black box” or an “open‑book” model? (Institutions should see exactly how recommendations are generated.)
  • Is the model built on generalized higher‑ed data or on your institution’s own history?
  • Is ongoing partnership included?
  • Will the tool reduce workload—or create new burdens?

These questions help teams distinguish between flashy features and meaningful, mission‑aligned capability.