a
News
Lumara in Action: Smarter Threat Detection with User Behaviour Analytics

Security monitoring typically works like a rulebook: when an event happens, it triggers a rule, and an analyst is notified. That model is the backbone of modern SIEM platforms, and it works well for the threats it is designed to catch.
But this model heavily depends on knowing what "suspicious" looks like in advance. In practice, that is not always possible. Whilst activity on its own can be the problem, the context matters.
To bridge that gap, our SOC is currently deploying User Behaviour Analytics (UBA) into production as an additional detection layer for our customers. This approach relies on the weighting of actions, it learns what is normal, so our analysts can catch what is not.
Why event-based detection isn't enough
Event-based detection is effective and will remain central to how we monitor customer environments. When a threshold is crossed or a known pattern is matched, our SIEM surfaces it to an analyst immediately.
The gap appears with activity that is slow, subtle, or simply unknown. A sophisticated attacker, or an insider with legitimate access, is unlikely to trigger a brute-force alert. They move quietly: accessing assets gradually over days or weeks, staying well below the thresholds that trip rules. The SIEM logs every event, but without a matching rule, nothing surfaces to an analyst.
Think of it this way: if we know attackers throw rocks at windows, we write a rule to detect broken glass. The SIEM sees the glass, the rule fires, and the analyst is notified. That works well when attackers behave in ways we have already anticipated.
But what if someone decides to come in through the floorboards instead? The SIEM will still log the activity, but if there is no rule written for that entry point, nothing surfaces. The threat is visible in the data, but invisible to the analyst.
Looking beyond the rules
Because environments are not always straightforward, our SOC uses UBA to build a behavioural baseline for each individual user in an organisation. Rather than asking, "Did a rule get broken?", it asks a different question: Is this user behaving the way they normally do?
Context is what makes this powerful. In an EDU environment, a staff member accessing timetable systems dozens of times a week is unremarkable; but a student doing the same thing is worth a closer look.
Writing rules that account for every variation without generating constant noise is difficult. UBA handles it by learning what normal looks like for each person and flagging when that pattern shifts, not when an arbitrary threshold is crossed.

How UBA works in practice
Once deployed, UBA observes user activity over time. When behaviour shifts outside the established baseline, the system raises a risk score. When that score crosses a threshold, an analyst is alerted.
The key difference from traditional alerting is what the analyst sees: not "this event happened," but "this user is behaving unusually." From there, they can pull the full activity history and play back everything that user has done, seeing the sequence from the moment of login forward.
This allows our team to detect:
"Low and Slow" Threats: Sophisticated actors (like nation-states or APTs) target assets slowly over weeks to stay below detection thresholds. UBA catches these subtle anomalies because it understands the historical baseline.
Insider Threats: If a disgruntled employee suddenly begins downloading large amounts of internal data to a personal machine, UBA flags the abnormal data movement even when their actions don't trigger any specific access rules.
Novel Attack Vectors: As new attack methods emerge, UBA provides visibility beyond what pre-defined rules can detect. It flags the attacker coming up through the floorboards, picking up the anomaly earlier in the chain.
What this means for defence in depth
Deploying UBA into production is a deliberate decision by our SOC to strengthen detection by adding behavioural context alongside traditional rule-based monitoring.
Known threats can still be detected quickly through the SIEM. Less obvious activity can now be identified through changes in behaviour. Together, they give our analysts a more complete picture of what is happening in a customer environment.
That is defence in depth in practice: not one system doing everything, but layers working together so that activity does not need to match a predefined pattern to be seen.
Want to know more about what your SOC is doing behind the scenes? Get in touch to learn more about how we approach detection for your environment.

