by Dotnaught on 8/27/24, 7:43 PM with 29 comments
by Nerada on 8/27/24, 11:50 PM
Your network admin has had access to the proxy, and by extension, all your browsing history since forever. Now, your UEBA does that, but mainly just sits there and flags things like a user normally hitting a single host to suddenly hitting 300 hosts on the network, or a user having an average data upload of 500MB/week to 200GB in a single session.
Very few people care if you're using the corporate network to listen to YouTube Music (or even looking for other jobs), most just want to be notified of data exfiltration, compromised accounts, or malicious network activity.
by Animats on 8/28/24, 1:41 AM
Counterintelligence people definitely view employees as risks. But they're not your boss. They work for a different organization entirely. They're watching your boss, and your boss's boss, too. They only care about threats to national security. If they find other things, they log them, but don't tell your management. They have nothing to do with performance evaluation. The three-letter agencies worked out the rules on this stuff decades ago.
by dugite-code on 8/27/24, 10:45 PM
"Insider threats" are typically the one group that any security firm can actually do anything about in an active manner. Every other threat group comes at you, not the other way around.
by crvdgc on 8/28/24, 12:15 AM
> Forcepoint offers to assess whether employees are in financial distress, show "decreased productivity" or plan to leave the job, how they communicate with colleagues and whether they access "obscene" content or exhibit "negative sentiment" in their conversations.
This far surpasses the normal surveillance, which is more technical in nature. It's trying to combine mind reading and minority report to enforce a Stalinist level of thought control. How much can be delivered in reality remains to be seen, though.
by SoftTalker on 8/28/24, 1:32 AM
by michaelmrose on 8/27/24, 10:29 PM
If you have 50,000 employees and are screening for a risk that is 1 in 1M with a 5% false positive rate you are going to be very disappointed when over the next decade it identifies 25,000 would be shooters when you have zero actual active shooters. Even better you will probably stop disregarding such a test and miss if if it actually happens.
As awesome the fact that skynet is always watching will probably cause people to manage their workspace personas to a psychotic degree that will surely ratchet up workspace stress to new highs. Deprived of actual data on what triggers the eye of sauron 100 wrong theories about how to avoid doing so will proliferate and your studied population will both diverge from the norm the system was designed to operate on and become progressively worse.
A few years later a study will prove that the AI inadvertently learned to discriminate against minorities, women, or people in other time zones through things the training population did without thinking and the people pushing it will look like bigots. Instead of ejecting we will try to fix it. Either this doesn't work or if it does people accuse skynet of being woke.
by chris_wot on 8/28/24, 2:00 AM