from Hacker News

VC money is fueling a global boom in worker surveillance tech

by Brajeshwar on 6/4/25, 4:35 PM with 162 comments

  • by chairmansteve on 6/4/25, 6:01 PM

  • by BrenBarn on 6/4/25, 6:34 PM

    > The term “Little Tech” was popularized by the VC firm Andreessen Horowitz, which argued that excessive regulation was stifling innovation.

    When someone says regulation is stifling innovation, the odds are 90%+ that what they mean is "we want to find loopholes to make money by screwing people over". When a VC firm says that, the odds are 100%.

  • by marc_abonce on 6/4/25, 7:31 PM

    Tangential comment, about this "background check" tools that the article briefly mentions:

    > Latin America, where labor laws are less strictly enforced [...]

    > in Mexico, provides identity verification tools that do various checks including biometrics, and also cross-check against government databases and blacklists.

    These two issues are closely related. Here in Mexico, a lot of companies share "blacklists" of ex-employees that have gone to labour court, regardless of the reason or the result. Because of this, very few people ever go to labour court against a company here, even if they know that they're legally on the winning side.

    Fortunately not all companies are like that, but the chilling effect still remains.

    Source in Spanish: https://www.elfinanciero.com.mx/el-preguntario/2023/09/02/qu...

  • by guestbest on 6/4/25, 5:10 PM

    There are already several books about Surveillance capital. It’s not just VC money but public money, too. People pay taxes to get monitored by the government then go to work at a job where further monitoring takes place. Maybe one day houses will come with a build in no tech safe room free from surveillance and it won’t just be the bathroom as Bill Gates joked about in 2007 with his interview alongside Steve Jobs
  • by baxtr on 6/4/25, 5:33 PM

    I want to live to see these tools being deployed in VC offices. What are these guys doing anyway besides going to the gym, having coffee dates, and putting out BS posts on X?!
  • by dmbche on 6/4/25, 6:54 PM

    If your bottom line is actually affected by a change in employee performance that needs automated surveillance to be noticed, are you just not running a profitable buisness?
  • by TrackerFF on 6/4/25, 5:15 PM

    If you live in a country with healthy labor rights, then luckily these things aren't much to worry about.
  • by leereeves on 6/4/25, 5:08 PM

    Makes me think of the short novel Manna, about a dystopian future where working class humans are micro-managed by AI. (And an alternative future where everyone shares in the benefits of technological progress.)

    https://en.wikipedia.org/wiki/Manna_(novel)

  • by stego-tech on 6/4/25, 5:40 PM

    The reality is that the current AI boom is just supercharging a longstanding trend of massive surveillance. Governments and the monied classes distrust everyone else by default, and continue instituting surveillance to protect their interests at the expense of everyone else.

    It’s also everywhere:

    * Academia has seen a glut of sensors and tech to surveil labs, classrooms, students, and faculty

    * Retail and service workers are tracked via camera or phone and yelled at remotely by the boss if they’re not appropriately productive in any given moment

    * Small businesses often leave telemetry and default data collection policies in place, letting private companies monitor their staff and business

    The only tools available to us at this point are sabotage, awareness, and resistance. We need to build a society where people trust each other by default again, instead of assuming harm until proven otherwise. We also need governance and regulations at every level stating that surveillance of any sort must be as narrowly scoped as possible, that data retention is limited to as little as practicable, and that sharing of surveillance data with any party other than those compelled by law and warrant is illegal.

    We’re in an era of peak productivity and flat wages, with the largest wealth pumps in human history funneling more money into fewer hands. This kind of surveillance isn’t just offensive or unacceptable, it’s grotesque in its treatment of our fellow humans.

  • by Huxley1 on 6/5/25, 8:27 AM

    As an employee, while I understand that companies want to improve productivity, being constantly monitored makes me feel like I’ve lost my autonomy.

    I believe trust and respect should go both ways, and there should be no need to invade an employee’s personal space with constant surveillance.

  • by wcski on 6/4/25, 5:59 PM

    This article lost me when it counted identity verification software as "surveillance tech".
  • by feverzsj on 6/4/25, 6:28 PM

  • by cynicalsecurity on 6/4/25, 5:16 PM

    I suspect some jobs are still protected from the AI surveillance dystopia: academics, research, healthcare, small businesses.

    I was unpleasantly surprised to see Accenture on the list of bad employers who spy on their employees with AI.

  • by hcarvalhoalves on 6/4/25, 6:21 PM

    Last I heard, workers were getting replaced by AI, so this will sort itself out.
  • by anshubansal2000 on 6/4/25, 4:58 PM

    This has been ongoing for a while now these are getting sophisticated with AI
  • by Karrot_Kream on 6/4/25, 6:34 PM

    Unfortunately this is preying on developing economies with spotty labor regulations and enforcement. I'm curious who's using them: are these domestic firms not trusting their workers or are they mostly foreign firms trying to squeeze out work from overseas shops? Everything I've encountered about South Asian work culture in low cost shops was a widespread low-trust culture which pitted managers and workers against each other, but these were specifically firms oriented around low-trust low-cost hiring.
  • by hooverd on 6/4/25, 5:06 PM

    A lot of AI discussion suffers from an is/ought problem where people think you're attacking their vision of Iain Bank's the Culture while they refuse to see what's in front of their face, IMO.
  • by PicassoCTs on 6/4/25, 6:56 PM

    Slamming into the limits of growth, all that remains is to either automate it away - or to underpay and than enforce minimal compliance for essentially slavery with surveillance.
  • by focusgroup0 on 6/4/25, 11:56 PM

  • by postalrat on 6/4/25, 6:43 PM

    If someone appears to not be working fire them. ASAP. In the long run you will save a lot of time and money.

    Don't waste your money and everyone's time by spying on them.

  • by ajsnigrutin on 6/4/25, 8:26 PM

    One surveillance software startup monitors your mouse usage, and another startup sells you a mouse jiggler, that makes you 'active' 24/7.
  • by b0a04gl on 6/4/25, 7:26 PM

    funny how the same folks preaching innovation are funding tools that stifle autonomy.guess it's only "innovation" when it benefits them.we're building a panopticon, one startup at a time.but who's watching the watchers?
  • by CommanderData on 6/5/25, 4:21 PM

    Government legislation.
  • by davidmurphy on 6/4/25, 7:12 PM

    Speaking personally, I find this utterly deplorable.
  • by neilv on 6/4/25, 7:18 PM

    Something to keep in mind when building out any kind of surveillance: you probably don't get to choose who uses it, and how it's used.

    Fortunately, recent events have made some of the risks much easier to appreciate.

    For example, let's say that your city council and universities have been opposing very recent rogue moves in upper government. Suddenly, the well-regarded local police are getting body cams, which they never needed before, but it seems like a good thing that some community activists ask for.

    But... Data fusion means that not only will said rogue elements in upper government soon have access to that general surveillance feed, but (since disregarding the Constitution anyway) will be able to enforce local police complying with rogue directives.

    An example first application of that might be due to the local police having a policy of not checking reporting parties for immigration status, to encourage mutual cooperation between police and community. The rogue could automate that away, with fairly simple "AI" monitoring for compliance, already feasible.

    Those earlier community activists change their mind about introducing surveillance, but too late.

    Then the rogue use cases can get worse from there. First in enforcing general practice compliance like above, and then (if situation and pretense decay further) unconstitutionally more specifically tasking those local police as additional foot resources for the rogue's goals. Thanks to the body cams and other surveillance, a rogue will be able to centrally monitor for compliance all these loose resources in various cities. Maybe using other surveillance to determine who does and doesn't gets tasked for what.

    (If this sounds unlikely, think back to how quickly masked officers were tasked to grab off the street someone who did nothing wrong, Soviet secret police style. And other officers were openly circumventing legitimate court orders against extraordinary rendition of grabbed people. Maybe complementary surveillance helps the rogue mass-distinguish "loyalists" from the ones who'd question illegal orders. Plot the arc.)

    Body cams is just taking one example of what we might've assumed was a positive, progressive thing for society, and showing how quickly it can be turned against society, by bad elements.

    I think the general answer is not to implement surveillance power when you don't have sufficient checks and balances to keep it operating within the interests of the people, both now and in the future.

    And don't fool yourself about immediate intent of something you're building. Obviously that employee surveillance software you're developing isn't only for regulatory compliance, or detecting a spy stealing IP. But more frequently will be abused by dim and petty managers, to create dystopian work environments that slowly kill everyone with stress and misery. And occasionally will be used by corporate to try to suppress someone who complains of sexual harassment or accounting irregularities.

    One of the first things you can do, as a thoughtful tech worker with integrity, is to simply not apply for the questionable-looking jobs. Many of them are obvious, and you'll get an unpleasant gut feel, just by reading their one-sentence startup blurb, or maybe when you look at the job description. Then close the tab without applying, and go soothe your nausea with some /r/aww or the gym. Let the icky company be flooded with the robo-appliers, the lower-skilled, and those who have good tech skills but are less-thoughtful or less-principled. Hey, maybe a company that's a mix of the low-skilled and the shitty will sabotage its own efforts, without you having to be involved.

  • by tropicalfruit on 6/4/25, 6:10 PM

    this kind of bs can only come due to a surplus of labour

    im afriad thats a harsh truth not going to change any time soon

  • by pmarreck on 6/4/25, 6:12 PM

    Good. The more hostile you make corporate work environments, the more someone will want to strike out on their own or with a smaller company