by Brajeshwar on 6/4/25, 4:35 PM with 162 comments
by chairmansteve on 6/4/25, 6:01 PM
https://m.economictimes.com/news/international/us/palantir-t...
https://www.washingtonpost.com/business/2025/05/07/doge-gove...
by BrenBarn on 6/4/25, 6:34 PM
When someone says regulation is stifling innovation, the odds are 90%+ that what they mean is "we want to find loopholes to make money by screwing people over". When a VC firm says that, the odds are 100%.
by marc_abonce on 6/4/25, 7:31 PM
> Latin America, where labor laws are less strictly enforced [...]
> in Mexico, provides identity verification tools that do various checks including biometrics, and also cross-check against government databases and blacklists.
These two issues are closely related. Here in Mexico, a lot of companies share "blacklists" of ex-employees that have gone to labour court, regardless of the reason or the result. Because of this, very few people ever go to labour court against a company here, even if they know that they're legally on the winning side.
Fortunately not all companies are like that, but the chilling effect still remains.
Source in Spanish: https://www.elfinanciero.com.mx/el-preguntario/2023/09/02/qu...
by guestbest on 6/4/25, 5:10 PM
by baxtr on 6/4/25, 5:33 PM
by dmbche on 6/4/25, 6:54 PM
by TrackerFF on 6/4/25, 5:15 PM
by leereeves on 6/4/25, 5:08 PM
by stego-tech on 6/4/25, 5:40 PM
It’s also everywhere:
* Academia has seen a glut of sensors and tech to surveil labs, classrooms, students, and faculty
* Retail and service workers are tracked via camera or phone and yelled at remotely by the boss if they’re not appropriately productive in any given moment
* Small businesses often leave telemetry and default data collection policies in place, letting private companies monitor their staff and business
The only tools available to us at this point are sabotage, awareness, and resistance. We need to build a society where people trust each other by default again, instead of assuming harm until proven otherwise. We also need governance and regulations at every level stating that surveillance of any sort must be as narrowly scoped as possible, that data retention is limited to as little as practicable, and that sharing of surveillance data with any party other than those compelled by law and warrant is illegal.
We’re in an era of peak productivity and flat wages, with the largest wealth pumps in human history funneling more money into fewer hands. This kind of surveillance isn’t just offensive or unacceptable, it’s grotesque in its treatment of our fellow humans.
by Huxley1 on 6/5/25, 8:27 AM
I believe trust and respect should go both ways, and there should be no need to invade an employee’s personal space with constant surveillance.
by wcski on 6/4/25, 5:59 PM
by feverzsj on 6/4/25, 6:28 PM
[0] https://www.odditycentral.com/news/company-installs-cameras-...
[1] https://www.reddit.com/r/China/comments/vx6pp0/a_chinese_com...
by cynicalsecurity on 6/4/25, 5:16 PM
I was unpleasantly surprised to see Accenture on the list of bad employers who spy on their employees with AI.
by hcarvalhoalves on 6/4/25, 6:21 PM
by anshubansal2000 on 6/4/25, 4:58 PM
by Karrot_Kream on 6/4/25, 6:34 PM
by hooverd on 6/4/25, 5:06 PM
by PicassoCTs on 6/4/25, 6:56 PM
by focusgroup0 on 6/4/25, 11:56 PM
by postalrat on 6/4/25, 6:43 PM
Don't waste your money and everyone's time by spying on them.
by ajsnigrutin on 6/4/25, 8:26 PM
by b0a04gl on 6/4/25, 7:26 PM
by CommanderData on 6/5/25, 4:21 PM
by davidmurphy on 6/4/25, 7:12 PM
by neilv on 6/4/25, 7:18 PM
Fortunately, recent events have made some of the risks much easier to appreciate.
For example, let's say that your city council and universities have been opposing very recent rogue moves in upper government. Suddenly, the well-regarded local police are getting body cams, which they never needed before, but it seems like a good thing that some community activists ask for.
But... Data fusion means that not only will said rogue elements in upper government soon have access to that general surveillance feed, but (since disregarding the Constitution anyway) will be able to enforce local police complying with rogue directives.
An example first application of that might be due to the local police having a policy of not checking reporting parties for immigration status, to encourage mutual cooperation between police and community. The rogue could automate that away, with fairly simple "AI" monitoring for compliance, already feasible.
Those earlier community activists change their mind about introducing surveillance, but too late.
Then the rogue use cases can get worse from there. First in enforcing general practice compliance like above, and then (if situation and pretense decay further) unconstitutionally more specifically tasking those local police as additional foot resources for the rogue's goals. Thanks to the body cams and other surveillance, a rogue will be able to centrally monitor for compliance all these loose resources in various cities. Maybe using other surveillance to determine who does and doesn't gets tasked for what.
(If this sounds unlikely, think back to how quickly masked officers were tasked to grab off the street someone who did nothing wrong, Soviet secret police style. And other officers were openly circumventing legitimate court orders against extraordinary rendition of grabbed people. Maybe complementary surveillance helps the rogue mass-distinguish "loyalists" from the ones who'd question illegal orders. Plot the arc.)
Body cams is just taking one example of what we might've assumed was a positive, progressive thing for society, and showing how quickly it can be turned against society, by bad elements.
I think the general answer is not to implement surveillance power when you don't have sufficient checks and balances to keep it operating within the interests of the people, both now and in the future.
And don't fool yourself about immediate intent of something you're building. Obviously that employee surveillance software you're developing isn't only for regulatory compliance, or detecting a spy stealing IP. But more frequently will be abused by dim and petty managers, to create dystopian work environments that slowly kill everyone with stress and misery. And occasionally will be used by corporate to try to suppress someone who complains of sexual harassment or accounting irregularities.
One of the first things you can do, as a thoughtful tech worker with integrity, is to simply not apply for the questionable-looking jobs. Many of them are obvious, and you'll get an unpleasant gut feel, just by reading their one-sentence startup blurb, or maybe when you look at the job description. Then close the tab without applying, and go soothe your nausea with some /r/aww or the gym. Let the icky company be flooded with the robo-appliers, the lower-skilled, and those who have good tech skills but are less-thoughtful or less-principled. Hey, maybe a company that's a mix of the low-skilled and the shitty will sabotage its own efforts, without you having to be involved.
by tropicalfruit on 6/4/25, 6:10 PM
im afriad thats a harsh truth not going to change any time soon
by pmarreck on 6/4/25, 6:12 PM