by chbint on 7/19/24, 9:53 PM with 92 comments
by perihelions on 7/19/24, 11:50 PM
Scientific research output should be free, universally, without hindrance.
It's myopic to try extract wealth from this public good by siloing it, by toll-gating access to it. Like barricading a public highway with toll-booths every 500 meters: it's a myopia that's blind to the public-good value of infrastructure—a myopia of greed that's a universal drain on public wealth, for some petty local optimization.
If you obstruct ML models on some financial profit theory, you're obstructing not only the ML entities; you're obstructing the thousand researchers downstream who stand to benefit from them. You're standing the in road blocking traffic, collecting tolls; you've not only stopped the vehicle in front of you, you've stopped a thousand more stranded behind it. It is a public nuisance.
by carbocation on 7/19/24, 10:40 PM
by BenFranklin100 on 7/19/24, 10:48 PM
I have not kept up with the latest on LLM’s and licensing, but I’m curious: are scientific papers accessible to LLMs? Honestly, a bigger societal loss in my view is publishers like Elsevier restricting LLM access to research articles, rather than being too permissive. I could not care less if Elsevier makes a little bit of money in the process.
by winddude on 7/19/24, 10:48 PM
by asdasdsddd on 7/19/24, 11:45 PM
by BeetleB on 7/19/24, 11:53 PM
by SuperNinKenDo on 7/20/24, 5:21 AM
by andrewstuart on 7/19/24, 11:18 PM
by johnnyanmac on 7/21/24, 5:41 PM
Those are both topics that can be a post in and of itself, so I'll just keep it simple and emphasize once again that we should implement the 3C's when asking of anything from another person's IP. I doubt many of the older papers/articles had contracts that allowed for such usage. Reinforced by the article:
>The agreement with Microsoft was included in a trading update by the publisher’s parent company in May this year. However, academics published by the group claim they have not been told about the AI deal, were not given the opportunity to opt out and are receiving no extra payment for the use of their research by the tech company.
regardless of your position, this publishing group at worst lied and at best is being irresponsible, this isn't even an issue of AI or copyright. We can debate "well this is how it should be", but let's leave ShouldLand for a bit and actually look at the current situation. Trust being broken in real time.
by kalfHTA on 7/19/24, 10:46 PM
We need new publishing models with strict copyright protections that protect against theft. Academics should run their own publishing houses as a cooperative.
by JSDevOps on 7/19/24, 10:54 PM
by j_crick on 7/19/24, 10:37 PM
by blackeyeblitzar on 7/19/24, 10:31 PM
by Der_Einzige on 7/19/24, 11:13 PM
by Terretta on 7/20/24, 2:10 AM
Are they not a fact discoverer or truth revealer?
It's unclear to me researchers should “own” truths prior research and public patronage enabled them to unearth.
// note: research != invention, i.e., Space X experimenting until systems and machinery can land a rocket on a barge is not “research”, but testing and documenting characteristics of fuels in a vacuum as the environment swings from -100C to 120C is