It looks like this was a larger decision from the lemmy development community in an attempt to eliminate karma farming. They say it’s psychologically damaging, and as someone who looks at them a lot, they may be right.
Here’s a GitHub thread discussing it where our Voyager dev weighs in:
https://github.com/LemmyNet/lemmy/issues/3393#issuecomment-1779400639
I would too, but it would undoubtedly be much more resource intensive and so probably not feasible for a free app.
And even if they charged users for it, doing it at the app level would be challenging and inefficient. It’s the kind of thing that really has to be cached, so voyager and any other app that wants the feature would have to independently calculate, store, and recalculate for each lemmy user periodically. Having multiple apps doing this would put more strain on the public api than storing it centrally, and that unwanted strain would most likely cause conflict from the devs that wanted the karma score removed to begin with.
Well each user is running an app, they can each computer for themselves at the very least
As far as other users go, when you open a user and you see their comment history, the app can certainly sum up the votes of the comments displayed without doing any extra queries. That ratio may be useful to identify trolls, or good actors
If each user is caching it, that would be even more resource intensive on the API than if the app was storing it somewhere and serving it to users. It would mean each time that a user clicks on another user’s profile for the first time, the app would need to pull all of their comments in order to calculate their score. If it were cached on servers at the app level, then you and I requesting the same user profile in a short period would only require api requests to pull all of that user’s comments once.
Storing it centrally means one repository making calls for scores. Storing at the app level means there are N repositories for N apps making calls. Storing it at the install level for each app instance means that you’d have astronomical amount of calls to the API to calculate this number. It would be incredibly expensive to do it that way and could be a behavior that may slow all API response to a crawl.
You’re making this more complicated than it needs to be.
The simple naive approach would be every time the app sees a post, it can memorize the current score for that post locally. For the user of the app, that should be all their posts right? This doesn’t need to be accurate broad strokes are fine we don’t have to catch votes that happen after we’ve last seen the post
For other users we look at, we could just tally the posts we see when we open the user profile which would be a few that get dynamically loaded I think five or six. That should give us trend data.
We don’t have to engineer anything more complex.
I really think you are grossly oversimplifying the problem.
When I click on any user’s profile, I’m seeing content from a lot of posts that my app instance hasn’t seen before. Each user follows different communities and accesses the app at different times, so each user’s instance will only have data on the posts that they have clicked through when they are browsing. The score calculation that you are suggesting would vary wildly depending on how much overlap you’ve had with that user previously. 5 or 6 posts out of hundreds or thousands would not be enough to consistently see a valid trend.
Voyager could probably add a hidden feature, like long press post count label to switch to karma and compute karma, maybe for last 500 or so posts & comments (10 total requests).
Because it would be a hidden feature requiring manual activation I doubt it would add server load. And being hidden means it also wouldn’t be an issue with karma farming.
deleted by creator
Any terms of use are set by the instance owner, not the Lemmy development team. That’s part of why Lemmy (and federated software in general) is awesome!
deleted by creator
GNU AGPL is a standard open source license, even Voyager uses, and has 0 impact on how you consume a lemmy instance’s API. The GNU AGPL main sticking point is just making sure that if you modify the source code, you have to make that modified source code open source.