• 2 Posts
  • 259 Comments
Joined 2 years ago
cake
Cake day: June 20th, 2023

help-circle
  • It can do both, lossiness is toggleable.

    If you’ve seen a picture on Lemmy, you’ve almost certainly seen a WebP. A fair bit of software – most egregiously from Microsoft – refuses to decode them still, but every major browser has supported WebP for years and since superior data efficiency compared to JPG/PNG means is already very widely used on the web. Bandwidth is not that cheap.


  • Yeah it’s Ctrl+D. I do use bookmarks on occasion (especially for stupid websites with non-intuitive URLs and page titles I can’t easily find by typing in the omnibar), but not as a way to organize my work.

    The reason I mention ADHD for this in particular is I saw a home organization tip for ADHD that I related strongly to: ADHD brains really benefit from having everything spread out on a table, visible and immediately available. Trying to force an ADHD person to constantly put things away is super counter-productive even if it’s apparently good advice for neurotypical folk. Though of course ADHD is not an excuse not to clear the messy table once the project is finished.

    My computer desktop follows the same principle. I’ll have as many workspaces as I do ongoing projects, and every workspace has all the tools I need open. And the good news is it’s much harder to run out of virtual space than it is to run out of space on a real table.


  • I feel like it’s often bad game design from developers who think they need to put in consumables without understanding their gameplay value.

    In too many games using up all your consumables is just A Thing That Happens. So the game is balanced to allow you to survive anyway. But then the corollary is that if it works for a bit then you can finish the game without using any of the consumables. The consumables are just a way to make already achievable portions of the game easier, which is just sloppy game design IMO. Bethesda games for instance are very guilty of that.


  • People not understanding that we understand bookmarks exist is weird to me.

    For me it’s a suspected ADHD thing. If I make a bookmark:

    • I have to context-switch into “cleaning up” mode. Leaving a tab open is not distracting, having to name it and categorize it is.
    • Bookmarks are virtual drawers. Anything I put in a drawer might as well be in a cave in Alaska guarded by a troll as far as my brain is concerned. If I intend to look at this in the next 2-3 weeks, I keep the tab open because it’s a virtual reminder I’ve not yet done the thing.
    • Yes, I’ve got tabs open from over a year ago. Those ones don’t serve a purpose, I’ll get around to cleaning up… eventually.

    Honestly if I was forced to close my browser sessions at the end of the work day, not joke, not an exaggeration, I’d switch jobs. I’m working on too many different complex things to have to rebuild my mental model of where everything was at from scratch every morning. I would not get anything done.



  • I love Dune but that game is so powerfully unappealing to me… I didn’t play it so maybe I got the wrong impression from a few minutes of gameplay but it read to me like every generic crafting-survival-base-building live service game from the last 15 years since MC and DayZ. Does it do something subversive or is it really just Rust on Arrakis?


  • I will argue there is something very different between Rowling and Bezos.

    Both are rich beyond measure and can fund lobbying efforts indefinitely. But Rowling has something that Bezos doesn’t have, cultural capital. When she says something, people listen, from journalists to citizens to lawmakers, and not just because she’s rich.

    Truth is social capital matters. A lot. Which is good because it’s the only thing we, the people, can hope to have that most billionaires don’t. But a corollary of this statement is that giving social capital to Rowling is, in fact, worse than giving actual capital to Bezos, all else being equal.

    Now I can’t tell you how to live your life and we all have our vices. Just giving food for thought.





  • How much of it is due to Agile (which is a very broad concept even though some people mistakenly equate it with scrum), and how much is it due to corporate pressures and inadequate processes though?

    I find Agile conceptually meshes a lot better with “standard” product and solutions development thanks to the tighter feedback loops and increased reliance on local expertise over centralized planning. This only gets truer as project complexity grows.

    However some companies try to make Agile work with top-down decision making and/or hard deadlines, which are deadly antipatterns. As for lack of time/resources and/or timesheet micro-management, this isn’t a problem unique to Agile nor something that waterfall is exempt from.

    Good agile teams are mostly independent and can define their own testing/release cycle as required for a given project; though of course when that happens there are at least a couple layers of management who feel a burning itch to stuff their dirty nosed where they don’t belong because if the team succeeds despite their lack of direct involvement then everyone might realize the emperor has no pants.


  • That may be true in some truly well organized (usually “legacy big corpo” companies).

    Where I’ve worked it’s more like:

    • Requirements only cover user-facing features, if that. (Not so) senior engineers are left to bridge the gap between UI mockups and literally everything else.
    • Implementation issue is accidentally introduced
    • Priority on the bug is lower than new features so no-one has any way to justify working on it
    • One day a dev might be personally annoyed enough by the issue that they fix the part as part of some tangentially related work. Else it stays like that forever.

    That is a basic side-effect of Agile development. If you have implementation details figured out to such an extent before writing the code, you are not doing agile, you are doing waterfall. Which has a time and a place, but that time and place is typically banking or medical or wherever you’re okay with spending several times the time and money to get maximum reliability (which is a different metric than quality!).

    I bet NVIDIA has driver crashes to figure out, and I know which of those issues I’d want them to focus on first if I used their windows driver.


  • Downmixing is a pretty straightforward affair. You have 6 channels, you need to go to 2, so you just average 4 signals per channel using some weights.

    Good media players (Kodi) allow you to change those weights, especially for the center channel, and to reduce dynamic range (with a compressor). Problem solved, the movie will be understandable even on shitty built-in TV speakers if you want to do that for some insane reason.

    The problem is that there are “default” weights for 2.0 downmixing that were made in the 90s for professional audio monitoring headphones, and these are the weights used by shitty software from shitty movie distributors or TV sets that don’t care to find out why default downmixing is done the way it is. Netflix could detect that you’re using shitty speakers and automatically reduce dynamic range and boost dialogue for you, they just DGAF. But none of that is the movie’s problem.


  • Right?! A track like Spanish Sahara by Foals that uses the full dynamic range is such a pleasure to listen to. Then there’s In the Air Tonight which IIRC has a digital release with super compressed dynamic range. The whole point of that song is that it slowly builds up to a genre-defining drop, so it had better stand out!

    But people want to listen to movies on their built-in TV speakers with children crying in the background, and they don’t want to understand how or why things are the way they are, they just want to complain that the world doesn’t revolve around them.


  • I had a 5.0 setup before I even bought my first TV. I was just using my PC monitor until then.

    It’s counter-intuitive but decent sound comes first. I’d much rather watch Interstellar in 360p with 5.1 audio than in 4K OLED HDR with built-in speakers.

    But when you say that people get mad because they spent a grand on a TV that sounds like shit and they feel they have to defend their choices.


  • I see that, but that is not what I am saying.

    This is just not how things work on a technical level. The default is how cinemas work because that’s the experience movies are made for; literally every other way to consume movie audio is “general usage audio programs fine tuning” and that’s what needs fixing. That’s my entire thesis. By calling me elitist you’re just inventing things I’m not saying to get mad over.

    Yes 500 € is a lot of money. But I will say I bought a good audio setup years before I even had a TV (some parts second hand so it did not actually cost me that much, and a 3.0 setup gets you 80 % of the way there). It’s a markedly better experience to watch a movie on a shitty PC monitor with good audio than on a 55" OLED with built-in speakers, and I will die on that hill. And anecdotally I’ve heard actual filmmakers say as much.


  • people watch stuff on TV and cannot hear any dialogue

    did you read anything I said or do you just want to complain?

    have a doctorate on audio / put in thousands of dollars into a hobby

    Good news then, a more-than-decent 5.1 setup can be had for ~500 €. A decent soundbar for a few hundred.

    and let people like you mess around with the settings for your home cinema

    I can’t if the audio source is fucked up because directors have been forced by studios to release with low dynamic range.

    My whole point is that your audio goes Master -> 5.1 channels -> downmixer -> your shitty 2.0 channels speakers and my audio goes Master -> 5.1 channels -> receiver -> my 5.1 setup.

    You’re asking the master to change to fit your needs. I’m asking the media players to fix their fucking downmixers because that’s where the problem lies. Leave the studio mastering alone god damn it.


  • Where do you draw the line? If you use a soundbar, someone else is complaining because they use their built-in speakers. But if you optimize for that, someone else is using their laptop speaker on the train.

    What really pisses me off with this “argument” is that the audio information is all right there, which you would know if you bothered to read the second half of my comment before getting all pissy.

    5.1 audio (and the standards that superseded it in cinemas) all have multiple audio channels with one dedicated to voice. If you have a shit sound system, the sound system should be downmixing in a way that preserves dialogue better. Again, the information is all right there as there is no stereo track in most movies, your player is building it on-the-fly based on the 5.1 track. It’s not the director’s fault that Netflix or Hulu is doing an awful job at accounting for the fact that most of their users are listening on a sound setup that can barely reproduce intelligible speech.


  • Nah, I have a good sound setup and I don’t want to be watching movies with less dynamic range because some people are using their shrilly built-in TV speakers with their children screaming in the background or $5 earbuds.

    If you don’t want to have a proper 5.1 audio setup, it’s not the director’s problem, it’s the media player. Audio compression, center channel boosting, and subtitling are things that media centers have been able to do for decades (e.g. Kodi), it’s just that streaming platforms and TVs don’t always support it because they DGAF. Do look for a “night mode” in your TV settings though, that’s an audio compressor and I have one on my receiver. If you are using headphones, use a media player like Kodi that allows you to boost the center channel (which is dedicated to dialogue).