• 0 Posts
  • 1.49K Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle
  • I mean, those work fine and are fast. You mean we’ll get those for cheap.

    In any case, the image is about physical dimensions, and SD cards are tiny! Considering we’re comparing to a 40 MB mechanical drive, I’m gonna say the comparison is valid and they aren’t even near the bottom of the specs table.

    Of course people like it when ALL the specs get better in these things, but that’s because people like simple things more than true things.



  • Just so we’re clear, the first pass of localization of every game you’ve played in the past decade has been machine-generated.

    Which is not to say the final product was, people would then go over the whole text database and change it as needed, but it’s been frequent practice for a while for things like subtitles and translations to start from a machine generated first draft, not just in videogames but in media in general. People are turning around 24h localization for TV in some places, it’s pretty nuts.

    Machine generated voices are also very standard as placeholders. I’m… kinda surprised nobody has slipped up on that post-AI panic, although I guess historically nobody noticed when you didn’t clean up a machine-translated subtitle, but people got good at ensuring all your VO lines got VOd because you definitely notice those.

    As with a lot of the rest of the AI panic, I’m confused about the boundaries here. I mean, Google Translate has used machine learning for a long time, as have most machine translation engines. The robot voices that were used as placeholders up until a few years ago would probably be fine if one slipped up, but newer games often use very natural-sounding placeholders, so if one of those slips I imagine it’d be a bit of drama.

    I guess I don’t know what “AI generated” means anymore.

    I haven’t bumped into the offending text in the game (yet), but I’m playing it in English, so I guess I wouldn’t have anyway? Neither the article nor the disclosure are very clear.

    That said, the game is pretty good, if anybody cares.




  • Yeeeeah, I have less of a problem for that, because… well yeah, people host stuff for you all the time, right? Any time you’re a client the host is someone else. Self-hosting makes some sense for services where you’re both the host and the client.

    Technically you’re not self hosting anything for your family in that case, you’re just… hosting it, but I can live with it.

    I do think this would all go down easier if we had a nice marketable name for it. I don’t know, power-internetting, or “the information superdriveway”. This was all easier in the 90s, I guess is what I’m accidentally saying.


  • This is a me thing and not related to this video specifically, but I absolutely hate that we’ve settled on “homelab” as a term for “I have software in some computer I expose to my home network”.

    It makes sense if you are also a system administrator of an online service and you’re testing stuff before you deploy it, but a home server isn’t a “lab” for anything, it’s the final server you’re using and don’t plan to do anything else with. Your kitchen isn’t a “test kitchen” just because you’re serving food to your family.

    Sorry, pet peeve over. The video is actually ok.


  • And nothing has replaced it.

    That’s what I was saying, it’s all shaky right now. Wilds runs about as well on both, but it’s noticeably less stuttery for me under Linux. Other stuff, particularly when leaning hard into Nvidia features, is either performing poorly or has features disabled on Linux. Plus the compatibility issues.

    There is just no one-size-fits-all solution on PCs thede days, even before you start considering the weirdness of running the same games in ridiculous 1000W powerhouses and 15W handhelds at the same time.

    PC gaming has become a LOT less plug-and-play this last decade, and I don’t know that it’ll go back to where it was any time soon.



  • That depends. In this case, where the Lenovo drivers are clearly outdated and kinda broken, definitely they’re the bottleneck for at least some games. That much they’ve shown, by installing newer drivers and showing a massive performance upgrade.

    Although I’d caveat that by saying that their flashier results with big updates across OSs and driver variants are running at outright unplayable settings. They are benchmarking on settings resulting on framerates in the teens. When they say they saw 12% performance increases on the newer drivers they mean going from 14 to 16 fps in some cases.

    Benchmarking properly is hard, I guess is my point.



  • As of right now, both models of the Go S listed on Lenovo’s website have 32 GB of RAM (screenshotted below, if the weird screenshot functionality here works). So no, you’re wrong here. The version with 16 GB is the Go 1. If there is a 16 gig SKU of the Go S, which there may be, they currently don’t have it listed.

    Memory size requirements depend on what you’re trying to run. Easier to run stuff will run on everything, but from hands-on experience I assure you a bunch of newer games struggle with the default allocation of 4 gigs of VRAM and can use the extra RAM. You can still give 8 gigs to the GPU with 16 but then you’re a lot more likely to start struggling with system RAM. If these AMD APUs worked like an Apple chip and could dynamically allocate RAM that wouldn’t be such a pain, but at the moment you need a reboot to change this even on current-gen hardware, so it’s easier to have a larger pool and give the GPU a little too much.

    The amount of CUs and the VRAM aren’t necessarily related. Even with larger RAM allocations and weaker GPUs you can find yourself in the wrong setup, which is annoying. And it’s not just amount of RAM, these shared architectures can struggle with bandwidth as well, so speed can matter (although it’s more giving you more or smoother FPS and the less the fall-off-a-cliff unplayable mess you get if the game is entirely out of RAM budget). That’s also why I suspect being lighter on memory and perhaps having a better default setup may be a part of why SteamOS performance is disproportionally better on heavier scenarios compared to what you see on desktop PCs. I can’t be sure, though.

    This comes from me messing around with a literal handful of PC handhelds on Windows, SteamOS and Bazzite. I’m not guessing, I’m telling you what happened during hands-on testing.






  • Antivirus programs? When was the last time you tried Windows, the mid-00s?

    Anyway, it’s not random print services causing CPU overhead, that’s old timey stuff. In this case it’s being RAM heavy in a RAM-limited scenario and, from their testing, Lenovo being really terrible at keeping their AMD Windows drivers updated. As part of the test they manually switched to an ASUS version of newer AMD drivers and saw significant boosts in some games.

    Modern graphics drivers are a mess of per-game features and optimizations. Different manufacturers keeping things at different levels of currency is a nontrivial issue and why some of this benchmarking is hard and throwing five random games at the problem doesn’t fully answer the question.




  • It is in some ways. I can tell you I tried to run Prototype 2 on a handheld today and it didn’t run natively on Windows 11 because it’s old but putting it into a Proton session and keeping it contained did wonders for it and the Deck ran it maxed out at 90fps (you forget it can do that if you insist on playing modern games on it, but man, does it look nice on the OLED).

    So hey, it certainly Windows 8s better than Windows 11. There is that.

    But it’s not magic, so I’d still like to figure out what we’re seeing in these examples.