• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2024

help-circle




  • They said they would wait until there was a meaningful increase in the power or efficiency they could get out of the form-factor.

    The OG deck launched 3.5 years ago, and since then not much has changed. The steam deck GPU has 1.6 TFlop of FP32 compute at 15W. AMD’s newest low-power APUs have 2.3 TFlop of FP32 at 28W - nearly double the power for a <50% theoretical performance gain.

    A semi-custom APU (that removes the useless AI engine) would compare more favorably, but we are still talking about maybe 20% more performance, not exactly game changing for the cost.



  • Many games use multiple threads, but they don’t do so very effectively.

    The vast majority of games use Unreal or Unity, and those engines (as products) are optimized to make the developer experience easy - notably NOT to make the end product performant.

    It is pretty common that there is one big thread that handles rendering, and another for most game logic. This is how Unreal does it ‘out of the box’. It also splits the physics calculations off into multiple threads semi-automatically, and the standard default setup will have render and game logic on separate threads.

    Having a lot of moving characters around is taxing because all the animation states have to go through the main thread that is also doing pathfinding for all the characters and any AI scripts that are running… often you can’t completely separate these things since where a character wants to move may determine whether they walk/run/jump/fly/swim and those need different animations.

    This often leads to the scenario where someone with an older 8+ core chip is wondering why the game is stuttering when ‘it is only using 10% of my cpu’ - because the render thread or game logic thread is stuffed and is pinning one core/thread at 100%.

    Effective concurrency requires designing for it very early, and most games are built in iterative refinements with the scope and feature list constantly changing - not conducive to solving the big CS problem of splitting each frame’s calculations into independent chunks.


  • The issue isn’t that they didn’t work, as I said I wasn’t expecting them to when I bought the mouse.

    The issue is their behavior has started changing with updates. I don’t mind, but I’m a tinkerer. My wife, my MiL, most of my friends, absolutely do not want to deal with an inconsistent computer experience.

    Different definitions of ‘ready’ I guess. Been using primarily Linux for years, so it was ‘ready’ for me back then - but nothing has changed in the mean time that would change my recommendation for people who just want a boring stable computer.


  • I love Linux, but it isn’t ready.

    Two weeks ago my side mouse buttons started working (they require Logitech software on Windows, wasn’t expecting them to work). Last week they stopped. This week they work again.

    Is this major? Not at all. Would it drive my mother-in-law into a rage rivaling that of Cocaine Bear? Absolutely. Spare me from the bear, keep Linux for the tinkerers.


  • Most games (pre-ai at least) would use a brush for this and manually tweak the result if it ended up weird.

    E.g. if you were building a desert landscape you might use a rock brush to randomly sprinkle the boulder assets around the area. Then the bush brush to sprinkle some dry bushes.

    Very rare for someone to spend the time to individually place something like a rock or a tree, unless it is designed to be used in gameplay or a cutscene (e.g. a climable tree to get into a building through a window).




  • skibidi@lemmy.worldtolinuxmemes@lemmy.worldLinux is not ready
    link
    fedilink
    arrow-up
    22
    arrow-down
    8
    ·
    8 months ago

    Linux isn’t ready.

    While many things will work ‘out of the box’, many won’t. Hell, for like 3 months HDR was causing system-wide crashes on Plasma for Nvidia cards, so the devs just disabled the HDR options until there was an upstream fix.

    There are still a host of resume-from-sleep issues, Wayland support is still spotty, and most importantly - not every piece of software will run.

    Linux is my daily driver, I have learned to live and love the jank. My wife uses windows and does not want to be confronted with a debugging challenge 5% of the time when she turns on her computer, and I think that is fair.

    These kinds of posts paper over lots of real issues and can be counterproductive. If someone jumps into the ecosystem without understanding, these kinds of posts only set them up for frustration and disappointment.


  • Yes, of course, there is financing and everything else. I was getting a bit deeper:

    If you have to spend 100 joules building a power plant, it better give back more than 100 joules during its lifetime - otherwise it was never worth it to build. That isn’t strictly true, there are special purposes, but certainly as a grid-scale energy deployment you would need - at a bare minimum - for each plant to pay for itself in terms of energy investment.

    The dollars follow from that physical reality.

    The first hurdle for fusion to clear is that the reaction outputs more energy than it needs to sustained. This would be a great academic success, and not much more.

    The second hurdle is that it outputs enough energy such that it exceeds the sustainment energy even after accounting for capture losses (e.g. from neutrons, turbine efficiency, etc.) and production efficiencies (lasers need more energy input than they impart to the reaction chamber, magnets need cooling, etc.).

    The third hurdle is that over the lifetime of a plant, it produces enough excess energy to build itself and pay the embodied costs of all maintenance and operations work. If the reaction is technically energy positive, but you need to replace the containment vessel every 48 hours due to neutron embrittlement, then the plant better be productive enough to pay for refining all that extra steel.

    The fourth hurdle is then that it produces more excess energy per unit of invested energy than any other form of power generation - at which point we’d never build solar panels again.

    These final hurdles are in no way guaranteed to be cleared. Artificial fusion needs to be orders of magnitude denser than natural fusion (Stars) to make any sense… a fusion power plant the size of Earth’s moon, with the same power density as the Sun, could only power around 1 million US homes.



  • Economical energy production, sure, not any energy production. There is a reason we no longer burn wood to heat public baths.

    I realize the science marketing of fusion over the past 60 years has been ‘unlimited free energy’, but that isn’t quite accurate.

    Fusion (well, at least protium/deuterium) would be ‘unlimited’ in the sense that the fuel needed is essentially inexhaustible. Tens of thousands of years of worldwide energy demand in the top few inches of the ocean.

    However that ‘free’ part is the killer; fusion is very expensive per unit of energy output. For one, protium/deuterium fusion is incredibly ‘innefficient’, most of the energy is released as high-energy neutrons which generates radioactive waste, damages the containment vessel, and has a low conversion efficiency to electricity. More exotic forms of fusion ameliorate this downside to a degree, but require rarer fuels (hurting the ‘unlimited’ value proposition) and require more extreme conditions to sustain, further increasing the per-unit cost of energy.

    Think of it this way, a fusion plant has an embodied cost of the energy required to make all the stuff that comprises the plant, let’s call that C. It also has an operating cost, in both human effort and energy input, let’s call that O. Lastly it has a lifetime, let’s call that L. Finally, it has an average energy output, let’s call that E.

    For fusion to make economical sense, the following statement must be true:

    (E-O)*L - C > 0.

    In other words, it isn’t sufficient that the reaction returns more energy than it requires to sustainT, it must also return enough excess energy that it ‘pays’ for the humans to maintain the plant, maintanence for the plant, and the initial building of the plant (at a minimum). If the above statement exactly equals zero, then the plant doesn’t actually given any usable energy - it only pays for itself.

    This is hardly the most sophisticated analysis, I encourage you to look more into the economics of fusion if you are interested, but it gets to the heart of the matter. Fusion can be free, unlimited, and economically worthless all at the same time.


  • If they weren’t a fascist ethnostate led by a madman, they probably wouldn’t have launched the war in the first place. The utterly misguided belief in their superiority is what made them blind to the (rather obvious) conclusion that they didn’t have the resources to conquer Europe (mostly) single-handedly. Let alone take Italy along with them.

    Hell, the only reason it was even - somewhat - close at points was Hitler’s insistence on a blitz through the Ardennes to attack France. The generals thought it was a terrible plan (and it was, that’s a big reason why the French were unprepared and got essentially knocked out of the war in weeks).

    WW2 is interesting precisely because the big numbers only point one way - a complete defeat of Germany and Japan by much larger and better-supplied powers. But there were multiple points where tactical developments could have become strategic victories - which are rather rare occurrences in the study of war.

    E.g. the Nazis didn’t have the resources to conquer the Soviet Union, but if the battles of Stalingrad and Moscow had gone their way, it is difficult to see how the USSR could have maintained a functioning government. Likewise Japan was woefully under prepared to defeat the US in the Pacific, but if the US carriers had been sunk at Pearl Harbor, maybe the Japanese hedgehog strategy to fortify the Pacific islands works out.

    Of course, once the bomb was ready then nothing else matters.

    Ultimately, it was all massive tragedy the likes of which I hope we never see again. The counterfactuals are fun to play out, if you can abstract away the millions of deaths in all sides.


  • skibidi@lemmy.worldtoScience Memes@mander.xyzPhD ain't no MD
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    Completely correct. There is also a (much rather in the US) ScD degree - Doctor of science.

    In the US, it is often identical to a PhD. If your institution offers it, you just check a box at the end of your program on whether you want a PhD or ScD. In Europe, an ScD is a higher degree than a PhD and requires some extra work to obtain.