I think I’m the type of person who gets into things after everyone. To that regard AI is no different, and for a long time I considered LLMs a toy - this was truer of older models, such as the original chatGPT models that came out in 2022-2023.

The discourse has understandably evolved over time and it’s clear that AI is not going anywhere. It’s like quadcopters in warfare, or so many other new techs before. As much as we’d like them not to be used or exist, they will still be. To refuse to adopt new advancements means to be left behind and giving oneself a disadvantage on purpose.

Ultimately the problems around AI stem from capitalism. Yes, there are excesses. But this is true of humans too.

AI - especially LLMs, which I have more experience with - are great at some tasks and absolutely abysmal at others. Just like some people are good at their job and others don’t know the first thing about it. I used to get an ad on Twitter about some guy’s weird messianic book, and in it he showed two pages. It was the most meaningless AI bullshit, just faffing on and on while saying nothing, written in the most eye-rolling way.

That’s because LLMs currently aren’t great at writing prose for you. Maybe if you prompt them just right they might, but that’s also a skill in itself. So we see that there is bottom-of-the-barrel quality, and better quality, and that exists with or without AI. I think the over-reliance on AI to do everything for them regardless of output will eventually be pushed out, and people who do it will stop finding success (if they even found it in the first place, don’t readily believe people when they boast about their own success).

I use AI to code, for example. It’s mostly simpler stuff, but:

1- I would have to learn entire coding languages to do it myself, which takes years. AI can do it in 30 minutes and better than I could in years, because it knows things I don’t. We can talk about security for example, but would a hobbyist programmer know to write secure web code? I don’t think so.

2- You don’t always have a coder friend available. In fact, the reason I started using AI to code my solutions is because try as we might to find coders to help, we just never could. So it was either don’t implement cool features that people will like, or do it with AI.

And it works great! I’m not saying it’s the top-tier quality I mentioned, but it’s a task that AI is very good at. Recently I even gave deepseek all the JS code it previously wrote for me (or even handwritten code) and asked it to refactor the entire file, and it did. We went from a 40kb file to 20 after refactoring, and 10kb after minifying. It’s not a huge file of course, but it’s something AI can do for you.

There is of course the environmental cost. To that I want to say that everything has an environmental cost. I don’t necessarily deny AI is a water-hog, just that the way we go about it in capitalism, everything is contributing to climate change and droughts. Moreover to be honest I’ve never seen actual numbers and studies, everyone just says “generating this image emptied a whole bottle of water”. It’s just things people repeat idly like so many other things; and without facts, we cannot find truth.

Therefore the problem is not so much with AI but with the mode of production, as expected.

Nowadays it’s possible to run models on consumer hardware that doesn’t need to cost 10,000 dollars (though you might have seen that post of the 2000$ rig that can run the full deepseek model). Deepseek itself is very efficient, and there are even more efficient models being made to the point that soon it will be more costly (and resource-intensive) to meter API usage than give it out for free.

I think the place you have as a user is finding where AI can help you individually. People also like to say AI fries your brain, that it incentivizes you to shut your brain off and just accept the output. I think that’s a mistake, and it’s up to you not to do that. I’ve learned a lot about how linux works, how to manage a VPS, and how to work on mediawiki with AI help. Just like you should eat your vegetables and not so many sweets, you should be able to say “this is wrong for me” and stop yourself from doing it.

If you’re a professional coder and work better with handwritten code, then continue with that! When it comes to students relying on AI for everything, then schools need to find other methods. Right now they’re going backwards to doing pen and paper tests. Maybe we should rethink the entire testing method? When I was in school, years before AI, my schoolmates and I already could tell that rote memorization was torture and a 19th century way of teaching. I think AI is just the nail in the coffin for a very, very outdated method of teaching. Why do kids use AI to do their homework for them? That is a much more important question than how are they using AI.

As a designer I’ve used AI to help get me started on some projects, because this is my weakness. Once I get the ball rolling it becomes very easy for me, but getting it moving in the first place is the hard part. If you’re able to prompt it right (which is definitely something I lament, it feels like you have to say the right magic words and they don’t work), it can help with that, and then I can do my thing.

Personally part of my unwillingness to get into AI initially was from the evangelists who like to say literally every new tech thing is the future. Segways were the future, crypto was the future, VR was the future, NFTs were the future, google glasses were the future… They make money on saying these things so of course they have an incentive to say it. It still bothers me that they exist, if you were wondering (if they bother you too lol), but ultimately you have to ignore them and focus on your own thing.

Another part of it I think is how much mysticism there is around it, with companies and let’s say AI power users who are so unwilling to share their methods or how LLMs actually work. They retain information for themselves, or lead people to think this is magic and does everything.

Is AI coming for your job? Yes, probably. But burying our heads in the sand won’t help. I see a lot of translators talking about the soul of their art - everything has a soul and is art now (even saw a programmer call it that to explain why they don’t use AI in their work), we’ve gone full circle back to base idealism to “explain” how human work is different from AI work. AI already handles some translation work very well, and professionals are already losing work to it. Saying “refuse to use AI” is not materially sound, it is not going to save their client base. In socialism getting your job automated is desirable, but not in capitalism of course. But this is not new either, machines have replaced human workers for centuries now, as far back as the printing press to name just one. Yet nobody today is saying “return to scribing monks”.

I think it would be very useful to have an AI guide written for communists by communists. Something that everyone can understand, written from a proletarian perspective - not the philosophy of it but more like how the tech works, how to use it, etc. I can put it up on the ProleWiki essays space if someone wants to write it, we’ve put up guides before, e.g. if you want to see a nutrition and fitness guide written from a communist perspective.

  • SunsetFruitbat@lemmygrad.ml
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    4 days ago

    I think it is helpful to put some things in perspective, like for electricity usage, data centers only take up 1-1.5% of global electricity usage. Like stated here https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks

    What is the role of data centres and data transmission networks in clean energy transitions?

    Rapid improvements in energy efficiency have helped limit energy demand growth from data centres and data transmission networks, which each account for about 1-1.5% of global electricity use. Nevertheless, strong government and industry efforts on energy efficiency, renewables procurement and RD&D will be essential to curb energy demand and emissions growth over the next decade.

    To also cite form that article, there also this mention to.

    Data centres and data transmission networks are responsible for 1% of energy-related GHG emissions

    So even for overall GHG, data center’s general account very little. Of course with this technology being used more, electricity usage will rise a bit more but it still likely will be small in the grand scheme of things. Another question how much of that is specifically AI in regards to data centers in general? One cited figure is 10-20% of data centers is designated to AI usage. Like here https://time.com/6987773/ai-data-centers-energy-usage-climate-change/

    Porter says that while 10-20% of data center energy in the U.S. is currently consumed by AI, that percentage will likely “increase significantly” going forward.

    So, a lot of data centers are just being used for lots of other things like cloud stuff for example, but the share by AI is growing a bit more however.

    Besides that, to go to the water usage, that is a problem, especially when data centers, in general, are built in areas that can’t really sustain such things. However this is just data centers in general, and this was happening before AI in the last two years. I think it is also worth mentioning to that like, google and the rest are able to buy water rights to which also completely fucks over First Nations to which don’t get a say in these things.

    To quote Kaffe, who I think is also on here to??

    Instead of weaponizing climate anxiety to attack AI merely to defend property law and labor aristocracy, let’s cut to specific issues like Meta’s and Google’s ability to purchase water in violation of treaties.

    https://xcancel.com/probablykaffe/status/1905480887594361070#m

    • into_highest_invite@lemmygrad.ml
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      4 days ago

      the IEA report was made in mid-2023, and i would imagine ai electricity usage has skyrocketed since then. as mentioned in the mit source, dating to may 2025, electricity usage by ai is 48% dirtier than the us average. my problem with ai isn’t that it violates intellectual property rights, it’s that llms are a net-negative to society because of their climate effects. if ai datacenters were built using clean energy and cooled using dirty water, it would likely be little more than a mild annoyance for me. as it stands, we are putting the global south underwater so that people who are surrounded by yes-men can have yes-robots too.

      • SunsetFruitbat@lemmygrad.ml
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        4 days ago

        Skyrocketed by how much? To also use that mit source,

        In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

        They link to a report from last year, but based on that, that still rather small no? That just only in the U.S alone and not globally. I do agree the share by datacenters, and by extension LLMs, will use more electricity in the future, but like in the grand scheme of things, like it is still relatively small.

        To add, I mainly quote Kaffe, because I mainly see lots of people elsewhere say how data centers are taking so much water, but then it fails to ignore things like Kaffe mentioned in that quote. Another thing I wanted to add, that I forgot to in my early reply, but why is it that LLM is getting the brunt of this, but wasteful water practices in farming don’t get mention like the growing of alfalfa in the southwest in the united states that leads to a lot of water waste as well?

        as it stands, we are putting the global south underwater so that people who are surrounded by yes-men can have yes-robots too.

        I’m not exactly sure how data centers, or rather LLM’s are doing that or being the sole contributor to that? Besides guessing that you mean the use of mines, along with introducing more mines, and the transportation needed to continue and make said data centers. Along with the green house emissions that result from said activities, and with generating electricity to power said data centers usually from dirty sources like in the united states. Yet like again, data centers shares in green house emissions, is little compared to more direct things like the u.s military being the largest polluter, or from other things like transportation contributing a large share of green house emissions.

        I just feel like it just goes again back to the issues of capitalism that AI exists in that context of, like CriticalResist said, and AI not really unique in these regards to contributing to climate change?

        • Commiejones@lemmygrad.ml
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          4 days ago

          The other thing that people are forgetting is that AI is also helping make data centers more efficient. How much energy is wasted on inefficient code? I’d bet it is more than 1%.

          • into_highest_invite@lemmygrad.ml
            link
            fedilink
            arrow-up
            3
            arrow-down
            3
            ·
            4 days ago

            it seems dubious to make the claim that large language models write more efficient code. the popularity of node.js alone makes me doubt there’s all that much efficient code out there for it to train on, at least percentage-wise. i mean, the most popular app for hardcore gamers to run in the background packages and runs on its own copy of google chrome. add to that hallucinations and code quality and whatever else and i doubt its code is achieving the efficiency of a high school coding class, at least in the general case

        • into_highest_invite@lemmygrad.ml
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          4 days ago

          LLMs get the brunt of it because alfalfa has more uses than chatgpt. maybe it’s the result of my own bias but i would consider golf courses more useful than chatgpt. LLMs aren’t even close to the sole contributor to climate change, but they are emblematic of venture capitalists more than i think anything else. but it’s hard for me to justify the creation and use of these things when they have very narrow use cases, often create as much work as they save, and suck down clean drinking water like i suck down whiskey sours