RedWizard [he/him, comrade/them]

  • 196 Posts
  • 865 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle



  • Its energy consumption is absolutely unacceptable, it puts the Crypto market to utter shame regarding its ecological impact. I mean, Three Mile Island Site 1 is being recommissioned to service Microsoft Datacenters instead of the 800,000 homes it could service with its 835 megawatt output. This is being made possible thanks to taxpayer backed loans provided by the federal government. So American’s tax dollars are being funneled into a private energy company, to provide a private tech company 835 megawatts of power output, for a service they are attempting to make a profit from. Instead of being provided clean, reliable energy to their households.

    Power consumption is only one half of the ecological impact that AI brings to the table, too. The cooling requirement of AI text generation has been found to consume just over 1 bottle of water (519 milliliters) per 100 words, or the equivalent of a brief email. In areas where electricity costs are high, they consume an insane amount of water from the local supply. In one case, The Dalles, Google’s datacenters were using nearly a quarter of all the water available in the town. Some of these datacenters use cooling towers where external air travels across a wet media so the water evaporates. Which means that they do not recycle the water being used to cool, and it is consumed and removed from whatever water supply they are drawing from.

    These datacenters consume resources, but often do not bring economic advantages to the people living in the areas they are constructed. Instead, those people are subject to the sounds of their cooling systems (if being electrically cooled), a hit to their property value, strain on their local electric grid, and often are a massive consumer of local water (if being liquid cooled).

    Models need to be trained and that training happens in datacenters, which can at times take months to complete. The training is an expense the company pays just to get these systems off the ground. So before any productive benefits can be gained by these AI systems, you have to consume a massive number of resources just to train the models. Microsoft’s data center used 700,000 liters of water while training GPT-3 according to the Washington Post. Meta used 22 million liters of water training its LLaMA-3 open source AI model.

    And for what exactly? As others have pointed out in this thread, and others outside this community broadly, these models only wildly succeed when placed into a bounded test scenario. As commenters on this NYT article point out:

    Major problem with this article: competition math problems use a standardized collection of solution techniques, it is known in advance that a solution exists, and that the solution can be obtained by a prepared competitor within a few hours.

    “Applying known solutions to problems of bounded complexity” is exactly what machines always do and doesn’t compete with the frontier in any discipline.

    Note in the caption of the figure that the problem had to be translated into a formalized statement in AlphaGeometry’s own language (presumably by people). This is often the hardest part of solving one of these problems.

    These systems are only capable of performing within the bounds of existing content. They are incapable of producing anything new or unexplored. When one data scientist looked at the o1 model, he had this to say about the speed at which the o1 model constructed code that took him months to complete:

    Kyle Kabasares, a data scientist at the Bay Area Environmental Research Institute in Moffett Field, California, used o1 to replicate some coding from his PhD project that calculated the mass of black holes. “I was just in awe,” he says, noting that it took o1 about an hour to accomplish what took him many months.

    He makes these remarks, with almost no self-awareness. The likelihood that this model was trained on his very own research is very high, and so naturally the system was able to provide him a solution. The data scientist labored for months creating a solution that, to be assumed, wasn’t a reality beforehand, and the o1 model simply internalized his solution. When asked to provide that solution, it did so. This isn’t an astonishing accomplishment, it’s a complicated, expensive, and damaging search engine that will hallucinate an answer when you’ve asked it to produce something that sits outside the bounds of its training.

    The vast majority of use cases for these systems by the public are not cutting-edge research. It’s writing the next 100 word email you don’t want to write, and sacrificing a bottle of water every time they do it. It’s replacing jobs being held by working people and replacing them with a system that is often exploitable, costly, and inefficient at the task of performing the job. These systems are a parlor trick at best, and a demon whose hunger for electric and water is insatiable at worst.

















  • In total, between 2019 and 2020, “IRI issued 11 advocacy grants to artists, musicians, performers or organizations that created 225 art products addressing political and social issues,” which it claimed were “viewed nearly 400,000 times.” Additionally, the Institute bragged that it “supported three civil society organizations (CSOs) from LGBTI, Bihari and ethnic communities to train 77 activists and engage 326 citizens to develop 43 specific policy demands,” which were apparently “proposed before 65 government officials.”

    Between October and December of 2020, the IRI hosted three separate “transgender dance performances” across the country. Per the report, “the goal of the performance was to build self-esteem in the transgender community and raise awareness on transgender issues among the local community and government officials.” At the final performance, in Dhaka City, the US Embassy sent its “deputy consul general and deputy director of the Office for Democracy, Rights and Governance” to participate.

    Finally, the IRI also carried out “community-specific quantitative and qualitative research,” which included “three focus group reports” and what it called “the largest published survey of LGBTI people in Bangladesh.”

    In sum: “IRI’s program raised public awareness on social and political issues in Bangladesh and supported the public to challenge the status quo, which ultimately aims for power shift [sic] inside Bangladesh.”

    In the US, Republican Party politicians have traditionally scorned government support for visual artists, transgender dancers, and rappers. But when an opportunity to install a more US-friendly government arose, the GOP’s in-house regime change organ eagerly transformed its domestic cultural enemies into political foot soldiers.

    This is incredible. This is such a prime and opaque example of Rainbow Imperialism. It’s like this example was cooked up in a lab. Here you have, the most vile supporters of the erasure of queer life in America doing queer activism in Bangladesh, funding queer artists and organizations to bring awareness to queer struggles within Bangladesh, for the explicit goal of creating political unrest in the country so that its leadership can be removed and replaced with a Compradore Dictator.

    If this was an organization being run by the Democratic Party, you would be hard-pressed to get any liberal to look at this critically. All they would see is their chosen saviors doing savior shit in Bangladesh. They would conclude that the change in leadership is justifiable because they’re “saving” queer people.

    So how do they square this circle now? How do you reconcile the notion that the GOP is doing Trans Rights in Bangladesh to topple the government, but also trying to genocide Trans people through stochastic terror campaigns here in the homeland? What could the conclusion be for these liberals?