dougfir [he/him]

  • 2 Posts
  • 33 Comments
Joined 3 months ago
cake
Cake day: June 20th, 2025

help-circle





  • Expecting on the Front Lines: Motherhood in Ukraine’s Military by the NYT

    Ukraine’s military is finding it hard to recruit young men as the war with Russia grinds on, but women — all volunteers — are a bright spot. The number of women serving has grown more than 20 percent to about 70,000 since Russia’s invasion in 2022.

    Those who become pregnant often serve in tough conditions under relentless shelling, living without heat in the winter, or running water and proper toilets.

    While the U.S. Army and many other militaries remove pregnant soldiers from combat zones, Ukrainian women usually serve until their seventh month. And that is in a military that doctors and soldiers say is ill-equipped to support them — from uniforms that don’t fit pregnant women, to a lack of prenatal care and nurseries — amid the costs and challenges of fighting the war.

    The Ukrainian military did not respond to questions about how many women were pregnant or had given birth in the ranks, or about prenatal care for soldiers.





  • https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/ an older man, who had had a stroke and was likely in the early stages of dementia, died when he tried to visit an AI in new york. he repeatedly had asked the bot if it was real and it always said it was. the story also talks about how these facebook chatbots are basically grooming machines. somebody should really do something about zuck and meta

    cw grooming

    An internal Meta policy document seen by Reuters as well as interviews with people familiar with its chatbot training show that the company’s policies have treated romantic overtures as a feature of its generative AI products, which are available to users aged 13 and older.

    “It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.” … The document seen by Reuters, which exceeds 200 pages, provides examples of “acceptable” chatbot dialogue during romantic role play with a minor. They include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.” Those examples of permissible roleplay with children have also been struck, Meta said.

    Other guidelines emphasize that Meta doesn’t require bots to give users accurate advice. In one example, the policy document says it would be acceptable for a chatbot to tell someone that Stage 4 colon cancer “is typically treated by poking the stomach with healing quartz crystals.”

    Nowhere in the document, however, does Meta place restrictions on bots telling users they’re real people or proposing real-life social engagements.





  • not sure if the aerial footage of gaza that has been filmed over the last couple days has been shared here yet. what i have seen of it is harrowing and this clip from itv sums up the grim scenes:

    “Gaza is being erased … Israel tried to restrict images from above being filmed or shown … This landscape of destruction looks otherworldy, but it’s not, it’s this world - What is happening may come to define one of its darkest eras, one that casts a stain on humanity, which will endure for generations”

    words are inadequate to describe the horror of this genocide, the worst crime of the 21st century and certainly among the worst of the last 100 years