• 1 Post
  • 396 Comments
Joined 1 year ago
cake
Cake day: July 14th, 2023

help-circle
  • Giphy has a documented API that you could use. There have been bulk downloaders, but I didn’t see any that had recent activity. However you still might be able to use one to model your own script after, like https://github.com/jcpsimmons/giphy-stacks

    There were downloaders for Gfycat - gallery-dl supported it at one point - but it’s down now. However you might be able to find collections that other people downloaded and are now hosting. You could also use the Internet Archive - they have tools and APIs documented

    There’s a Tenor mass downloader that uses the Tenor API and an API key that you provide.

    Imgur has GIFs is supported by gallery-dl, so that’s an option.

    Also, read over https://github.com/simon987/awesome-datahoarding - there may be something useful for you there.

    In terms of hosting, it would depend on my user base and if I want users to be able to upload GIFs, too. If it was just my close friends, then Immich would probably be fine, but if we had people I didn’t know directly using it, I’d want a more refined solution.

    There’s Gifable, which is pretty focused, but looks like it has a pretty small following. I haven’t used it myself to see how suitable it is. If you self-host it (or something else that uses S3), note that you can use MinIO or LocalStack for the S3 container rather than using AWS directly. I’m using MinIO as part of my stack now, though for a completely different app.

    MediaCMS is another option. Less focused on GIFs but more actively developed, and intended to be used for this sort of purpose.



  • Wouldn’t be a huge change at this point. Israel has been using AI to determine targets for drone-delivered airstrikes for over a year now.

    https://en.m.wikipedia.org/wiki/AI-assisted_targeting_in_the_Gaza_Strip gives a high level overview of Gospel and Lavender, and there are news articles in the references if you want to learn more.

    This is at least being positioned better than the ways Lavender and Gospel were used, but I have no doubt that it will be used to commit atrocities as well.

    For now, OpenAI’s models may help operators make sense of large amounts of incoming data to support faster human decision-making in high-pressure situations.

    Yep, that was how they justified Gospel and Lavender, too - “a human presses the button” (even though they’re not doing anywhere near enough due diligence).

    But it’s worth pointing out that the type of AI OpenAI is best known for comes from large language models (LLMs)—sometimes called large multimodal models—that are trained on massive datasets of text, images, and audio pulled from many different sources.

    Yes, OpenAI is well known for this, but they’ve also created other types of AI models (e.g., Whisper). I suspect an LLM might be part of a solution they would build but that it would not be the full solution.


  • Both devices have integrated memory, so that 16 GB will look more like a 11/5, 12/4, or maybe even 14/2 split. The Steam Deck is also $400 for an LCD model or $550 for the OLED, not $800. It’s reasonable to expect more performance when you pay more.

    Because the Steam Deck has a lower native resolution, that means that less of the RAM will be used for the integrated GPU. Downscaling from 1080p to 720p doesn’t look good, either - and you could downscale to 540p if supported, but if you need to do that (vs choosing to for an emulated game) it probably won’t be pretty, either.

    This device is also running Windows, rather than a streamlined Linux-based launcher, meaning that more of that RAM will be taken up by OS processes by default.

    The article talks about how the 8840U benefits from more, fast RAM. You won’t get near the 8840U’s full potential gaming with 16 GB. 24 GB, on the other hand, would have been enough that games expecting 16 GB of system RAM would have been able to get it, even while devoting 6-7 GB to the GPU and 1-2 GB to the OS.




  • Depends on your perspective. Would it be fine for Meta Threads to replace it? Threads supports ActivityPub, so in some ways it likely interacts better with the fediverse.

    If we agree that Threads isn’t a suitable replacement, then clearly there’s some criteria a replacement should meet. A lot of the things that make Threads unpalatable are also true of Bluesky, particularly if your concern relates to the platform being under the control of a corporation.

    On the other hand, from the perspective of “Twitter 2.0 is now a toxic, alt-right cesspool where productive conversations can’t be had,” then both Threads and Bluesky are huge improvements.







    • Young white men are included under “Young People and Students.”
    • Old white men are included under “Seniors and Retirees.”
    • Many white men have disabilities and are covered under “Americans with Disabilities.”
    • Many white men are covered under “LGBTQ+” - trans men, gay and queer men. Heck, some even include allies under the umbrella.
    • Many white men who are neither young nor old (or members of their family) are members of unions, or would like to be, and thus covered under “Union Members and Families.”
    • Likewise, many white men are covered under:
      • Faith Community
      • Rural Americans
      • Small Business Community
      • Veterans and Military Families

    Economically, Democratic policies favor poor and middle class people, which statistically makes up the majority of all white men. And there aren’t any policies that oppress white people or men the way that Republican policies oppress women or reduce support for all of the groups that Democratic policies help support.

    In other words, unless you get off on the oppression of those groups, almost all white men are served by the Democratic party, even if they can’t find themselves on the list you shared.

    “Black Lives Matter” was a response to black men and women being murdered by police at higher rates, of the news stories of those deaths being under-reported by comparison, and of the victims being blamed more than people of other races, particularly white people.

    “All Lives Matter” as a response to “Black Lives Matter” missed the point. It’s “Black Lives Matter, too.” If all lives mattered, people wouldn’t have needed to protest the killings of black people in the first place.

    Imagine if you were at a restaurant and everyone around you got their order but you, so you said “Hey, I need my order.” If the server responded with “Yes, everyone needs their order” and walked off, that would be about the equivalent to saying “All Lives Matter.”

    So, is there a parallel between thinking that white men should be pandered to and saying “White Lives Matter?” Absolutely.



  • 500 grams of what, though? Folgers?

    The current average price per pound (454 grams) of ground coffee beans in the US was double that just a couple months ago, so spending $3.00 per pound would necessitate getting cheaper than average - and therefore, likely lower quality than average, or at least lower perceived quality than average - beans.

    The sorts of beans that companies tend to stock (IME) that are perceived as higher quality aren’t the same brands that I tend to buy (generally from local roasters), but they’re comparably priced. For a 5 pound (2267 grams) bag of one of their blends (which are roughly half the price of their higher end beans), it’s similar to what you’d pay for 5 pounds of Starbucks beans - about $50-$60.

    Often when a company says “free coffee,” they don’t mean “free batch-brewed drip coffee,” but rather, free espresso beverages, potentially in a machine (located in the break room) that automates the whole process. I assume that’s what Intel is doing.

    At $10 per pound (16 ounces) and roughly 1 ounce (28 grams) of beans per two ounce pour of espresso, that means that if each person on average drinks two per day, then that’s $1.25 for coffee per person per day.

    However, logistics costs (delivering coffee to all the company’s break rooms) and operational costs (the cost of the automatic machine and repairs, at minimum; or the cost of baristas, or adding the responsibility to someone’s existing job (and thus needing more people or more hours) if just batch brewing) have to be added on top of that. Then add in the cost of milk, milk alternatives, sweeteners, cups, lids, stir sticks, etc…

    Obviously if they just had free coffee grounds and let people handle the actual brewing of coffee in the break room, it would be much cheaper. But if the goal is to improve morale, having higher quality coffee that people don’t have to make themselves is going to do that better.




  • You can use yt-dlp to download Tiktok videos, and you can use it on both iOS (e.g., via aShell or Pythonista) and Android (e.g., via Termux).

    Once yt-dlp is installed, you can run this command in the terminal app. It’ll be downloaded into your current directory:

    yt-dlp https://www.tiktok.com/@r_o_b__b_a_r_b_e_r/video/7392630187063627040
    

    Just replace the URL with the one for your desired video. The video URL should like the one I have below, though you don’t need to remove the query parameters - if it doesn’t you may need to Share, Copy Link, and use the copied link instead of the URL bar. This is especially true if navigating among tabs on the web or something.

    You may need to wrap the url in double quotes. IME it varies by device.

    On iOS there are Shortcuts that integrate with yt-dlp, and on Android you can do the same with Tasker and the Tasker - Termux plugin. Make sure to install the F-Droid versions.

    You can also save many Tiktok videos through the app’s Share dialog, though creators can disallow that content wide.