• 15 Posts
  • 270 Comments
Joined 1 year ago
cake
Cake day: September 13th, 2024

help-circle

  • HiddenLayer555@lemmy.mltoAsklemmy@lemmy.mlWhat christmas tradition do you uphold?
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    5 days ago

    Christmas tree and lights. Though the LED ones have ruined it for me, I can see the half wave rectified 60hz and it gives me a headache. That pisses me off so much because I was once super excited about LED Christmas lights consuming a fraction of the energy, but every company seems to think they can get away with literally just replacing incandescent bulbs with LEDs without the proper circuitry to drive them which would have cost, what? A dollar more? Hell a full bridge rectifier probably costs literally a penny when bought in bulk and though it’s still not a “proper” LED driver, it would have doubled the frequency and most people wouldn’t see it anymore.

    Does anyone know any Christmas lights with a proper power supply that drives them at low voltage DC? I imagine it would be a lot safer too wrapped around a flammable plastic tree.



  • HiddenLayer555@lemmy.mltoAsklemmy@lemmy.mlWhats your take on stand up comedy?
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    edit-2
    6 days ago

    As I mature, I find myself thinking very lowly of standup in general despite watching a lot of it when I was an edgy 18/19 year old. When I try to get back into it now I just see the various ethical problems with their jokes and that makes it not funny for me.

    White standup comics get free reign to drop racist dog whistles and if you criticize them on it they get all snippy with you. The most popular “genre” of white stand up comedy still seems to be “I went to a [insert culture here] restaurant and here are my disrespectful and stereotype enforcing hijinks” or even “I went to [insert COUNTRY here] and will now proceed to joke about how their culture is different from us.” I can only hope the whole thing is made up and they’re not that atrocious in real life, though the vast majority of service staff seem to have stories about famous comics treating them like shit so I wouldn’t be surprised.

    Even a lot of ethnic standup comics portray themselves as the victims of racism in one joke but then have no problems using stereotypes of another ethnicity in the very next joke.

    Also standup definitely seems to have an air of being attended by older generations who find insulting the younger generations funny. Things like participation trophies (which was a boomer idea by the way, the kids aren’t planning school competitions or buying the prizes) or terms like “snowflake” seems to have gotten into boomer rethoric partly because of standup. Reactionary takes about progressive social movements like veganism or car-free living are also the norm because I assume they know most of the people who watch them are the type to get mad at how other people choose to live their lives. Standup in general seems to have a “let’s make fun of anything people are doing that’s different from how it was before because we don’t want to do it that way and need validation that we’re not assholes on the wrong side of history” attitude. Or they’ll just make fun of random people living their lives, I remember watching a comic on YouTube doing a whole segment making fun of people who swim laps in hotel pools because it annoys him, like bro mind your own damn business.

    Occasionally a comic will try to earn brownie points by saying the most superficial shit about a major societal problem and then act like they singlehandedly solved it. Bonus points if they’re talking about another country’s problems which the West fucking caused.

    I’m not saying all standup is like this or all standup comics are racist or reactionary, but I am saying there are very few long running standup shows/podcasts with none of these problems.







  • An extra hard drive for offline backup of my home server. Just knowing I have a cold, unplugged copy of my data in my drawer has made me less paranoid about accidentally “rm -rf”-ing my computer and taking all the mount points with it or my dog getting her paw caught on a wire (she likes to run around haphazardly and is pretty clumsy) and dragging the entire hard drive enclosure down with it.

    Ideally I wouldn’t keep that drive in my house but I don’t have anywhere else to put it. Maybe someday I’ll get a safe deposit box or something but then my lazy ass probably wouldn’t bother to retrieve and sync my data nearly as often.



  • An AGI wouldn’t need to read every book because it can build on the knowledge it already has to draw new conclusions it wasn’t “taught.”

    Also, an AGI would be able to keep a consistent narrative regardless of the amount of data or context it has, because it would be able to create an internal model of what is happening and selectively remember the most important things more so than things that are inconsequential (not to mention assess what’s important and what can be forgotten to shed processing overhead), all things a human does instinctively when given more information than your brain can immediately handle. Meanwhile, an LLM is totally dependent on how much context it actually has bufferered, and giving it too much information will literally push all the old information out of its context, never to be recalled again. It has no ability to determine what’s worth keeping and that’s not, only what’s more or less recent. I’ve personally noticed this especially with smaller locally run LLMs with very limited context windows. If I begin troubleshooting some Linux issue using it, I have to be careful with how much of a log I paste into the prompt, because if I paste too much, it will literally forget why I pasted the log in the first place. This is most obvious with Deepseek and other reasoning models because it will actually start trying to figure out why it was given that input when “thinking,” but it’s a problem with any context based model because that’s its only active memory. I think the reason this happens so obviously when you paste too much in a single prompt and less so when having a conversation with smaller prompts is because it also has its previous outputs in its context, so while it might have forgotten the very first prompt and response, it repeats the information enough times in subsequent prompts to keep it in its more recent context (ever notice how verbose AI tends to be? That could potentially be a mitigation strategy). Meanwhile, when you give it a very large prompt as big or bigger than its context window, it completely overwrites the previous responses, leaving no hints to what was there before.