• 27 Posts
  • 531 Comments
Joined 2年前
cake
Cake day: 2023年7月1日

help-circle


  • The lesson is that humans should always be held responsible for important decision making and to not rely on solely ML models as primary sources. Eating potentially dangerous mushrooms is a decision that you should only make if you’re absolutely sure it wont hurt you. So for research If you choose upload a picture to chatgpt and ask if its edible instead of taking the time to learn mycology, attend mushroom foraging group events, and read identification books, well thats on you.


  • Honestly, my favorite people are the ones who love to talk and are horribly desperate to babble to potential listeners. I’m not much of a talker but I absolutely dont mind looking you in the eyes and nodding my head as you talk about your hobby or current going ons.

    In bigger social groups I noticed this weird thing fellow humans tend to do where they all want a slice of being the talker/ center of attention and constantly cut off eachother or tune out current speaker waiting for them to shut up so they can start their monkey babble turn.

    This behavior absolutely infuriates me and I refuse to take part in it. I would rather just be silent and let you say your piece than interrupt the flow.

    As a knock on effect people subconsciously notice I’m not competing with them for talk time and am sending them constant listening signals like looking in the eye nodding head “mhm got you” stuff. This seems to really go a long way with making friendly with talkative types with minimal effort.


  • The question isnt whether quantum computers have an advantage over regular computers (they pretty much always do for code cracking as the parallel superposition computation is some crazy shit that changes cryptography forever) instead the question is whether or not AES-256 is able to resist our current quantum compute and how long it can do that.

    Its a simple equation, as long as it takes longer than the lifespan of the universe to compute with our most powerful supercomputers its considered good encryption. However as computers get more powerful, the projected time decreases potentially to the point of human lifespan time frames. Thats when it becomes a problem and the standard fails.

    Currently AES is quantum resistant but it almost certainly won’t be forever. New standards are gonna need to be adopted at some point.






  • A fun weekend project was to set up a local model to tool call from openweather and wolfram alpha through their API for factual dataset retrieval and local weather info.

    Someone In our community showed off toolcalling articles on local instance of Wikipedia through a kiwix server and zim file and that seems really cool project too.

    I would like to scrape preprints from ArXiv and do basic rag with them. Also Try to find a way to have a local version of OEIS or see if theres an API to scrape.

    So I guess my solution is to use automation tools to automate data retrieval from wiki’s and databases directly. Use RSS, direct APIs, scrapers and tool calling.



  • SmokeyDope@lemmy.worldtomemes@lemmy.worldWell, they are.
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    12日前

    Eh. Meme consumers are riding a pointless high horse on this one. On one hand yes there is arguably a human aspect to manual creation thats lost and theres still plenty of screwups in generated details if you look closely. On the other hand memes and edited silly cat pictures are not high art, they’re the low effort fast food quick consumption equivalent of images. We call them shitpost, I mean come on.

    The whole point of an image is to activate a series of neurons in your brain that excite certain patterns to convey information to you. The whole point of a meme is to do this to convey humor or a poorly vieled political message with little creation efforts and high resharability.

    In the beforefore times you had to actually paint and color by hand on a piece of paper, make models and set pieces out of materials like clay, cut up printed pictures to stitch them together.

    Then we could digitize everything, create models and animations with clicks of the button, edit images easily through software, and share our creations effortlessly. This lead to the rise of the Internet’s first generation of memes with stuff like the dancing baby and flashgrounds references and caption memes.

    Now its completely effortless with machine learning models trained to create images/sounds/videos from simple sentence instructions. It probably my takes a minute for someone to instruct image models to boilerplate an acceptable picture through typing a few keywords, or import an existing image and instruct edits. All without the need to learn the ins and outs of image manipulation software, 3d modeling software, ect.

    Why spend an hour to make an edit in gimp for maybe a few dozen people to look at it once, go “heh” and immediately forget about it as they doomscroll past.

    Unlike most jerkasses with an armchair opinion ive actually cooked some comics and memes in my day for the internets consumption. Sometimes Ive spent way longer than I should have in gimp just to make a funny edit. I’m happy these tools exist to let people engage with visual creation on their own terms. If you don’t have time or hardware for learning gimp to crop a meme I won’t be on my high horse telling you that your bad and your images are fundamental wrong just because you had a computer/model boilerplate it for you from some keyword prompting. Fuck em, they get the slop they deserve to consume.


  • I’ve pumped at least a few hours into at least 90% of the games I own, gave each entry a fair shake, and either moved on or come back to it now and again. Its the rare exception that the game is so up my alley I can pour endless hours into it without the experience getting boring.

    I don’t care about achievements or even completing it. I play games mainly to try out a new experience and get the brain working in different ways. Some of my favorite games I can pump many hours into and have completed, rougelikes especially are infinitely replayable. But others are once and done experiences I got my fill of over the course of a couple of hours and have no desire to come back to. Theres no shame in being the later.



  • What is ‘the internet’ to you? I think this term means different things to different people. I imagine to people born in the latest generations the internet is social media and productivity corpo sites. To them the internet is youtube, tiktok, twitter, reddit, their bank, and whatever slop services they subscribe to magically beamed into pocket computer through technomagical nerd shit like “5g” and processed through “microprocessors” and other stuff they’ dont care to really understand because its all abstracted away.

    I was born early enough for the internet to be nothing more than two computers barely powerful enough to run a GUI calling eachother up through telephone wires to share goofy web 1.0 blogspam. I remember when low res images were the norm and when pre-google youtube was just coming into being. When AOL and Myspace and Newgrounds/flash games. I remember being a kid and loving computers because I never knew what new cool website was on the horizon to discover and play with. I remember that people used things like newsgroups and pre-craigslist to meet up for transactions.

    This is the internet, to me. At least what it once was and what it can be again. People using the digital landscape to freely express themselves with their own hardware. To come together to share in hobbies and interest and passions.

    We could have that again if we all bought into a standardized radio based mesh network that could host personal sites while acting as a routing node.

    But I don’t know if the general public will ever be pushed to partake in this network. They would have to be squeezed very hard to try alternatives to the common way of things.






  • SmokeyDope@lemmy.worldtoScience Memes@mander.xyzFictional
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    23日前

    Its more related to limits of knowability of events beyond a certain scale. Its easy an intuitive to think of it like spacetime is quantized like pixels on a grid with a minimum action requirement of time and energy to move between them. But its not that simple or at least that kind of granular discreteness is not proven (though there are digital physics frameworks that treat spacetime discrete like this)

    The Planck length does not define the minimum distance something can move but rather the minimum scale of meaningful measurement that can make a bit of distinction between two microsstates of information. In essence it says that if theres two continuous computational paths that differ by less than a sub-plancks worth of distinction there is no measurable distinction difference between them and the paths get blurred together.

    Its a precision limit that defines how exact we can measure interactions that happen within the distance between two points.

    It’s possible that spacetime is continuous at a fundamental level, but the Planck length represents the scale at which quantum fluctuations of spacetime itself become so violent that the concepts of a ‘path’ or a ‘distance’ can no longer be defined in the classical sense, effectively creating discrete quantized limits for measurement precision.

    Ultimately this precision bound limit is related to energy cost to actualize a measurement from a superposition and the exponetial increase in energy needed to overcome uncertainty principle at smaller and smaller scales. The energy required to actualize a meaningful state from a sub-planck length would be enough to create a kugelblitz black hole made from pure condensed energy.

    This same logic applies to time, giving us the Planck time, the shortest meaningful interval. So, in a way, the Planck scale does define a fundamental limit on the ‘speed’ at which distinguishable events can occur.


  • Is the speed of causation propagation linked to plank length?

    Yes, more specifically the Planck length is derived from an equation involving the speed of light/causality.

    Where C is light, h is reduced planck constant, and G is gravitational constant. Together they tell us the fundamental unit length of meaningful distinction, a very important yard stick for measuring the smallest distances.


  • I have no religious beliefs. The thing that trips me up is how is there matter in the first place if none can ever be created? Why was there stuff at a single point at some time

    The “matter/information can’t be created or destroyed” thing only applies to closed systems within their own operational bounds. It’s about logical consistency within a closed set, but that tells us nothing about where the closed set itself came from. All the energy from the big bang/first universal iteration was loaned from somewhere else. The how and why of this is probably going to remain a mystery forever because our simulations of the laws of physics can’t go back to before the big bang.

    So the nature of the big bang and why anything exists is one of the big open-ended philosophy-of-science questions that there isn’t an easy falsifiable answer to. It’s up to interpretation. I have my own theories on the topic but any guess is as good as another.

    From the good old classic “Because God Did It™” to “bubble universes that foam out from a hyperdimensional substrate with random laws of physics/math that sometimes allow for observation and life” and everything in between. It’s all the same to me because we can’t prove anything one way or the other.


  • What your asking directly stems from two related open ended philosophy-of-science questions. These would be " Are universal constants actually constant?" and “Does the speed of light differ in speed at any point of time in its journey between two points of space in a continuous substrate?”

    The answer to both like all philosophy questions is a long hit on the pot pipe and a “sure man, its possible but remains unlikely/over engineering the problem until we have justification through observing it” however I’ll give my two cents.

    “” Are universal constants actually constant?" " it probably depends on the constant. Fundamental math stuff that tie directly into computations logic and uncertainty precision limits like pi are eternal and unchanging. More physics type constants derived from statistical distribution like the cosmological constant might shift around a little especially at quantum precision error scales.

    The speed of light probably is closer to the first one as its ultimately about mathematically derived logical boundaries on how fast any two points universe can interact to quantize a microstate. Its a computational limit and I don’t see that changing unless the actual vaccum substrate of spacetime takes a sudden phase shift.

    “Does the speed of light differ in speed at any point of time in its journey between two points of space in a continuous substrate?”

    Veritasium did a good video about this one. The answer is its possible but currently unmeasurable . so if all hypothesis generate the same effective results then the simplest among them (light maintaining a constant speed during both ways of trip) is the most simple computationally efficient hypothesis among them.