• 3 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle


  • Well, that was something… I have used ligatures in my code editor for quite a few years now, and I have NEVER been confused about the ambiguity this person is so upset about. Why? I have never ever seen the Unicode character for not equals in a code block, simply since it is not a valid character in any known language. In fact, I have never even seen it in a String where it actually would be legal, probably since nobody knows how to type that using a standard keyboard. This whole article felt like someone with a severe diagnose have locked in on some hypothetical correctness issue, that simply isn’t a problem in the real world.

    But, if you for some reason find ligatures confusing, then you shouldn’t use them. But, just to be clear, there is not a right of wrong like this blog post tries to argue, it is a matter of personal taste.




  • The problem is that C is a prehistoric language and don’t have any of the complex types for example. So, in a modern language you create a String. That string will have a length, and some well defined properties (like encoding and such). With C you have a char * , which is just a pointer to the memory that contains bytes, and hopefully is null terminated. The null termination is defined, but not enforced. Any encoding is whatever the developer had in mind. So the compiler just don’t have the information to make any decisions. In rust you know exactly how long something lives, if something try to use it after that, the compiler can tell you. With C, all lifetimes lives in the developers head, and the compiler have no way of knowing. So, all these typing and properties of modern languages, are basically the implementation of your suggestion.






  • I know I have used it since Fedora made it default in 2016. I think I actually used it a while before that, but I don’t have any thing to help me pin down the exact time.

    Since I only use Intel built-in GPU, everything have worked pretty well. The few times I needed to share my screen, I had to logout and login to an X session. However, that was solved a couple of years ago. Now, I just wait for Java to get proper Wayland support, so I fully can ditch X for my daily use and get to take advantage of multi DPI capabilities of Wayland.



  • But is the desktop really the most relevant measurement? Wouldn’t it be more relevant to talk about “primary” devices? When I grew up, the desktop was what people used to connect with Internet and everything that comes with that. Hence, Linux on the desktop seemed to be relevant. Now, that is still relevant in relation to work and gaming, but for general use people use other devices. So instead of “on the desktop” I think we should talk about “for work”, “for gaming” and “for programming”.






  • From their documentation

    Unlike classic terminals, Warp requires you to sign up and log in to get started with the app.

    So, yeah, it might be that people are not very impressed by a terminal that requires a cloud account.

    But, if you don’t type anything sensitive on to your terminal, like passwords and such, then you should be fine…




  • snaggen@programming.devtoLinux@lemmy.mlMy move to wayland: it's finally ready
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    20
    ·
    edit-2
    8 months ago

    If you avoid Nvidia, it have been ready for many years. And to be honset, not sure X11 was really stable with Nvidia either. My main issue with Wayland, is that X doesn’t have multi dpi support… and for that I really cannot blame Wayland. Also, Skype doesn’t have screensharing, well, they actually had for a while, but then removed it… still, hard to blame on Wayland.

    But as a general rule, if you have Nvidia, then you are not allowed to complain about anything… that was your choice, and with Nvidia under Linux, all bets are off. I thought that was clear a long time a go, especially after Linus not so subtle outburst.