• 0 Posts
  • 48 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle
  • I think a lot of it comes from schools, and in particular physical education and competitive sports. There is nothing wrong with competitive sports but the attitudes around it in schools can be so toxic, and in particular it can be used to create hierarchies. The idea of being good at sports and that being masculine was something I certainly experienced a lot at school. Also people who weren’t as academic but thrived in sports were lauded.

    My school had various sports teams and clubs, and fuck all academic activities. Sports aren’t toxic but the attitudes around them can be, and particularly adults who feed in toxic attitudes and values around it.





  • Yeah not sure I agree with all of this.

    When it comes to KDE this feels out of date. The GTK issues are not what they once were; KDE Plasma has good GTK themes that match the KDE ones. Nowadays I find the main issues are with Flatpak software not matching DE themes because they’re in a sandbox. I’ve had that issue on both KDE and gnome 2 derived environments (I’ve never really gotten into Gnome 3). KDE also used to have a reputation for being slow and a resource hog; that’s inverted now - KDE has a good reputation including for scaling down to lower powered machines, while Gnome 3 seems to have a reputation as a resource hog?

    I have a KDE desktop environment and it’s very attractive, and I haven’t had any glitches beyond issues with Flatpak (VLC being a recent one that I managed to fix). I would say the mainstream themes for DE work in the same way as a windows theme works. The problems are when you go to super niche attempts to pretty up the desktop - but you’d get similar issues if you tried that in windows.

    I agree regarding the professional apps. If you are tied into specific proprietary Windows software then Linux is difficult. The exception is Office 365 which is now both Windows and Web App based, and the web apps are close to feature parity with the desktop clients. The open source alternatives to windows proprietary software can be very good, but there are often compromises (particularly collaboration as that is generally within specific softwares walled gardens). Like Libre Office; it’s very good and handles Office documents near seamlessly, but if your work uses Office then it you lose the integration with One Drive and Teams.

    In terms of Linux not supporting old software, I would caveat that that is supporting old linux software. It is very good at supporting other systems software through the various open source emulators etc. Also Flatpak has changed things somewhat; software can come with it’s own set of libraries although it does mean bloat in terms of space taken (and security issues & bugs albeit it limited to the app’s sandbox). And while Wine can be painful for some desktop apps it is also very robust with a lot of software; it can either be a doddle or a nightmare. Meanwhile Proton has rapidly become very powerful when it comes to gaming.

    I disagree that it takes a lot of time to make basic things work. Generally Linux supports modern hardware well and I’ve had no issues myself with fresh installs across multiple different pieces of hardware (my custom desktop, raspberry Pi, and a living room PC). Printing/Scanning remains probably the biggest issue but I’ve not had to deal with that in a long time. But problem solving bigger issues can be hard.


  • To answer your questions:

    When it comes to other distros; I currently use Linux Mint with KDE Plasma desktop. The debian/ubuntu ecosystem is pretty easy to use and there are lots of guides out there for fixing/tinkering with Linux Mint (or Ubuntu which largely also works) because of their popularity. Lots of software is available as “.deb” packages which can be installed easily on Linux Mint and other Debian based systems including Ubuntu.

    I’ve also been trying Nobara on a living room PC; that is Fedora based. I like that too, although it has a very different package manager set up.

    Whatever distro you choose, Flatpak is an increasingly popular way of installing software outside the traditional package managers. A flatpak should just work on any distro. I would not personally recommend Snap which is a similar method from Cannonical (the people behind Ubuntu) but not as good in my opinion.

    In terms of desktop environments, I like Linux Mint’s Cinnamon desktop, but have moved over to KDE having decided I prefer it after getting used to it with the Steam Deck. KDE has a windows feel to it (although it’s very customisable and can be made to look like any interface). I’ve also used some of the lightweight environments like LXDE, XFCE etc - they’re nice and also customisable but not as slick. You can get a nice look on a desktop with a good graphics card with KDE. The only desktop environment I personally don’t like is Gnome 3 (and the Unity shell from Ubuntu); that may just be personal preference but if you’re coming from Windows I wouldn’t start with that desktop environment - it’s too much of a paradigm shift in my opinion. However it is a popular desktop environment.


  • BananaTrifleViolin@kbin.socialtoLinux@lemmy.mlLooking to make the switch
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    8 months ago

    I’ve been dual booting between Linux and Windows for maybe 10 years or so (and tinkered with linux growing up before that). I think maybe similar to you, I’m technically apt when it comes to computers but not a programmer; I’m good at problem solving issues with my computer and am not afraid to “break” it.

    A few key things:

    • Make sure your important personal data, files etc are kept secure and always backed up. This is probably obvious, but it does lower the threshold for tinkering and messing with the computer. I’ve reinstalled Windows and Linux multiple times; whether that’s getting round broken Windows updates, or Linux issues or just switching up which Linux distro I use. If you are confident you have your data backed up, then reinstalling an OS is not a big deal
    • Use multiple drives; don’t just partition one drive. Ideally each OS gets it’s own SSD; this will make dual booting much easier and also allows complete separation of issues. I have 4 hard drives in my PC currently - A 1TB C Drive SSD for Windows, a 500 GB Linux SSD drive, and two 4TB data drives (one is SSD one is just a standard HD). SSD is faster but you can of course use a mechnical drive if you want.
    • When it comes to dual booting, if you have a separate linux hard drive, then linux will only mess around with it’s own boot sectors. It will just point at the Windows boot sector on the windows hard drive and not touch it but add it as an option to it’s boot menu. Then all you have to do is go into your Bios and tell it to boot the Linux drive first, which will get you a boot menu to chose between Linux and Windows. Tinker with that boot menu (Grub2 usually) - I set mine to always boot the last OS selected, so I only have to think about the boot menu when I’m wanting to switch. Separate drives saves you having to mess around with Windows recovery disks if things go wrong with the boot sector. One drive with a shared or multiple boot sectors can be messy.
    • Try a few Distros using their live images. Most Linux distros you flash onto a USB stick, boot onto that (OR use VirtualBox in Windows to try Linux in an emulated environment) and it takes you into the full desktop environment running from the stick. You can then install from that. But you can also use linux that way. You can even run linux entirely from those USB sticks (or an external drive) and get a feel for it, including installing more apps, upgrading etc all using the USB stick as storage.
    • Also try a few different different desktop environments and get a feel for which one you like. Most distros default to a desktop environment (Gnome, KDE, Cinnamon, etc). You only really need to test the desktop environments with one distro as they’ll feel mostly the same in each distro.

    If you know you want to use Pop_OS, then follow their guide on how to install. It’s generally very similar for all linux OSs (there are other methods but this is the simplest and most common):

    1. Download a disk image (ISO)
    2. Flash the disk image onto a spare USB stick. Balena Etcher is a very commonly used tool for this.
    3. Restart your computer and go into your bios (usually the Del key just after reboot, sometimes Escape or F2) and change the boot order to that USB is 1st, above your hard drives
    4. Insert the USB stick and restart the computer
    5. You should load into the Linux live environment set up by that distro. Pop_OS loads you directly into the installer; you can go to the desktop by clicking “Try Demo Mode” after setting up langauge and keyboard. You can just continue installing.
    6. Select the hard drive you want to install onto. BE CAREFUL at this step; most installers are good at making clear which drives are which. The last thing you want to do is wipe a data drive or your main OS. Know your computer’s drives well, and if in doubt the safest thing is to unplug all the hard drives except the one you’re going to install Linux onto.
    7. Follow the installer set up (to create the main user account, etc) and install.
    8. After installing reboot the system and go back into the bios. This time put your linux drive at the top of the boot order (or below USB if you still want to boot other live images - remember to take out the stick! But generally more secure to boot to a hard drive and password protect your bios so people can only boot to USB when you decide). That’s it! Reboot, and select linux from the new boot menu.

    Linux has come a very long when it comes to installing and setting up; installers are generally easy to use, work well and generally hardware is recognised and set up for you. The exception will be the Nvidia graphics card - you will need to set up the Nvidia drivers. Pop_OS’s install guide shows how to do it.

    Hope that helps! Run out of characters!






  • So big mistake here: NAC is not harmless. It does have side effects and it also has toxicity at high doses.

    It has not been studied in long term use orally or IV, it’s main use bomg short term use for paracetamol overdose treatment. Inhalation is more studied but it is not absorbed into the body in the same way.

    We think it is safe but we haven’t actually done human trials to be sure. What we have found in mice is that high doses can cause lung and heart damage and also when it comes to alcohol it is protective if taken before alcohol consumption BUT it amplifies the toxicity to the liver if about 4 hours taken after alcohol. All of this is summarised on the Wikipedia page which looks to be good quality.

    Overall it may be a useful drug but don’t take it off label or self medicating. Medicine is littered with unexpected effects of drugs that only came out once it was too late. Thalidomide is a good example - a “wonder drug” for nausea used in pregnancy that was not tested and caused horrific birth defects which only became evident when it was too late.

    Your body is not a lab, be careful experimenting with supposedly “safe” drugs.


  • I think your second half is bang on the nail for the missing part of this story. It is not just to drive search directly, it is also to control the browser market long term.

    That’s what Microsoft did very successfully with Internet Explorer too. They have it away for free and bundled it with Windows, killing all competition and then used that to leverage MSN. They also didn’t follow standards and through market dominance shaped the internet.

    Google sort of follows standards but they have also forced through proprietary standards or have broken code which is why some websites don’t work well in Firefox or Safari even now.

    Chromium may be open source but it is a tool used by Google to control and dominate the internet.

    Apple is exactly the same with WebKit - they talk about privacy and security but the real motivation is surpressong alternate routes to the internet from their devices whic then keeps iron control over payment methods particularly in iOS. Yet people in the apple eco system buy into the narrative that the one piece of software you’re not allowed in iOS is a non apple web browser, as if that is an acceptable approach. It’s just another manifestation of anti competitive behaviour and the power and money you can get by “free” software.


  • So the way evolution works, the design we have works well enough that it doesn’t cause problems. It might be the best possible design or it might not, all that mattered is that whenever it arose in evolutionary history it was either an advantage over what camebefore in terms of survival so propagated or it was not detrimental and paired with something else genetically that propagated.

    We can’t definitively answer your question but we can speculate on why it’s a good idea to separate urine and faecal matter. Urine is a reasonable medium for growing bacteria. That wouldn’t matter in the colon but would matter if bacteria from the colon could ascend into the kidneys and diarupt it’s function. Valves could help or a bladder that drains into the colon, but complete separation may just be better.

    It may also be that the acidic nature of urine would disrupt the helpful bacteria we rely on to colonise our guts to help digest foods.

    Another possibility is the constant flow of urine would mean our faecal matter would never dry out. It’d be like having diarrhoea all the time and we’d need to poop constantly. The colon retrieves enough water - but not all water - that’s why poop isn’t hard as rock. If it was flooded with fluid it may not need to retrieve fluid.

    The fluid might even be stuck in a cycle between the colon and the kidneys and make it harder for the body to keep homeostasis - as the kidneys excrete more fluid to try and regulate fluid volume the the colon could just resorb it. Basically the colon could end up working against the kidneys and cause even more work for thenl body. It may just be less efficient than discarding water as needed.

    Drier faecal matter in the colon and a reservoir of fluid in the bladder does also give us freedom to release when it is safe to do so, which may protect us from predators (having to stop to poop even a few times a day is dangerous compared to only going when you know it’s safe to as there are more opportunities to be attacked by a predator). It would also be very easy to track an animal that leaves a constant trail of poop and urine uncontrollably behind it.

    All or none of these may be reasons why we have separate urinary and alimentary tracts; it’s impossible to know and would always be speculation. But regardless these do seem like reasonable reasons why we may have separate tracts.





  • This is the problem with believing too much in models. A model can show you anything you want - it’s output is only as good as the parameters and algorithms you set it.

    Modelling the climate in the next 50-100 years is already extremely difficult and fraught with inaccuracies but we have lots of models and data to extrapolate from, so we do have a crude idea where we’re going. But we can’t model next years weather with accuracy, just the base trend. Crucially important warning for climate change but limited otherwise.

    Modelling out to 250 million years is basically a crock of shit. The tectonic movements are predictable and gross predictions that a pangea arrangement might be warmer may have some validity but modelling the climate and evolution and status of mammals is pure conjecture.

    Good thing about modelling that far is you will never have see you model’s accuracy being tested. Publish a paper, play into current fears around climate change with an irrelevant prediction about 250million years away, get an article published in the New York times and egos massaged all round.