Alt Text

A screenshot of a file manager preview window for my ~/.cache folder, which takes up 164.3 GiB and has 246,049 files and 15,126 folders. The folder was first created about 1.75 years ago with my system

    • Zangoose@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      11 months ago

      Looks like yay is storing every previous binary for AUR bin packages (also excuse the unreadable terminal theme, it doesn’t play very well with a lot of TUI apps unless they support custom theming)

      • Bronco1676@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        11 months ago

        You should run yay -Sc from time to time. This cleans a) your pacman cache (which is normally done by executing pacman -Sc) b) your AUR build cache, which is what’s taking up 160GB. But this one seems rather unusual, I use paru (which also has the command paru -Sc), so I can’t really tell if this is normal with yay.

        The command also asks you for every directory if you want to delete it or not, so it’s completely save to run that command.

        • Zangoose@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          Something I noticed was that it was mostly the binary packages that were taking up so much space, it may be because of how yay stores the programs (does it use git?), the ones that were compiled from source code usually took up the least amount of space, while the binary programs were the ones taking up tens of gigabytes

          • Bronco1676@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            Indeed, yay utilizes the AUR, which essentially serves as a Git repository for each package. These repositories typically include a PKGBUILD file and a .SRCINFO file, along with possible additional files like patches, desktop, or service files.

            For example, take a look at IntelliJ Ultimate: [https://aur.archlinux.org/cgit/aur.git/tree/?h=intellij-idea-ultimate-edition]. It contains the .SRCINFO and PKGBUILD, as well as a .desktop file. These files themselves do not occupy much space.

            The PKGBUILD specifies the sources for dependencies. For instance:

            source=("https://download.jetbrains.com/idea/ideaIU-$pkgver.tar.gz"
                    "jetbrains-idea.desktop")
            

            The PKGBUILD is essentially a Bash script with predefined functions and variables. You can learn more about it here: [https://wiki.archlinux.org/title/PKGBUILD].

            This script primarily downloads and extracts the tar file. In this specific case, it only relocates the files to their intended installation locations, like moving the desktop file to /usr/share/applications.

            With such packages, there’s a possibility of wasting significant space since the tar file is downloaded and possibly retained in the cache.

            However, other packages, especially those compiled from source, usually involve Git clones. These clones bring the Git repository into a subdirectory of the already cloned AUR package Git repo. Some might also have source tarballs. These types of packages generally do not consume much space in the cache, as they are often just text files, like C source code or Python scripts. These packages frequently rely on external libraries and packages, which are not included in this package’s cache.

            While binary packages often bundle all necessary libraries and other components in their source tarballs.

            The AUR cache is mostly beneficial if you’re rebuilding the same version or can reuse components from a previous version. For example, a package might depend on a large, static file that doesn’t change often.

            In Paru, I’ve enabled the “CleanAfter” option to prevent my cache from overflowing. Given my relatively fast internet speed, redownloading large files isn’t a major concern for me.