V

  • 5 Posts
  • 37 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle
  • Yeah I had SearXNG running via a Docker container and it was pretty good. I didn’t like having to use a domain name and expose it over the internet though, because Docker is running on my NAS. I guess I could give it another try using Cloudflare tunnels so I don’t have to open anything up.

    Or else go back to Startpage.


  • schmurnan@lemmy.worldtoTechnology@lemmy.worldWhy I Lost Faith in Kagi
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    7 months ago

    My 100-search trial expired this week and I was literally planning on subscribing later tonight. This has made me think twice.

    But it takes me back to why I tried Kagi in the first place: What else can I use that respects privacy?

    I don’t think any of them do completely. DuckDuckGo uses Bing, so is Microsoft; Google is… well, Google; Brave is apparently really shady; I’ve never thought much of the results from Bing directly. Startpage seemed ok but apparently uses Google.

    What else?

    I also like something to be integrated into the browser. As a Mac user, I can’t add new search engines to Safari (and have actually switched to Orion, but may now switch to Firefox or back to Safari).


  • Sorry, I wasn’t classing Chrome and Chromium as the same thing. I’m a software developer of 20 years so I understand they’re not the same thing. I guess I just took that opportunity to state that I don’t use Google services/products if I can help it.

    In work we’re a Windows house, but I’ve managed to get my hands on an M2 MacBook Pro. For now I’m still using Edge but would like to get my iCloud exemption so I can use some of the apps on my personal MBP for work, and I’m wondering whether I should continue using Edge for work and A. N. Other browser for personal (and mirror this on my iPhone); or whether to use profiles, for example, on Safari and split it that way. I might be limited to what I can download on the work machine, but I’d like to synergies everything as much as I can where possible rather than having two completely different Mac experiences with my iPhone sort of thrown in the middle of both.

    Which browser do you prefer? I assume a Chromium-based derivative?


  • I have/had a ProtonMail account, and whilst it was great, I believe it was only end-to-end encrypted when sending emails to other people using ProtonMail…? Or at least that was my understanding at the time.

    The apps back then weren’t particularly polished, so I ended up migrating everything back to iCloud.

    To be honest, I don’t seem to have any issues with iCloud and everything just works. But that’s the problem with Apple, and how they “get” you.










  • I could be misinformed, but this isn’t just limited to Spark as I understand it, I believe a lot (maybe all?) third-party clients do the same thing. They act as an intermediary between you and the server so they can deliver push notifications.

    However, as I understand it, Spark’s privacy policy outlines that they don’t read/scan the contents of your emails, and the use of app-specific passwords rather than your email password ensures they only have access to emails and nothing else.

    Pretty sure others such as Canary, Airmail, Edison, etc. all do/did the same thing, but it was the lack of clarity in Spark’s privacy policy that made them the main target for scrutiny. I think they’ve since cleared that up.

    I could be mistaken, though.





  • I replied to another comment on here saying that I’d tried this once before, via a Docker container, but just wasn’t getting any results back (kept getting timeouts from all the search engines).

    I’ve just revisited it, and still get the timeouts. Reckon you’re able to help me troubleshoot it?

    Below are the logs from Portainer:

     File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,651 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,654 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikidata: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.google: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.qwant: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.startpage: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikibooks: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikiquote: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikisource: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikipecies: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikiversity: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikivoyage: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.brave: engine timeout
    2023-08-06 10:02:05,481 WARNING:searx.engines.wikidata: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,481 ERROR:searx.engines.wikidata: HTTP requests timeout (search duration : 6.457878380082548 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,482 WARNING:searx.engines.wikisource: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,484 ERROR:searx.engines.wikisource: HTTP requests timeout (search duration : 6.460748491808772 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,485 WARNING:searx.engines.brave: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,485 ERROR:searx.engines.brave: HTTP requests timeout (search duration : 6.461546086706221 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,487 WARNING:searx.engines.google: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,487 ERROR:searx.engines.google: HTTP requests timeout (search duration : 6.463769535068423 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,489 WARNING:searx.engines.wikiversity: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,489 ERROR:searx.engines.wikiversity: HTTP requests timeout (search duration : 6.466003180015832 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.wikivoyage: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.wikivoyage: HTTP requests timeout (search duration : 6.466597221791744 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.qwant: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.qwant: HTTP requests timeout (search duration : 6.4669976509176195 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikibooks: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,491 ERROR:searx.engines.wikibooks: HTTP requests timeout (search duration : 6.4674198678694665 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikiquote: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 WARNING:searx.engines.wikipecies: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikiquote: HTTP requests timeout (search duration : 6.468321242835373 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikipecies: HTTP requests timeout (search duration : 6.468797960784286 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,496 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,497 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 6.47349306801334 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,511 WARNING:searx.engines.startpage: ErrorContext('searx/engines/startpage.py', 214, 'resp = get(get_sc_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,511 ERROR:searx.engines.startpage: HTTP requests timeout (search duration : 6.487425099126995 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:04:27,475 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:27,770 WARNING:searx.engines.duckduckgo: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:04:27,771 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.2968566291965544 s, timeout: 3.0 s) : TimeoutException
    2023-08-06 10:04:50,094 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:50,187 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.ConnectTimeout', None, (None, None, 'duckduckgo.com')) False
    2023-08-06 10:04:50,187 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.0933595369569957 s, timeout: 3.0 s) : ConnectTimeout
    

    The above is a simple search for “best privacy focused search engines 2023”, followed by the same search again but using the ddg! bang in front of it.

    I can post my docker-compose if it helps?




  • Before putting Pi-hole behind Traefik, it worked perfectly via :/admin. And the logs for Pi-hole now in Traefik show that it is up and working, and I get the login page. But just can’t get beyond it.

    The guides I’ve seen show how to structure the Traefik labels with and without the addprefix middleware, and both apparently work. So I’m wondering if by following several guides and taking bits from each, I’ve ended up overlooking something.

    I’ll try and expose 80 and see if it makes a difference, but like I say everything is up and running in the backend, I just can’t get past the login screen on the frontend.




  • Just a quick update on where I’m up to…

    I’ve managed to get all my containers working behind the Traefik reverse proxy with SSL. I’ve also deployed a Cloudflare DDNS container in Docker and have linked the external IP address of my Synology NAS to Cloudflare. I haven’t port forwarded 80 and 443, though, so it’s not accessible over the internet. So I’ve added local DNS into Pi-hole so I can access all the containers using subdomains.

    I’ve also deployed an Authelia container and have started running through my containers adding 2FA in front of them all.

    I should probably point out at this juncture, that if I encounter any errors, the HTTP 404 page that I get is a Cloudflare one - I assume that’s expected behaviour?

    So, the final three bits I’m struggling with now are:

    • Pi-hole behind the reverse proxy
    • Portainer behind the reverse proxy
    • Accessing Vaultwarden over the internet (because as soon as I leave my house, if the vault hasn’t synced then I don’t have access to all my passwords) - unless anybody has a better suggestion?

    Portainer - I have no idea how I do it, because I use it to manage my containers, so don’t have the config for Portainer in Portainer (obviously). So if I screw up the config, how am I getting back in to Portainer to fix it?

    And the far more troubling one is Pi-hole. I just cannot get that thing working behind the reverse proxy.

    I’ve followed a few different guides (though none of them are recent), and the below is the latest docker-compose I have. It will bring up the login page, but when I login it keeps returning me back to the login page - it won’t go to the main admin page.

    version: "3.7"
    
    services:
      pihole:
        container_name: pihole
        image: pihole/pihole:latest
        restart: unless-stopped
        networks:
          - medianet
          - npm_network
        ports:
          - 8008:80
          - 53:53/tcp
          - 53:53/udp
        environment:
          - TZ=Europe/London
          - WEBPASSWORD=xxxxxxxxxx
          - FTLCONF_LOCAL_IPV4=192.168.1.116
          - WEBTHEME=default-auto
          - DNSMASQ_LISTENING=ALL
          - VIRTUAL_HOST=pihole.mydomain.com
        volumes:
          - /path/to/pihole:/etc/pihole
          - /path/to/pihole/dnsmasq.d:/etc/dnsmasq.d
        cap_add:
          - NET_ADMIN
        labels:
          - traefik.enable=true
          - traefik.http.routers.pihole.entrypoints=http
          - traefik.http.routers.pihole.rule=Host(`pihole.mydomain.com`)
          - traefik.http.middlewares.pihole-https-redirect.redirectscheme.scheme=https
          - traefik.http.routers.pihole.middlewares=pihole-https-redirect
          - traefik.http.middlewares.pihole-addprefix.addprefix.prefix=/admin
          - traefik.http.routers.pihole.middlewares=pihole-addprefix
          - traefik.http.routers.pihole-secure.entrypoints=https
          - traefik.http.routers.pihole-secure.rule=Host(`pihole.mydomain.com`)
          - traefik.http.routers.pihole-secure.tls=true
          - traefik.http.routers.pihole-secure.service=pihole
          - traefik.http.services.pihole.loadbalancer.server.port=80
    
    networks:
      medianet:
        external: true
      npm_network:
        external: true