@fenndev
@leminal.spaceTL; DR: Is it possible (and if so, desirable) to configure my OPNsense router to handle non-standard traffic instead of needing to configure each client device manually? Examples of what I mean by 'non-standard traffic' include Handshake, I2P, ZeroNet, and Tor.
I'm new to electronics and looking to assemble an array of components and tools for working on and designing electronics & circuits. Something immediately apparent is that all of the widely available kits orient you towards working with microcontrollers and SBCs; these kits are cool, but I want to have a halfway decent understanding of the underlying analog components and circuit design before I go digital.
With that in mind, what should I get? If anyone could specify specifics to look into, I'd really appreciate that! Thanks for the help.
Current list
Edit: Thanks for the help, issue was solved! Had Traefik's loadbalancer set to route to port 8081, not the internal port of 80. Whoops.
HI everyone. I've been busy configuring my homelab and have run into issues with Traefik and Vaultwarden running within Podman. I've already successfully set up Home Assistant and Homepage but for the life of me cannot get things working. I'm hoping a fresh pair of eyes would be able to spot something I missed or provide some advice. I've tried to provide all the information and logs relevant to the situation.
Expected Behavior:
*.fenndev.network
are sent to my Traefik server.vault.fenndev.network
are forwarded to Vaultwarden
https://vault.fenndev.network
and utilizes the wildcard certificates generated by Traefik.fenndev_default
network502 Bad Gateway
error with Vaultwarden8081
is open on my firewall and my service is reachable at {SERVER_IP}:8081
.10.89.0.132
is the internal Podman IP address of the Vaultwarden containerServer: AlmaLinux 9.4
Podman: 4.9.4-rhel
Traefik: v3
Vaultwarden: alpine-latest (1.30.5-alpine I believe)
Traefik Log:
2024-05-11T22:09:53Z DBG github.com/traefik/traefik/v3/pkg/server/service/proxy.go:100 > 502 Bad Gateway error="dial tcp 10.89.0.132:8081: connect: connection refused"
cURL to URL:
[fenndev@bastion ~]$ curl -v https://vault.fenndev.network
* Trying 192.168.1.169:443...
* Connected to vault.fenndev.network (192.168.1.169) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
* TLSv1.0 (OUT), TLS header, Certificate Status (22):
vaultwarden.container file:
[Unit]
Description=Password
After=network-online.target
[Service]
Restart=always
RestartSec=3
[Install]
# Start by default on boot
WantedBy=multi-user.target default.target
[Container]
Image=ghcr.io/dani-garcia/vaultwarden:latest-alpine
Exec=/start.sh
EnvironmentFile=%h/.config/vault/vault.env
ContainerName=vault
Network=fenndev_default
# Security Options
SecurityLabelType=container_runtime_t
NoNewPrivileges=true
# Volumes
Volume=%h/.config/vault/data:/data:Z
# Ports
PublishPort=8081:80
# Labels
Label=traefik.enable=true
Label=traefik.http.routers.vault.entrypoints=web
Label=traefik.http.routers.vault-websecure.entrypoints=websecure
Label=traefik.http.routers.vault.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.tls=true
Label=traefik.http.routers.vault.service=vault
Label=traefik.http.routers.vault-websecure.service=vault
Label=traefik.http.services.vault.loadbalancer.server.port=8081
Label=homepage.group="Services"
Label=homepage.name="Vaultwarden"
Label=homepage.icon=vaultwarden.svg
Label=homepage.description="Password Manager"
Label=homepage.href=https://vault.fenndev.network
vault.env file:
LOG_LEVEL=debug
DOMAIN=https://vault.fenndev.network
cross-posted from: https://leminal.space/post/6179210
I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.
I know some very basics about
ffmpeg
and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:
- Smaller file size (obviously)
- Image quality good enough that I cannot spot the difference, even on a high-end TV or projector
- Preserved audio
- Preserved HDR metadata
In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.
Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?
I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.
I know some very basics about ffmpeg
and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:
In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.
Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?
cross-posted from: https://leminal.space/post/4761745
Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.
I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.
git clone --mirror
is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered bygit clone
.The issues I can foresee:
- Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
- Automating this process might be tricky
- Not having direct access/contributor permissions for the Git repos might complicate things, not sure
I'd appreciate any help you could provide.
Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.
I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.
git clone --mirror
is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone
.
The issues I can foresee:
I'd appreciate any help you could provide.