Firstly, I wanted to apologize for the silence over the past couple weeks. Work, life took over administering this instance.
Onto the good stuff.
As some of you may have noticed we skipped 0.18.0
because of some unforeseen issues but we're now on 0.18.1
. In my extremely minimal testing, the upgrade seems to have gone through largely smoothly! Please do let me know if you see any weirdness. (Some old themes might be borked, please update your own interface accordingly).
I am aware that Jerboa was completely broken, hopefully it works now (I can't test it since I don't have access to Android).
Time for some stats:
$ df -h
Filesystem Size Used Avail Use% Mounted on
tmpfs 392M 1.7M 390M 1% /run
/dev/vda1 94G 21G 69G 23% /
tmpfs 2.0G 0 2.0G 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 392M 4.0K 392M 1% /run/user/1002
$ free -m
total used free shared buff/cache available
Mem: 3911 475 142 140 3294 3020
Swap: 2399 88 2311
Another important thing I did was upgrade the instance to a mid-tier vultr plan (we did run out of disk space on the old one). Here's the new plan:
AMD High Performance 2 vCPU, 4096 MB RAM, 100 GB NVMe, 5.00 TB Transfer
And last month's vultr stats:
This brings our yearly costs to (there's occasional bumps because of some vultr snapshot nonsense) regardless:
domain: $12/year
vultr: $24 (instance) + $4.8 (backups) = $28.8/month = $345.60 / year
email: still free tier on zoho, woo!
total: $357 / year
Let me know questions/concerns, bugs you've noticed after the upgrade.
Cheers!
In case anyone was wondering, yes we were down for ~2 hours or so. I apologize for the inconvenience.
We had a botched upgrade path from 0.17.4
-> 0.18.0
. I spent some time debugging but eventually gave up and restored a snapshot (taken on Saturday Jun 24, 2023 @ 11:00 UTC).
We'll likely stick to 0.17.4
till I can figure out a safe path to upgrade to a bigger (and up-to-date) instance and carry over all the user data. Any help/advice welcome. Hopefully this doesn't occur again!
Hey all,
It's been slightly over two weeks since lemmyrs started. It's been pretty fun watching the community grow!
Some instance stats for you:
$ df -h
Filesystem Size Used Avail Use% Mounted on
tmpfs 97M 1.7M 96M 2% /run
/dev/vda1 24G 15G 7.1G 68% /
tmpfs 485M 0 485M 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 97M 4.0K 97M 1% /run/user/1002
$ free -mh
total used free shared buff/cache available
Mem: 969Mi 445Mi 77Mi 134Mi 445Mi 240Mi
Swap: 2.3Gi 571Mi 1.8Gi
lemmy=# select count(id) from local_user;
count
-------
294
(1 row)
We're cutting it pretty close in terms of RAM and Disk usage, the user growth rate has mostly flat-lined though once r/rust came back online so I'm not too concerned. When (If) it's time I'll likely bump up the Vultr instance plan to something which will continue to serve us for the foreseeable future.
Previous relevant posts:
Hey all,
Just thought I'd share an update. I have added a few new communities and renamed the existing communities to have slightly more consistent naming throughout this instance. Icons are primarily from Wikimedia Commons (replacements welcome as long as there are no copyright issues).
Added:
- Rust: Web Development
- Rust: Game Development
- Rust: Embedded Systems
Renamed:
- Memes to Rust: Memes
- News to Rust: News
- Support to Rust: Support
- Meta to Rust: Meta
PS: The identifiers for the renamed communities remain the same. Open to any suggestions/thoughts on this change or otherwise.
Cheers!
Hey everyone, thought I'd post some stats since we're one week old now!
From Vultr (instance is hosted through them):
Total applications: 116
Denied applications: 4 (one person asked to change username, 3 others gave one word answer to the application question)
Accepted applications: 112
docker stats (snapshot):
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
7f365c848236 caddy 0.19% 43.22MiB / 969.4MiB 4.46% 7.23GB / 7.65GB 631MB / 146MB 8
d9421a5d930a lemmy-ui 0.00% 49.62MiB / 969.4MiB 5.12% 1.51GB / 3.32GB 869MB / 1.26GB 11
e8850c310380 lemmy 0.08% 52.53MiB / 969.4MiB 5.42% 5.67GB / 5.86GB 942MB / 582MB 8
7ebb13fde277 postgres 0.02% 304.2MiB / 969.4MiB 31.38% 908MB / 2.97GB 3.82GB / 14.4GB 12
9b471baacf84 pictrs 0.05% 10.32MiB / 969.4MiB 1.06% 53.5MB / 1.18GB 653MB / 360MB 14
df -h:
Filesystem Size Used Avail Use% Mounted on
tmpfs 97M 1.7M 96M 2% /run
/dev/vda1 24G 12G 11G 53% /
tmpfs 485M 0 485M 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 97M 4.0K 97M 1% /run/user/1002
https://strawpoll.com/polls/Q0ZpRL6ExnM
What's your opinion? Vote now: Yes! I would feel more confident if a non-profit foundation is running this, No! I want this instance to be completely independent...
I've had a few questions about trouble accessing other communities from here.
First and foremost, I request you to be patient, lemmy is...alpha software at best imho. There's 200+ issues on github right now and very few maintainers. No one expected things to take the turn they did within a matter of days, but here we are :)
Biggest known issues:
This is out of my hand and I can confirm that its busted. I tested locally and the current main lemmy backend branch is incompatible with lemmy-ui branch. Can't even login if you set everything up locally.
Good news is that there is shoddy workaround. Say you want to access c/gaming
from beehaw.org. Enter the full url https://beehaw.org/c/gaming
in your search, it won't show up, click search a couple times then wait a sec, then enter just gaming
and it pops up magically.
There's jerboa
for Android and mlem
for iOS. Both are under heavy development. Thankfully the website works fine on mobile...mostly.
PS: I'm not a lemmy maintainer, just a hobbyist self-hoster and professional Rust developer trying the fediverse as much as y'all are :)
https://geekingfrog.com/blog/post/getting-things-done-with-async
I have noticed some questions around whether lemmyrs.org will continue to be up and running for a long time. I'm hopeful that it does.
For full transparency, here's what I'm currently personally paying for:
AMD High Performance 1 vCPU, 1024 MB RAM, 25 GB NVMe, 2.00 TB Transfer
Total cost (yearly): $96 + (some tax).
As things stand right now ~$100/year is easily affordable but as the number of users grow, it largely boils down to egress and storage costs. I can personally bear most of it, but if it starts booming then I'll have to rethink about options.
Rest assured, we will be here for the long run!
I'm just one person here, if this gains traction I'm gonna need some help with moderation and administration. Keeping this open to discuss the future possibilities!
@admin
@lemmyrs.org