I made a simple self-hosted subscriptions costs tracker in less than 30 minutes !
submitted by /u/Available-Advice-294 [link] [comments] |
Happy Friday, r/selfhosted! Linked below is the latest edition of This Week in Self-Hosted, a weekly newsletter recap of the latest activity in self-hosted software.
This week covers beta releases for Thunderbird mobile and Borg Backup rclone support, new software launches (many generated by this community!), directory additions, and a spotlight on Streamyfin - a simple and user-friendly mobile Jellyfin client with a ton of features.
The ability to remain installed and undetected makes Perfctl hard to fight. ~ Dan Goodin – 4 Oct 2024 00:42
I'm registering a domain name to use for an email address and want to factor everything to avoid getting locked out of my account. I'm taking into consideration the domain registrar deleting the domain, or terminating my account (I'm not doing anything suspicious but I want to account for that possibility).
Is it possible to register a domain directly with the registry to cut out the registrar? The TLD I want is controlled by CentralNIC in this case.
Am I being overly paranoid? How do others handle this? If something happens to the domain name, for some services it can be a huge pain in the ass to change the email address away from one that's not working anymore. I'm trying to avoid that.
The average bandwidth/egress fees for most cloud providers (Vercel, Google Cloud, AWS etc.) is something around $0.10-$0.20/GB (and significantly cheaper for VPS/dedicated servers, but still expensive imo)
I’m an individual living in french province in the middle of nowhere, and I pay $30/month for a 8Gb/s internet plan with unlimited data transfer. (Some will argue that there is probably a “fair use” policy, however in 2 decades I’ve never heard a single person saying they got banned for using too much bandwidth, including people who are using their personal internet plan as heavy torrent seedboxes)
So for $30 I can download/upload thousands and thousands of terabytes every month, but companies specialized in cloud/hosting services are charging me $0.15 per gigabyte??!!
Either my internet provider is losing a lot of money, or these cloud companies are scamming me? Or is there something else I’m missing? Who is really paying for bandwidth at the end? Because when you need cloud computation, I understand that AWS and others buy CPUs/GPUs, electricity, build datacenter and servers and rent them to us, but bandwidth? They’re probably not the ones building the worldwide internet infrastructure with transatlantic cables etc. so who is paying for that? Where the money goes? How much it cost? Who is getting scammed?
Repo: https://github.com/hazzuk/compose-backupdate
I currently run my Docker containers inside a virtual machine, and whilst I do create daily backups. I'm only creating a backup of the entire VM. Which I've never been fully satisfied with for two reasons:
During my search for a solution I found this Reddit post. Which shared some similar grievances with the tools I had so far found.
So I just ended up making my own tool.
backupdate is a BASH script for creating scheduled backups, and performing (backed-up) guided updates on Docker compose stacks.
compose.yaml
filesSimply do backupdate -s "nginx" -d "/path/to/docker" -b "/path/to/backup"
and it will safely create .tar.gz backups of the stacks working directory and any associated named volumes.
Or apply -u
to perform a backup alongside a guided update. Plus, if you're already inside the compose.yaml directory then you can do something even shorter like backupdate -u -b "/path/to/backup"
.
Cloud backups:
To keep things simple I didn't want to reinvent the wheel when it came to uploading to the cloud. I'd much rather rely on tried and tested tools like rclone. Which I quickly came to understand why it has 46k stars on GitHub. It was a surprisingly easy to learn tool (but with lots of features) that rather amazingly works with almost any cloud storage provider.
And so backupdate's focus is on simply and safely providing a way to get your data out of Docker. After which you can use any preferred cloud/local data backup solution. And make it automated with something like a cron job or with a tool like Cronicle.
You can find backupdate on GitHub. And any feedback, bugs or feature requests are greatly appreciated. 💚
Hi all,
I need to make Paperless NGX and Nextcloud available for some not technically affine people (parents ;) ) to make some tax documents etc. available.
As these are some sensitive documents (e.g. contracts, etc.) the access should be as secure as possible. However, I cannot use WireGuard (they wouldn't want any additional software to be installed on their PC).
Also, I would like to use the same credentials for Nextcloud and Paperless NGX, so I'm thinking to use something like Authentik.
Any recommendations?
Backstory. This morning, someone tried to break into my home, so I would like to install security cameras. My questions start here: for some time, I have been thinking about making a home server that would have sensors (for smoke, or maybe something else), and if possible, functionalities for data storing and sharing on a local network, maybe also music streaming, etc. And now I would like to add CCTV functionality to all that. Is that possible to do on one server, which OS to use?
Hi, As the title says I’m looking for a good and inexpensive case for my new 3.5“ HDD. I can only use USB A 3.0 von my Intel NUC.
I ordered 4 different cases on Amazon but all had those buttons to turn on the drive. So when the server needs to restart, I always need to turn on the HDD manually. This is really annoying.
Do you guys have an recommendation?
https://github.com/deskangel/DaRemote
This is THE best fucking ssh/monitoring/docker monitoring/dashboard app I have found yet. Individual Cores stats, docker container controls and stats, ssh, system overview, storage overview including mounted subsystems/drives/raid.
I 1p0% recommend it any chance I get and purchased pro.
To the maker/team:
You are amazing. Thank you.
Hi all I have tried creating a tunnel with the purpose of creating an api that I should be able to access from anywhere. I have a tunnel setup as shown. This is my config for the tunnel running on my raspberrypi GNU nano 7.2 /root/.cloudflared/config.yml
I am running sudo cloudflared tunnel run tunnelid I have code that gets data with SELECT * from test_db and it successfully does when I go to 192.168.0.168:5000 on my local network. However when I go to api.mydomainname.com I get an error 1033 Im in desperate help please. [link] [comments] |
Dear All, I'm setting up my first home server. I've buyed a Beelink SER5 Pro Mini pc and it runs with AMD Rizen 7 5800H, 32GB RAM, 1 NVME SSD 512GB and a 2TB SATA SSD. Additionaly I've taken two mechanical external 5TB HDD to use them as rotated backup repository. My main focus is have my personal data and media always online (about 20years of photos some hundred of gbs of videos). I use Plex/Jellyfin, and I'm planning to give a try to Nextcloud, Himmich, home assistant and would like to virtualize the preinstalled windows 11 and perform a GPU passthrough to try some little gaming experience (I've read it works quite good with Proxmox). My actual situation is the one below and all is running pretty well: I'm in trouble to choose the best way to share files to my network (mainly windows notebooks) and beetwen VMs/LXC and how to backup it. The two SSDs are formatted this way: Once I've virtualized the actual Win11 installation I'm planning to make all the nvme drive etx4, while the SATA SSD is LVM and the 1TB virtual drive I'm sharing is stored there. Can you help me to figure out the best configuration for my thoughts? My main focus is to make sure my personal files are always correctly backupped to the 2 external drive. Maybe I should use Syncthing instead of PBS to achieve this easily? Any help is appreciated. [link] [comments] |
Hi how do you guys keep up with the latest tech news and stufff?
I'm generall interested in news surrounding everything in IT.
I've heard good stuff about TL;DR newsletter, but after having it for one moth plus, it feels to me like it's just featuring openai EVERY day, then some apple sprinkled here and there and elon/spacex, it feels like just those topics are written about. Id love a newsletter that has some varieting content, like occasionally covering things like cybersecurity....
Thanks for any suggestions and your opinion on tldr newsletter
This is probably a really frequently asked question, so PLEASE link another post answering it (if there is one). I couldn't find any, although it may just be Reddit search or DuckDuckGo.
Server specs: i7700 32GB DDR4 RAM 1tb NVME 8tb HDD.
That's it, thanks so much in advance!
As an indiehacker, I know what an indiehacker needs. So I created Tianj Tianji is not just for a single feature, but for a complete user profile. That is, indiehackers and SMBs I know what users want, so I put everything on Tianji. Here are my ideas: Website Analytics allows users to quickly build information about their website status Monitor monitors service availability and sends alerts Compared to Monitor, Server monitors from the server level. Page shows users a page showing service availability Telemetry collects visits from third-party websites in a more anonymous way, such as emails, Github profiles and readme pages Survey is an open channel for users to submit feedback. Compared to the conventional low-code approach of providing ready-made pages, I know that indiehackers want a pure API because people will implement their own forms in their own way. Finally, Feeds connects everything. Do you know Logsnag? This is Logsnag in Tianji. In addition, everything is API-first, and all functions have corresponding openAPIs available. I also know that open source code will give all users peace of mind, and I hope everyone can work together to improve it, so Tianji is also an open source project. [link] [comments] |
I'm not sure what part I did wrong. But I can't access my WEKAN(snap) and MEMOS(docker)
I'm completely a newbie, studied biotech and has no coding background.
illustrated site names for memos and wekan are memos.XXXXX.com and wekan.XXXXX.com
What I Did
:80 --> return 301 https --> :443 --> got all ssl certs keys from let's encrypt and well configured and certs keys of each site in their own folders --> proxy_pass memos http://localhost:5230 and wekan http://localhost:3001
* I used literally "localhost" in the proxy_pass since I'm not sure if it should be my external IP/domain or my internal IP
Note that I tried different expressions, like combining domains together in one function. But now I'm using server {\}* for each http-to-https and proxy_pass+ssl. That means in my case, I have four server {\}* functions. (wekan 80-433) (wekan 433-3001) (memos 80-433) (memo 433-3001)
below is the Nginx config example. It is in my /etc/nginx/sites-available/
server { listen 80; server_name ; return 301 https://$host$request_uri; } #memos site port 5230 server { listen 443 ssl; listen [::]:443 ssl; ssl_certificate /etc/letsencrypt/live/memos.XXXXX.com/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/memos.XXXXX.com/privkey.pem; include /etc/letsencrypt/options-ssl-nginx.conf; server_name ; location / { proxy_pass ; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $http_x_forwarded_proto; }wekan.XXXXXX.commemos.XXXXX.comhttp://localhost:5230
MEMOs
I did sudo snap set wekan root-url="wekan.XXXXX.com"
and sudo snap set wekan port='3001'
I successfully accessed my memos when I just use "http" + modem port forwarding.
docker ls -a showed the memos status "Exited (0) 26 hours ago", I started Memos again, but it said it is already started.
WEKAN
At first I followed someone's guide, and put its didcated config in /etc/nginx/conf.d/wekan.conf
It worked when use only http + modem port forwarding.
I then set up the whole thing but it doesn't work. I rm the wekan.conf and put down the setting in /etc/nginx/sites-available/default
A few weeks ago I presented my self hosted recipe management app “FlavorMate”. Since then I implemented a few new functions and the next one I want to address is deep linking. I know how to set up “universal links” or “app links” for a static url but with self hosting everyone has a different url.
I could host a little page that does nothing but opening the app or redirecting to the self hosted web app if the app is not installed on the device. The downsides are that I can possibly see your requests (although turning off logging would be fixing this) and it would be dependent on my service. If I shut this page, down deep linking would be broken.
Maybe someone here had the same problem and can explain how they did it.
I'm looking for recommendations on a self-hosted app similar to archive.md that allows you to bypass article paywalls (e.g. Medium) on the fly. Bonus points if there's also the ability to store articles for future reference.
From an initial search Linkwarden looks like it'll do what i'm after but curious what others recommend.
Is there a cheap way to do virtual GPUs? I hear Intel GPUs have virtualization features on the enterprise version, and that you might be able to flash a consumer model to the enterprise/server BIOS. I also hear there is something you can do with some NVIDIA cards.