• 19 Posts
  • 141 Comments
Joined 2 years ago
cake
Cake day: December 12th, 2023

help-circle











  • I use rsync for many of the reasons covered in the video. It’s widely available and has a long history. To me that feels important because it’s had time to become stable and reliable. Using Linux is a hobby for me so my needs are quite low. It’s nice to have a tool that just works.

    I use it for all my backups and moving my backups to off network locations as well as file/folder transfers on my own network.

    I even made my own tool (https://codeberg.org/taters/rTransfer) to simplify all my rsync commands into readable files because rsync commands can get quite long and overwhelming. It’s especially useful chaining multiple rsync commands together to run under a single command.

    I’ve tried other backup and syncing programs and I’ve had bad experiences with all of them. Other backup programs have failed to restore my system. Syncing programs constantly stop working and I got tired of always troubleshooting. Rsync when set up properly has given me a lot less headaches.


  • I don’t have root access on my phone but I still copy backups of my media and apps that export data to accessible files.

    I keep my process very simple using Termux with rsync openssh and termux-services packages.

    I created a folder dedicated on my for syncing between phone to computer called sync but you can change this for your needs.

    From a fresh Termux install, the setup should look something like the following:

    # Update package list and packages
    pkg update && pkg upgrade
    # Install required packages
    pkg install rsync openssh termux-services
    # Setup Termux's access to your phone's files
    termux-setup-storage
    # Make the required folder
    mkdir ~/storage/shared/sync/
    cd ~/storage/shared/sync/
    # Automatically start your SSH server when you open Termux
    sv-enable sshd
    
    • Get your phone’s username:
    ~ $ whoami
    u0_a205
    
    • Optional: Setup a password with the command passwd (I can’t remember if this step is important)

    A quick note: Termux on android has a file system quite different than a computer so file and directory names can get quite long. The pwd command would show /data/data/com.termux/files/home/storage/shared/sync for my sync folder.

    This can be made simpler by using the realpath command. realpath /data/data/com.termux/files/home/storage/shared/sync then shows /storage/emulated/0/sync as a result. If you’re using CLI, this may make your commands easier to read.

    Now you can start to build your rsync command to transfer your files. When setting up an rsync command, ALWAYS use the --dry-run- option. This performs a “transfer” without any files being moved.

    • From my computer (data transfer direction: Phone -> Computer):
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    • From my phone (data transfer direction: Phone -> Computer):
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress /storage/emulated/0/sync/ computer_username@192.168.40.205:/home/computer_username/backup/
    

    Explanation:

    • --archive preserves several file attributes
    • --verbose --human-readable --partial --progress creates a readable output to see what is happening
    • --compress compresses the data during the actual transfer (good for over a network)
    • -e 'ssh -p 8022' SSH on termux runs on port 8022
    • u0_a205@192.168.40.210:/storage/emulated/0/sync/ and computer_username@192.168.40.205:/home/computer_username/backup/ are how rsync identifies remote folders. Basic format is <username>@<remote IP address>:/path/to/folder/
    • /home/computer_username/backup/ and /storage/emulated/0/sync/ are the local folders, relative to what machine the rsync command is being run from.

    In order to reverse the direction of a command relative to the machine you are running on, simple swap the remote folder and local folder in the command. Example: From only my computer:

    # Direction: Phone -> Computer
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    # Direction: Computer -> Phone
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' /home/computer_username/backup/ u0_a205@192.168.40.210:/storage/emulated/0/sync/
    

    In order to actually transfer files, remove the --dry-run option from the previous rsync commands. The output in your terminal will show additional information regarding transfer status.

    Additionally, you can also add the --delete option to the rsync command, this will “deduplicate” files, meaning the source folder will force the destination folder match, file by file. That means deleting any file in the destination folder that does not match the source folder list of files.

    A command WITHOUT --dry-run and WITH --delete would look like the following (CAUTION: THIS CAN DELETE FILES IF UNTEST):

    rsync --delete --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    

    I personally manually transfer my backups into an encrypted external drive which I manually decrypt. /u/[email protected] has a suggestion for automated encrypted backups if that’s more to your needs.


  • Yeah, a few weeks ago a achieved my state of “secure” for my server. I just happened to notice a dramatic decrease in activity and that’s what prompted this question that’s been sitting in the back of my mind for weeks now.

    I do think it’s important to talk about it though because there seems to be a lack of talk about security in general for self hosting. So many guides focus on getting services up and running as fast as possible but don’t give security much thought.

    I just so happened to have gained an interest for the security aspect of self hosting over hosting actual services. My risks for self hosting is extremely low so I’ve reached a point of diminishing returns on security but the mind is still curious and wants to know more.

    I might write up a guide/walkthrough of my setup in the future but that’s low priority. I have some other not self hosting related things I want to focus on first.


  • I think I am already doing that. My Kiwix docker container port is set to 127.0.0.1:8080:8080 and my reverse proxy is only open to port 12345 but will redirect kiwi.example. com:12345 to port 8080 on the local machine.

    I’ve learned that docker likes to manipulate iptables without any notice to other programs like UFW. I have to be specific in making sure docker containers only announce themselves to the local machine only.

    I’ve also used this guide to harden Caddy and adjusted that to my needs. I took the advice from another user and use wildcard domain certs instead of issuing certs for each sub domain, that way only the wildcard domain is visible when I search it up at https://crt.sh/ . That way I’m not advertising my sub domains that I am using.



  • My ISP blocks incoming data to common ports unless you get a business account. That’s why I used Cloudflare’s tunnel service initially. I changed my plans with the domain name I currently own and I don’t feel comfortable giving more power and data to an American Tech company so this is my alternative path.

    I use Caddy as my reverse proxy so I only have one uncommon port open. My plans changed from many people accessing my site to just me and very few select friends of mine which does not need a business account.


  • I get that.

    I was generally (in my head) speaking about all my devices. If someone stole my computer, the full disk encryption is more of a deterrence than the idea of my data being fully secured. My hope is that the third party is more likely to delete than to access. If I catch the attention of someone that actually wants my data, I have bigger issues to worry about than security of my electronic devices.


  • I agree with the last point, I only mentioned that because I don’t really know what other setting in my SSHD config is hiding my SSH port from nmap scans. That just happened to be the last change I remember doing before running an nmap scan again and finding my SSH port no longer showed up.

    Accessing SSH still works as expected with my keys and for my use case, I don’t believe I need an additional passphrase. Self hosting is just a hobby for me and I am very intentional with what I place on my web facing server.

    I want to be secure enough but I’m also very willing to unplug and walk away if I happen to catch unwanted attention.


  • Thanks for the insight. It’s useful to know what tools are out there and what they can do. I was only aware of nmap before which I use to make sure the only ports open are the ports I want open.

    My web facing device only serves static sites and a file server with non identifiable data I feel indifferent about being on the internet. No databases or stress if it gets targeted or goes down.

    Even then, I still like to know how things work. Technology today is built on so many layers of abstraction, it all feels like an infinite rabbit hole now. It’s hard to look at any piece of technology as secure these days.


  • I use a different port for SSH, I also have use authorized keys. My SSHD is setup to only accept keys with no passwords and no keyboard input. Also when I run nmap on my server, the SSH port does not show up. I’ve never been too sure how hidden the SSH port is beyond the nmap scan but just assumed it would be discovered somehow if someone was determined enough.

    In the past month I did rename my devices and account names to things less obvious. I also took the suggestion from someone in this community and setup my TLS to use wildcard domain certs. That way my sub domains aren’t being advertised on the public list used by Certificate Authorities. I simply don’t use the base domain name anymore.


  • Early when I was learning self hosting, I lost my work and progress a lot. Through all that I learned how to make a really solid backup/restore system that works consistently.

    Each device I own has it’s own local backup. I copy those backups to a partition on my computer dedicated to backups, and that partition gets copied again to an external SSD which can be disconnected. Restoring from external SSD to my Computer’s backup partition to each device all works to my liking. I feel quite confident with my setup. It took a lot of failure to gain that confidence.

    I also spent time hardening my system. I went through this Linux hardening guide and applied what I thought would be appropriate for my web facing server. Since the guide seems more for a personal computer (I think), the majority of it didn’t apply to my use case. I also use Alpine Linux so there was even less I could do for my system but it was still helpful in understanding how much effort it is to secure a computer.