SayCyberOnceMore

  • 3 Posts
  • 194 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle

  • It seems you need more USB ports than the Green / Yellow support… and USB hubs aren’t great… so, unless you’re purchasing the hardware to support the foundation, then other options are better.

    As many others have said, getting a SSD for the Pi is probably best - if your hardware has enough RAM. No hardware is wasted, small cost, etc. IIRC, you’ll need to take a full backup (all options enabled), do a full clean re-install and then restore your backup. Then select the option to move your data.

    The benefit of going to another device (ie passively cooled N1xx) is that tje storage and memory allow you to expand what you’re doing with HA - more addons, better graphs (ie Grafana), longer term history, etc. If you’re not interested in all that, stick with the Pi.



  • Not really.

    I keep my data backups (docs, photos, etc) separate from the OS backups.

    So, depending on what you’re using to do the backup, often they can just simulate a restore and just check the backup’s not corrupted. Not really a restore, but at least you know it’s not trash.

    If you’ve backed up your data with a simple copy / sync (ie not a “backup” program), then you can restore your data somewhere else (maybe even jist a part of it) and do a compare.

    But, yeah, if you’re restoring the OS, then it might be ok restoring it in a VM to check it…

    I’m slowly moving towards no OS backups and using Ansible to be able to recreate the system(s) from scratch… of course I need to backup the ansible files too 😉


  • Write things down

    You will break something - and that’s good, it’s the best way to learn - but you’ll want to make a note of what you did / went wrong / how you fixed it.

    Future you will still break things and be grateful that you wrote that thing down

    You’ll buy something and find next year it was the wrong thing (too small, too large, too old, too new), so just get second hand stuff until you know what you need.

    Cabled networks are so much better than wireless, but then you’ll need switches and cables and shelves and stuff… so using today’s wifi is fine, but know where you’re heading.

    You need to store you stuff - that’ll be in a NAS

    You need something to run services on - that’ll be your server

    These might be the same physical metal lump (your 2nd laptop?), they might be separate… play around, break something and work out what feels right for you… and then put your data on there

    … and that’ll break too.

    Just be aware… if sync files between devices. That’s not a backup. (Consider you’ve deleted / corrupted something - it’s now replicated everywhere)

    Having a NAS with 10 drives in a RAID6 array, is not a backup. It’s just really robust against a drive failure, but a deleted file is still a deleted file.

    Take a full copy of your data off your system - then restore it somewhere else.

    Did it work? If so, that’s a backup.


  • Are you trying to solve one problem, but then find the next?

    If you solve the problem of the media selection (which looks like some form of database replication), then what happens if both parents select media they don’t have at the same / similar time - you’ve stated elsewhere that this is the bottleneck.

    I think it’s going to be much simpler to just replicate the media and let them work locally.

    I don’t know what your script with rsync is doing, but syncthing can limit bandwidth and have exclusion patterns, so perhaps one set of parents don’t want anything from GenreA and you could exclude that.

    Similarly if both parents lile GenreC and you don’t, then just sync that between their systems and save your storage.







  • Depends on which functions of NC you’re using.

    Personally, I found thst no-one used the gallery, calendar or contacts apps in NC, so I replaced it all with radicale and syncthing.

    But if you’re using all the collaboration stuff, then you’ll need to look into it a bit more.

    For me, NC was way overkill, nightmare to maintain and an extra layer of software (ie vulnerabilities) exposed to the interwebs thst I didn’t need




  • One thing I forgot to mention: rsync has an option to preserve file timestamps, so if that’s important for your files, then thst might also be useful… without checking, the other commands probably have that feature, but I don’t recall at the moment.

    rsync -Prvt <source> <destination> might be something to try, leave for a minute, stop and retry … that’ll prove it’s all working.

    Oh… and make sure you get the source and destination paths correct with a trailing / (or not), otherwise you’ll get all your files copied to an extra subfolder (or not)




  • It depends

    rsync is fine, but to clarify a little further…

    If you think you’ll stop the transfer and want it to resume (and some data might have changed), then yep, rsync is best.

    But, if you’re just doing a 1-off bulk transfer in a single run, then you could use other tools like xcopy / scp or - if you’ve mounted the remote NAS at a local mount point - just plain old cp

    The reason for that is that rsync has to work out what’s at the other end for each file, so it’s doing some back & forwards communications each time which as someone else pointed out can load the CPU and reduce throughput.

    (From memory, I think Raspberry Pi don’t handle large transfers over scp well… I seem to recall a buffer gets saturated and the throughput drops off after a minute or so)

    Also, on a local network, there’s probably no point in using encryption or compression options - esp. for photos / videos / music… you’re just loading the CPU again to work out that it can’t compress any further.