I don't actively push QetriX, because it's still not as ready as I'd want it to be, but despite that I'm happy to announce it's currently used at 8 customers from both private and public sector, in applications and websites created by 4 different companies.
It's mostly QedyJS, as I want to prove and evolve the framework on frontend. Server-side QetriX Framework has already been proven, currently C# and PHP implementations of the Framework are in active use. In some cases there's a custom backend for QedyJS, created by another party, or even an open-source software, accessed via it's API by QedyJS.
I really like I don't need a custom backend to access a server side API, although it requires a CORS proxy sometimes, but it's still an universal piece of software and all the business logic is in JavaScript.
For rapid development purposes I created a simple API backend, acting basically as REST-to-SQL converter. There's no model in the API, everything is translated from/to tables and columns in a database.
To modify the model I just use HeidiSQL, as I'm used to. I know many devs prefer ORM or some sort of abstraction over SQL, but that's not my case, as I'm more old-school and I learned to think that way.
As for delivering updates, some of them use Deployer in FTP mode, some of them use GitLab and their CI, and in one case there's a batch file to create a directory with changed files, that are manually copy-pasted via Remote Desktop to the Windows Server, where is another batch file that stops IIS, copies those pasted files and runs IIS again. I really like that one, it works surprisingly well, reliable and with only a negligible downtime.
I still use the BAT file to copy the most important files to my other drive and to on-line service, so I want to share useful commands:
Written to _backup.log: Backup started at 01.01.2020 10:00:00,00
The /cdirsy
parameters are for silent copy of newer files and all non-empty subdirectories. Basically “auto-copy everything updated, no question asked”.
Dumps database “db” to “db.sql” file using “mysqldump” username and password, then compresses the .sql file to ZIP archive using 7-Zip and finally deletes the .sql file:
Prompts you to press “Y” (+Enter) to do something, or enter anything else or nothing (+Enter) to skip it.
As many others I underrated a good backup for a long time, despite having lost several precious stuff, like my early source codes in Basic and Pascal, or my early music attempts in FL Studio.
I usually created some backup floppy disks, later CDs and DVDs, and finally external hard drives, but it was too much hassle, so I did it only once in a while. I copied everything manually and therefore the backup quickly became obsolete.
I was aware of that and tried to find a comfortable solution, that would make me backup more often. My recent ThinkPads had/have secondary HDD in the UltraBay, currently 1 TB HDD besides internal 0,5 TB SDD, so simple copy between those drives could be considered a backup, but it won't work against my current biggest dread – ransomware.
I tried to mitigate this issue by using Dropbox with a BAT file for copying changed files, but increasing backup size and my decreasing satisfaction with the on-line service made this solution suboptimal once again.
One day I got an idea for an ultimate solution: purchase an external drive and just do a periodic backup of my entire storage. This way I wouldn't have to care what files goes into to backup and therefore the entire backup process could be automated/unattended. Well, except for plugging the drive in and out, of course, because I don't want it to stay connected (ransomware often encrypts connected external drives as well).
Because most of the data I need to backup is work related, I asked my boss for the drive and he agreed. Also he probably didn't want me to bother him with upgrades later, because instead of requested 2 TB I got a 4 TB WD My Passport :-)
It's brilliant I can keep the high-speed micro USB 3.0 cable permanently in my ThinkPad dock, so I just need to plug the drive in and initiate a backup.
The next step was to find a proper software. I wanted open source or at least freeware and the best was often cited Cobian Backup, so I gave it a try. It's exactly what I wanted! I created a task in Task Scheduler to launch it once a week, the rest I process manually when I find it sitting in the SysTray.
This is the amount of effort I'm willing to do, so this solution works nicely.
However, in a few months I noticed some directories were missing from the backup. I found a post describing the same issue, but with no remedy and with source code of Cobian Backup sold to another entity I don't see it fixed. So I decided to try to find another open source backup software.
This time i chose FreeFileSync. It claimed to be the fastest of the lot. While Cobian Backup needed around 7 hours to backup both drives, FreeFileSync needs just around 1h 15m to do a regular sync (weekly it's usually around 25 GB out of 1500 GB in total). It takes ~35 minutes to build a list of files to be synced and another ~40 minutes to do the actual sync. Impressive!
I'd love to know what's the difference, because I assume both programs do pretty much the same, only Cobian does the backup while traversing the dir tree, while FreeFileSync does the traversing first and backup later (and therefore is quite memory hungry, peaking slightly over 2 GB RAM).