Spent my spare time over the past 2 days doing playing cleaning up. Specifically trying to fix translations:
- Look for orphaned translations texts (defined in JSON file but no longer used in code)
- Look for translations that exist in one language, but not another
- Fix transcription typos in code
Took about 3 hours to write the script. So far so good, fixed some typos, found 131 orphaned translations, and 281 keys that either exist in one language, but not the other.
It will take a couple of days to clean all this up.
Good thing is this python script will now be incorporated into the pre commit stage and form part of pre-commit testing, so problems like this will be automatically detected and the developer will be forced to fix it.
What amazes me most is time taken to complete this check.
real 0m0.085s
user 0m0.074s
sys 0m0.004s
I did not attempt to optimise this code, but did not expect this to run in such a short time (This is done on the Ryzen, Pi is about 2x slower).
To be honest I'm somewhat amazed at how fast computers are these days, and how efficient regular expressions searches can be. Doing this on older computers, and without the use of regular expressions would have taken longer.
Anyway, that's it for tonight. Rest of the week will probably be spent on cleaning up the translations. Then I will start to clear off all the un-finished tasks in 1.2.2, fix some display issues, and release the next alpha.