You should try to keep a diary in an operating system that does not have telemetry and has encryption that can't easily be broken (either through hacks or brute force).
Information on what I just said if you are semi-tech-illiterate:
Operating system (OS)
The software that runs a computer or phone and manages hardware and applications. Examples: Windows, macOS, Android, Linux.
Telemetry
Automatic collection and transmission of device data to company servers, often presented as "help improve your experience" to get users to accept it.
How telemetry can leak passwords — examples
Crash dumps, diagnostic logs, autofill sync, or misconfigured uploads can capture and send credentials. Windows often nudges users to sign in with a Microsoft account, which can cause recovery secrets or synced credentials to be stored in Microsoft's cloud.
BitLocker and Microsoft storage of recovery keys
If you enable BitLocker and sign in with a Microsoft account or accept default prompts, Windows will back up the BitLocker recovery key to your Microsoft account. Microsoft then holds that key; anyone who can compel Microsoft (law enforcement with a warrant or similar authority) or who gains access to the account can decrypt the drive without local brute force.
How Windows tricks users into giving up passwords or keys
Windows presents cloud sign-in, backup, and recovery as "recommended" with default accept buttons during setup and app prompts, making the cloud-backed option the easiest choice.
Brute force attacks — how they work and who can do them, with costs
Brute force tries many passwords until one works. Attackers use tools like Hashcat on GPUs for offline hashes (very fast) or scripts for online logins (slower due to rate limits). Single criminals can use a consumer GPU (hundreds of dollars) to crack weak passwords. Criminal gangs rent or run GPU farms (tens to hundreds of dollars per day). Governments use large clusters or specialized hardware at much higher cost. Renting a few cloud GPUs is roughly $50–$500/day; large targeted operations can cost thousands to millions.
Hacks (other ways passwords are obtained)
Phishing, malware/keyloggers, exploiting software bugs, or stealing backups capture plaintext without breaking encryption.
Diary (digital journal)
Personal entries saved as files or in an app. Keep diaries encrypted, avoid untrusted cloud sync, and avoid apps that send telemetry by default.
Open-source OSes with minimal telemetry — examples
Desktop: Debian, Fedora Silverblue, PureOS, Parabola.
Mobile: LineageOS, /e/ OS, GrapheneOS.
Virtual machine (VM) option and privacy limitations
Run a privacy-focused OS in VirtualBox, QEMU, or VMware and encrypt the guest disk. The host can still access VM images, memory, or network traffic; avoid shared folders and clipboard and minimize host telemetry.
Passphrase examples by attacker capability
Single criminal with one GPU — explanation and example
Use ~16–20 characters: 3–4 uncommon words, mixed case, two digits, two symbols. Example: BlueFence7!paper%9
Criminal gang with many GPUs — explanation and example
Use ~24–32 characters: 4–6 unrelated words, mixed case, several digits, multiple symbols. Example: river-apple7-silver!horse%21
Government-scale cracking — explanation and example
Use 35+ characters: long sentence-style passphrase or 6–8 unrelated words, mixed case, 3–4 digits, multiple symbols; combine with MFA for critical accounts. Example: MyGrandpaLoved2Paint-BlueWalls!1984@
(These are just examples and there is software available to estimate how long cracking would take for various passwords if you are serious.)
Open-source vs closed-source — simple explanation
What is code / source code?
Code is the instructions people write to make software do things. Source code is the human-readable version of those instructions (what programmers write).
Source code written in programming languages is designed to be human-readable, which makes it easier for developers to understand, modify, and collaborate on software projects. However, this source code must be compiled into binary format for the computer to execute it, rendering it inaccessible to humans in its original form. In the case of closed-source software, only the binary files are provided to users, leaving them without visibility into the underlying code. This lack of access can create security risks, as users cannot verify how the software operates. To address this concern in open-source software, hashes are used to create a unique fingerprint of the compiled code, allowing users to compare the hash of the available binary against the original code. This process helps ensure that the binary has not been tampered with, providing some level of assurance about its integrity and security.
What is open source?
Open source means the source code is published so anyone can read it, check it, and build the software themselves. Because the code is visible, independent reviewers and security researchers can look for bugs or hidden backdoors. You can also verify a downloaded build matches the published one by comparing a cryptographic hash or signature — a short fixed string that changes if the program is altered.
What is a hash?
A hash is like a fingerprint for a file: a short string of letters and numbers produced from the file's exact contents. If the file changes at all, the hash changes. Projects publish hashes so users can confirm they received the exact file the project released.
What is closed source?
Closed source means the company does not publish the source code. You get only the compiled program (the version the computer runs), which isn't human-readable. Examples include Windows and macOS. Because the source isn't public, independent parties can't easily inspect it for hidden code.
Why backdoors are almost certain in closed-source corporate software
Closed-source software means the company does not publish the human-readable instructions (source code) that make the program run. Because the public can't read that code, independent people cannot reliably confirm the software has no hidden access mechanisms. Governments routinely seek access for investigations and national security; courts and laws (including secret orders from national security courts) can compel companies to cooperate. Companies have commercial incentives to comply rather than risk penalties, and technical access features (recovery keys, remote management, debugging hooks, telemetry channels) are often built in for support and convenience.
The combination of legal compulsion, business pressure, and built-in remote-access features makes it extremely unlikely that no closed-source corporate products contain backdoors or undisclosed ways for authorities (or attackers who gain access to those channels) to get in. In contrast, open-source projects publish their source code so anyone can inspect it; that transparency makes undisclosed backdoors far harder to hide because independent reviewers can examine the code and builds, sophisticated computer users can compile that code into binaries and then take hashes, and then average users can compare the hashes of available binaries with hashes developers publish. The end result helps computer users know that the binary they download is the same as the published open source code because the hashes match, making it impossible for governments to demand back doors without the backdoors being visible to open source computer users who inspect the open source code.