Fluoride Information

Fluoride is a poison. Fluoride was poison yesterday. Fluoride is poison today. Fluoride will be poison tomorrow. When in doubt, get it out.


An American Affidavit

Sunday, May 21, 2017

Security is a threat-driven exercise (ComSec in the age of Big Brother follow up) from The Saker

Security is a threat-driven exercise (ComSec in the age of Big Brother follow up)

Dear friends,
The following is a follow-up to my recent post about communication security (ComSec).  I decided to write it after reading the comments to the original post which clearly showed to me that there was a dire need of even basic information about ComSec.  I am going to try to keep it very, very basic so please bear with me.
First and foremost – security is a threat-driven exercise.  You cannot protect against “anything”.  You cannot protect against something diffuse like “they” or “the powers that be” or even “the US government”.  You can only protect against a specific threat.  Let’s take an example: as soon as we discuss the protection of our computers we think of the NSA.  This is normal, since the NSA is the arch-villain of the IT world and the US government the number one “rogue state” on the planet. 
However, what is missed here is that the NSA has no interest in most of us.  But the US IRS (revenue service) might.  What you have to realize here is that the NSA has means which the IRS does not and that the NSA has absolutely no intention of sharing any information with the IRS.  In fact, the US IRS also probably does not care about you.  The folks most likely to spy on you are your bosses, your colleagues, your family and your friends (sorry! don’t get offended; it’s more or less the same list for those most likely to murder you too).  In fact, some people close to you might even want to report you to the IRS in order to get you in trouble.  Once you understand that, you can also conclude the following
  1. All security planning must begin with the question “what is the threat?”
  2. Giving up on ComSec because the NSA can probably beat you is plain stupid, unless you are somebody really important to the NSA
Second, both spying and ComSec are cost-driven.  Yes, even the NSA has a limited (if huge) budget.  And yes, even the NSA has to prioritize its efforts: shall they use their supercomputers, translators, analysts, senior officers, etc. to spy after, say, the girlfriend of a senior Chinese diplomat or spy after you?  It is true that all our communications are intercepted and recorded.  This is especially true of the ‘metadata’ (who contacted whom and when and how and how often), but it is also true of our more or less ‘secure’ communications, be they protected by a very weak encryption algorithm or a military-grade encryption system.  Once that data is stored, the NSA has to parse it (mostly looking at the metadata) and take a decision as to how much resources it is willing to allocate to your specific case.  No offense intended, but if you are a small pot grower with a history of political activism who emigrated to the USA form, say, Turkey 10 years ago and if you are emailing your friends in Antalya, the NSA would need to decrypt your email.  That would take them less than 1 milisecond, but somebody needs to authorize it.  Then they would have to get a machine translation from Turkish into English which will be hopefully good enough (I am quite sure that the few Turkish-language translators they have will not be allocated to you, sorry, you are just not that important).  Then some analyst must read that text and decide to pass it on to his boss for follow-up.  If the analyst finds your email boring, he will simply send it all into a virtual trash bin.  Conclusions:
  1. For the bad guys to spy after you must be worth their time as expressed in dollars and cents, including opportunity costs (time spend *not* going after somebody more important)
  2. It is exceedingly unlikely that the NSA will put their best and brightest on your case so don’t assume they will.
Third, security flaws are like bugs.  Okay, this is crucial.  Please read-up on the so-called “Linus’ Law” which states: “given enough eyeballs, all bugs are shallow“.  This “law” has been paraphrased in Wikipedia as such: “Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.”  I would paraphrase it as such: the most effective manner to detect and eliminate bugs (such as security flaws) in software code or mathematical algorithms is to have them available for review by experts and to insure that a maximal amount of experts have a strong stake in very carefully reviewing them.  Now before proceeding I need to debunk a huge myth: the US government has more means than anybody else on the planet.  That is absolutely false.  Think of it: to work for the NSA, you not only need to have a US passport, but a high-level security clearance.  Right there, you have rejected almost all Chinese, Indian or Russian candidates (along with millions of US Americans!).  You might reply that the NSA has more money.  Wrong again!  Take a look at this article which begins with the following absolutely true, if amazing, statement:
The total development cost represented in a typical Linux distribution was $1.2 billion. We’ve used his tools and method to update these findings. Using the same tools, we estimate that it would take approximately $10.8 billion to build the Fedora 9 distribution in today’s dollars, with today’s software development costs. Additionally, it would take $1.4 billion to develop the Linux kernel alone. This paper outlines our technique and highlights the latest costs of developing Linux.  The Linux operating system is the most popular open source operating system in computing today, representing a $25 billion ecosystem in 2008.
Let me ask you this: did you ever think that the free software community, using a de-centralized development model, would be able to produce a product with the corporate world or a government would need to spend more than TEN BILLION dollars to develop?  Let me give you another example: Debian, which is the “ultimate” GNU/Linux distribution has over 1000 developers and package maintainers worldwide (including Chinese, Indians, Russians and Americans without a security-clearance) which are selected by showing the Debian community that they are the best at what they are doing.  Do you really believe that the US government could hire that amount of top-level coders and then manage them?  I remind you that the NSA is an “agency”, meaning that it is a bureaucracy, run by people who have reached risen to their level of incompetence according to the “Peter Principle“.  Such agencies are slow to adopt new technologies or methods, they are inherently corrupt (due to their secrecy), they are permeated with the “where I sit is where I stand” mindset which leads to a strong opposition to progress (since if you are used to doing X you will lose your job or will have to re-train if Y is introduced) and which is hopelessly politicized.  Buck per buck, brain per brain, the free software community is vastly more effective than this gargantuan mega-agency.
And then there is academia.  There are superb technical institutes worldwide, many in China and India, by the way, which are filled by the best and brightest mathematicians and cryptologists who are not only competing against each others, but also against all their colleagues worldwide.  The “eyeballs” of these people are focused with great attention to any new encryption algorithm developed anywhere on the planet and the first thing they look after are flaws simply because being the guy (or group) who found a security flaw in a previously assumed flawless algorithm is a guaranteed claim to fame and professional success.  Most of these folks are far more driven than the bureaucrats at Fort Meade!  But for them to be able to do their job it is absolutely crucial that the code of the encryption application and the actual encryption algorithm be made public.  All of it.  If the source-code and encryption algorithm are kept secret, than very FEW “eyeballs” care review them for flaws.  The conclusions from that are:
  1. The assumption that the NSA is miles ahead of everybody else is plain false.
  2. Placing your trust in peer-reviewed software and encryption algorithms is the safest possible option
  3. The worldwide hacker and academic communities have superior means (in money and brains equivalent) to any government agency
Using sophisticated ComSec technologies only draws unwanted attention to you.  This one was very true and is still partially true.  But the trend is in the right direction.  What this argument says is that in a culture where most people use postcards to communicate using a letter in a sealed envelope makes you look suspicious.  Okay, true, but only to the extend that few people are using envelopes.  What has changed in the past, say, 20-30 years is that nowadays everybody is expecting some degree of security and protection.  For example, many of you might remember that in the past, most Internet addresses began with HTTP whereas now they mostly begin with HTTPS: that “s” at the end stands for “secure”.  Even very mainstream applications like Skype or Whatsapp use a very similar technology to the one justifying the “s” at the end of HTTPS.  We now live in a world were the number of users of sealed envelopes is growing where the usage of postcards is in free fall.  Still, it IS true that in some instances the use of a top-of-the-line encryption scheme will draw somebody’s attention to you.
[Sidebar: I have personally experienced that.  In the late 1990s I used to use PGP encryption for email exchanges with my Godson.  Sure enough, one day my boss calls me into his office, presents me with the printout of an encrypted email of mine and ask me what this was.  My reply was “an encrypted message”.  He then proceeded to ask me why I was encrypting my emails.  I replied that I did that to “make sure that only my correspondent could read the contents”.  He gave me a long hard look, then told me to leave.  This incident probably greatly contributed to my eventual termination from that job.  And this was in “democratic” Switzerland…]
My advice is simple: never use any form of encryption while at work or on the clock.  If your email address is something like $fdJ&3asd@protonmail.com your employer won’t even know that you are using protonmail.  Just keep a reasonably low profile.  For public consumption, I also recommend using Google’s Gmail.  Not only does it work very well, but using Gmail makes you look “legit” in the eyes of the idiots.  So why not use it?  Conclusions:
  1. Using advanced ComSec technologies is now safe in most countries.
  2. The more private users and the industry will become ComSec conscious (and they are) the safer it will be to use such technologies
The weakest link in a chain determines the strength of the chain.  The US government has many ways to spy on you.  You can use the most advanced encryption schemes, but if your computer is running Windows you are *begging* for a backdoor and, in fact, you probably already have many of them in your machine.  But even if your operating system is really secure like, say OpenBSD or SEL-Debian, the NSA can spy on you through your CPU, or through the radiation of your computer screen, or even by installing a key-logger in your keyboard or a simple camera in your room.  Most so-called “hacks” (a misnomer, it should be “cracks”) are traceable to a human action, not pure technology.  So you should not just blindly trust some advanced encryption scheme, but look at the full “chain”.  However, while it costs Uncle Sam exactly *zero* dollars to use a backdoor preinstalled with Windows, it would cost him a lot more to direct a crew of humans to install a camera in your room.  So fearing that the NSA will use any and all of its tools to spy after you is also plain stupid.  Chances are, they won’t.  You are just not that important (sorry!).  The conclusions here are:
  1. Your ComSec depends on it’s weakest link and in order to identify this link you need to
  2. Acquire enough knowledge to understand the full chain’s function and not rely on one even very cool gadget or app.
Trust is always relative but, when carefully granted, beats distrust.  I hear a lot of sweeping and nonsensical statements like “I will never trust any technology or corporate” or “I will never trust any encryption scheme”.  These sound reasonable, but they are anything but.  In reality, we don’t have the option of “not trusting” any more.  We all use cars, computers, RFID-chips, smartphones, GPSs, the Internet, credit cards, etc.  Those who say that they don’t ever trust anybody are just lying to themselves.  The real question is not “trust vs distrust” but how to best allocated our trust.  To go with open source code and public encryption algorithms is far more rational than to refuse to use any ComSec at all (we all know that the post office, and many other people, can open our mail and read it – yet we still mostly use sealed envelops and not postcards!).  If ComSec is important for you, you really ought to ditch your Windows or Mac/Apple machines.  They – like anything Google, are basically a subsidiary of the NSA.  If you use remote servers to provide you with “software as a service” try to use those who have a stake in being peer-reviewed and who only use open source technologies (Silent Circle’s Silent Phone is an example).  There are public interest and “watchdog” type of organizations out there who will help you make the right choices, such as the Electronic Freedom Foundation.  Conclusions:
  1. We live in a complex and high technology world.  While you can reject it all and refuse to use advanced technologies, you thereby also make yourself the ideal passive sheep which the powers that be want you to be.  What the powers that be are terrified of are the cyberpunks/cypherpunks, free software hackers, folks like Assange or Snowden and institutions like Wikileaks.  They are so terrified of them that they *reassure* themselves by claiming that these are all “Russian agents” rather than to look at the terrifying reality that these are the natural and inevitable reaction to the worldwide violation of human and civil rights by the AngloZionist Empire.  It is your choice as to whether educate yourself about these issues or not, but if you chose to remain ignorant while paranoid the powers that be will give you a standing ovation.
  2. Placing your trust in X, Y or Z does not have to be a ‘yes or no’ thing.  Place as much trust in, say, open source software as you deem it to deserve, but remain prudent and cautious.  Always think of the consequences of having your ComSec compromised: what would that really do to you, your family, your friends or your business.  It is a dynamic and fast moving game out there, so keep yourself well informed and if you do not understand an issue, decide whom amongst those who do understand these issues you would trust.  Delegating trust to trustworthy experts is a very reasonable and rational choice.
The real cost of security will always be convenience: the painful reality is that good security is always inconvenient.  In theory, security does not need to harm convenience, but in reality it always, always does.  For example, to become more or less proficient in ComSec you need to educate yourself, that takes time and energy.  Using a key to enter a home takes more time than to open an unlocked door.  A retinal scan takes even more time (and costs a lot more).  You might always spend a great deal of time trying to convince your friends to adopt your practices, but they will reject your advice for many more or less valid reasons.  The key here is “is it worth it?” and that is a personal decision of yours to take.  Also, you will also need to factor in the costs of not using high-tech.  You can email a friend or meet him face to face.  But in the latter case, you need to ask yourself how much time and money will it take for you two to meet, how easy it will be for the bad guys to eavesdrop on your whispered conversation, how fast you could transmit any information by such means or whether physically carrying sensitive information to such a meeting is a good idea in the first place.  Conclusions:
  1.  Going low-tech might be far more costly and less safe than intelligently using high-teach solutions.
  2. “No-tech” at all is usually the worst choice of all, if only because it is delusional in the first place.
Conclusions:
I tried to debunk some of the many myths and urban legends about ComSec in general and an agency like the NSA in particular.  I had the time to do that once, but since this topic is not a priority for this blog, I won’t be able to repeat this exercise in the future.  I hope that this has been useful and interesting, if not I apologize.
Starting next week, we will return to our more traditional topics.
Hugs and cheers,
The Saker

No comments:

Post a Comment