Paper28b.
# How self-evident a "safeguard" structure provides
pragmatic results across the board.
Paper28 part-b explains how to use a pragmatic safeguard
structure.
> "100%
Secure" It is simple once you know how.
Status: Finalised.
Filed: 4th Quarter, 2012
Most organisations are now dependent on the information flows handled by computer systems. Protecting those information flows is therefore crucial, but most organisations (including security companies and government security agencies) fail to get their act together..
Different cultures have different habits, but since most regions seem to follow the international 'main stream' when it comes to office and IT culture, the observations seem to apply across the board.
Response
Attitude
Gap
between hardware and software makers
Note: Often an absolute root cause turns out to be initially a taboo subject with those who are involved in the structure.
The
absolute root cause analysis results provided the needed
insights to research effective solutions and put those into a
logical methodology: ASCS.
The majority of the methodology verification testing has been
done discretely within various operational structures to
minimise change resistance.
Some organisations are aware of this Advanced Self Correcting
Structure, others still cling to the old mainstream habits. It
takes some people a bit more courage to get off the known main
road, to avoid traffic jams and other predictable obstacles.
There is even a marvelous way to simply pass most obstacles,
if you want to.
Note:
As a natural
protection mechanism, a brain will often
automatically first try to dismiss evidence if it
contradicts the embedded belief structure. Secondly,
brain 'security' mechanisms create a feeling of
anger from frustration if it can not easily
dismiss such bothersome new information. That
feeling of anger is often used as a convenient
excuse to justify the dismissal of the whole subject
in order to continue as if the unpleasant evidence
never existed. Before continuing reading, one needs to be aware of a fundamental brain mechanism which is called: self-justification. For some, this planet
is and always will be a disk instead of a
globe, even after contradicting evidence is
observed with their own eyes.
"If we choose, we can live in a
world of comforting illusion."
Noam Chomsky |
Let's set the
pace by stating that: 'Security'
is a mythological beast. " Security is a mere desire..
Protection is done
by preparation " ~ JH '96
Security: from
se- 'without' + cura 'care'
Protection: from pro- 'in front' + tegere 'to cover' |
Occurrence:
How widespread it is at the moment. Detection: Effort needed to detect the existence of the described situation. Solution: Which method in Paper28b will eliminate or mitigate the issue. Cost: How much money the solution will normally cost. Effort: How much effort it normally takes to deal with the described issue. ▁ ▃ ▅ ▇ |
10
~ Blocking top-down structure
Most organisations are hierarchical and also strictly divided
into departments.
Many people want to be open minded, but when tested, the upper
layer turns out not as approachable or flexible as they like
to proclaim.
Consequence is that there is a social iron curtain preventing
crucial information from moving up the ladder to those who
need to act upon it.
Thus creating a partly detached management layer which is
unaware of some things going wrong, until.. it has developed
into a crippling issue so blatant that it is also noticed by
either management or the news media.
Occurrence:
The
norm
Detection: Easy
Solution: #S-06 + #S-01
Cost: Insignificants
Effort: ▃
11
~ Academic reality bending, mind the gap
[restricted
information]
Occurrence: Very common
Detection: A little effort
Solution: #S-02 + #S-03L + #S-05
Cost: Depends on how deeply rooted
Effort: " "
12
~ The Standards
deception
Part 1:
Organisations often opt for adopting
a security standard, instead of just becoming compliant with
regulations.
So, instead of only
changing what needs
to be changed, they just start doing it all
exactly as described in the chosen guidelines.
This is mostly done because it seems easier, but it forces
an organisation structure into an unnatural shape.
It also creates a vulnerable mono-culture, by
removing uniqueness and diversity.
The foreshadowed end-result is eagerly abused by
attackers and espionage.
Part 2:
The currently popular official 'security' standards have a
few inherent drawbacks:
Occurrence:
The
norm
Detection: Advertised by the organisation
Solution: #S-07 + #S-01
Cost: Depends on how deeply already rooted
Effort: " "
13
~ Certification mono-culture
Plus, there are several fundamental major flaws in the
ISO/IEC/ISACA/(ICS)2/SANS/... doctrine.
Occurrence:
The
norm
Detection: Easy, check for the known standard mistakes made
by all those who rely upon the doctrine.
Solution: #S-01+ #S-05
Cost: Insignificant
Effort: ▅
14
~ Over
simplification / (The 3 robot laws)
When policies are translated into work procedures, those
procedures are then normally made obligatory.
So, deviating from The
Procedures is not an
encouraged option.
Translating policies into procedures is usually done by just
writing down what is normally done to get the desired
result. This is a tunnel vision approach, which doesn't take
exceptions or change into account.
Another issue is that procedures are often written in a
so-called '"fool-proof"
format. In theory any co-worker should then be able to
follow such a procedure to get the same result.
The resulting over-simplification creates a nasty catch-22
situation when an exception pops up. The person who has to
deal with that situation is bound by the procedures
which do not provide a solution.
A common situation: Someone behind a shop
counter tells you that something, which is perfectly
reasonable, is "not possible!".
You both know that it is possible, if.. the person
would be allowed to act outside the restrictive procedures.
..Like for example, simply asking a manager who is allowed
to act in the best interest of the organisation, to permit such
sensible exceptions from "the rules".
Occurrence:
Common
Detection: A test case
Solution: #S-01 + #S-06
Cost: Cost reduction
Effort: ▃
15
~ Conflict of interests
In order to sell more and more, when: it breaks outside the
calculated warranty scope.
Or to sell 'extra' features, new versions and support. ..in
order to 'help' the customer
deal with various known shortcomings
of the product.
Just a few examples from a very long list:
‣ "Built-in breakdown"
equipment from Sony, Samsung, HP, Philips and many other
companies;
‣ The locked out
features in Apple Inc., Tesla, and so many other products;
‣ Too complicated
systems;
‣ Forced updates which
remove functionality;
‣ Etc..
‣ 2 decades ago, specific printers from one
company suddenly seemed to be broken, just
after the warranty period. The company
informed people who wanted to get it fixed that it would be
cheaper to buy a new one. But a German technician found out
that by holding a specific button combination while turning
on the power, the 'built-in breakdown' function is reset and
the printer works just fine again.
Sadly many other equipment manufacturers have adopted that
nasty sales method.
‣ Some companies like to disable or exclude
functionality in their devices, and enable or include it in
a following 'New!' version. So people are enticed to toss
away their just one or two years old product to keep buying
the latest (soon also outdated) version.
‣ A company called Oracle started out with
a database product, likes to not only sell expensive
software but also even more expensive consultancy with it.
They introduced various gaps to get and keep the product
functional. Some other database products simply do what is
needed out of the box, without creating extra complexity,
insufficient documentation and missing functionality.
‣ A currently popular(due
to very aggressive marketing, not because of it's
quality) "next generation firewall" from
a self-proclaimed "global cybersecurity
leader" company, communicates with the company on-line
services in a way which lets them enable/disable/change
functionality. It also automatically extracts sensitive
information from the customers networks.(= a backdoor)
Those are rather obvious examples of incapacitating products
which find their way into your organisation. There are also
various less obvious forms of trickery hidden in not only
products but also some of the commercial
certifications and training/education methods.
Occurrence:
Widespread
Detection: Have someone with the required know-how
scrutinize your chosen products, solutions and methods.
Also
search for publications about detected shortcomings in
various products, solutions and methods from those chosen
vendors.
Solution: #S-01 & #S-05
Cost: Reduction in expenditure
Effort: ▃ .. ▇
Depends partly on the learning curve of those
responsible for selecting decent options.
Normally that only takes a few days to get the
hang of it.
16
~ Outsourced, does not mean: don't worry!
A self-induced misconception is that delegated
responsibility for protection can be achieved, by
outsourcing.
Consequence is that when such an outsourced service fails or
gets hacked, the organisation suffers the pain but the service
provider merely points at a disclaimer in the service delivery
contract.
A second issue, is that service providers generally prefer to
keep quiet about failures, thus often do not inform their
customers when their systems get hacked or data breaches
happen, when they think that it has not be noticed.
Typical example: A service is hosted by a run-of-the-mill
hosting/cloud service provider when it gets hacked on a
rainy Sunday morning. After it gets noticed a few days
later, the service provider can first try to mitigate the
issue without customers noticing, but can also argue that it
could not have prevented because it was supposedly "a very
advanced attack" or some other commonly used excuse.
You can then only count your
losses...
Occurrence: Common
Detection: Easy, from outside
Solution: #S-02 + #S-01
Cost: Slight increase
Effort: ▃ .. ▅
17 ~ Nerds make nerd toys, not good products
Technology nerds inherently tend to lose themselves in their
own realm. They have little interest in how user friendly
their creations really are.
Consequence is that many products are counter-intuitive;
far too complex; bloated with hidden/unused
functionality; unstable/unfinished; or even clearly too
dysfunctional to use.
Example: after more than 30 years of 'progress', computer software
products still suddenly stop/crash without
any message or obvious reason. Or throw
some useless messages at the baffled user every now and
then, like "unexpected
error!" , "Error: #1234", "Application is not
responding", ...
because of all that included extra
functionality/complexity, it also means that most
commercial 'security' products are by default filled with
various exploitable parts waiting to be discovered.
Occurrence:
Widespread
Detection: Depends on the type of product
But
an educated guess can be made
Solution : #S-05 , for mitigation
#S-01
, for solution
Cost: Depends on current overall organisation structure.
Effort: " "
Occurrence: The
norm
Detection: Needs skill
Solution: #S-03L + #S-05
Cost: Depends on number of programmers & engineers
and their age.
Effort: ▇.
Result also makes their work far more enjoyable.
19 ~ Skipping the
basics, to play with exciting toys
To create a solid foundation
to enable a strong overall structure, a few simple
rules need to be enforced.
The work that needs to be done to create a solid
infrastructure protection foundation based on the
fundamental rules/guidelines isn't glamorous.⌑
So it takes some self-discipline to put in that work, when
there are tempting options[2][3][4][12][16]
to ignore those tactical rules. That
is one of 3 reasons why most organisations can't even manage to
factually adhere to a few essential principles which are also
described within most security guidelines, like ISO2700x, etc..
Many people prefer fashionable or fancy
looking options.⌑
The false expressions "Can't stop
progress! " is often abused as a hollow excuse
to justify it, without checking if it's merely a hype
without being actual progress.
There is an endless stream of such exciting ''new''
gadgets which are eagerly adopted. Many turn out to create
more direct and/or indirect problems than one can justify
as acceptable. With more and more such gimmicks stacking
up over the years, the overall structure degeneration
becomes a major risk factor.
Occurrence:
Common
Detection: 1 minute from outside
Solution: #S-05
Cost: Reduction in expenditure
Effort: ▁
20
~ Losing sight of the weakest link
Being aware of weakest link principle is one of several key
factors for effective infrastructure protection.
Weakest Link Principle says: a chain is only as strong as its
weakest link. It means there is always a part of the process, a
person or a technology which weakens the the overall structure.
When the complete chain overview is missing, weak links are no
longer visible to those who are responsible for the protection
of the infrastructure.
With excessive complexity one can no longer
identify the weak links by just looking at each individual link.
because there are too many; they differ too much in structure;
are mostly invisible; are layered; subject to frequent change.
Overcomplexity makes it impossible to be in control or
even to classify all links correctly within the context of the
complete structure.
Another consequence is that various lower-ranking people get
blamed each time one or more links fail by an accidental or
unforeseen event, or targeted by hostile actors. Such top-down
blaming is counter-productive and unfairly shifts the blame from
those initially responsible for creating the uncontrollable
overall structure to the individual participants who just deal
with small parts of the structure and thus can't do much to
prevent it from happening again.
..Thus, frustration levels go up, overall quality goes down, and
a top-down state of denial
creates a taboo around the actual root cause of the structural
failure.
Occurrence:
Widespread
Detection: Relatively easy
Short
term.
Solution: #S-08 , by reducing the significance of the
weakest link factor in the overall structure.
Cost: Reduction
Effort: ▁
+
Long term.
Solution: #S-01
Initial Cost: ±1.5 FTE
Effort: ▃ ▁ , Reducing over time to
±0.2 FTE
21 ~
Asking 'hackers' to do 'security'
To check if it is true that 'hackers' are better at setting
up good protection, as the news media likes to suggest from
time to time. We did a survey of 40 such prominent hackers
who also sell security consultancy.
When asked why their own systems are just as poorly
protected as those of their hacking targets, and consultancy
customers, they responded with similar excuses as their
clients [*see quotes 1. 2 ,.. and
reoccurring news media articles about such incidents].
Which is logical, because their attention focus is on the
weakest links, not at all on creating a solid protection
structure.
22
~ Stubborn misconceptions
In this paper we describe the most significant of the
elementary misconceptions, fundamental flaws in used logic and
myths in the popular 'Security' doctrine.
There are related minor misconceptions, most need more
than a few lines to be put within the context they are found
and the impact they radiate out into the overall structure.
No need to list them here, they will pop up soon enough when
adequately dealing with the major ones.
To
ensure decent infrastructure protection, the focus must
shift
from haphazardly adding technology, bureaucracy
and/or external service providers,
to a "keep it simple" natural work
environment.
After all, computers were created to make our lives better,
...not more complex, odd and restrictive, as is the
case within the mainstream now.
Positive change takes time when opposed, raising awareness about the issues seems a good way to start moving forward.
? You may
have noticed that many sections also apply to other
structures, not just ICT and 'Cyber-Security'.
The solutions for each issue, described in Paper28b, largely also apply to various other areas. For instance, even the media regularly reports about healthcare, energy, banking and government organisations suffering from the same type of structural failure, ...due to the same causes. |
The
following quotes popped up during our conversations with those
people, during the years of research.
These quotes are from those who promote the failing mainstream
'security' approach.
This also further illustrate the structural awkwardness of
that approach.
Notes: