The 'Security' Paradox

A more sensible look at infrastructure protection

Paper28a.
 # Why the standard information technology related 'security' methods can only fail to deliver.

 Status:
RFC

  Version: 1.4 ,Dec 2020

 First Publication:
2012

Paper28b.
  # How self-evident a "safeguard" structure provides pragmatic results across the board.
  
  Paper28 part-b
  explains
how to use a pragmatic safeguard structure.

    >
"100% Secure" It is simple once you know how
.
  Status:
Finalised.

  Filed
: 4th Quarter, 2012

"If you do not change direction, you may end up where you are heading" सिद्धार्थ गौतम बुद्ध 

Summary:

At present, popular information technology related 'security' methods are partly based on stubborn misconceptions and conflict of interests.
In this paper we explain in layman's terms what is causing the widespread 'security' failure.
This information can enable a better grip on information protection by providing insight into the major factors for failure as a result of the quirky mainstream 'security' approach.


Introduction:

Most organisations are now dependent on the information flows handled by computer systems. Protecting those information flows is therefore crucial, but most organisations (including security companies and government security agencies) fail to get their act together..

Just a seemingly small alteration of a public facing web-site can put the reputation at stake. A disruption of the main transaction systems which only takes a few seconds, can take days to recover from while putting the whole organisation at a complete standstill during that time. And..  'hacking' has become rather popular, not only with bored schoolkids and gangsters but also among governments and rival companies.


Research shows that banks, hospitals, police  and other critical infrastructure participants can only on paper become compliant with the relevant information protection laws and regulations - in reality they are not even close to it. Thus also breaking the law which states that decent protection is obligatory.
Even organisations whose primary focus is 'security' - like secret service agencies, academic research groups and well known 'security' companies - choose to neglect several basic protection principles, it turns out.

Information Technology Security people tend to use the standard circular logic excuse: "nothing can be 100% secure !" to justify continuing as before and wasting budget on yet more very expensive add-on products, standards and procedures.

Instead of dealing with what is actually causing that structural failure.


S
o... the "But why !?" question arose, and a long-term research project came to life [1997-2020].
The project reached its initial goal in 2011. This dual paper is a result of completely independent field research by project members, involving a wide variety of organisations around the globe.

Different cultures have different habits, but since most regions seem to follow the international 'main stream' when it comes to office and IT culture, the observations seem to apply across the board.

"How wonderful that we have met with a paradox. Now we have some hope of making progress." Niels Bohr


              Root cause
  1. Complexity overload

    Response

  2. State of denial

  3. Retreating into blind faith

    Attitude

  4. King of the hill syndrome

  5. King of the castle

  6. Macho IT culture

     Type of people
  7. Hiring process is often inherently self-destructive

  8. Unqualified to qualify

  9. Pleasant colleague, may be the wrong choice

          Mind the gap
  1. Blocking top-down structure

  2. [restricted information]

  3. The Standards deception

  4. Certification mono-culture

  5. Over simplification

  6. Conflict of interests

  7. Outsourced, doesn't mean: don't worry

  8. Nerds make nerd toys, not good user products

  9. Gap between hardware and software makers

    Thinking outside the box
  10. Skipping the basics, to play with exciting toys

  11. Losing sight of the weakest link

  12. Asking 'hackers' to do 'security', side effects

  13. Stubborn misconceptions

"Every great and deep difficulty bears in itself its own solution. It forces us to change our thinking in order to find it."
Niels Bohr

Approach to analysis:

Standard Root Cause Analysis (RCA) is a bureaucratic management driven process. It identifies some things as "The root cause" the very moment it suits those who are conducting the investigation. The resulting outcomes of such narrow path RCA projects are inherently a few layers above the actual root cause of the problem. They will focus on corrective measures for something which is best described as "sub-layer symptoms" instead of dealing with the actual 'root cause'.
We have used Absolute Root Cause Analysis (ARCA). A more sophisticated approach, to dig down to the actual bare bones of a problem.
The label
"root cause" is only used when it satisfies the curiosity after extremely repetitive "but why !?" questioning from 3 angles.
Those angles are: logical, structural and motive.

Note: Often an absolute root cause turns out to be initially a taboo subject with those who are involved in the structure.

The absolute root cause analysis results provided the needed insights to research effective solutions and put those into a logical methodology: ASCS.
The majority of the methodology verification testing has been done discretely within various operational structures to minimise change resistance.
Some organisations are aware of this Advanced Self Correcting Structure, others still cling to the old mainstream habits. It takes some people a bit more courage to get off the known main road, to avoid traffic jams and other
predictable obstacles.
There is even a marvelous way to simply pass most obstacles, if you want to.

Note:
Before continuing reading,
one needs to be aware of a fundamental brain mechanism which is called: self-justification.
As a natural protection mechanism, a brain will often automatically first try to dismiss evidence if it contradicts the embedded belief structure. Secondly, brain 'security' mechanisms create a feeling of anger from frustration if it can not easily dismiss such bothersome new information. That feeling of anger is often used as a convenient excuse to justify the dismissal of the whole subject in order to continue as if the unpleasant evidence never existed.
For some, this planet is and always will be a disk instead of a globe, even after contradicting evidence is observed with their own eyes.

"If we choose, we can live in a world of comforting illusion."
Noam Chomsky

Let's set the pace by stating that: 'Security' is a mythological beast.
The word Security itself creates misleading expectations with severe consequences.
To get a solid foundation to build on, one needs to eliminate that underestimated beast.

" Security is a mere desire..  Protection is done by preparation " ~ JH '96
Security: from se- 'without' + cura 'care'
Protection: from pro- 'in front' + tegere 'to cover'
"Arrange whatever pieces come your way." Virginia Woolf


Occurrence:  How widespread it is at the moment.
Detection:  Effort needed to detect the existence of the described situation.
Solution:  Which method in Paper28b will eliminate or mitigate the issue.
Cost:  How much money the solution will normally cost.
Effort:  How much effort it normally takes to deal with the described issue.
▁ ▃ ▅ ▇


What is causing the structural failure:

1 ~ Complexity overload
The root cause of most of the following issues can be found in a natural brain overload defence mechanism.
A person doing a job is supposed to be 'in control' of the tasks that come with it. But with the extremely complex (sub-)structures which have become the standard these days, it is no longer possible to comprehend it all.

If the mind is not trained to deal with such dynamic complexity (there seems to be no mainstream educational institute which teaches this rather useful skill),
the brain gets into a state where automatic reactions start hampering healthy reasoning.

One of the brain reactions is choosing to rather ignore crucial new information.


Occurrence: Common
Detection: 1 test question
Solution: #S-01
Cost: Insignificant
Effort:




2 ~ State of denial
A side effect of the overload issue is often a state of denial about the inability to be in control.
Children can stop paying attention when its all too much. But most adults perceive group pressure to act like they can keep up all the time.
A person in such a complexity overload position can go to great lengths to try to conceal that fact.
Endlessly blaming various factors presumably outside their control, is the most common. Which is circular reasoning.
When directly confronted with inescapable evidence of incompetence to become in control, a classic reaction is repeating "That is just an opinion!" in response to each piece of evidence.

Occurrence: Common
Detection: 1 test question
Solution: #S-02
Cost: Insignificant
Effort:



3 ~ Retreating into blind faith
Many just pick a
seemingly most popular "solution", without even looking at which other options are available.
If one generally feels not able to deal with the overload of options, an easy way out seems to just follow a crowd and hope things will be fine.
If it
turns out not to be fine..., one can then still feel some comfort by stating that the choice was based on 'a standard'*
Where some are aware of this capitulation and are openly prepared to suffer the consequences.
Others (usually within the large organisations) aren't, and slide into a very particular state of denial; one in which they confuse their choice as 'the only option! '. Followed by putting extensive effort into trying to get the consequences of a counter-productive initial choice under control.
Instead of just taking a few hours to learn how to  cut through  information overload, to become able to make a sensible choice.


Occurrence: Common
Detection: 1 test question
Solution: #S-01
Cost: Reduction
Effort:

+"One accurate measurement is worth a thousand expert opinions." Grace Hopper


4 ~ King of the hill syndrome
A person who doesn't feel in control of the job, will often go to great lengths to keep out any form of criticism / competition.
One of the consequences is that he/she will try to make sure that only less qualified people will be hired, so the overall situation will become gradually worse.
Such a king of the hill will also try to prevent external verification of the current status, like a vulnerability test. If such a test is preformed anyway and unpleasant results are reported, a king of the hill will try hard to
hide the results by means of misrepresentation of those results and shifting blame to "circumstances outside their control", like lack of budget, service providers, etc. When in fact it has been their lack of taking control which caused the situation,  their choice to hide facts and shift blame. All that, instead of just dealing with the status quo, like using such a report to get the required budget for training/help to get in control.


Occurrence: Common
Detection: Easy
Solution: #S-03
Cost: Insignificant
Effort:
▁ / ▅  Depends on age and stubbornness. Otherwise leaderships willingness to remove some power from the person, if too deeply locked into such a state.


5 ~ King of the castle
A person who owns or manages part of an organisation and feels too authoritative, can start making up arbitrary rules without checking if they are realistic.
This is normally combined with not tolerating criticism. The situation is created by the
complexity overload issue in combination with a power trip.
A consequence is that people working
under the 'King of the castle' must do things which they know are counter-productive in order to keep their job.
Most often they will focus on making everything look good on paper
and ignore/hide the actual state for as long as possible.

Occurrence: Common
Detection: With a trick question.
Solution: #S-04
Cost: Little
Effort:



6 ~ Macho IT culture
It is no secret that the Information Technology playground has a macho culture for both men and woman.
One of the consequences is that a problem isn't tackled with the most pragmatic solution, but an opportunity to show off with a fancy 'solution'.
Resulting over time in a far too complex structure (Frankenstein's monster). That complexity is then triggering the King of the hill syndrome and other unpleasant side effects.

Occurrence: The norm
Detection: Takes a minute see from outside
Solution: #S-05
Cost: Reduction
Effort:
"People who think they know everything
are a great annoyance to those of us who do"
 Isaac Asimov ;-)



7 ~ Hiring process is often inherently self-destructive for an organisation
The standard hiring process format is from a time when the archive was still a room full of paper maintained by office clerks, instead of the current digital data stores maintained by technicians.
The personnel department has partly distanced itself from their co-workers by calling itself "HRM" (Human Resource Management), but did not yet do a complete transition into the digital era. They still use the same old
basic selection criteria method which is not suited for getting the proper people. Thus still favour the bureaucratic office clerk type.

Occurrence: The norm
Detection: Easy
Solution: #S-04
Cost: Reduction
Effort:



8 ~ Unqualified to qualify
Most HRM people
and recruiters, who create the job requirements description and do the initial CV selection, simply can't fully understand all CV's and don't know enough about what is actually required for every position to be effective.
So.. they primarily copy and paste description bits and pieces from the internet for the requirements. And then try to find those keywords in CV's.
Which means that they unknowingly overlook the more qualified people.


A typical example: (no need to understand all those acronyms)

The CEO asked for a "security officer", in order to comply with government regulations.
HRM has a quick chat with ICT, with the following (misguided) result for the requirements: 
    " Job title: Security Officer.
    " Requirements:
Masters degree in computer science, +CISSP,  +experience with AIX.
 
~ John's resume states: "15 years as security consultant for Banks and Telco's, Certified Senior Unix Engineer, 1995: Polytechnic microelectronics diploma."

~ Harry's resume states: "2015: University degree in computer science. Research project with AIX v4.2. 
Certified (CISSP) from (ISC)² in 2019. "

The HRM department will put John's resume in the rubbish bin without hesitation, and proudly show Harry's to the CEO.

The CEO, made blind by the meddling HRM department, will have to pick Harry as their new Security officer. A guy who has briefly worked with the asked for operating-system at university.  He simply doesn't have the needed practical knowledge and experience to do the job! But on paper seems a perfect match.

Consequence is that organisations which use such HRM or outsourced screening, generally get stuck with the "Fake it 'til you make it" type.
Which triggers the King of the hill syndrome right from the start of their new job.

Occurrence: The example doesn't show an exception but the norm.
Detection: Easy
Solution: #S-03 + #S-05
Cost: Reduction
Effort:



9 ~ Pleasant colleague, may be the wrong choice
Normally people who are looking for someone to strengthen their organisation will make "pleasant colleague" one of the primary selection criteria.
Some drawbacks of that somewhat thoughtless but understandable choice are:
 
‣ Will exclude healthy counterweight perspectives.
 ‣ Will exclude the '"tough cop" type, which is crucial for the security officer's position.

 ‣ Will over time create a fragile mono-culture.

 ‣ Will over time create an
increasingly conservative, introverted, inflexible organisation structure.
Consequence is that most organisations: 

Can't handle several types of exceptions. 

Prefer to dismiss justified criticism and advice.  

Find it too difficult to keep up with a fast changing world. 

Are not in control of adequate information & infrastructure protection; their risk management.
 

That being so.. seem genuinely surprised every time when they have yet another predictable incident or crisis.


Occurrence: The norm
Detection: Clearly visible on the outside
Solution: #S-05
Cost: Reduction
Effort:
▁ / Insignificant

10 ~ Blocking top-down structure
Most organisations are hierarchical and also strictly divided into departments.
Many people want to be open minded, but when tested, the upper layer turns out not as approachable or flexible as they like to proclaim.
Consequence is that there is a social iron curtain preventing crucial information from moving up the ladder to those who need to act upon it.
Thus creating a partly detached management layer which is unaware of some things going wrong, until.. it has developed into a crippling issue so blatant that it is also noticed by either management or the news media.

Occurrence: The norm
Detection: Easy
Solution: #S-06 + #S-01
Cost: Insignificants
Effort:



11
~   Academic reality bending, mind the gap

[restricted information]

Occurrence: Very common
Detection: A little effort
Solution: #S-02 + #S-03L + #S-05
Cost: Depends on how deeply rooted
Effort:
" "

12 ~ The Standards deception
Part 1:

 Organisations often opt for
adopting a security standard, instead of just becoming compliant with regulations.
So, instead of
only changing what needs to be changed, they just start doing it all exactly as described in the chosen guidelines.
This is mostly done because it seems easier, but it forces an organisation structure into an unnatural shape.
It also creates a vulnerable mono-culture, by removing uniqueness and diversity.
The foreshadowed end-result is eagerly abused by
attackers and espionage.
Part 2:
The currently popular official 'security' standards have a few inherent drawbacks:

  1. Increase complexity, the nr1 cause for structural failure.
  2. By default outdated. It takes the bureaucrats about 4..6 office years to create or revise a standard.
  3. Mostly come from dusty academic theory and copy & paste bits from older documents, instead of being a collection of extracted field experience. ISO27001 is derived from BS7799 which was derived from The Orange Book published in 1983.
  4. By their nature impose a conservative structure and thus restrict the required evolution.
  5.   [restricted information]
  6. The accompanying procedures for the implementation and for certification are commonly over simplified.
     Which not only removes essential insight into how the interdependent parts have to fit in the overall structure,
       but also removes the needed flexibility to handle unforeseen situations. 
    Thus, fail to deal with many types of threats, incidents, exceptions, ...

Occurrence: The norm
Detection: Advertised by the organisation
Solution: #S-07 + #S-01
Cost: Depends on how deeply already rooted

Effort: " "

"The only thing worse than being blind is having sight but no vision." Helen Keller

13Certification mono-culture
When organisations only use people who are all committed to just one and the same dogma, they expose themselves to predictable failure.
Plus, there are several fundamental major flaws in the ISO/IEC/ISACA/(ICS)2/SANS/... doctrine.
[*Evidence on request]

Occurrence: The norm
Detection: Easy, check for the known standard mistakes made by all those who rely upon the doctrine.
Solution: #S-01+ #S-05
Cost: Insignificant
Effort: 


14 ~ Over simplification  / (The 3 robot laws)
When policies are translated into work procedures, those procedures are then normally made obligatory.
So, deviating from
The Procedures is not an encouraged option.

Translating policies into procedures is usually done by just writing down what is normally done to get the desired result. This is a tunnel vision approach, which doesn't take exceptions or change into account.

Another issue is that procedures are often written in a so-called '
"fool-proof" format. In theory any co-worker should then be able to follow such a procedure to get the same result.
The resulting over-simplification creates a nasty catch-22 situation when an exception pops up. The person who has to deal with that situation is bound by the procedures which do not provide a solution.

A common situation: Someone behind a shop counter tells you that something, which is perfectly reasonable, is "not possible!".
You both know that it is possible, if.. the person would be allowed to act outside the restrictive procedures.
..Like for example, simply asking a manager who is allowed to act in the best interest of the organisation,
to permit such sensible exceptions from "the rules".

Occurrence: Common
Detection: A test case
Solution: #S-01 + #S-06
Cost: Cost reduction
Effort: 


15 ~ Conflict of interests
In recent years it has become 'normal'  for companies to create half-baked products.
In order to sell more and more, when: it breaks outside the calculated warranty scope.
Or to sell 'extra' features, new versions and support. ..in order to 'help' the customer deal with various known shortcomings of the product.


Just a few examples from a very long list: 
 
"Built-in breakdown" equipment from Sony, Samsung, HP, Philips and many other companies;
 
The locked out features in Apple Inc., Tesla, and so many other products;
 
Too complicated systems;
 
Forced updates which remove functionality;
 ‣ Etc..

2 decades ago, specific printers from one company suddenly seemed to be broken, j
ust after the warranty period. The company informed people who wanted to get it fixed that it would be cheaper to buy a new one. But a German technician found out that by holding a specific button combination while turning on the power, the 'built-in breakdown' function is reset and the printer works just fine again.
Sadly many other equipment manufacturers have adopted that nasty sales method.

Some companies like to disable or exclude functionality in their devices, and enable or include it in a following 'New!' version. So people are enticed to toss away their just one or two years old product to keep buying the latest (soon also outdated) version.

A company called Oracle started out with a database product, likes to not only sell expensive software but also even more expensive consultancy with it.
They introduced various gaps to get and keep the product functional. Some other database products simply do what is needed out of the box, without creating extra complexity, insufficient documentation and missing functionality.

A currently popular(due to very aggressive marketing, not because of it's quality) "next generation firewall" from a self-proclaimed "global cybersecurity leader" company, communicates with the company on-line services in a way which lets them enable/disable/change functionality. It also automatically extracts sensitive information from the customers networks.(= a backdoor)

Those are rather obvious examples of incapacitating products which find their way into your organisation. There are also various less obvious forms of trickery hidden in not only products but also some of the commercial certifications and training/education methods.

Occurrence: Widespread 
Detection: Have someone with the required know-how scrutinize your chosen products, solutions and methods.
                 Also search for publications about detected shortcomings in various products, solutions and methods from those chosen vendors.
Solution: #S-01 & #S-05
Cost: Reduction in expenditure
Effort: 
▃ .. ▇           
   Depends partly on the learning curve of those responsible for selecting decent options.
   Normally that only takes a few days to get the hang of it.

"From then on, when anything went wrong with a computer, we said it had bugs in it.¨ Grace Hopper

16 ~ Outsourced, does not mean: don't worry!
A self-induced misconception is that delegated responsibility for protection can be achieved, by outsourcing.
Consequence is that when such an outsourced service fails or gets hacked, the organisation suffers the pain but the service provider merely points at a disclaimer in the service delivery contract.
A second issue, is that service providers generally prefer to keep quiet about failures, thus often do not inform their customers when their systems get hacked or data breaches happen, when they think that it has not be noticed.

Typical example: A service is hosted by a run-of-the-mill hosting/cloud service provider when it gets hacked on a rainy Sunday morning. After it gets noticed a few days later, the service provider can first try to mitigate the issue without customers noticing, but can also
argue that it could not have prevented because it was supposedly "a very advanced attack" or some other commonly used excuse.
You can then only count your losses...


Occurrence: Common
Detection: Easy, from outside
Solution: #S-02 + #S-01
Cost: Slight increase
Effort: 
▃ ..



17
~ Nerds make nerd toys, not good products
Technology nerds inherently tend to lose themselves in their own realm. They have little interest in how user friendly their creations really are.
Consequence is that many products are counter-intuitive;  far too complex;  bloated with hidden/unused functionality; unstable/unfinished; or even clearly too dysfunctional to use.

Example: after more than 30 years of 'progress', computer software products still suddenly stop/crash
without any message or obvious reason. Or throw some useless messages at the baffled user every now and then, like "unexpected error!" , "Error: #1234", "Application is not responding", ..

because of all that included extra functionality/complexity, it also means that most commercial 'security' products are by default filled with various exploitable parts waiting to be discovered.

Occurrence: Widespread
Detection: Depends on the type of product
                 But an educated guess can be made
Solution : #S-05 , for mitigation
                #S-01 , for solution
Cost: Depends on current overall organisation structure.
Effort: " "


18 ~ Gap between hardware and software makers
It turns out that software programmers are generally a rather different type of person from hardware development engineers.
People who do both, normally have one of those skills predominant, because the different skills require an elementary different type of thought process.
This difference in personality wouldn't be an issue if they were both able to appreciate the difference and work well together on projects requiring both skills.
Consequence is that products which combine microelectronics and software normally have a noticeable gap between the hardware and software sections, which can be exploited without either section even being able to raise an alarm.

There are many examples where very expensive security devices can be bypassed with a clever trick targeted at that gap.
For example: a Trusted Platform Module, various biometric and other types of physical access devices, etc..

Occurrence: The norm
Detection: Needs skill
Solution: #S-03L + #S-05
Cost: Depends on number of programmers & engineers and their age.
Effort: 
.  Result also makes their work far more enjoyable.

"I don't know who has the (2nd) laser pointer, but..." Stephanie Wehner

"If you want to see what children can do, you must stop giving them things." Norman Douglas

19 ~ Skipping the basics, to play with exciting toys
To create a solid foundation to enable a strong overall structure, a few simple rules need to be enforced.

The work that needs to be done to create a solid infrastructure protection foundation based on the fundamental rules/guidelines isn't glamorous.⌑
So it takes some self-discipline to put in that work, when there are tempting options[2][3][4][12][16] to ignore those tactical rules.  
That is one of 3 reasons why most organisations can't even manage to factually adhere to a few essential principles which are also described within most security guidelines, like ISO2700x, etc..

Many people prefer fashionable or fancy looking options.⌑
The false expressions "Can't stop progress! " is often abused as a hollow excuse to justify it, without checking if it's merely a hype without being actual progress.
There is an endless stream of such exciting ''new'' gadgets which are eagerly adopted. Many turn out to create more direct and/or indirect problems than one can justify as acceptable. With more and more such gimmicks stacking up over the years, the overall structure degeneration becomes a major risk factor.

Occurrence: Common
Detection: 1 minute from outside
Solution: #S-05
Cost: Reduction in expenditure
Effort:


20 ~ Losing sight of the weakest link
Being aware of weakest link principle is one of several key factors for effective infrastructure protection.
Weakest Link Principle says: a chain is only as strong as its weakest link. It means there is always a part of the process, a person or a technology which weakens the the overall structure.
When the complete chain overview is missing, weak links are no longer visible to those who are responsible for the protection of the infrastructure.

With excessive complexity one can no longer identify the weak links by just looking at each individual link. because there are too many; they differ too much in structure; are mostly invisible; are layered; subject to frequent change. Overcomplexity makes it impossible to be in control or even to classify all links correctly within the context of the complete structure.

Another consequence is that various lower-ranking people get blamed each time one or more links fail by an accidental or unforeseen event, or targeted by hostile actors. Such top-down blaming is counter-productive and unfairly shifts the blame from those initially responsible for creating the uncontrollable overall structure to the individual participants who just deal with small parts of the structure and thus can't do much to prevent it from happening again.
..Thus, frustration levels go up, overall quality goes down, and a top-down state of denial creates a taboo around the actual root cause of the structural failure.

Occurrence: Widespread
Detection: Relatively easy

Short term.
Solution: #S-08 , by reducing the significance of the weakest link factor in the overall structure.
Cost: Reduction
Effort:

  +
Long term.
Solution: #S-01
Initial Cost: ±1.5 FTE
Effort: 
▃  , Reducing over time to ±0.2 FTE


21
~ Asking 'hackers' to do 'security'

To check if it is true that 'hackers' are better at setting up good protection, as the news media likes to suggest from time to time. We did a survey of 40 such prominent hackers who also sell security consultancy.
When asked why their own systems are just as poorly protected as those of their hacking targets, and consultancy customers, they responded with similar excuses as their clients
[*see quotes 1. 2 ,.. and reoccurring news media articles about such incidents].
Which is logical, because their attention focus is on the weakest links, not at all on creating a solid protection structure.

22 ~ Stubborn misconceptions
In this paper we describe the most significant of the elementary misconceptions, fundamental flaws in used logic and myths in the popular 'Security' doctrine.
There are related minor misconceptions,  most need more than a few lines to be put within the context they are found and the impact they radiate out into the overall structure.
No need to list them here, they will pop up soon enough when adequately dealing with the major ones.

"You grow up the day you have your first real laugh at yourself." Ethel Barrymore

Conclusion:

To ensure decent infrastructure protection, the focus must shift
 from
haphazardly adding technology, bureaucracy and/or external service providers,
 to a "keep it simple" natural work environment.

After all, computers were created to make our lives better,   ...not more complex, odd and restrictive, as is the case within the mainstream now.

Positive change takes time when opposed,  raising awareness about the issues seems a good way to start moving forward.

"The most dangerous phrase in the language is, 'We've always done it this way.'"
Grace Hopper

? You may have noticed that many sections also apply to other structures, not just ICT and 'Cyber-Security'.
The solutions for each issue, described in Paper28b, largely also apply to various other areas.
For instance, even the media regularly reports about healthcare, energy, banking and government organisations suffering from the same type of structural failure,
 ...due to the same causes.





Addendum:

The following quotes popped up during our conversations with those people, during the years of research.
These quotes are from those who promote the failing mainstream 'security' approach.
This also further illustrate the structural awkwardness of that approach.

[ The identity of individuals used to be somewhat obscured in Paper28a v1.0 & v1.1,
  although statements have to be verifiable, no one should be singled out for misguided beliefs.
]




Linda Wogelius,  Research Coordinator (ISACA.org)

 
"To remain in compliance, a CISA must just report CPE and pay the maintenance fee."

(ISC)²

SANS
"
In order to stay certified you must demonstrate ongoing competency in the Information Assurance field. There are a variety of options for accomplishing this goal."


 
Dr. Udo Helmbrecht, Executive Director of the European Network and Information Security Agency (ENISA). Professor Jose Cabeza Gonzalez.
"Sure, there are a number of important topics that we do not have in our study curriculum. (I am part of a EU project on Critical Infra Protection).
Our IT Security master programme is a result of many compromises (fighting for space in the programme against colleagues with pet projects of their own).
I am afraid that this does not sound optimistic."

dr. Damiano Bolzoni & prof. dr. Pieter H. Hartel (Distributed and Embedded Security Research Group)
"research is not driven by business factors, therefore researchers focused on topics that are challenging."
"Some ideas are easier to 'port' into the real world, where the assumptions one uses in his/her research do not always hold."


Kai Hansen, Department Manager Automation Networks [Critical Infra Protection development]
"We are not talking about ABB infrastructure. What we talk about is the deliveries to US critical infrastructure and fulfilling the cyber security requirements."
"Truly, we do want practical guys, but only practical guys with a PhD."
"I have a Ph.D. in Chaos Theory - not much direct use in industry - but it shows dedication. "


Rich Mogull. Analyst & CEO, Securosis "Information Security Research & Analysis"
"Actually, our *own* security isn't very pragmatic because it is too complex to scale, but we assume we are a bigger target and take extra steps."

Ronald Prins. CEO, Fox-IT, "For a more secure society "
"
I use a phone to read my mail. That F*!#* thing has no option to disable things for better security."
"We only do regular
internal audits our selves, and the results have been providing quite a lot of work already.
  So I will leave it as is, no external 3rd party verification tests."

"No, I don't understand my husband's theory of relativity, but I know my husband and I know he can be trusted. " Elsa Einstein


Chris Hadnagy (social-engineer.org)
"Thank you for those tips. I just moved server.... So it was some stuff i overlooked"
"We went through and removed some mods added some other sec, hopefully ll good"

James Adair & Adam Geller. Vice President Enterprise and Government, CISSP & CISM (Verisign)
"We are very concerned with our security, and have
internal departments handling all aspects."
"Thank you for accepting our conference call invitation. I'm sorry, but we did not look into the details of the security breach reported to us a few days ago."
"Our software is very well tested, it is not possible to gain access to our secure certificate k-end database. So we are not interested in any alleged specific details about a faulty gateway for the ARM corporation. Thank you for your time, goodbye."
- "Subject: RDP, Information protection flaws.
Your request has been sent to the relevant representative within ARM and has been assigned reference number 10321"
- "Produced By Microsoft Exchange V6.5: cam-exch2.emea.arm.com [10.1.255.58]: Your message was deleted without being read on Monday, February 10, 10:11 AM (GMT)"

Ernst J. Oud. Hired security officer at EZP
(which has one Nuclear power plant)
'I am
CISA / CISSP, with 30 years of security experience, university lecturer. According to others an authority in the field.'
'For customers
we use all technical resources available to our profession, we have experienced security architects and we let our systems regularly be tested by the best ethical hackers there are in NL and abroad!'
'We are always looking for professionals who keep us on our toes, so surprise me with your knowledge and expertise!'
'You can't see how we do our security just by looking at my website, which is of a much lower standard.'
'Your bias I can not understand, nor why someone kicks against the profession.'


Dr. Wim Hafkamp, CISSP LL.M, Head Info Sec. Strategies & Policies, Chair FI-ISAC Bill Munroe (Verdasys) & Dan Geer  (Verdasys) / Chief information security officer (In-Q-Tel)
"I am on the road but I did send this to IT with a note that we need to clean up any holes in our own infrastructure. I am most concerned about our own internal infrastructure that protects our software code and QA testing lab."
"I had a discussion with our IT folks and they have contacted a local company that the head of IT has worked with in the past. They were not surprised and were aware of many issues and had it on their agenda. They are going to start reporting it at our executive staff meetings on Monday to show progress so I can follow along. I thank you for the alert so I could move it up the priority chain of items they were working on."


Drs. Marco Plas. CISA CISSP, Chief Architect Risk & Security (ING), Author, Lecturer, Jericho Forum advocate.
"The Advanced Self Correcting Structuredocument does have a general appeal. As a member of the upper management, I would be inclined to invite the author to share a little-bit more about that adaptive security solution."
"Most organizations know, or at least have a gut feeling that their security does not provide 100 percent protection agains unlawful access. One of the biggest design flaws is the simple fact that most organizations have no idea what the value of their information is. Next to that, most security technology was designed for a specific context. That doesn't mean that these solutions will work well in every context, nor where the products designed to interact with products from other vendors. Most organizations have a multitude of point-solutions. Not the good makings of a protective shield (which was in most cases our design idea)."
'Send from my iPhone´ via mail^.google.com with iPhone Mail (7E18)

A.    Head of Network Security Section (Norwegian Secret Service, National Security Authority)

"First - thank you for the info you provided. We verified the info and did some further investigations, and then brought it to our IT dept.
As far as I know, corrective measures has been made to most of the issues.
But as I said to you - this is outside the scope of my responsibilities, so my knowledge in this matter is somewhat limited.
Anyway - your input has set some much needed focus on the importance of improving information/IT security procedures.
I'd like to thank you for handling this in a professional manner."

"Never mistake knowledge for wisdom.
One helps you make a living; the other helps you make a life."

Notes:




















We are not perfect, and thus perfectly capable of making mistakes.
If you notice anything that seems incorrect on these pages, then please do let us know.