Real hackers do not randomly find a flaw in a system. There is a systematic approach to hack a system!
Regardless of size and type of an online entity and its online presence, a giant company with ten thousand of employee, or a home user of the Net, the only reason a system (may) have not hacked or taste the bitter part of internet is because it has not been targeted!
Hack proof, and resistance…are you kidding? systems are mainly vulnerable to basic penetration testing! A system enough mature to resist targeted attacks is literally a piece of “Art of Security Management” rather than a collection of sophisticated security tools or staff.
Once you are targeted you could truly measure the strength of your security measures and experience shows that we could barely stay safe after being actively targeted. However, targeting process may take years but once a malicious actor put your name on the list, you need to find way to response and faster ways for recovery.
I am not considering persistent threats where you have been hacked for years before even you realize. Remember, not all malware activity is supposed to be noisy and obvious. Hence, targets remain totally open to adversaries for months and years before they could even detect anomaly so let’s talk about hidden-side of being victimized and APT later.
What does it mean to be targeted?
It is not as simple as it sounds but briefly, it means adversary simply profiles you/your business for a relatively longer time and uses every aspect of your online presence to have what I call BB or brighter blueprint of a cyber entity.
Attacker creates an enhanced “vision” of a cyber target and s/he uses every single direct or indirect possible object to picture the target. At the end, or somehow in the middle of this process, attackers know your system way better than yourself! And that’s where they land the attack. The result depends on the purpose and motivation, could be destructive or hidden with minimal impact which is scary because then they nest somewhere within your system as long as they need.
The truth is, if a system has not been hacked, that’s not because it has a solid security posture, it is only because it has not been targeted.
Real issues are not going to be solved bY any of those known internet applications!
Currently it does not but technology could solve our problems if two factors considered:
Definition of ‘Problem’
Justifying practical ‘Application’
The former seems so obvious but that is actually the root cause of why technology is not able to solve our problems. You see how major aspects of technology are focused on ‘things’ that are neither a problem, nor an issue, or even a basic consideration.
Internet does it really justified to solve our problems? The answer is no because first we are misleading ourselves with unreal problems and things that are more wanted than needed, and more a matter of convenience than a matter of reasonable living.
As an example, transportation, social media, advertising… even fast communication are not real issue. Comparing to other side of same stories like fossil fuel, introversion and lack of communication. When you do not have transportation, focusing on Uber and Lyft is more insulting than funny. When people are getting more and more introvert, talking about your virtual friends is ridiculous. When delivery of an important message to appropriate people is being distracted by many political and ethical issues, focusing so much on advertising is like ignoring the entire word of ‘humanity’.
internet does not really justified to solve our problems
You see that definition does matter because if the problem was really how you could faster load facebook pages, or how to have a video with more viewers on YouTube, then yes, all those internet trends and application would frankly be toward solving our issues. But the real deal is different, and that’s why most of internet applications are going wrong direction.
Real problems are food, drinking water, population, education, diseases…but not how many restaurants have online vitrines, or how fast and convenient you could order pizza online, or millions of recipes in a PDF…the problem is providing enough resources for a billion of people in lack of basic life resources.
Enough food for one in nine people on earth just to make sure that they can function. Fighting against global water and waste crisis. Eliminating the risk of malaria to half of the world’s population. Taking care of millions of people with at least one sort of dangerous addiction. You can name it.
After you believe in real problems of today’s human on the earth, you’ll see how technology is far from helping us to solve those issues. Thousands of scientists all over the world are committed to solve our real issues, but that number is way less than hundreds of millions of people focusing on unreal aspect of life.
Conquering space for search of what we call inevitable for life, H2O, while we do not have the basic water treatment and culture of consuming drinkable water sounds very naive. Every day we make thousands of chemicals and medications for weight loss while a billion of people do not have enough food to function. We develop all sort of online applications but it’s like all of us are blind and deaf, we can’t see, we can’t hear what is going on before our eyes.
Technology is actually so capable of resolving our real issues. Let’s define and review and digest today’s issues and force technology to handle them for us. Not a cybersecurity professional, or a programmer, but as a human, that is my career to help technology solve human real problems with practical cheap solutions.
Operations fail by focusing on tool rather than technique!
In context of information technology, with all primary operations like system administration, patching and updating, backup and replication, malware protection…and all related sub-tasks, focus on Tools is an enemy of the process!
Defining, developing or choosing a technique in advance is crucial to an IT operation. Then finding a tool to do whatever the technique is dictating, not vice versa. Techniques are also backed up and rationalize by objectives and policies but that is out of scope of this article.
Techniques → Tools
That is the right flow chart: reaching device, gadget, program, software, application, script…or anything like that only after knowing the method or routine. In other words, we need to define the way we want to do something (process) and what is required (features) and then go after shopping or writing a code to handle that.
Many IT operations fail due to doing this simply in reverse direction: finding a tool just by searching the Subject and then refine the “forced” process based on what tool is dictating, not what we were expecting. Well, sometimes there is no expectation at the first place which is sign of a immature IT practice but that is also behind this short article.
Everybody’s talking about importance of physical exercise and routine workouts these days, and of course that’s the result of 21st century life-style which is forced through technology but how about some technology exercises and routine practices which can help reduce the pressure on tech staff workload and leads us to a healthier IT environment?
It will be so easy documenting and actually using it as a powerful tool and a supportive factor in everyday IT dynamic environment. But Only once we realize the application and purpose behind it in addition to simple techniques.
Most of us see Documentation as a hassle, an extra useless job of writing some staff on paper or Word and Excel files, and give a version or revision number, control it (what does it mean exactly?)… and then live it dusty chest or even ends up with some nonconformity because it is not what a reflection of our real world processes…..what is the purpose? why people see this as a hassle and this way it is actually a negative workload. Rather than utilizing it, that utilizes our resources!
The reason task of “documenting” has seen and believed to be a bother for most of IT professionals or even business analysts, is that we are doing it wrong, so no doubt it utilizes resources without any value. The easiest way to describe what is right documentation is explaining what is not. first you should ask:
What is going to be documented and for what reason
If reason is justified as a “Management System”, or “Standard”, or “Certification” then the answer is wrong and you are going the wrong way. You should justify by reasons like: “part of manufacturing process”, “describing system of asset management”, “explaining why product X failed during evaluation”, “document of how an employee is hired”, and so on. But never have the Driver as the Reason.
Documentation is not complicated but just like any other skill, first we need to understand the concept, and then some practice. Mastering this skill would not take more than 1% of your daily duties so let’s see what is the heart of the matter:
Document the logic and purpose of a task or subject rather than describing details of a task. In other words, focus on goal rather than the task. This saves a lot of time wasting on useless information in documents. This is also one of the main reasons users later won’t refer to documents. So we waste time creating them and then force the audience to read but they won’t because content is boring, confusing and only waste of time; no added value or even negative value.
Screenshots and steps to do something is not usually what documentation is all about. That might be useful for a user manual (I would doubt!) but not as an option for an IT guideline or even procedure, work instruction or policy. Here’s an example:
Let’s say you want to document your backup process, Disaster Recovery Plan, malware response and handling procedure, or how a node is setup and connected to a system in another network segment, how anti-virus agent is deployed….and thousands of other scenarios. Now would you open a Word file and start capturing screenshots of each step?!
It means you might doing in right way if that sounds funny to you, but most IT personnel are so busy that they don’t have time to step back and think about the way things have been done Wrong in the past, and they just repeat the same tools and techniques
Benefits are endless, and the result which is an agile environment will be appealing!
Once documenting become a routine and as a regular exercise the benefits start to show off their positive effects in environment:
Less time spending on Documentation! More effective and useful documents!!
Effective corporate communication and team elaboration
Compliance management in a controller manner
Certainty and confidence in changes: an strong and original strong change management
Faster, accurate and more effective and meaningful evaluation of future solutions. In other words, re-born of R&D within IT operations which I believe it has been totally forgotten in the fast paced today’s tech world
Smooth transition among staffing, team leadership and general daily administration
Audit and being audited any time with zero nonconformity or noncompliance
Better understanding of current processes and natural automatic and constant training for tech staff and end-users
Trustworthy IT team with reasonable full support from top management
Smarter internal and external customer relation and interaction
Reduced or almost zero anxiety among help desks and system administrators
Supporting to any future or ongoing management system and any framework which seeks documentation: ISO standards, Security management systems…
Do you need technical people to compile documents?
You need people that understand the logic of the document Subject, so it is likely that you need technical expertise but not necessarily a technical writer. Of course technical writers can add value but those value are not certainly useful and inline with purpose of documentation. Again refer to User Manual example.
The moment you discover the power of documentation as an integral part of IT management model, you won’t let anything done without it (I have seen this also as an imbalanced approach). But the beauty of it, is more the fact that it is a useful tool for both management and staff, something that is so rare. Stay tuned for IT Documentation Workshop soon.
In practice, most of internet anonymous services are only exposing your net identity in a different manner, even more obvious and only in a noisier way!
ong story short, if your are concerned about so-called ‘Privacy‘, do not rely on popular techniques and tools of net anonymity. for example, when you connect to a VPN service to hide your real IP address, either free or paid, you just enter a private smaller, easier to monitor (of course does not necessary mean ‘eavesdropping’), easier to track part of the larger internet.
Do solutions like VPN really hide our net presence or they only make our internet footprint exposure exclusive?
hat means tracing back to each single of your online movement related to your identity, which is ‘your unique online presence’ and your internet footprint or online signature, is much more easier and even more precise. Even not considering the fact, that no one can stay 100% hidden forever, or being naturally born hidden, which means you already have some (a lot of) footprint on larger public internet, and now with browsing for example via a private IP, you just consciously connect the dots for data brokers. This is not a fiction and it is actually happening practically when dealing with browser cache, cookies and many other server and client-side elements of your online activity.
Net Anonymity services simply change the scope of your identity exposure to a exclusive, limited and restricted environment which leads to a highly precise identification and a better, realistic version of your internet footprint!
The real Net anonymity is not achievable via popular online services. However, there are certain techniques which can be implemented via free simple tools on a paid dedicated hosting, within an small community, for example your family, or your friends. In a nutshell, it is technically possible to have practical and reasonable anonymous internet identity only via a private entity with a limited restricted ownership:
Get a dedicated physical server online, or set it up with dynamic DNS on your current home internet connection, setup a few application to mimic a fast HTTP and SOCK proxy and then route your peers via VPN or P2P protocols to real world. in this case, the footprint would belong to only that small community, and ‘retain’ and accessible only within that P2P network. With certain techniques, you can send and receive communication completely untraceable to individuals. This can be accomplished if you have an IT guru nearby.
How to utilize native Windows security features to get beyond all the tools in the market?!
Most of the times ‘extra tools’ are just for doing things in a different way, perhaps more convenient, but not necessary in a better way, or more effective, cheaper or faster way and Windows is not an exception. Speaking of Windows security features, all the features we need are already part of operating system, they are either initially included or later provided by Microsoft. There are exceptions, but only when we are looking for a totally different structure, a very unique extraordinary situation, and that is where what we want is behind the Windows native features and capabilities, so we have to add something to the kernel or expand the API.
Windows Firewall and power of Micro Segmentation, EFS and power of Windows native file-level encryption, basic Access Supervisory via powerful native to kernel, Windows Event Monitoring and Sysmon, Group Policy and world of unlimited capabilities, PowerShell and unexpected security administration possibilities… and many more unleashed Windows features are already there, you just need to utilize them before thinking of buying a new tool!
In following articles I will explain how to unleash Windows native security features before shopping for a tool. Even though tools might be free, why add anything to Windows when it is already packed with most of the necessities? Let’s get through the basics briefly:
Windows Firewall provides all you need as the cheapest and fastest host-based firewall for Windows. It does not matter if the target machine is part of a corporate network or small office or home computer. Most importantly, it is very easy to utilize it as part of your micro-segmentation and see how you can reach the effective filtering and totally eliminate lateral propagation of malware in a large scale network. But if you ask me why administrators ignore Windows Firewall, I have no explanation unless admitting that beauty of third-party firewalls totally blinds them!
Encrypting File System (EFS) is a powerful file encryption which surprisingly has been ignored among new generation of IT administrators. Perhaps ‘encryption’ is enough scarry for most of IT staff to deal with so they decide to rely on third-party colorful tools, but I will show you later how to use EFS as the integral part of ACL and take your access supervision to next level!
We will deep dive into one of the most effective monitoring extensions of Windows, Sysmon, and see how a couple of extra megabytes can change the scope of Windows Event audit trial, needless to say Windows event log is a quiet piece of intelligence where all those shinny system and network monitoring tools are relying on, and if we add a little bit of AI to it how a free SIEM could evolve from it!
The point is, Windows has enough native tools to touch almost anything you want in terms of security, and for some hidden tiny tweaks we could always get into Registry, at least we won’t be worried about extra security vulnerabilities result of introducing new tools to environment, so why not get more familiar with the operating system and get maximum benefit from its native security features and capabilities? Then some day if you had a very specific requirement which Windows was not capable of providing it, you could consider using third-party tools or even switching to a whole new operating system!
From struggling or hardly surviving, to a fully supervised and manageable security program…
Most companies are struggling with running an smooth security program. No matter how much they are spending on that, the difference is really not that much. From zero budget to million dollars security budgets, they still do not have enough trust in their security program. Regardless of how much they are spending on security initiatives, they never really have confidence and are not expected to see positive and reliable result of their investment.
Adversaries are finding new ways to hurt online businesses every single minutes while tech gurus are creating “solutions” to address today’s challenges within months, years and sometimes decades after the fact!
We simply are only trying to survive in cyberspace, but what could we do better to thrive, in a stronger position, where the security is not a hassle anymore, and it should not be the heart of the matter too. Let’s look at some practical countermeasures:
* stick to a
For a moment forget about technology and tools and get back to basics. A management system could totally fulfill whatever you need in terms of handling processes and not being worried about base of your operation. You can invent the wheel again or you can choose from thousands of management systems, but first ask experts which system fits your needs or bring professional on-board to implement a system fully customized to your work flow from scratch, also believe me, without a management system of any kind and approach, you will be still at the first step after years spending your precious time, which is that “surviving” approach.
* set objectives
any project has a
set of goals which are measurable and achievable. Objectives are
neither like: having a more secure network…or, setup RDP filtering
on firewalls…there are more like reducing current number of entry
points to network…or, assessing current remote protocol
objectives help you
better understand what are trying to get from your management system,
and where resources have to be focused. This topic is also related to
Risk approach which I think is the fundamental and background of the
* constantly measure
These are checkpoints where you can tell precisely and by evidence if you are on the track, and the more you measure and automate the process, the more you get close to a proactive system and faster accomplishing each of objectives. Through measurement you can tell of direction is write or wrong, or what is wrong or right.
* plan for
corrective and preventive actions
with each measurement you need to define corrective actions if the result is not expected or the pace is slow, and the plan to enforce these corrective actions is the key to a smooth security program, otherwise you will be struggling with past actions while new ones arrive.
* be responsive to
facts not fictions
computer security industry is full of fictions, and we mostly spend time and money on things which are either not important or can be tackled from root, so let me give you an example:
taking Advil when
you catch flu is just a pain killer, only for passing time without
suffering from flu symptoms, just to survive, because we do not know
how to handle flu virus in 21st century (or maybe we know
but we don’t want to disclose?!), and that is similar to running a
virus scan on your network when you get a virus infection!
Cyber security facts
have not been changed since the beginning of this subject in human
history, so once you know about the facts you see how it is easy to
address them without Advil!
No doubt that companies struggle with information security these days. Today they spend hundreds of thousand dollars, some millions, tomorrow they realize they have done nothing! Security folks do not have peaceful night sleep, because they know what they have done during the day could easy be compromised!
Regardless of why we are spending money while we are not
certain or confident to an expected outcome, why solutions really getting more
and more useless and ineffective? The answer is the hidden monster behind all
insecurities within information technology: the complexity beast!
Complex systems introduce complex work flows which are prone to intensive security flaws!
Complicated systems (which are also prone to insufficiency) introduce complex work flows and a model which is naturally prone to have more flaws, result of more surface for the attack and more attack vectors with combined magnitude and even unexpected new evolved way of attach. This is not fiction, this is the dynamic of today’s cyber security trend. You hire, you purchase, you train, you consult… you do your best and still you are not confident cause your neighbor company just had a breach and you will be more scared if you have pro visibility and see how malicious actors are already in-house!
Traditions have proven outcome already messaged, although market hesitate to listen, let alone to follow!
Fancy systems are more attractive to adversaries also, and there is reason behind it because they know how the chance of finding a flaw is exponentially higher when they see a fancy colorful IT infrastructure vs a clunky system out there. The worst part of this is that, customers of that fancy information system do not necessarily get better services or goods (products) even they pay more for it, they are also prone to lose more due to a complex system as backend, but that is another story with its own sad ending.
Complex software and hardware build complex systems
Complex systems are built around complex software, hardware and literally a complex IT setup where a given goal is accomplished through a complicated workflow, and this is either result of poor design, or just excess resource assignment where it is not needed at all. There are millions of examples you look around, or better, start by your own business or department you are managing:
Do you think all businesses
need Windows platform to run applications?
Do you think you use even
20% of Outlook features and capabilities?
Do you think most website
owners need PHP vs simple HTML?
Have you ever walked to
your company server room and ask your IT guy why things are setup like that?
Have you ever tried simpler
software vs the one with more features?
Have you ever shopped based
on what you need vs what has higher score reviews?
Those are just goofy questions just to fire up the real
flame inside you which makes you as yourself: should I really totally trust
people that are running my IT infrastructure, or I could use my common sense
and just question why I need these complex system? What workflow my business
really need and then what simple system is out there to support my workflow
regardless of what market is pushing me to buy.
Complex system setup puts us in more trouble when we start securing it with the consistent complicated mindset, and that’s where we could end up having more insecurities after spending and relying on sophisticated security solutions. Experience has shown and proven that the simplest way to address security is designing and implementing a simple system, an straightforward workflow is naturally secure, or easier to secure with even free or cheap security solutions which are easier to maintain, manage and run, so the outcome is more secure and cheaper and more reliable and efficient.
Use following checklist to make sure you are on the right track to choose your first or next SIEM solution. The whole process takes 1-4 weeks based on your dedication and vendor availability. Remember the worst thing is being in rush in five four steps:
Write a plan
Write down all the steps you anticipate and maintain documentation and progress during all stages. Put rough deadlines and start communicating to stakeholders.
Justify the need
This will help you have a better understanding of criteria later but generally, i would recommend this for any type of information security project. This step will assure you are not going to have SIEM just because it’s out there or even just because you have some spare money and other resources to spend.
You would ask yourself or your IT team, even your manager who has assigned you with the task: why we need SIEM? What type of problem is going to be resolved? What our world will look like after SIEM? Is this for sake of compliance, customer expectation, market urge, or as an enhancement to visibility of your environment.
Scoping gives you more understanding of the environment. Specifically with a concept like SIEM, the moment you start thinking about scope, you realize how much you might be behind the preparation of your environment.
Budget based on Risk
Budgeting based on your pocket is like overeating by intention when you know it’s bad for you. Budgeting without risk consideration will neutralize all the other steps. I consider this step so fundamental and ignoring it shows there is no understanding of the whole subject of information security within an organization.
A simple risk assessment can give you the right budget, but unfortunately that assessment most of the time does not exist so you have to create something from scratch just to support SIEM budgeting. We need to assess risk of not having enough visibility and detection in certain areas of IT operation and evaluate the risk factors. Once you start this process you will realize how most SIEM solutions in the market right now are naive and designed with a narrow vision.
During previous steps you should be able to compile criteria list. The more precise criteria, the easier to choose vendors initially. Without criteria there is no meaning to even browser a vendor website. With having criteria in hand, you easily check them in and out in next step.
You should compile a list of things like what is the primary objective, compliance, risk or threat management, architectural things like is it going to be managed or self, on-premises or cloud, interface and performance, type of log and data collection, integration considerations, correlation capabilities, intelligence feed, how about remediation and response and…
Identify targeted platforms
First you need to list SIEM vendors, there are tons of them out there and don’t think that good SIEM is a matter of how long a company has been doing this or how the brand is known, although this could be part of your criteria because vendor reputation is somehow a big factor, but do not confuse it with brand, not all know brands are necessarily better. Research and learn from vendors, we need to read all the white papers they provide and if they are not willing to share via website, it is not a good sign but don’ be discouraged and go for a meetup. Here is an staring list of vendors/solutions:
…remember all are good and all are bad, it depends to your criteria.
Meetup with vendor
Nothing is better than a short call, if you get the signal, go for a video presentation and have them demo. Never direct vendor, let them manage the meeting and content, listen to their question and start your evaluation from first call. Most vendors do not reveal anything alerting with email or regular phone calls so insist to have a demo and meet their technical team. Ask about your criteria but in the meantime listen to what and how they reveal. Based on your situation you may be more focused on how they execute or help you setup and run on-premises.
Trials are best time and tool for evaluation, also it’s a sign of how much a vendor is comfortable and confident. I personally would not even thinking a solution if they are not willing to give a chance to try. Trials are not just for finding glitches, they are mainly to refine your criteria and turn expectations to real world scenarios. Always let vendor know if you go for a different one, you will never know what is going to be the next time to call them so be professional and respect marketing manners.
Now it is time to evaluate. Materials, meetings and trials, most of the times you get the answer by first 3-4 days of trial. Justify if you need to compromised any predefined criteria and never hesitate to re-define and refine new one but never forget justification. You have to sometimes re-assess a risk if you need to revise your criteria.
Jumping in implementation without preparing your environment is not a good idea. Now it is time to go for all details and technical requirements which you should have planned for during scoping. Prepare VMs, Cloud apps and smallest things like SNMP and Windows Event Forwarding, this is the time for your technical team to show off. You should not have any problem if scoping was rational, but most companies have multiple issues in this stage because of lack of scoping in early stage.
Meet the deadline and kick off, this is going to be a big milestone with your IT Security operation.
Noise is the nature of SIEM so consider tune up based on size of the company and/or scope of system. This should be part of your baselining process anyway (if you have), without proper baselines your team will be confused and stresses for a longer time.
What is going to happen after this is what you should have seen and anticipated during your planning phase. Whether your team is going to tackle other tasks by adding SIEM, or it is going to be independent or… all depends on your plan. Never accept something from a solution/vendor as a ‘want’ or good to have, unless there is an actual ‘need’ for it.
Stay tuned for explanation of a fully native free SIEM, security information and event management system, a solution for 80% of environments!
Would you put more complex firewall rules when internal nodes are vulnerable due to initial default insecure setup, or setup numerous security tools while setting up more and more insecure nodes at the same time?!
the mechanic and dynamic of hacking is blurry to typical IT guru…
Are Hackers ahead of entire IT security industry?
Why the balance between two parties of has been shifted a long ago?
What made a big gap when there was not such a huge difference in 90’s?
it’s a false hope believing that sunny side of the cyberspace is controlling the cyber-planet! Malicious hackers are way ahead and that’s why we spend so much time on safety rather than focusing on legitimate needs of cyber-society!
Many factors are involved in Hackers Supremacy: knowledge (original or fake), intelligence, team work with genuine sense of community, nature of operation, goal and its outcome (destructive or constructive), originality of source code… but I have noticed there is only one effective factor as the most significant to matter, something that took hackers’ community to a totally different level of control, and changed the balance between Jedi and Sith forever: Commitment! Hackers are simply more committed to do their job!
We send our top IT talents to learn hands-on hacking techniques, encourage IT administration to deep dive into dark web, and all company crew to learn security essentials, and still it takes one man to bring the entire company technology infrastructure down to knees, all because the mechanic and dynamic of hacking is blurry to typical IT guru. Here is an analogy to human body: consuming more and more vitamins and hope to have a healthier cells physiologically, while body is creating cancerous cells. That is ignoring root cause and going after fixing the issue without considering symptoms! But result would be misleading because even with cancer, still vitamin C has a positive effect on the patient!
focusing on defining more complex firewall rules, versus Not setting up vulnerable nodes with default insecure configuration!
Software with every piece of code is the foundation of any modern computerized system (basic ha?) and that’s where we have problem: creating vulnerable code at the first place, and that’s where “Commitment” comes to equation: software community wants to release, in rush, with limited to zero knowledge of security, dealing with very high-level and complex API, no test, immature or illogical software development process, no code review…but hackers are committed to review developers code for them, and they find those cancerous cells inside body of the software!
Even worse, while hackers are committed to find and Exploit those software flaws, developers are committed to release newer versions with more focus on functionality rather than fixing the foundation. No doubt it is tedious and sometimes impossible, because if the flaw is within the design, there is no time for developer to step back and fix something natively insecure, to the point that sometimes developers prefer to completely leave the insecure code behind and go for a brand new baby code, where they fall into same illogical development process, or even they may use some boilerplate codes from previous practice (more likely insecure artifacts).
focusing on setting up numerous security tools in an environment, versus stop adding insecure nodes to the same environment!
Code Review is the best way to get ahead of hackers and of course that’s software developers’ mission to culturize and popularize the practice in earliest stage of coding, and for IT administration, they need to fully understand the mechanic of software they are using. Remember that today’s IT crew are more like software operators, so it is reasonable to have operators fully aware of the machine they are driving.