6. Security Vs Performance As Firewalls inspection moved higher in the OSI layer, and kept track of more connections characteristics culminating at the Proxy firewall, performance hit became significant.
7. A requirement for a second layer of security The need was clear for a new network security device. (Assembling fragmented packets, inspecting traffic for specific patterns, more in-depth inspection of the upper layers …). That device would be a second level of defense after the firewalls primary filtering and it should also guarantee optimal network performance.
8. Intrusion detection systems 1980 The first recognized IDS paper was published in 1980 by James Anderson, titled “Computer Security Threat Monitoring and Surveillance”. 1990 Commercial development of IDS. 1998 Snort was released. 1998 Commercial development of IPS.
12. Host based Advantages Effective against network end-to-end encrypted communication. Deter insider attackers as well as detect them. Easy to determine the number of hosts attacked. Disadvantages Harder to collect data when the number of hosts increase.
13. Network based Advantages Ability to analyze a wide range of application protocols. Strong response against outsiders attack. Disadvantages Weak damage assessment capabilities. Weak malicious insider activity detection.
17. Bibliography Increasing overall network security by integrating Signature-Based NIDS with Packet Filtering firewall. HamedSalehi, HosseinShirazi and Reza AskariMoghadam.s.l. : International Joint Conference on Artificial Intelligence, 2009. Baskerville, Phil. Intrusion Prevention Systems: How do they prevent intrusion? s.l. : University of Otago, Dunedin., 2006. Innella, Paul. The Evolution of Intrusion Detection Systems. [Online] 2001. http://www.symantec.com/connect/articles/evolution-intrusion-detection-systems.
Notes de l'éditeur
Network attack tools are growing increasingly easy to use and widespread, with tutorials and videos on the internet its very easy for amateur to launch attacks (a few ex:nmap, metasploit and fasttrack).For the professional and motivated hacker, who develop his own tools, the attacks can that much ore effective.Therefore there is a growing need for organizations to implement security policies and network protection solutions.
The first generation was introduced at 1985[1] which iscalled Packet Filtering firewalls. It uses filtering rules todetermine the validity of the network traffics. The rules aredefined on the basis of factors like source IP and port,destination IP and portCircuit Level Firewalls was the second generation whichwas introduced at about 1990. This kind of firewall is used forTCP connection. They examine each connection setup toensure that it follows a legitimate “handshake” for the TCP isbeing used. The circuit level firewall then checks its records tomake sure that the sender is allowed to send to the receiver andthe receiver is allowed to receive from the sender.The third one was the Application Layer Firewall which isan intermediary between the internal network and the Internet[4]. It can act as proxy server or proxy client. Firewall takes theintended communication from first computer and directs it tothe second one [3]. Inbound connections are always made withthe proxy client, while outbound connections are always madeof the proxy server. There is no direct connection between theinternal network and an insecure network.The next development of firewall technologies was the introduction of dynamic packet filteringfirewalls. They were closely related to stateful inspection firewalls.Because of the disadvantages with cost and performance of the application layer firewalls, and thelack of security surrounding packet filtering, a new method was developed. The new method wascalled stateful inspection.It is important to note that these firewalls generations overlapped some functionality in the process of evaluation.The current technology used for firewalls are classed as kernel proxy firewalls. This technologyevaluates packets at multiple layers of the protocol stack in the proxy server and is similar to theapplication layer in its use of proxy servers.
We notice that through the evolution of the firewalls they moved from looking at a single packet characteristics to following up with conversations between client and server and looking at the upper layers of the osi model.Yet as that trend continued it affected performance
The terminologies for alerts by an IDPS system fall under four categories indicating whether the alarm went on for a true network attack, or for normal traffic, and whether an alarm did not go off for a true network attack or finally did not alarm for a normal network traffic.
A signature is a pattern that corresponds to a known threat. Signature-based detection is the process of comparing signatures against observed events to identify possible incidents.5 Examples of signatures are as follows: A telnet attempt with a username of “root”, which is a violation of an organization’s security policy An e-mail with a subject of “Free pictures!” and an attachment filename of “freepics.exe”, which are characteristics of a known form of malware An operating system log entry with a status code value of 645, which indicates that the host’s auditing has been disabled. Anomaly-based detection is the process of comparing definitions of what activity is considered normal against observed events to identify significant deviations. An IDPS using anomaly-based detection has profiles that represent the normal behavior of such things as users, hosts, network connections, or applications. The profiles are developed by monitoring the characteristics of typical activity over a period of time. For example, a profile for a network might show that Web activity comprises an average of 13% of network bandwidth at the Internet border during typical workday hours. The IDPS then uses statistical methods to compare the characteristics of current activity to thresholds related to the profile, such as detecting when Web activity comprises significantly more bandwidth than expected and alerting an administrator of the anomaly. Stateful protocol analysis is the process of comparing predetermined profiles of generally accepted definitions of benign protocol activity for each protocol state against observed events to identify deviations.6 Unlike anomaly-based detection, which uses host or network-specific profiles, stateful protocol analysis relies on vendor-developed universal profiles that specify how particular protocols should and should not be used.