Saturday, October 17, 2009
Friday, October 9, 2009
Journal 6 - Database Encryption
What is the value in encrypting your Data?
By Rich Adrian Lane
This article discusses reasons for deciding on whether or not to encrypt data.
Case 1 - In establishing a disaster recovery plan considerations are made on duplicating or backing up server data to an external location.
Case 2- Concerns about the loss of sensitive company information when lab equipment is stolen prompted consideration for encrypting data.
Case 3 – Compliance with PCI-DSS guidelines where customer billing, credit card and password information is stored in a database.
Case 4 –Key management is forgotten to be encrypted.
http://securosis.com/tag/database+encryption
By Rich Adrian Lane
This article discusses reasons for deciding on whether or not to encrypt data.
Case 1 - In establishing a disaster recovery plan considerations are made on duplicating or backing up server data to an external location.
Case 2- Concerns about the loss of sensitive company information when lab equipment is stolen prompted consideration for encrypting data.
Case 3 – Compliance with PCI-DSS guidelines where customer billing, credit card and password information is stored in a database.
Case 4 –Key management is forgotten to be encrypted.
http://securosis.com/tag/database+encryption
Network Security Tool Live CD
Network Security Tool is a bootable ISO live CD/DVD toolkit. It was designed to provide easy access to best-of-breed Open Source Network Security Applications. I chose to write about Network Security Tool because it has a comprehensive set of the top 100 security tools, an advanced Web User Interface for system administration, navigation, automation and configuration of network security applications. The NST can transform most processors into a system designed for network traffic analysis, intrusion detection, network packet generation, wireless network monitoring, virtual session servicing or a sophisticated network/host scanner. It takes less than a minute to get it going by just rebooting NST Live. It is an excellent tool to help with crash recovery troubleshooting scenarios and diagnostics.
In September 2009, NST announced the latest release v2.11.0 with a new design. NST Live can be installed to a USB device for creation of a NST Live USB disk. NST keeps looking for ways to improve their product and does not stand still when it comes to the latest technology and interfacing with it. Since NST is an open source security application, there are various support organizations that will help take advantage of all of its capabilities. It will help with configuring technical security controls for organizations when considering prevention, detection and over all security administration.
The NST toolkit can help in supporting the mission of the organization by protecting its physical and financial resources, reputation, legal position, employees and other tangible and intangible assets. It is very cost-effective since it is an open source and well known in the industry. The toolkit can help organizations support its policies in managing their computer security program, risk management, helping with the Business Continuity Plan and Disaster recovery program, awareness training, and physical and environmental security among others.
Some of security tools that the NST toolkit provides are as follows:
Wireshark, Multi-Tap, Network Packet Capture, Nessus, Snort, NMap, Top, Kismet, Netcat, Hping2, Tcpdump, Cain and Abel, John the Ripper, Ettercap, Nikto, THC Hydra
http://sourceforge.net/support/getsupport.php?group_id=85467
http://www.networksecuritytoolkit.org/nst/index.html NIST SP800-30 & NST SP800-14
In September 2009, NST announced the latest release v2.11.0 with a new design. NST Live can be installed to a USB device for creation of a NST Live USB disk. NST keeps looking for ways to improve their product and does not stand still when it comes to the latest technology and interfacing with it. Since NST is an open source security application, there are various support organizations that will help take advantage of all of its capabilities. It will help with configuring technical security controls for organizations when considering prevention, detection and over all security administration.
The NST toolkit can help in supporting the mission of the organization by protecting its physical and financial resources, reputation, legal position, employees and other tangible and intangible assets. It is very cost-effective since it is an open source and well known in the industry. The toolkit can help organizations support its policies in managing their computer security program, risk management, helping with the Business Continuity Plan and Disaster recovery program, awareness training, and physical and environmental security among others.
Some of security tools that the NST toolkit provides are as follows:
Wireshark, Multi-Tap, Network Packet Capture, Nessus, Snort, NMap, Top, Kismet, Netcat, Hping2, Tcpdump, Cain and Abel, John the Ripper, Ettercap, Nikto, THC Hydra
http://sourceforge.net/support/getsupport.php?group_id=85467
http://www.networksecuritytoolkit.org/nst/index.html NIST SP800-30 & NST SP800-14
Friday, October 2, 2009
Increase Security through Open Source?
Is it not logical that if you close the door to your security system you could reduce the risk and have less vulnerability? Would you not reduce the likelihood of a successful attack with less exposure with a closed software system? The question is, when considering security risks in your system, which is better to use an open or closed source?
There has to be an understanding between the security of a system, the exposure of the system and the risk associated with using the system. Risk is defined as a combination of the likelihood of a successful attack and the damage resulting from it. The exposure of a system is not just that hackers can get into the system but that they know the vulnerabilities and whether the system is a high profile target. How secure the system is depends on the number of vulnerabilities and the severity.
A closed source prevents the attacker from easy access. However, it is well know that hackers take it as a challenge and they do not stop until they get access into a closed source and they can create havoc. One of the major problems is that the producers of the closed source are the only ones that can create patches to the vulnerabilities that have been compromised. A big problem is that it will take them weeks or months to implement their patches. In the meantime, they will be vulnerable to hackers. These hackers will provide the information to other hackers and the public over the internet eventually creating even more disaster for the victim.
An open source system does provide exposure to the public and actually puts the potential victim on guard where they have to install preventive software patches to protect themselves. However, this is a good thing because open source users help each other by making these patches available to a central repository. There is a network effect, where users can find more and faster patches to quickly resolve their problem. This also enables them to add extra security measures. Evidence suggests that patches for open source software are released almost twice as fast as for closed software, thus cutting in half the vulnerability period. If a user is unable to patch a bug himself, open source enables him to communicate about bugs with developers more efficiently. Because it is an open source to the public as a side effect, this will stimulate research and development in new, improved tools for software development, testing and evaluation. In the long run openness of the source will increase its security.
There has to be an understanding between the security of a system, the exposure of the system and the risk associated with using the system. Risk is defined as a combination of the likelihood of a successful attack and the damage resulting from it. The exposure of a system is not just that hackers can get into the system but that they know the vulnerabilities and whether the system is a high profile target. How secure the system is depends on the number of vulnerabilities and the severity.
A closed source prevents the attacker from easy access. However, it is well know that hackers take it as a challenge and they do not stop until they get access into a closed source and they can create havoc. One of the major problems is that the producers of the closed source are the only ones that can create patches to the vulnerabilities that have been compromised. A big problem is that it will take them weeks or months to implement their patches. In the meantime, they will be vulnerable to hackers. These hackers will provide the information to other hackers and the public over the internet eventually creating even more disaster for the victim.
An open source system does provide exposure to the public and actually puts the potential victim on guard where they have to install preventive software patches to protect themselves. However, this is a good thing because open source users help each other by making these patches available to a central repository. There is a network effect, where users can find more and faster patches to quickly resolve their problem. This also enables them to add extra security measures. Evidence suggests that patches for open source software are released almost twice as fast as for closed software, thus cutting in half the vulnerability period. If a user is unable to patch a bug himself, open source enables him to communicate about bugs with developers more efficiently. Because it is an open source to the public as a side effect, this will stimulate research and development in new, improved tools for software development, testing and evaluation. In the long run openness of the source will increase its security.
Data Remanence - Journal 5
I selected the article “How can DRAM remanence compromise encryption keys” by Michael Cobb at SearchSecurity.com because it discusses the attacks on random access memory which is the next step in data or disk remanence vulnerabilities. In data remanence the concern is the data that has been erased but still exists in hard drives. This article discusses the concern on the encryption keys where they linger in the RAM after the computer is turned off. It is unknown how much the risk is because it is an emerging threat and hackers will not publish their findings, at least not yet. This article also advices the readers on basic defenses such as physical prevention and training in awareness of the latest risks in RAM and disk remanence.
Thursday, October 1, 2009
Monday, September 28, 2009
Wireless Infidelity
1. What is war Driving?
Answer:
War driving is wireless monitoring but with the unlawful or unethical intent by intruders for their gain or profit.
2. What is Wired Equivalent Privacy (WEP)?
Answer:
A security protocol for wireless local area networks defined in the 802.11b standard. WEP aims to provide security by encrypting data over radio waves so that it is protected as it is transmitted from one end point to another.
3. What was the lessoned learned form War Driving?
Answer:
A person who is war driving has a very high possibility of getting prosecuted because judges are now willing to accept a reason of intent as being sufficient to put someone behind bars.
Answer:
War driving is wireless monitoring but with the unlawful or unethical intent by intruders for their gain or profit.
2. What is Wired Equivalent Privacy (WEP)?
Answer:
A security protocol for wireless local area networks defined in the 802.11b standard. WEP aims to provide security by encrypting data over radio waves so that it is protected as it is transmitted from one end point to another.
3. What was the lessoned learned form War Driving?
Answer:
A person who is war driving has a very high possibility of getting prosecuted because judges are now willing to accept a reason of intent as being sufficient to put someone behind bars.
Friday, September 25, 2009
Wireless Identity Thieves
The article “Wireless Identity thieves” really made me realize how easy it is to access wireless networks not just your neighbors’ wireless network devices but also wireless networks belonging to major corporations. Nation - wide corporations like Lowe’s, TJ Maxx, Marshalls apparently had open wireless networks that allowed criminal hackers access into their networks that carried credit card information and eventually stole anywhere from 50 million to 200 million credit card numbers. Even though there is no guarantee that you will be safe, it is recommended that you should encrypt your wireless network, home, or office, and if possible restrict your wireless to only work with the MAC addresses or devices you own. Layering WEP, WPA or WPA2 with MAC address permissions and stealthing one’s SSID can further keep your home network safe.
http://reviews.cnet.com/4520-3513_7-6733602-1.html
http://www.wired.com/science/discoveries/news/2006/07/71358
http://reviews.cnet.com/4520-3513_7-6733602-1.html
http://www.wired.com/science/discoveries/news/2006/07/71358
Taxonomy Attacks - Scary
Denial of Service and Distributed Denial of Service attacks may have two intentions. One can be to exhaust the resources of the targeted host, or to exhaust the bandwidth of a particular link. This bandwidth attack is called a flooding attack which uses a technique of sending a lot of packets down a particular link which was designed for only a certain amount of bandwidth according to the organization’s Information System needs.
Boycott Novel was hit with DDOS attacks in May of 2009. What was interesting about this article is that it states that you can do whatever you want but you will not be able to protect yourself from these attacks. The best you can do is try to see where they are coming from and report to the authorities. It is pretty scary for businesses who do all their business through their Web Pages and the Internet.
http://blogs.computerworld.com/burying_the_truth_boycott_novell_hit_by_denial_of_service_attack?page=1
http://practical-tech.com/network/brace-yourself-ddos-attacks-ahead/
Boycott Novel was hit with DDOS attacks in May of 2009. What was interesting about this article is that it states that you can do whatever you want but you will not be able to protect yourself from these attacks. The best you can do is try to see where they are coming from and report to the authorities. It is pretty scary for businesses who do all their business through their Web Pages and the Internet.
http://blogs.computerworld.com/burying_the_truth_boycott_novell_hit_by_denial_of_service_attack?page=1
http://practical-tech.com/network/brace-yourself-ddos-attacks-ahead/
Recommended Security Controls for Federal Information Systems and Organizations
The fundamental concepts associated with security control selection and specification are, the structure of security controls and the organization of the controls, security control baselines, the identification and use of common security controls, security controls in external environments, security control assurance, and future revisions to the security controls. The process of selecting and specifying security controls for an organizational information system includes; applying the organization’s approach to managing risk, categorizing the information system and determining the system impact level, selecting baseline and assessing the security controls as part of a comprehensive continuous monitoring process.
THE FUNDAMENTALS
Security Control Organization and Structure - In the security control selection and specification process, controls are organized into seventeen families. The table below lists the identifier, the family and the class. Please see Special Publication 800-53 [2.1] Table 1.1
http://csrc.nist.gov/publications/nistpubs/800-53-Rev2/sp800-53-rev2-final.pdf
Security Control Baselines - To assist organizations in making the appropriate selection of security controls for an information system, the concept of baseline controls is introduced in accordance FIPS 199 and FIPS 200, respectively (see urls below). http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.pdf
http://csrc.nist.gov/publications/fips/fips200/FIPS-200-final-march.pdf
Common Controls - The organization assigns responsibility for common controls to appropriate organizational officials and coordinates the development, implementation, assessment, authorization, and monitoring of the controls.
Security Controls in External Environments - Organizations are responsible and accountable for the risk incurred by use of services provided by external providers and address this risk by implementing compensating controls when the risk is greater than the authorizing official or the organization is willing to accept.
Security Control Assurance - Actions taken by security control assessors to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.
Revisions and Extensions - The security controls in the security control catalog are expected to change over time, as controls are withdrawn, revised, and added. A stable, yet flexible and technically rigorous set of security controls will be maintained in the security control catalog.
THE PROCESS
Managing Risk - The management of risk is a key element in the organization’s information security program and provides an effective framework for selecting the appropriate security controls for an information system—the security controls necessary to protect individuals and the operations and assets of the organization.
Categorizing the Information System - The security controls applied to a particular information system are commensurate with the potential adverse impact on organizational operations, organizational assets, individuals, other organizations, and the Nation should there be a loss of confidentiality, integrity, or availability. FIPS 199 requires organizations to categorize their information systems as low-impact, moderate-impact, or high-impact for the security objectives of confidentiality, integrity, and availability.
Selecting Security Controls - There are three steps in the control selection process carried out sequentially: selecting the initial set of baseline security controls,) tailoring the baseline security controls, and supplementing the tailored baseline.
Monitoring Security Controls - The continuous monitoring program includes an ongoing assessment of security control effectiveness to determine if there is a need to modify or update the current deployed set of security controls based on changes in the information system or its environment of operation
THE FUNDAMENTALS
Security Control Organization and Structure - In the security control selection and specification process, controls are organized into seventeen families. The table below lists the identifier, the family and the class. Please see Special Publication 800-53 [2.1] Table 1.1
http://csrc.nist.gov/publications/nistpubs/800-53-Rev2/sp800-53-rev2-final.pdf
Security Control Baselines - To assist organizations in making the appropriate selection of security controls for an information system, the concept of baseline controls is introduced in accordance FIPS 199 and FIPS 200, respectively (see urls below). http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.pdf
http://csrc.nist.gov/publications/fips/fips200/FIPS-200-final-march.pdf
Common Controls - The organization assigns responsibility for common controls to appropriate organizational officials and coordinates the development, implementation, assessment, authorization, and monitoring of the controls.
Security Controls in External Environments - Organizations are responsible and accountable for the risk incurred by use of services provided by external providers and address this risk by implementing compensating controls when the risk is greater than the authorizing official or the organization is willing to accept.
Security Control Assurance - Actions taken by security control assessors to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.
Revisions and Extensions - The security controls in the security control catalog are expected to change over time, as controls are withdrawn, revised, and added. A stable, yet flexible and technically rigorous set of security controls will be maintained in the security control catalog.
THE PROCESS
Managing Risk - The management of risk is a key element in the organization’s information security program and provides an effective framework for selecting the appropriate security controls for an information system—the security controls necessary to protect individuals and the operations and assets of the organization.
Categorizing the Information System - The security controls applied to a particular information system are commensurate with the potential adverse impact on organizational operations, organizational assets, individuals, other organizations, and the Nation should there be a loss of confidentiality, integrity, or availability. FIPS 199 requires organizations to categorize their information systems as low-impact, moderate-impact, or high-impact for the security objectives of confidentiality, integrity, and availability.
Selecting Security Controls - There are three steps in the control selection process carried out sequentially: selecting the initial set of baseline security controls,) tailoring the baseline security controls, and supplementing the tailored baseline.
Monitoring Security Controls - The continuous monitoring program includes an ongoing assessment of security control effectiveness to determine if there is a need to modify or update the current deployed set of security controls based on changes in the information system or its environment of operation
Generally Accepted System Security Principles
In September 1996, the National Institute of Standards and Technology in the Technology Administration of the U.S. Department of Commerce developed Principles and Practice for securing Information Technology Systems. The document provides a baseline that organizations can use to establish or review their IT Security Systems. The eight principles address computer security from a high level viewpoint. The fourteen practices guide organizations on the types of controls, objectives, and procedures that compromise an effective IT security program.
The eight principles in the following list provide an anchor or guide when creating new systems, practices, or policies. The United States endorsed the international OECD Guidelines that were developed to provide a foundation from which governments and the private sector could construct a framework for securing IT systems. The principles were based on guidelines as documented in the NIST Special Publication (SP) 800-14. http://csrc.nist.gov/publications/nistpubs/800-14/800-14.pdf
Generally Accepted Principles
1. Computer Security Supports the Mission of the Organization
Security helps the organization’s mission by protecting its physical and financial resources, reputation, legal position, employees, and other tangible and intangible assts.
2. Computer Security is an Integral Element of Sound Management
Organization managers have to decide what level of risk they are willing to accept, taking into account the cost of security controls.
3. Computer Security Should Be Cost-Effective
Security should be appropriate and proportionate to the value of and degree of reliance on the IT systems and to the severity, probability, and extent of potential harm.
4. Systems Owners Have Security Responsibilities Outside Their Own Organizations
If a system has external users, its owners have a responsibility to share appropriate knowledge about the existence and general extent of security measures so that other users can be confident that the system is adequately secure.
5. Computer Security Responsibilities and Accountability Should Be Made Explicit
This principle implicitly states (documents) that people and other entities (such as corporations or governments) have responsibility and accountability related to IT systems which may be shared.
6. Computer Security Requires a Comprehensive and Integrated Approach
Managers should recognize how computer security relates to other areas of systems and organizational management. Many other important interdependencies may exist that are often unique to the organization or system environment.
7. Computer Security Should Be Periodically Reassessed
Changes in the system or the environment can create new vulnerabilities. Strict adherence to procedures is rare and become outdated over time. These issues make it necessary to reassess periodically the security of IT systems.
8. Computer Security is Constrained by Societal Factors
The flow of information, especially between a government and its citizens, is a situation where security may need to be modified to support a societal goal. In addition, some authentication measures may be considered invasive in some environments and cultures.
Generally Accepted Practices
The practices serve as a companion to the NIST Special Publication, 800-12, An Introduction to Computer Security: The NIST Handbook. The following lists practices currently employed in an effective computer security program. http://csrc.nist.gov/publications/nistpubs/800-12/handbook.pdf
1. Policy
Directives for senior management to create a computer security program, establish its goals, and assign responsibilities.
2. Program Management
Program management of computer security at multiple levels is important because it contributes to the overall program utilizing different types of expertise, authority, and resources.
3. Risk Management
Require the analysis of risk, relative to potential benefits, consideration of alternatives, and implementation of what management determines to be the appropriate course of action.
4. Life Cycle Planning
Most IT system life cycle models contain five basic phases: initiation, development/acquisition, implementation, operation, and disposal.
5. Personnel/User Issues
No IT system can be secured without properly addressing a broad range of security issues related to how individuals interact with computers, access and authorities they need to do their job.
6. Preparing for Contingencies and Disasters
Contingency planning addresses how to keep an organization’s critical functions operating in the event of disruptions, both large and small.
7. Computer Security Incident Handling
An incident handling capability may be viewed as a component of contingency planning, because it provides the ability to react quickly and efficiently to disruptions in normal processing.
8. Awareness Training
An effective computer security awareness and training program requires proper planning, implementation, maintenance, and periodic evaluation.
9. Security Considerations in Computer Support and Operations
Failure to consider security as part of the support and operations of IT systems is a significant weakness. It must be included in user and software support. Any changes in configuration management, backups, and media controls are just some of the areas that must take security into consideration.
10. Physical and Environmental Security
Physical and environmental security controls are implemented to protect the facility housing systems resources, the system resources themselves, and the facilities used to support their operations.
11. Identification and Authentication
A critical building block of computer security since it is the basis for most types of access control and for establishing user accountability.
12. Logical Access Control
Organizations should implement logical access control based on policy made by a management official responsible for a particular system, application, subsystem, or group of systems.
13. Audit Trails
Audit trails can provide a means to help accomplish several security related objectives, including individual accountability, reconstruction of events, intrusion detection, and problem identification.
14. Cryptology
Cryptography provides an important tool for protecting information and is used in computer security. Several important issues should be considered when designing, implementing, and integrating cryptography in an IT system.
The eight principles in the following list provide an anchor or guide when creating new systems, practices, or policies. The United States endorsed the international OECD Guidelines that were developed to provide a foundation from which governments and the private sector could construct a framework for securing IT systems. The principles were based on guidelines as documented in the NIST Special Publication (SP) 800-14. http://csrc.nist.gov/publications/nistpubs/800-14/800-14.pdf
Generally Accepted Principles
1. Computer Security Supports the Mission of the Organization
Security helps the organization’s mission by protecting its physical and financial resources, reputation, legal position, employees, and other tangible and intangible assts.
2. Computer Security is an Integral Element of Sound Management
Organization managers have to decide what level of risk they are willing to accept, taking into account the cost of security controls.
3. Computer Security Should Be Cost-Effective
Security should be appropriate and proportionate to the value of and degree of reliance on the IT systems and to the severity, probability, and extent of potential harm.
4. Systems Owners Have Security Responsibilities Outside Their Own Organizations
If a system has external users, its owners have a responsibility to share appropriate knowledge about the existence and general extent of security measures so that other users can be confident that the system is adequately secure.
5. Computer Security Responsibilities and Accountability Should Be Made Explicit
This principle implicitly states (documents) that people and other entities (such as corporations or governments) have responsibility and accountability related to IT systems which may be shared.
6. Computer Security Requires a Comprehensive and Integrated Approach
Managers should recognize how computer security relates to other areas of systems and organizational management. Many other important interdependencies may exist that are often unique to the organization or system environment.
7. Computer Security Should Be Periodically Reassessed
Changes in the system or the environment can create new vulnerabilities. Strict adherence to procedures is rare and become outdated over time. These issues make it necessary to reassess periodically the security of IT systems.
8. Computer Security is Constrained by Societal Factors
The flow of information, especially between a government and its citizens, is a situation where security may need to be modified to support a societal goal. In addition, some authentication measures may be considered invasive in some environments and cultures.
Generally Accepted Practices
The practices serve as a companion to the NIST Special Publication, 800-12, An Introduction to Computer Security: The NIST Handbook. The following lists practices currently employed in an effective computer security program. http://csrc.nist.gov/publications/nistpubs/800-12/handbook.pdf
1. Policy
Directives for senior management to create a computer security program, establish its goals, and assign responsibilities.
2. Program Management
Program management of computer security at multiple levels is important because it contributes to the overall program utilizing different types of expertise, authority, and resources.
3. Risk Management
Require the analysis of risk, relative to potential benefits, consideration of alternatives, and implementation of what management determines to be the appropriate course of action.
4. Life Cycle Planning
Most IT system life cycle models contain five basic phases: initiation, development/acquisition, implementation, operation, and disposal.
5. Personnel/User Issues
No IT system can be secured without properly addressing a broad range of security issues related to how individuals interact with computers, access and authorities they need to do their job.
6. Preparing for Contingencies and Disasters
Contingency planning addresses how to keep an organization’s critical functions operating in the event of disruptions, both large and small.
7. Computer Security Incident Handling
An incident handling capability may be viewed as a component of contingency planning, because it provides the ability to react quickly and efficiently to disruptions in normal processing.
8. Awareness Training
An effective computer security awareness and training program requires proper planning, implementation, maintenance, and periodic evaluation.
9. Security Considerations in Computer Support and Operations
Failure to consider security as part of the support and operations of IT systems is a significant weakness. It must be included in user and software support. Any changes in configuration management, backups, and media controls are just some of the areas that must take security into consideration.
10. Physical and Environmental Security
Physical and environmental security controls are implemented to protect the facility housing systems resources, the system resources themselves, and the facilities used to support their operations.
11. Identification and Authentication
A critical building block of computer security since it is the basis for most types of access control and for establishing user accountability.
12. Logical Access Control
Organizations should implement logical access control based on policy made by a management official responsible for a particular system, application, subsystem, or group of systems.
13. Audit Trails
Audit trails can provide a means to help accomplish several security related objectives, including individual accountability, reconstruction of events, intrusion detection, and problem identification.
14. Cryptology
Cryptography provides an important tool for protecting information and is used in computer security. Several important issues should be considered when designing, implementing, and integrating cryptography in an IT system.
Thursday, September 17, 2009
RFC 1135 “Helminthiasis of the Internet Worm”
1. What was the cause of the first Internet Worm? In specific, what vulnerabilities did the worm take advantage of in order to spread through the Internet?
Answer:
It was a code that was developed specifically for targeting flawed utility programs in Unix systems and infected in particular Sun Microsystems Sun 3 systems and VAX computers running variants of 4 BSD UNIX. Some of the systems’ vulnerabilities allowed a free ride for the worm to attach itself to vector programs, establish itself as a shell, and proceeded by one of three routes: rsh, fingerd, or sendmail. But first it would attempt to establish a connection on the telnet or rexec ports first before attempting the infection methods to spread through the Internet.
This first Internet worm was traced to a twenty-three-year-old Cornell University graduate student named Robert Tappan Morris, Jr. He had launched it by infecting a machine at MIT from his terminal in Ithaca, New York. The worm identified other nearby computers on the Internet by rifling through various electronic address books found on the MIT machine. Its purpose was simple: to transmit a copy of itself to the machines, where it would there run alongside existing software and repeat the cycle.
When asked why he unleashed the worm, Morris said he wanted to count how many machines were connected to the Internet.
2. Are those vulnerabilities still present?
Answer:
The same vulnerabilities for the Unix operating system are not the same, just as the malware infestations are not the same. As a matter of fact, another Unix-like operating system has evolved called Linux. Today vulnerabilities still exist for these two operating systems but they are different, since technology has advanced so much that different anti-worm and anti-virus tools are consistently being developed. (See zdnet article below)
http://homes.cerias.purdue.edu/~spaf/tech-reps/823.pdf
http://yupnet.org/zittrain/archives/11
http://www.zdnet.com.au/insight/soa/Linux-Unix-viruses-demand-special-attention/0,139023731,120275738,00.htm
Answer:
It was a code that was developed specifically for targeting flawed utility programs in Unix systems and infected in particular Sun Microsystems Sun 3 systems and VAX computers running variants of 4 BSD UNIX. Some of the systems’ vulnerabilities allowed a free ride for the worm to attach itself to vector programs, establish itself as a shell, and proceeded by one of three routes: rsh, fingerd, or sendmail. But first it would attempt to establish a connection on the telnet or rexec ports first before attempting the infection methods to spread through the Internet.
This first Internet worm was traced to a twenty-three-year-old Cornell University graduate student named Robert Tappan Morris, Jr. He had launched it by infecting a machine at MIT from his terminal in Ithaca, New York. The worm identified other nearby computers on the Internet by rifling through various electronic address books found on the MIT machine. Its purpose was simple: to transmit a copy of itself to the machines, where it would there run alongside existing software and repeat the cycle.
When asked why he unleashed the worm, Morris said he wanted to count how many machines were connected to the Internet.
2. Are those vulnerabilities still present?
Answer:
The same vulnerabilities for the Unix operating system are not the same, just as the malware infestations are not the same. As a matter of fact, another Unix-like operating system has evolved called Linux. Today vulnerabilities still exist for these two operating systems but they are different, since technology has advanced so much that different anti-worm and anti-virus tools are consistently being developed. (See zdnet article below)
http://homes.cerias.purdue.edu/~spaf/tech-reps/823.pdf
http://yupnet.org/zittrain/archives/11
http://www.zdnet.com.au/insight/soa/Linux-Unix-viruses-demand-special-attention/0,139023731,120275738,00.htm
Wednesday, September 16, 2009
Is it Time to Supplement Desktop Security Protections
The article “Is it Time to Supplement Desktop Security Protections?” posted April 20, 2009, caught my attention because Bruce Van Nice goes further than just giving his perspective on safety for internet users through current protections. He proposes that there is a lot more that can be done to help the user beyond Desktop protection software. He is aware how Internet users struggle to get the best protection they can get without having the expertise to know whether they are actually getting the right anti-malware protection they need. Almost all users are under the assumption that the only thing they can do is use Desktop software and become aware of the different types of malware threats such as viruses, worms, and phishing. He states that this is not enough obviously because in the past few months there has been a dramatic increase in Internet-based attacks. He targets the Service providers because they are in the position to deliver network based protections that would benefit the Internet user tremendously. He believes that network based protections can complement and enhance existing desktop software.
I think that the fact that he is asking the question of the service providers to take the initiative to help the internet users is very important and something that should be conveyed to all users since they can pose the question to the Service Providers as they shop for the best service.
http://www.circleid.com/posts/20090420_time_to_supplement_desktop_security_protections/
I think that the fact that he is asking the question of the service providers to take the initiative to help the internet users is very important and something that should be conveyed to all users since they can pose the question to the Service Providers as they shop for the best service.
http://www.circleid.com/posts/20090420_time_to_supplement_desktop_security_protections/
Friday, September 4, 2009
Information Assurance Model
The McCumber model provides a concise representation of Information Systems Security discipline. The objective was to integrate separate disciplines such as personnel security, computer security, communications security, and operational security into a cohesive Information Assurance model. The model is viewed as both multidisciplinary and multidimensional. The four dimensions of the model are, Information States, Security Services, Security Countermeasures and time.
McCumber INFOSEC Model
McCumber INFOSEC Model
mekabay.com/courses/academic/norwich/.../lectures/01_INTRODUCTION.pdf
The interaction of the components is more important than the individual components themselves. The model is a framework for all who are seeking to understand Information Assurance, its dynamic components and how it will protect information in various states.
What is Live CD?
What is a Live CD?
A Live CD is a bootable compact disk that contains its own operating system that allows a user to utilize it in several ways to help them manage major issues or major changes on their desktop or laptop computers. It can be used when there are security issues or there is a need to try a different operating system. It doesn't install anything on your hard disk and it does not make any changes to the computer’s existing operating system, hard drives or files. You just insert the CD into your CD drive and restart the computer. It will boot from the CD and you can start using operating system software on your machine right away. In fact, you can even run a Live CD on a computer which has no hard-disk. All programs run directly from the CD. It doesn't alter your original software so it's convenient for testing or demonstration purposes. The system can return to its previous operating system when the computer is rebooted without the Live CD (Kayne, Pillay, User-ful, Borgohain).
Security Perspective
Consider the scenario of wanting to purchase the top of the line computer (whether laptop or desktop) from an online store. You have chosen the best combination of hardware for your money and you get it shipped. When you receive your new laptop or desktop, you run Live CD and use it as your operating system. You can use the hard disk (if any) as your storage medium. When you log off, you remove the live CD and put it away knowing that if your machine is ever stolen, the data in the internal hard disk is useless to anyone since the entire drive is encrypted with your private key and it is secured as well. You could also save your information on a USB mass storage device. From a corporate security aspect, your read-only CD is safe since it cannot be tampered or infected since it is read only (Pillay, Schaumann).
The origin of the live CD was not a CD at all, but a bootable floppy disk. Many operating system vendors, hardware manufacturers, and anti-virus developer's produced bootable floppy disks with a base operating system to perform functions which were not always possible with an operating system already running. As an example, hard disk manufacturers distributed bootable floppy disks to allow the system user to test their hardware products without having an operating system in the way, and for consistency in testing configurations. Anti-virus software developers provided bootable floppy and CD-ROM disks to allow the user to boot his system in a known safe condition, so any virus infections on their machine would not interfere with the virus testing software. Initially write-protected floppies were used to prevent infection from spreading from the tested system, but before long live CD media was used because they were cheaper to produce and were considerably faster for testing. Also, for the security-conscious user, or for the conscientious-security user, live CDs are useful, among other things for using untrusted hardware, such as public-use PCs at coffee shops, analyzing computers that may have been compromised, recovering data from systems that no longer boot for some reason and running software you'd prefer not to install on your hard disk.
Knoppix
Knoppix is a Debian-based Linux distribution and one of the first Linux live CDs that was available. While the Knoppix distribution is packed with open-source goodness, one of the most popular uses for Knoppix is recovering files from damaged drives. To that end Knoppix is packed with open-source applications for testing disk integrity, recovering files, reading corrupted drives, and more. There are a total of 2,000 programs packed into the disc covering everything from disc recovery to media playback. New technologies make the Knoppix Live CD very versatile and flexible, and there are many things you can do with the Live CD without having to resort to a full-blown hard disk install. You can work mobile. All you need to carry around is a CD and a small USB flash drive to store your settings and configurations. You can start Knoppix from almost anywhere with the same data, settings and even your own installed programs. A hard disk installation on the other hand, will tie you to the disk where you installed it, plus all the possible problems that come with a hard disk install (Knoppix, Fitzpatrick).
Windows Live CD
Microsoft has released a tool for system administrators and all personal computer professionals, the so called Windows Preinstallation Environment (WinPE). It is a modified, short version of the operating system based on the Windows XP kernel. In fact, WinPE lets you boot up your personal computer in a Graphical User Interface (GUI) and at the same time, control the configuration of all your system. You can then format a partition, have access to files on your hard disk or to other computers in your local area network, or use some external devices. if you have the correct drivers (CDR-INF).
Drawbacks
One of the biggest drawbacks of using a Live CD is the speed. Remember, you are running this from RAM so the amount of RAM the machine has (as well as the speed of the CD drive) will determine how fast your Live CD distribution will run. So a machine with low RAM will run poorly. This isn’t such an issue if you are planning on installing immediately. But using the Live CD on a low-RAM machine will be painfully slow. The other drawback was already mentioned, unless you are using a flash drive-based Live distribution, you cannot save data. If you are only testing the distribution out to see if you like it, that’s not a problem (Wallen).
Final Thoughts
Live CDs are here to stay. They have many uses and few drawbacks. If you are hesitant to use a Live CD because you don’t want to lose data, you shouldn’t worry about that (unless you accidentally click the installation button and accidentally click through all of the steps to install the operating system (Wallen).
1. Kayne, R. (2009) http://www.wisegeek.com/what-is-a-live-cd.htm
2. Pillay, Harish. (2005) http://www.freesoftwaremagazine.com/articles/live_cds
3. Schaumann, Jan. (2006) http://www.netbsd.org/~jschauma/nblivecds.pdf
4. TECH FAQ. http://www.tech-faq.com/live-cd.shtml
5. User-ful. (2208) http://support.userful.com/wiki/index.php/FAQs/Live_CD
6. Borgohain, Bolin. http://blogs.siliconindia.com/bolinborgohain/What_is_a_Live_CD-bid- axPw6Er43009378.html
7. Knoppix. (2008) http://www.knoppix.net/wiki/Live_CD_Tips
8. CDR-INF. (2005) http://www.cdrinfo.com/Sections/Reviews/Specific.aspx?ArticleId=15113
9. Wallen, Jack. (2009) http://www.ghacks.net/2009/02/18/get-to-know-linux-live-cd/
10. Bauer, Mick. (2008) http://www.linuxjournal.com/article/10038
11. Fitzpatrick, Jason. (2009) http://lifehacker.com/5157811/five-best-live-cds
A Live CD is a bootable compact disk that contains its own operating system that allows a user to utilize it in several ways to help them manage major issues or major changes on their desktop or laptop computers. It can be used when there are security issues or there is a need to try a different operating system. It doesn't install anything on your hard disk and it does not make any changes to the computer’s existing operating system, hard drives or files. You just insert the CD into your CD drive and restart the computer. It will boot from the CD and you can start using operating system software on your machine right away. In fact, you can even run a Live CD on a computer which has no hard-disk. All programs run directly from the CD. It doesn't alter your original software so it's convenient for testing or demonstration purposes. The system can return to its previous operating system when the computer is rebooted without the Live CD (Kayne, Pillay, User-ful, Borgohain).
Security Perspective
Consider the scenario of wanting to purchase the top of the line computer (whether laptop or desktop) from an online store. You have chosen the best combination of hardware for your money and you get it shipped. When you receive your new laptop or desktop, you run Live CD and use it as your operating system. You can use the hard disk (if any) as your storage medium. When you log off, you remove the live CD and put it away knowing that if your machine is ever stolen, the data in the internal hard disk is useless to anyone since the entire drive is encrypted with your private key and it is secured as well. You could also save your information on a USB mass storage device. From a corporate security aspect, your read-only CD is safe since it cannot be tampered or infected since it is read only (Pillay, Schaumann).
The origin of the live CD was not a CD at all, but a bootable floppy disk. Many operating system vendors, hardware manufacturers, and anti-virus developer's produced bootable floppy disks with a base operating system to perform functions which were not always possible with an operating system already running. As an example, hard disk manufacturers distributed bootable floppy disks to allow the system user to test their hardware products without having an operating system in the way, and for consistency in testing configurations. Anti-virus software developers provided bootable floppy and CD-ROM disks to allow the user to boot his system in a known safe condition, so any virus infections on their machine would not interfere with the virus testing software. Initially write-protected floppies were used to prevent infection from spreading from the tested system, but before long live CD media was used because they were cheaper to produce and were considerably faster for testing. Also, for the security-conscious user, or for the conscientious-security user, live CDs are useful, among other things for using untrusted hardware, such as public-use PCs at coffee shops, analyzing computers that may have been compromised, recovering data from systems that no longer boot for some reason and running software you'd prefer not to install on your hard disk.
Knoppix
Knoppix is a Debian-based Linux distribution and one of the first Linux live CDs that was available. While the Knoppix distribution is packed with open-source goodness, one of the most popular uses for Knoppix is recovering files from damaged drives. To that end Knoppix is packed with open-source applications for testing disk integrity, recovering files, reading corrupted drives, and more. There are a total of 2,000 programs packed into the disc covering everything from disc recovery to media playback. New technologies make the Knoppix Live CD very versatile and flexible, and there are many things you can do with the Live CD without having to resort to a full-blown hard disk install. You can work mobile. All you need to carry around is a CD and a small USB flash drive to store your settings and configurations. You can start Knoppix from almost anywhere with the same data, settings and even your own installed programs. A hard disk installation on the other hand, will tie you to the disk where you installed it, plus all the possible problems that come with a hard disk install (Knoppix, Fitzpatrick).
Windows Live CD
Microsoft has released a tool for system administrators and all personal computer professionals, the so called Windows Preinstallation Environment (WinPE). It is a modified, short version of the operating system based on the Windows XP kernel. In fact, WinPE lets you boot up your personal computer in a Graphical User Interface (GUI) and at the same time, control the configuration of all your system. You can then format a partition, have access to files on your hard disk or to other computers in your local area network, or use some external devices. if you have the correct drivers (CDR-INF).
Drawbacks
One of the biggest drawbacks of using a Live CD is the speed. Remember, you are running this from RAM so the amount of RAM the machine has (as well as the speed of the CD drive) will determine how fast your Live CD distribution will run. So a machine with low RAM will run poorly. This isn’t such an issue if you are planning on installing immediately. But using the Live CD on a low-RAM machine will be painfully slow. The other drawback was already mentioned, unless you are using a flash drive-based Live distribution, you cannot save data. If you are only testing the distribution out to see if you like it, that’s not a problem (Wallen).
Final Thoughts
Live CDs are here to stay. They have many uses and few drawbacks. If you are hesitant to use a Live CD because you don’t want to lose data, you shouldn’t worry about that (unless you accidentally click the installation button and accidentally click through all of the steps to install the operating system (Wallen).
1. Kayne, R. (2009) http://www.wisegeek.com/what-is-a-live-cd.htm
2. Pillay, Harish. (2005) http://www.freesoftwaremagazine.com/articles/live_cds
3. Schaumann, Jan. (2006) http://www.netbsd.org/~jschauma/nblivecds.pdf
4. TECH FAQ. http://www.tech-faq.com/live-cd.shtml
5. User-ful. (2208) http://support.userful.com/wiki/index.php/FAQs/Live_CD
6. Borgohain, Bolin. http://blogs.siliconindia.com/bolinborgohain/What_is_a_Live_CD-bid- axPw6Er43009378.html
7. Knoppix. (2008) http://www.knoppix.net/wiki/Live_CD_Tips
8. CDR-INF. (2005) http://www.cdrinfo.com/Sections/Reviews/Specific.aspx?ArticleId=15113
9. Wallen, Jack. (2009) http://www.ghacks.net/2009/02/18/get-to-know-linux-live-cd/
10. Bauer, Mick. (2008) http://www.linuxjournal.com/article/10038
11. Fitzpatrick, Jason. (2009) http://lifehacker.com/5157811/five-best-live-cds
Journal Assignment One - ID Theft
I selected my article because it is a current event that should not have happened. It is important because corporations are still too lax and complacent when it comes to Information Systems Security. The things that really intrigued me about the article “Army of 950 identity thieves marched through Manhattan’ 2 commanders indicted” was the involvement of so many people in the theft, how well it was organized, how they were able to avoid law enforcement, the amount of the theft ($2 million), and that they got away with it for almost 16 months. They were able to steal right under the eyes of nearly every bank in Manhattan. Not only that, but they were able to steal from the New York Police Department. How can we be able to gain or win the trust of these major banks with our money when they are all not capable of taking identity theft seriously? They are not doing enough to implement strong security measures that can prevent identity theft or at least capture the theft as it is happening.
How can so many people get involved with this crime? Was their belief in the myth that they are not stealing from people but from big businesses that in their mind makes it all right? Are they thinking it is a white collar crime and they will get off easy? What is the harm in bank tellers stealing account information and giving it to someone else? Information is intangible and is not of any value to most people. Banks deal with billions of dollars so who is going to miss a couple of million?
These thefts by this very well organized army of thieves happened between February 2007 and February 2009. We are in the 21st century where we have advanced considerably in technology. So what happened? Back in the 60’s when the infamous Frank Abagnale (Depicted in movie “Catch Me If You Can”) passed bad checks totaling $2.5 million; it took law enforcement a while to capture him. At the time of Abagnale’s great bad check spree, our technology to help identify and track this type of fraud was nonexistent. It was very difficult and understandable that someone was able to easily get away with it. Today, because of so many technological advances, we have developed so many security tools that can be implemented to stop identity theft and bank fraud. From my perspective and from what I have learned so far, it is the management and human factor in employee personnel in these organizations who are not taking it seriously. They are not doing their homework, they are not seeking the right help, and they are certainly not paying attention to their employees.
How can so many people get involved with this crime? Was their belief in the myth that they are not stealing from people but from big businesses that in their mind makes it all right? Are they thinking it is a white collar crime and they will get off easy? What is the harm in bank tellers stealing account information and giving it to someone else? Information is intangible and is not of any value to most people. Banks deal with billions of dollars so who is going to miss a couple of million?
These thefts by this very well organized army of thieves happened between February 2007 and February 2009. We are in the 21st century where we have advanced considerably in technology. So what happened? Back in the 60’s when the infamous Frank Abagnale (Depicted in movie “Catch Me If You Can”) passed bad checks totaling $2.5 million; it took law enforcement a while to capture him. At the time of Abagnale’s great bad check spree, our technology to help identify and track this type of fraud was nonexistent. It was very difficult and understandable that someone was able to easily get away with it. Today, because of so many technological advances, we have developed so many security tools that can be implemented to stop identity theft and bank fraud. From my perspective and from what I have learned so far, it is the management and human factor in employee personnel in these organizations who are not taking it seriously. They are not doing their homework, they are not seeking the right help, and they are certainly not paying attention to their employees.
Subscribe to:
Posts (Atom)