Chapter 8 Common Forensic ToolsOverviewIn this chapter, youl.docxchristinemaritza
油
This document discusses common forensic tools used for disk imaging and validation. It describes several open source and commercial tools, including dd (a UNIX utility), DriveSpy (a DOS-based tool), EnCase, Forensic Replicator, FTK Imager, Norton Ghost, ProDiscover, SAW, and SMART. These tools are used to create forensic copies or images of storage drives and media that can be analyzed while preserving the original evidence. The document provides an overview of each tool's imaging capabilities and validation features.
This document discusses the digital forensics toolkit DEFT. It provides an overview of DEFT's components including GNU/Linux and DART. It describes how DEFT can be used for digital forensics by ensuring integrity of file structures without altering data. Methods for installing DEFT include overwriting a hard drive, using a USB installer, or running in a virtual environment. Analysis tools, hashing tools, imaging tools, password recovery tools, and reporting tools included in DEFT are listed. Commands for managing storage devices like fdisk and mounting are provided. Autopsy forensic browser allows managing forensic investigations through a GUI. Methods for acquiring storage media include dd, ddrescue, and dcfldd. Foremost can recover
Data Compass is a computer forensic solution that provides three main capabilities:
1. It can extract, analyze, and validate data from malfunctioning hard disk drives that other forensic software cannot access.
2. It can fully and effectively recover data from unstable and bad sector defective hard disk drives with less risk of further damage.
3. It allows access and analysis of data from unrecognized hard disk drives through its emulation technology.
This document discusses computer forensic tools and how to evaluate them. It covers the major tasks performed by forensic tools, including acquisition, validation, extraction, reconstruction, and reporting. Acquisition involves making a copy of the original drive, while validation ensures the integrity of copied data. Extraction recovers data through viewing, searching, decompressing, and other methods. Reconstruction recreates a suspect drive. Reporting generates logs and reports on the examination process and findings. The document examines both software and hardware tools, as well as command-line and graphical user interface options. Maintaining and selecting appropriate tools is important for effective computer investigations.
This document outlines the steps for imaging and processing born-digital materials, including: running virus checks, creating disk/logical images using FTK Imager, partitioning disks, identifying file systems, capturing images to evidence folders with case numbers, processing images in AccessData FTK including hashing and indexing files, and generating a collection summary report. The workflow is intended to thoroughly capture and document born-digital content for long-term preservation and access.
Watching the Detectives: Using digital forensics techniques to investigate th...GarethKnight
油
This document discusses digital forensics techniques used by law enforcement and researchers. It describes how digital forensics emerged in response to criminal use of electronic devices and emphasizes scientifically valid methods. Key techniques discussed include imaging media to obtain evidence, using hashing to filter known files, and data carving to recover deleted information. Challenges include analyzing increasing digital data and addressing ethical issues when recovering deleted files.
This document provides an overview of computer forensics, including key terminology, how data is stored and retrieved from hard drives, and the process of acquiring forensic images. It discusses the difference between visible and latent data, and explains that visible data is what the operating system is aware of, like documents, while latent data includes things like file slack, RAM, and deleted files. It emphasizes that a forensic examiner must acquire data in a way that does not alter any bits and uses techniques like hashing to prove the integrity of acquired images.
iam giving you entire process of油油forensc duplication;the response.pdfmukhtaransarcloth
油
iam giving you entire process of forensc duplication;
the response strategy for forensc duplication is
Decision of when to perform a forensic duplication based is based, in part, on existing response
strategy for the instant situation
For example, many organizations have a policy of creating forensic HD duplicates of all PCs
used by executives that leave the organization.
Forensic Duplicates as Admissible Evidence;
Existing legal standards define minimum criteria for an item to be admitted into evidence
Collection process usually under scrutiny as well
Federal Rules of Evidence ;
Federal Rules of Evidence (FRE) 1002 state that the item or information presented in court must
be the original.
Exceptions: Definitions and Duplicates
If data are stored by computer or similar device, any printout or other output readable by sight,
shown to reflect the data accurately, is an original. Admissibility of Duplicates
A duplicate is admissible to the same extent as an original unless: A genuine question is raised
as to the authenticity of the original, or In the circumstances it would be unfair to admit the
duplicate in lieu of the original
first of all what is forensics duplication
A file that contains every bit of information from the source in a raw bitstream format.
Tools that create forensic duplicates:
1. dd
2. FTK Imager, Access Data
3. Dfcldd, US DOD Computer Forensics Lab version of the dd comand
Qualified Forensic Duplicate?
A file that contains every bit of information from the source, but may be stored in a altered form.
Tools that create qualified forensic duplicate output files:
1. SafeBack
2. EnCase
3. FTK Imager
Restored Image;
A restored image is what you get when you restore a forensic duplicate or a qualified forensic
duplicate to another storage medium.
Mismatched drive geometries can cause problems.
HD Development;
When hard drives grew beyond 512MB, the PC-BIOS needed to be updated (to recognize larger
drives). software emulated a modern BIOS. Software pushed all of the real data on the drive
down one sector and stored its program and information in sector 2.
The real partition table would be at cylinder 0, head 0, sector2. Safeback, EnCase, FTK Imager,
and dd will create a restored image from the qualified forensic duplicate. EnCase and dd images
may not need to be restored. Treat images as virtual disks, eliminating the need for restoration.
Note, FTK Imager can create images in the EnCase Format
Mirror Image ;
Created from hardware that does at bit for bit copy from one hard drive to another.
Requires two identical hard drives
Doesnt happen very often.
Tool Requirements: Forensic Duplication ;
Tool must: Create a forensic duplicate or mirror image of the original.
Handle read errors in a robust and graceful manner.
Not make any changes to source medium.
Capable of scientific and peer review.
Results must be third party repeatable and verifiable.
Legal Issues
Tools used for forensic duplication must pass the legal tests for r.
Debian Linux as a Forensic Workstation Vipin George
油
This document discusses using Debian GNU/Linux as a forensic workstation. It begins with an introduction to digital forensics and defines it as the gathering and analysis of digital information for use as legal evidence. It then discusses why Debian is suitable as a forensic workstation due to its stability, large set of forensic tools, and ability to avoid infecting evidence. The rest of the document outlines the stages of a forensic investigation and various tools that can be used at each stage, including acquiring disk images, examining disk images, collecting volatile memory data, and network forensics.
This document provides instructions for conducting computer forensics using free and open source tools in Linux. It discusses preparing the imaging system by installing the F.I.R.E. boot CD and NASA enhanced loopback drivers on the analysis system. Steps covered include identifying the evidence and image drives, gathering metadata before imaging using hdparm and sfdisk, imaging the drive using dcfldd for hashing and error handling, and preparing the analysis system to mount disk images using the NASA drivers. The document aims to demonstrate how to perform forensics at no cost using open source Linux tools.
This document discusses using Linux to recover data from failed Windows systems by creating disk images that can be mounted and accessed. Key steps include using dd to create an image file, mounting the image to access files, and copying recovered files to another location or burning to CD. More advanced tools like The Coroner's Toolkit can help if the filesystem is damaged. Safety precautions are advised when dealing with potentially failing drives.
Digital evidence acquisitions can be stored in raw, proprietary, or AFF formats. The main acquisition methods are disk-to-image files, disk-to-disk copies, logical disk acquisitions, and sparse copies. Proper planning includes validating acquisitions, making copies, and considering contingencies. Linux live CDs provide useful forensic tools like dcfldd for acquiring evidence while write-blocking disks. RAID acquisitions require determining the type of RAID and suitable acquisition tool. Remote network tools involve installing an agent on the suspect computer.
Digital evidence acquisitions can be stored in raw, proprietary, or Advanced Forensics Format (AFF). The document discusses various acquisition methods and tools for disk-to-image, disk-to-disk, logical, and sparse acquisitions. It emphasizes the importance of validation, contingency planning, and minimizing alteration of evidence during the acquisition process. Special considerations are given for acquiring data from RAID systems and using Linux tools or remote network tools.
Computer Forensics chap 3+4.DS_Store__MACOSXComputer Foren.docxmaxinesmith73660
油
Computer Forensics chap 3+4/.DS_Store
__MACOSX/Computer Forensics chap 3+4/._.DS_Store
Computer Forensics chap 3+4/6810chap03.doc
Computer Forensics
Chapter 3 Data Acquisition
(Chapter 4 in 4th Edition)
1. What are the three data acquisition methods & when should each one be used?
2. Explain how the investigators target drive can be smaller than the suspects drive in a disk-to-image copy.
3. Why are hash algorithms used when image files are created?
4. Differentiate between absolute and relative sectors.
5. What are some typical drawbacks to Windows data acquisition tools?
6. What is RAID? What challenges do RAID systems present to computer forensic investigators?
7. What is the difference between a Static Acquisition and a LiveAcquisition?
8. What drawbacks can be encountered when performing a RemoteAcquisition?
__MACOSX/Computer Forensics chap 3+4/._6810chap03.doc
Computer Forensics chap 3+4/6810chap04.doc
Computer Forensics
Chapter 4 Processing Crime & Incident Scenes
(Chapter 5 in 4th Edition)
1. Discuss different activities investigators perform with digital evidence?
2. Even though digital evidence is considered to be physical, it differs from other types of physical evidence. What are these differences and what issues do they create for analysts?
3. What are FOIA laws and why do they exist?
4. What are Corporate Policy Statements and Warning Banners? How do they impact an employers rights related to corporate computer investigations?
5. Explain how a corporate employee could jeopardize the suspects Fourth Amendment protection by gathering evidence in a private sector investigation?
6. What is probable cause and what criteria must be met to establish probable cause?
7. What is the plain view doctrine and how does it apply to the search and seizure of digital evidence?
8. Why is seizing a computer (and analyzing it in a computer forensics lab) preferred over analysis at the crime scene. What conditions might prevent an investigator from seizing a computer?
9. What is a technical advisor and what roles do they play at an incident or crime scene?
10. What various media exist for storing digital evidence? What are the pros and cons of each?
11. What steps are outlined in the text for processing & handling digital evidence?
12. Who has the responsibility for setting evidence retention standards in a corporate (private) environment? What is the exception to this rule?
__MACOSX/Computer Forensics chap 3+4/._6810chap04.doc
Computer Forensics chap 3+4/nelson03.ppt
Chapter 3
Data Acquisition
Guide to Computer Forensics
and Investigations
Fifth Edition
Guide to Computer Forensics and Investigations Fifth Edition
Chapter 3
Data Acquisition
*
息 Cengage Learning 2015
Guide to Computer Forensics and Investigations Fifth Edition
*
ObjectivesList digital evidence storage formatsExplain ways to determine the best acquisition methodDescribe contingency planning for data acquisitionsExplain how to use acquisiti.
This document discusses various types of forensic duplication including simple duplication that copies selected data versus forensic duplication that retains every bit on the source drive including deleted files. It covers requirements for forensic duplication including the need to act as admissible evidence. It describes different forensic image formats including complete disk, partition, and logical images and details scenarios for each type. Key aspects of forensic duplication covered include recovering deleted files, non-standard data types, ensuring image integrity with hashes, and traditional duplication methods like using hardware write blockers or live DVDs.
The document discusses the key aspects of computer forensics including the goals, processes, rules, software, and reporting. The goal of computer forensics is to perform a structured investigation while maintaining evidence to determine what happened on a computer and who was responsible. The typical stages are preparation, search and seizure, acquisition, analysis and reporting. Key rules include never mishandling evidence, never trusting the subject system, documenting everything, and never working on original evidence. Common software used includes FTK Imager, Stegsuite, and Helix. Important files that can be analyzed include SAM, SYSTEM32, index, and NTLDER. Sites listed provide access to forensics tools, software, and resources.
This document provides information about performing Linux forensics. It discusses analyzing floppy disks and hard disks using tools like dd, mount, and strings. It describes creating forensic images and obtaining hash values for verification. The document also outlines collecting data from a compromised system using a forensic toolkit, including gathering information on running processes, open ports, loaded kernel modules, and physical memory.
This document discusses forensic imaging. It describes how forensic imaging creates an exact copy of a hard drive or other media that can be used as digital evidence. It outlines different types of forensic imaging like physical, logical, and targeted collection. It also lists several tools that are commonly used for forensic imaging like FTK Imager, DriveImage XML, and EnCase forensic imager. Finally, it provides guidance on initial response when encountering shut down machines or live machines at a crime scene.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam at Strayer University. It includes multiple choice, true/false, and other types of questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tools categories, Macintosh and Linux boot processes/file systems, scope creep in investigations, and techniques for hiding data such as steganography and bad cluster marking.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tool categories, Macintosh and Linux boot processes, searching and hashing files, and hiding and recovering hidden data.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include common computer forensics tools and their functions, the Macintosh and Linux boot processes and file systems, scope creep in investigations, and techniques for hiding and recovering hidden data.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam at Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tool categories, Macintosh and Linux boot processes, searching and hashing files, and hiding and recovering hidden data.
Cis 562 week 11 final exam strayer newLillieDickey
油
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include common computer forensics tools and their functions, the Macintosh and Linux boot processes and file systems, scope creep in investigations, and techniques for hiding and recovering hidden data.
Watching the Detectives: Using digital forensics techniques to investigate th...GarethKnight
油
This document discusses digital forensics techniques used by law enforcement and researchers. It describes how digital forensics emerged in response to criminal use of electronic devices and emphasizes scientifically valid methods. Key techniques discussed include imaging media to obtain evidence, using hashing to filter known files, and data carving to recover deleted information. Challenges include analyzing increasing digital data and addressing ethical issues when recovering deleted files.
This document provides an overview of computer forensics, including key terminology, how data is stored and retrieved from hard drives, and the process of acquiring forensic images. It discusses the difference between visible and latent data, and explains that visible data is what the operating system is aware of, like documents, while latent data includes things like file slack, RAM, and deleted files. It emphasizes that a forensic examiner must acquire data in a way that does not alter any bits and uses techniques like hashing to prove the integrity of acquired images.
iam giving you entire process of油油forensc duplication;the response.pdfmukhtaransarcloth
油
iam giving you entire process of forensc duplication;
the response strategy for forensc duplication is
Decision of when to perform a forensic duplication based is based, in part, on existing response
strategy for the instant situation
For example, many organizations have a policy of creating forensic HD duplicates of all PCs
used by executives that leave the organization.
Forensic Duplicates as Admissible Evidence;
Existing legal standards define minimum criteria for an item to be admitted into evidence
Collection process usually under scrutiny as well
Federal Rules of Evidence ;
Federal Rules of Evidence (FRE) 1002 state that the item or information presented in court must
be the original.
Exceptions: Definitions and Duplicates
If data are stored by computer or similar device, any printout or other output readable by sight,
shown to reflect the data accurately, is an original. Admissibility of Duplicates
A duplicate is admissible to the same extent as an original unless: A genuine question is raised
as to the authenticity of the original, or In the circumstances it would be unfair to admit the
duplicate in lieu of the original
first of all what is forensics duplication
A file that contains every bit of information from the source in a raw bitstream format.
Tools that create forensic duplicates:
1. dd
2. FTK Imager, Access Data
3. Dfcldd, US DOD Computer Forensics Lab version of the dd comand
Qualified Forensic Duplicate?
A file that contains every bit of information from the source, but may be stored in a altered form.
Tools that create qualified forensic duplicate output files:
1. SafeBack
2. EnCase
3. FTK Imager
Restored Image;
A restored image is what you get when you restore a forensic duplicate or a qualified forensic
duplicate to another storage medium.
Mismatched drive geometries can cause problems.
HD Development;
When hard drives grew beyond 512MB, the PC-BIOS needed to be updated (to recognize larger
drives). software emulated a modern BIOS. Software pushed all of the real data on the drive
down one sector and stored its program and information in sector 2.
The real partition table would be at cylinder 0, head 0, sector2. Safeback, EnCase, FTK Imager,
and dd will create a restored image from the qualified forensic duplicate. EnCase and dd images
may not need to be restored. Treat images as virtual disks, eliminating the need for restoration.
Note, FTK Imager can create images in the EnCase Format
Mirror Image ;
Created from hardware that does at bit for bit copy from one hard drive to another.
Requires two identical hard drives
Doesnt happen very often.
Tool Requirements: Forensic Duplication ;
Tool must: Create a forensic duplicate or mirror image of the original.
Handle read errors in a robust and graceful manner.
Not make any changes to source medium.
Capable of scientific and peer review.
Results must be third party repeatable and verifiable.
Legal Issues
Tools used for forensic duplication must pass the legal tests for r.
Debian Linux as a Forensic Workstation Vipin George
油
This document discusses using Debian GNU/Linux as a forensic workstation. It begins with an introduction to digital forensics and defines it as the gathering and analysis of digital information for use as legal evidence. It then discusses why Debian is suitable as a forensic workstation due to its stability, large set of forensic tools, and ability to avoid infecting evidence. The rest of the document outlines the stages of a forensic investigation and various tools that can be used at each stage, including acquiring disk images, examining disk images, collecting volatile memory data, and network forensics.
This document provides instructions for conducting computer forensics using free and open source tools in Linux. It discusses preparing the imaging system by installing the F.I.R.E. boot CD and NASA enhanced loopback drivers on the analysis system. Steps covered include identifying the evidence and image drives, gathering metadata before imaging using hdparm and sfdisk, imaging the drive using dcfldd for hashing and error handling, and preparing the analysis system to mount disk images using the NASA drivers. The document aims to demonstrate how to perform forensics at no cost using open source Linux tools.
This document discusses using Linux to recover data from failed Windows systems by creating disk images that can be mounted and accessed. Key steps include using dd to create an image file, mounting the image to access files, and copying recovered files to another location or burning to CD. More advanced tools like The Coroner's Toolkit can help if the filesystem is damaged. Safety precautions are advised when dealing with potentially failing drives.
Digital evidence acquisitions can be stored in raw, proprietary, or AFF formats. The main acquisition methods are disk-to-image files, disk-to-disk copies, logical disk acquisitions, and sparse copies. Proper planning includes validating acquisitions, making copies, and considering contingencies. Linux live CDs provide useful forensic tools like dcfldd for acquiring evidence while write-blocking disks. RAID acquisitions require determining the type of RAID and suitable acquisition tool. Remote network tools involve installing an agent on the suspect computer.
Digital evidence acquisitions can be stored in raw, proprietary, or Advanced Forensics Format (AFF). The document discusses various acquisition methods and tools for disk-to-image, disk-to-disk, logical, and sparse acquisitions. It emphasizes the importance of validation, contingency planning, and minimizing alteration of evidence during the acquisition process. Special considerations are given for acquiring data from RAID systems and using Linux tools or remote network tools.
Computer Forensics chap 3+4.DS_Store__MACOSXComputer Foren.docxmaxinesmith73660
油
Computer Forensics chap 3+4/.DS_Store
__MACOSX/Computer Forensics chap 3+4/._.DS_Store
Computer Forensics chap 3+4/6810chap03.doc
Computer Forensics
Chapter 3 Data Acquisition
(Chapter 4 in 4th Edition)
1. What are the three data acquisition methods & when should each one be used?
2. Explain how the investigators target drive can be smaller than the suspects drive in a disk-to-image copy.
3. Why are hash algorithms used when image files are created?
4. Differentiate between absolute and relative sectors.
5. What are some typical drawbacks to Windows data acquisition tools?
6. What is RAID? What challenges do RAID systems present to computer forensic investigators?
7. What is the difference between a Static Acquisition and a LiveAcquisition?
8. What drawbacks can be encountered when performing a RemoteAcquisition?
__MACOSX/Computer Forensics chap 3+4/._6810chap03.doc
Computer Forensics chap 3+4/6810chap04.doc
Computer Forensics
Chapter 4 Processing Crime & Incident Scenes
(Chapter 5 in 4th Edition)
1. Discuss different activities investigators perform with digital evidence?
2. Even though digital evidence is considered to be physical, it differs from other types of physical evidence. What are these differences and what issues do they create for analysts?
3. What are FOIA laws and why do they exist?
4. What are Corporate Policy Statements and Warning Banners? How do they impact an employers rights related to corporate computer investigations?
5. Explain how a corporate employee could jeopardize the suspects Fourth Amendment protection by gathering evidence in a private sector investigation?
6. What is probable cause and what criteria must be met to establish probable cause?
7. What is the plain view doctrine and how does it apply to the search and seizure of digital evidence?
8. Why is seizing a computer (and analyzing it in a computer forensics lab) preferred over analysis at the crime scene. What conditions might prevent an investigator from seizing a computer?
9. What is a technical advisor and what roles do they play at an incident or crime scene?
10. What various media exist for storing digital evidence? What are the pros and cons of each?
11. What steps are outlined in the text for processing & handling digital evidence?
12. Who has the responsibility for setting evidence retention standards in a corporate (private) environment? What is the exception to this rule?
__MACOSX/Computer Forensics chap 3+4/._6810chap04.doc
Computer Forensics chap 3+4/nelson03.ppt
Chapter 3
Data Acquisition
Guide to Computer Forensics
and Investigations
Fifth Edition
Guide to Computer Forensics and Investigations Fifth Edition
Chapter 3
Data Acquisition
*
息 Cengage Learning 2015
Guide to Computer Forensics and Investigations Fifth Edition
*
ObjectivesList digital evidence storage formatsExplain ways to determine the best acquisition methodDescribe contingency planning for data acquisitionsExplain how to use acquisiti.
This document discusses various types of forensic duplication including simple duplication that copies selected data versus forensic duplication that retains every bit on the source drive including deleted files. It covers requirements for forensic duplication including the need to act as admissible evidence. It describes different forensic image formats including complete disk, partition, and logical images and details scenarios for each type. Key aspects of forensic duplication covered include recovering deleted files, non-standard data types, ensuring image integrity with hashes, and traditional duplication methods like using hardware write blockers or live DVDs.
The document discusses the key aspects of computer forensics including the goals, processes, rules, software, and reporting. The goal of computer forensics is to perform a structured investigation while maintaining evidence to determine what happened on a computer and who was responsible. The typical stages are preparation, search and seizure, acquisition, analysis and reporting. Key rules include never mishandling evidence, never trusting the subject system, documenting everything, and never working on original evidence. Common software used includes FTK Imager, Stegsuite, and Helix. Important files that can be analyzed include SAM, SYSTEM32, index, and NTLDER. Sites listed provide access to forensics tools, software, and resources.
This document provides information about performing Linux forensics. It discusses analyzing floppy disks and hard disks using tools like dd, mount, and strings. It describes creating forensic images and obtaining hash values for verification. The document also outlines collecting data from a compromised system using a forensic toolkit, including gathering information on running processes, open ports, loaded kernel modules, and physical memory.
This document discusses forensic imaging. It describes how forensic imaging creates an exact copy of a hard drive or other media that can be used as digital evidence. It outlines different types of forensic imaging like physical, logical, and targeted collection. It also lists several tools that are commonly used for forensic imaging like FTK Imager, DriveImage XML, and EnCase forensic imager. Finally, it provides guidance on initial response when encountering shut down machines or live machines at a crime scene.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam at Strayer University. It includes multiple choice, true/false, and other types of questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tools categories, Macintosh and Linux boot processes/file systems, scope creep in investigations, and techniques for hiding data such as steganography and bad cluster marking.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tool categories, Macintosh and Linux boot processes, searching and hashing files, and hiding and recovering hidden data.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include common computer forensics tools and their functions, the Macintosh and Linux boot processes and file systems, scope creep in investigations, and techniques for hiding and recovering hidden data.
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam at Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include forensics tool categories, Macintosh and Linux boot processes, searching and hashing files, and hiding and recovering hidden data.
Cis 562 week 11 final exam strayer newLillieDickey
油
This document provides an excerpt from a study guide for the CIS 562 Week 11 Final Exam from Strayer University. It includes multiple choice, true/false, and completion questions about computer forensics tools, file systems, analysis and validation. Some key topics covered include common computer forensics tools and their functions, the Macintosh and Linux boot processes and file systems, scope creep in investigations, and techniques for hiding and recovering hidden data.
Blind Spots in AI and Formulation Science Knowledge Pyramid (Updated Perspect...Ajaz Hussain
油
This presentation delves into the systemic blind spots within pharmaceutical science and regulatory systems, emphasizing the significance of "inactive ingredients" and their influence on therapeutic equivalence. These blind spots, indicative of normalized systemic failures, go beyond mere chance occurrences and are ingrained deeply enough to compromise decision-making processes and erode trust.
Historical instances like the 1938 FD&C Act and the Generic Drug Scandals underscore how crisis-triggered reforms often fail to address the fundamental issues, perpetuating inefficiencies and hazards.
The narrative advocates a shift from reactive crisis management to proactive, adaptable systems prioritizing continuous enhancement. Key hurdles involve challenging outdated assumptions regarding bioavailability, inadequately funded research ventures, and the impact of vague language in regulatory frameworks.
The rise of large language models (LLMs) presents promising solutions, albeit with accompanying risks necessitating thorough validation and seamless integration.
Tackling these blind spots demands a holistic approach, embracing adaptive learning and a steadfast commitment to self-improvement. By nurturing curiosity, refining regulatory terminology, and judiciously harnessing new technologies, the pharmaceutical sector can progress towards better public health service delivery and ensure the safety, efficacy, and real-world impact of drug products.
Useful environment methods in Odoo 18 - Odoo 際際滷sCeline George
油
In this slide well discuss on the useful environment methods in Odoo 18. In Odoo 18, environment methods play a crucial role in simplifying model interactions and enhancing data processing within the ORM framework.
How to Configure Flexible Working Schedule in Odoo 18 EmployeeCeline George
油
In this slide, well discuss on how to configure flexible working schedule in Odoo 18 Employee module. In Odoo 18, the Employee module offers powerful tools to configure and manage flexible working schedules tailored to your organization's needs.
Computer Application in Business (commerce)Sudar Sudar
油
The main objectives
1. To introduce the concept of computer and its various parts. 2. To explain the concept of data base management system and Management information system.
3. To provide insight about networking and basics of internet
Recall various terms of computer and its part
Understand the meaning of software, operating system, programming language and its features
Comparing Data Vs Information and its management system Understanding about various concepts of management information system
Explain about networking and elements based on internet
1. Recall the various concepts relating to computer and its various parts
2 Understand the meaning of softwares, operating system etc
3 Understanding the meaning and utility of database management system
4 Evaluate the various aspects of management information system
5 Generating more ideas regarding the use of internet for business purpose
APM People Interest Network Conference 2025
- Autonomy, Teams and Tension
- Oliver Randall & David Bovis
- Own Your Autonomy
Oliver Randall
Consultant, Tribe365
Oliver is a career project professional since 2011 and started volunteering with APM in 2016 and has since chaired the People Interest Network and the North East Regional Network. Oliver has been consulting in culture, leadership and behaviours since 2019 and co-developed HPTM速an off the shelf high performance framework for teams and organisations and is currently working with SAS (Stellenbosch Academy for Sport) developing the culture, leadership and behaviours framework for future elite sportspeople whilst also holding down work as a project manager in the NHS at North Tees and Hartlepool Foundation Trust.
David Bovis
Consultant, Duxinaroe
A Leadership and Culture Change expert, David is the originator of BTFA and The Dux Model.
With a Masters in Applied Neuroscience from the Institute of Organisational Neuroscience, he is widely regarded as the Go-To expert in the field, recognised as an inspiring keynote speaker and change strategist.
He has an industrial engineering background, majoring in TPS / Lean. David worked his way up from his apprenticeship to earn his seat at the C-suite table. His career spans several industries, including Automotive, Aerospace, Defence, Space, Heavy Industries and Elec-Mech / polymer contract manufacture.
Published in Londons Evening Standard quarterly business supplement, James Caans Your business Magazine, Quality World, the Lean Management Journal and Cambridge Universities PMA, he works as comfortably with leaders from FTSE and Fortune 100 companies as he does owner-managers in SMEs. He is passionate about helping leaders understand the neurological root cause of a high-performance culture and sustainable change, in business.
Session | Own Your Autonomy The Importance of Autonomy in Project Management
#OwnYourAutonomy is aiming to be a global APM initiative to position everyone to take a more conscious role in their decision making process leading to increased outcomes for everyone and contribute to a world in which all projects succeed.
We want everyone to join the journey.
#OwnYourAutonomy is the culmination of 3 years of collaborative exploration within the Leadership Focus Group which is part of the APM People Interest Network. The work has been pulled together using the 5 HPTM速 Systems and the BTFA neuroscience leadership programme.
https://www.linkedin.com/showcase/apm-people-network/about/
APM People Interest Network Conference 2025
-Autonomy, Teams and Tension: Projects under stress
-Tim Lyons
-The neurological levels of
team-working: Harmony and tensions
With a background in projects spanning more than 40 years, Tim Lyons specialised in the delivery of large, complex, multi-disciplinary programmes for clients including Crossrail, Network Rail, ExxonMobil, Siemens and in patent development. His first career was in broadcasting, where he designed and built commercial radio station studios in Manchester, Cardiff and Bristol, also working as a presenter and programme producer. Tim now writes and presents extensively on matters relating to the human and neurological aspects of projects, including communication, ethics and coaching. He holds a Masters degree in NLP, is an NLP Master Practitioner and International Coach. He is the Deputy Lead for APMs People Interest Network.
Session | The Neurological Levels of Team-working: Harmony and Tensions
Understanding how teams really work at conscious and unconscious levels is critical to a harmonious workplace. This session uncovers what those levels are, how to use them to detect and avoid tensions and how to smooth the management of change by checking you have considered all of them.
Research & Research Methods: Basic Concepts and Types.pptxDr. Sarita Anand
油
This ppt has been made for the students pursuing PG in social science and humanities like M.Ed., M.A. (Education), Ph.D. Scholars. It will be also beneficial for the teachers and other faculty members interested in research and teaching research concepts.
Prelims of Rass MELAI : a Music, Entertainment, Literature, Arts and Internet Culture Quiz organized by Conquiztadors, the Quiz society of Sri Venkateswara College under their annual quizzing fest El Dorado 2025.
How to Modify Existing Web Pages in Odoo 18Celine George
油
In this slide, well discuss on how to modify existing web pages in Odoo 18. Web pages in Odoo 18 can also gather user data through user-friendly forms, encourage interaction through engaging features.
How to use Init Hooks in Odoo 18 - Odoo 際際滷sCeline George
油
In this slide, well discuss on how to use Init Hooks in Odoo 18. In Odoo, Init Hooks are essential functions specified as strings in the __init__ file of a module.
The Constitution, Government and Law making bodies .saanidhyapatel09
油
This PowerPoint presentation provides an insightful overview of the Constitution, covering its key principles, features, and significance. It explains the fundamental rights, duties, structure of government, and the importance of constitutional law in governance. Ideal for students, educators, and anyone interested in understanding the foundation of a nations legal framework.
2. Storage Formats for Digital Evidence
Raw Format
In the past, there was only one practical way of copying data for the purpose of evidence preservation and
examination. Examiners performed a bit-by-bit copy from one disk to another disk the same size or larger.
As a practical way to preserve digital evidence, vendors (and some OS utilities, such as the Linux/UNIX dd
command) made it possible to write bitstream data to files. This copy technique creates simple sequential flat files
of a suspect drive or data set. The output of these flat files is referred to as a raw format . This format has unique
advantages and disadvantages to consider when selecting an acquisition format.
The advantages of the raw format are fast data transfers and the capability to ignore minor data read errors on
the source drive. In addition, most forensics tools can read the raw format , making it a universal acquisition
format for most tools.
One disadvantage of the raw format that it requires as much storage space as the original disk or data set.
Another disadvantage is that some raw format tools, typically freeware versions, might not collect marginal (bad)
sectors on the source drive, meaning they have a low threshold of retry reads on weak media spots on a drive.
Many commercial tools have a much higher threshold of retry reads to ensure that all data is collected.
Several commercial acquisition tools can produce raw format acquisitions and typically perform a validation
check by using Cyclic Redundancy Check (CRC32), Message Digest 5 (MD5), and Secure Hash Algorithm (SHA-
1 or later) hashing functions. These validation checks, however, usually create a separate file containing the hash
value.
3. Proprietary Formats
Most commercial forensics tools have their own formats for collecting digital
evidence.
Proprietary formats typically offer several features that complement the vendors
analysis tool, such as the following:
The option to compress or not compress image files of a suspect drive, thus saving space on the target
drive
The capability to split an image into smaller segmented files for archiving purposes, such as to CDs or
DVDs, with data integrity checks integrated into each segment
The capability to integrate metadata into the image file, such as date and time of the acquisition, hash
value (for self-authentication) of the original disk or medium, investigator or examiner name, and
comments or case details
4. Advanced Forensic Format
Dr. Simson L. Garfinkel developed an open-source acquisition format called
Advanced Forensic Format (AFF) . This format has the following design goals:
Capable of producing compressed or uncompressed image files
No size restriction for disk-to-image files
Space in the image file or segmented files for metadata
Simple design with extensibility
Open source for multiple computing platforms and OSs
Internal consistency checks for self-authentication
5. In digital Forensics, there are 2 types of acquisitions:
Static Acquisition: which is the preferred way to collect a digital evidence when a computer seized during
police raid.
Live Acquisition: is the way to collect digital evidence when a computer is powered on and the suspect has
been logged on to. This type is preferred when the hard disk is encrypted with a password.
For both types, there are 4 methods of collecting data:
1. Creating a disk-to-image file: the most common method to collect data. It allows the investigator to create
one or many bit-for-bit replications of the original drive. By using this method, we can use any of the forensics
tools such as ProDiscover, EnCase, FTK, X-ways, ILook, SMART, and Sleuth Kit to read the different types of
disk-to-image files.
2. Creating a disk-to-disk copy: is used when disk-to-image faces hardware of software errors due to
incompatibilities. It copies the entire disk to a newer disk by using any of the forensics tools such as EnCase and
SafeBack. These tools can adjust the target disks geometry to match the original drive.
3. Creating a logical disk-to-disk or disk-to-data file: this is the preferred method with large data storage
such as RAID servers. This method captures only specific files or file types of interest to the case. It is used
when time is limited.
Best Acquisition Method
6. 4. Creating a sparse copy of a folder or file: this method is similar to creating a logical
acquisition but it also collects deleted data (unallocated). Also this method is used when an
investigator doesnt need to examine the whole drive.
To determine the appropriate acquisition method, the investigator must consider the following:
The size of the source disk.
Can you retain the source disk as an evident or must you return it to the owner?
Time to do perform the acquisition.
Location of the evidence
7. Contingency Planning for Image Acquisitions
As a standard practice, make at least two images of the digital evidence you collect. If you have
more than one imaging tool, such as FTK Imager Lite and X-Ways Forensics, make the first copy
with one tool and the second copy with the other tool. Different acquisition tools use different
methods to copy data, and one tool might, for example, make more attempts to copy corrupted
areas of a drive. So using more than one tool can be helpful in making sure data has been copied
correctly.
If you have only one tool, however, consider making two images of the drive with the same
tool, especially for critical investigations. With many tools, you can make one copy with no
compression and compress the other copy. Remember that Murphys Law applies to digital
forensics, too: If anything can go wrong, it will.
Some acquisition tools dont copy data in the host protected area (HPA) of a disk drive. Check
the vendors documentation to see whether its tool can copy a drives HPA. If not, consider using a
hardware acquisition tool that can access the drive at the BIOS level, such as Belkasoft or ILookIX
IXImager, with a write-blocker, Image MASSter Solo, or X-Ways Replica. These tools can read a
disks HPA.
8. Microsoft has added whole disk encryption with BitLocker to its newer operating systems, such as
Windows Vista, 7, 8, and 10, which makes performing static acquisitions more difficult. As part of
contingency planning, you must be prepared to deal with encrypted drives.
A static acquisition on most whole diskencrypted drives currently involves decrypting the drives,
which requires the users cooperation in providing the decryption key.
Most whole disk encryption tools at least have a manual process for decrypting data, which is
converting the encrypted disk to an unencrypted disk. This process can take several hours,
depending on the disk size. One good thing about encryption is that data isnt altered, in that free and
slack space arent changed.
The biggest concern with whole disk encryption is getting the decryption keythat is, the password
or code used to access encrypted data. If you can recover the whole disk key with tools such as
Elcomsoft Forensic Disk Decryptor, mentioned previously, you need to learn how to use it to decrypt
the drive.
In criminal investigations, this might be impossible because if a disk contains evidence supporting the
crime, a suspect has a strong motivation not to supply the decryption key.Note
9. Validating Data Acquisitions
Probably the most critical aspect of computer forensics is validating digital evidence. The weakest
point of any digital investigation is the integrity of the data you collect, so validation is essential.
In this section, you learn how to use several tools to validate data acquisitions.
Validating digital evidence requires using a hashing algorithm utility, which is designed to create
a binary or hexadecimal number that represents the uniqueness of a data set, such as a file or disk
drive. This unique number is referred to as a digital fingerprint. With a few exceptions, making
any alteration in one of the fileseven changing one letter from uppercase to lowercase
produces a completely different hash value.
These exceptions, known as collisions, have been found to occur in a small number of files with
MD5, and SHA-1 might also be subject to collisions. For forensic examinations of data files on a
disk drive, however, collisions are of little concern. If two files with different content have the
same MD5 hash value, a comparison of each byte of a file can be done to see the differences.
Currently, several tools can do a byte-by-byte comparison of files.
10. Linux Validation Methods
Linux is rich in commands and functions. The two Linux shell commands shown earlier in this
chapter, dd and dcfldd, have several options that can be combined with other commands to
validate data. The dcfldd command has other options that validate data collected from an
acquisition. Validating acquired data with the dd command requires using other shell commands.
Current distributions of Linux include two hashing algorithm utilities: md5sum and sha1sum. Both
utilities can compute hashes of a single file, multiple files, individual or multiple disk partitions, or
an entire disk drive.
Validating dd-Acquired Data
As shown earlier, the following command produces segmented volumes of the /dev/sdb drive, with
each segmented volume named image_sdb and an incrementing extension of .aa, .ab, .ac, and so
on: To validate all segmented volumes of a suspect drive with the md5sum utility, you use the
Linux shell commands shown in the following steps. For the saved images, remember to change to
the directory where the data was saved, or list the exact path for the saved images. To use sha1sum
instead of md5sum, just replace all md5sum references in commands with sha1sum. The drive
should still be connected to your acquisition workstation.
11. Windows Validation Methods
h
Unlike Linux, Windows has no built-in hashing algorithm tools for digital forensics. However, many Windows third-party
programs do have a variety of built-in tools. These third-party programs range from hexadecimal editors, such as X-Ways
WinHex or Breakpoint Software Hex Workshop, to forensics programs, such as OSForensics, Autopsy, EnCase, and FTK.
Commercial forensics programs also have built-in validation features. Each program has its own validation technique used
with acquisition data in its proprietary format. For example, Autopsy uses MD5 to validate an image. It reads the metadata
in Expert Witness Compression or AFF image files to get the original hash. If the hashes dont match, Autopsy notifies
you that the acquisition is corrupt and cant be considered reliable evidence. In Autopsy and many other forensics tools,
however, raw format image files dont contain metadata. As mentioned, a separate manual validation is recommended for
all raw acquisitions at the time of analysis.
The previously generated validation file for raw format acquisitions is essential to the integrity of digital evidence. The
saved validation file can be used later to check whether the acquisition file is still good. In FTK Imager Lite, when you
select the Expert Witness Compression (.e01) or the SMART (.s01) format, additional options for validation are displayed.
This validation report also lists the MD5 and SHA-1 hash values. The MD5 hash value is added to the proprietary format
image or segmented files. When this image is loaded into FTK, SMART, or X-Ways Forensics (which can read only .e01
and raw files), the MD5 hash is read and compared with the image to verify whether the acquisition is correct.
12. Performing RAID Data Acquisitions
Acquisitions of RAID drives can be challenging and frustrating for digital forensics examiners because of
how RAID systems are designed, configured, and sized. Size is the biggest concern because many RAID
systems are now pushing into exabytes or more of data. The following sections review common RAID
configurations and discuss ways to acquire data on these large storage devices.
Understanding RAID
Redundant array of independent disks (RAID) is a computer configuration involving two or more physical
disks. Originally, RAID was developed as a data-redundancy measure to minimize data loss caused by a disk
failure. As technology improved, RAID also provided increased storage capabilities.
Several levels of RAID can be implemented through software (known as software RAID) or special
hardware controllers (known as hardware RAID). Software RAID is typically implemented from the host
computers OS. Hardware RAID uses its own controller as well as a processor and memory connected to the
host computer
13. The following are concepts that are often associated with
RAID:
Striping: Data is split across many drives.
Mirroring: Data is replicated between multiple drives.
Parity: This is a determined number used to recreate data
mathematically.
14. Different RAID Levels
RAID-0 (Stripping)
RAID-1 (Mirroring)
RAID-2 (Bit-Level Stripping with Dedicated Parity)
RAID-3 (Byte-Level Stripping with Dedicated Parity)
RAID-4 (Block-Level Stripping with Dedicated Parity)
RAID-5 (Block-Level Stripping with Distributed Parity)
RAID-6 (Block-Level Stripping with two Parity Bits)
15. Acquiring RAID Disks
Theres no simple method for getting an image of a RAID servers disks. You need to address the following
concerns:
How much data storage is needed to acquire all data for a forensics image?
What type of RAID is used? Is it Windows RAID 0 or 1 or an integrated hardware firmware
vendors RAID 5, 10, or 15? Is it another unknown configuration or OS? If its a RAID 1, 10,
or 15 server, do you need to have all drives connected so that the OS sees their contents?
Some older RAID 1 systems required connecting both drives to make the data readable, which
might also apply to RAID 10 and 15.
Do you have an acquisition tool capable of copying the data correctly?
Can the tool read a forensic copy of a RAID image?
Can the tool read split data saves of each RAID disk, and then combine all images of each
disk into one RAID virtual drive for analysis?
16. With the larger disks now available, copying small RAID systems to one large disk is possible,
similar to the way non-RAID suspect drives are copied. For example, a small server running eight
36 GB SCSI drives in a RAID 0 tower requires about a 300 GB SATA or IDE (PATA) drive.
Less data storage is needed if a proprietary format acquisition is used with compression applied.
All forensics analysis tools can analyze an image because they see the acquired data as one large
drive, not eight separate drives.
Several forensics vendors have added RAID recovery features. These vendors typically specialize in
one or two types of RAID formats. The following are some vendors offering RAID acquisition
functions:
Guidance Software EnCase
X-Ways Forensics
AccessData FTK
Runtime Software
R-Tools Technologies