Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(handler): Add support for EWF format #582

Closed
wants to merge 2 commits into from

Conversation

nyuware
Copy link
Contributor

@nyuware nyuware commented May 17, 2023

Temporary solve for #419
Expert Witness Format (EWF) works with multiple forensics tools such as Encase, FTK, Linen, they mostly share the same specification.
The format is divided into multiple section, each section contain the size of it's own section and the offset of the next section.
Right now this handler works for the following formats :

  • Linen 5-6
  • Encase 2-7
  • EWFX
  • FTK
    This handler currently doesn't support the following formats :
  • Encase 1
  • SMART
  • EWF
    It also doesn't support the zlib compression.

This is only a temporary solution, the proper solution would be to use the "table" section, which holds the information to each chunk offset as well as the compression level of the chunk itself.
The table is built like this but can change depending on the version used
TABLE HEADER -> how many offset in table
ARRAY OF ENTRIES -> hold the relative offset to each chunk and the most significant bit indicates if compressed or not.

Some generous people made the documentation of the format here

unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
@qkaiser
Copy link
Contributor

qkaiser commented Oct 4, 2023

@nyuware please squash your commits and rebase on main

@nyuware
Copy link
Contributor Author

nyuware commented Oct 4, 2023

The code pushed works for the zlib compressed chunks.

The remaining problems are :

  1. adler-32 not implemented
  2. when the file is not compressed by zlib, the sector_per_chunks can be bigger than the remaining size of the section, meaning that the function outfile.write(chunks) will collect more than it's supposed to.

In the past, I was using this method to calculate the remaining size of the section, but it seems that the section_size is incorrect.

if volume_descriptor is not None:
    remaining_size = data_descriptor.section_size - len(
        data_descriptor
    )
    while remaining_size > 0:
        chunk_size = min(sectors_per_chunk, remaining_size)
        for chunk in iterate_file(
            file, file.tell(), chunk_size
        ):
            outfile.write(chunk[:-4])
        remaining_size -= chunk_size

vulture_whitelist.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
@@ -0,0 +1,202 @@
import io
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your integration tests are not well structured. You've placed them in tests/integration/archive/ewf/ewf but your handlers are named ewfl and ewfe.

You must have these two directories:

  • tests/integration/archive/ewf/ewfe
  • tests/integration/archive/ewf/ewfl

These directories must contain integration tests files for both cases.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I created 2 sample for each fortmat, one in clear text and one zlib compressed

unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Show resolved Hide resolved
Comment on lines 151 to 161
if data_descriptor.definition == Definition.VOLUME.value:
volume_descriptor = self._struct_parser.parse(
"volume_descriptor_t", file, Endian.LITTLE
)
sectors_per_chunk = find_chunk_size(volume_descriptor)

if data_descriptor.definition == Definition.SECTORS.value:
position = file.tell()

if data_descriptor.definition == Definition.TABLE.value:
self.table_descriptor(file, position, outdir, sectors_per_chunk)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain how EWF is structured ? Can we observe VOLUME, SECTORS, and TABLE in random orders ? Do they follow a strict ordering ? Is it possible to have multiple VOLUMEs ? What about the others ?

unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
unblob/handlers/archive/ewf/ewf.py Outdated Show resolved Hide resolved
output_file.write(compressed_chunk)
output_file.write(chunk)

def extract(self, inpath: Path, outdir: Path):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a logic / naming issue here. The extraction is performed in table_descriptor while the parsing is performed in extract. Please adjust your function so they reflect what they're actually doing. Ideally you would parse the EWF structure and return data that can be used by extract in order to create files in outdir.

@nyuware nyuware marked this pull request as draft November 8, 2023 12:49
@nyuware nyuware closed this Feb 7, 2024
@nyuware nyuware deleted the EWF-format branch February 7, 2024 13:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants