-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(handler): Add support for EWF format #582
Conversation
@nyuware please squash your commits and rebase on main |
The code pushed works for the zlib compressed chunks. The remaining problems are :
In the past, I was using this method to calculate the remaining size of the section, but it seems that the section_size is incorrect. if volume_descriptor is not None:
remaining_size = data_descriptor.section_size - len(
data_descriptor
)
while remaining_size > 0:
chunk_size = min(sectors_per_chunk, remaining_size)
for chunk in iterate_file(
file, file.tell(), chunk_size
):
outfile.write(chunk[:-4])
remaining_size -= chunk_size |
@@ -0,0 +1,202 @@ | |||
import io |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your integration tests are not well structured. You've placed them in tests/integration/archive/ewf/ewf
but your handlers are named ewfl
and ewfe
.
You must have these two directories:
tests/integration/archive/ewf/ewfe
tests/integration/archive/ewf/ewfl
These directories must contain integration tests files for both cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I created 2 sample for each fortmat, one in clear text and one zlib compressed
unblob/handlers/archive/ewf/ewf.py
Outdated
if data_descriptor.definition == Definition.VOLUME.value: | ||
volume_descriptor = self._struct_parser.parse( | ||
"volume_descriptor_t", file, Endian.LITTLE | ||
) | ||
sectors_per_chunk = find_chunk_size(volume_descriptor) | ||
|
||
if data_descriptor.definition == Definition.SECTORS.value: | ||
position = file.tell() | ||
|
||
if data_descriptor.definition == Definition.TABLE.value: | ||
self.table_descriptor(file, position, outdir, sectors_per_chunk) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain how EWF is structured ? Can we observe VOLUME, SECTORS, and TABLE in random orders ? Do they follow a strict ordering ? Is it possible to have multiple VOLUMEs ? What about the others ?
output_file.write(compressed_chunk) | ||
output_file.write(chunk) | ||
|
||
def extract(self, inpath: Path, outdir: Path): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a logic / naming issue here. The extraction is performed in table_descriptor
while the parsing is performed in extract
. Please adjust your function so they reflect what they're actually doing. Ideally you would parse the EWF structure and return data that can be used by extract
in order to create files in outdir
.
Temporary solve for #419
Expert Witness Format (EWF) works with multiple forensics tools such as Encase, FTK, Linen, they mostly share the same specification.
The format is divided into multiple section, each section contain the size of it's own section and the offset of the next section.
Right now this handler works for the following formats :
This handler currently doesn't support the following formats :
It also doesn't support the zlib compression.
This is only a temporary solution, the proper solution would be to use the "table" section, which holds the information to each chunk offset as well as the compression level of the chunk itself.
The table is built like this but can change depending on the version used
TABLE HEADER -> how many offset in table
ARRAY OF ENTRIES -> hold the relative offset to each chunk and the most significant bit indicates if compressed or not.
Some generous people made the documentation of the format here