Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HOTO? #44

Open
Rut-B opened this issue Mar 18, 2024 · 5 comments
Open

HOTO? #44

Rut-B opened this issue Mar 18, 2024 · 5 comments

Comments

@Rut-B
Copy link

Rut-B commented Mar 18, 2024

I'm trying to figure out how to run the shapy model on my own RGB images, without needing any additional files like keypoints or vertices.

I would greatly appreciate it if you could provide me an explanation of how to run the code with my RGB images only - to get the measurements of the body

@Rut-B
Copy link
Author

Rut-B commented Mar 18, 2024

@Rut-B
Copy link
Author

Rut-B commented Mar 18, 2024

I succeeded to run this cmd on the collab notebook

%%shell
cd /content/shapy/regressor
python demo.py --save-vis true --save-params true --save-mesh true --split test --datasets openpose --output-folder samples/shapy_fit_ruti/ --exp-cfg configs/b2a_expose_hrnet_demo.yaml --exp-opts output_folder=../data/trained_models/shapy/SHAPY_A part_key=pose datasets.pose.openpose.data_folder=../samples datasets.pose.openpose.img_folder=image datasets.pose.openpose.keyp_folder=openpose datasets.batch_size=1 datasets.pose_shape_ratio=1.0

@muelea
Copy link
Owner

muelea commented Mar 18, 2024

You need to adapt the data processing script to do that. We use the keypoint file to crop the person on the picture. If you have a picture that is already cropped you can skip this step in the data loader.

@Rut-B
Copy link
Author

Rut-B commented Mar 19, 2024

@muelea thanks of your response!!
if just have RGB image that not was copped. should I prepare anything?
what is the cmd running in order to use the Shapy to get the measurements body from RGB wild image?

python demo.py
--save-vis true
--save-params true
--save-mesh true
--split test --datasets openpose
--output-folder ../samples/shapy_fit/
--exp-cfg configs/b2a_expose_hrnet_demo.yaml
--exp-opts output_folder=../data/trained_models/shapy/SHAPY_A
part_key=pose
datasets.pose.openpose.data_folder=../samples
datasets.pose.openpose.img_folder=images
datasets.pose.openpose.keyp_folder=openpose
datasets.batch_size=1
datasets.pose_shape_ratio=1.0

what are this parameters?
datasets.pose.openpose.data_folder=../samples
datasets.pose.openpose.img_folder=images
datasets.pose.openpose.keyp_folder=openpose

is it should be specific for the images?

@Rut-B
Copy link
Author

Rut-B commented Mar 20, 2024

@muelea I would greatly appreciate it if you could provide me an explanation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants