Skip to content
/ SF-GRU Public

Pedestrian Action Anticipation using Contextual Feature Fusion in Stacked RNNs

License

Notifications You must be signed in to change notification settings

aras62/SF-GRU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SF-GRU

sf-gru

This is the python implementation for paper A. Rasouli, I. Kotseruba, and J. K. Tsotsos, "Pedestrian Action Anticipation using Contextual Feature Fusion in Stacked RNNs", BMVC 2019.

Table of contents

Dependencies

The interface is written and tested using python 3.5. The interface also requires the following external libraries:

  • tensorflow (tested with 1.9 and 1.14)
  • keras (tested with 2.1 and 2.2)
  • scikit-learn
  • numpy
  • pillow

Datasets

The code is trained and tested with PIE and JAAD datasets. We used the keras implementation of open-pose to generate poses for the PIE dataset. They can be found at data/features/pie/poses.

Train

A sample training script is provided below:

from sf_gru import SFGRU
from pie_data import PIE

data_opts = { 'seq_type': 'crossing',
              'data_split_type': 'random',
               ... }
imdb = PIE(data_path=<path_to_pie>)

model_opts = {'obs_input_type': ['local_box', 'local_context', 'pose', 'box', 'speed'],
              ...}

method_class = SFGRU()
beh_seq_train = imdb.generate_data_trajectory_sequence('train', **data_opts)
saved_files_path = method_class.train(beh_seq_train)

from pie_data import PIE imports the data interface. Download the interface from the corresponding annotation repository.
data_opts = { 'seq_type': 'crossing', ... } specifies the data generation parameters form the dataset. Make sure that seq_type is set to 'crossing'. Refer to generate_data_trajectory_sequence() method in corresponding interface for more information.
model_opts = {...} specifies how the training data should be prepared for the model. Refer to sf_gru.py:get_data() for more information on how to set the parameters.
method_class = SFGRU() instantiates an object of type SFGRU.
imdb.generate_data_trajectory_sequence() generate data sequences from the dataset interface.
method_class.train() trains the model and returns the path to the folder where model and data processing parameters are saved.

A sample of training code can be found in train_test.py. All the default parameters in the script replicate the conditions in which the model was trained for the paper. Note that since 'random' split data is used, the model may yield different performance at test time.

Test

A sample test script is provided below

from sf_gru import SFGRU
from pie_data import PIE

data_opts = { 'seq_type': 'crossing',
              'data_split_type': 'random',
               ... }
imdb = PIE(data_path=<path_to_pie>)

method_class = SFGRU()
beh_seq_test = imdb.generate_data_trajectory_sequence('test', **data_opts)
saved_files_path = <path_to_model_folder>
acc , auc, f1, precision, recall = method_class.test(beh_seq_test, saved_files_path)

The procedure is similar to train with the exception that there is no need to specify model_opts as they Are saved in the model folder at train time.
In the case only test is run without training, saved_files_path should be specified. It should be the path to the folder where model and training parameters are saved. The final model used for the paper can be found at data/models/pie/sf-rnn. Note that if test follows train, the path is returned by train() function.
method_class.test() test the performance of the model and return the results using the following 5 metrics acc (accuracy) , auc (area under curve), f1, precision and recall. A sample of training code can be found in train_test.py.

Citation

If you use our dataset, please cite:

@inproceedings{rasouli2017they,
  title={Pedestrian Action Anticipation using Contextual Feature Fusion in Stacked RNNs},
  author={Rasouli, Amir and Kotseruba, Iuliia and Tsotsos, John K},
  booktitle={BMVC},
  year={2019}
}

Authors

Please send email to arasouli.ai@gmail.com or yulia_k@eecs.yorku.ca if there are any problems with downloading or using the data.

License

This project is licensed under the MIT License - see the LICENSE file for details

About

Pedestrian Action Anticipation using Contextual Feature Fusion in Stacked RNNs

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages