Skip to content

omar-mohamed/Transformer-Arabic-To-English

Repository files navigation

Arabic to English Machine Translation with Google Transformer Model

This is an implementation of Machine Translation from Arabic to English using the Transformer Model.

This repo is based on the code provided by the authors.

We train the model using the OpenSubtitles V2018 arabic-english parallel dataset.

Preprocessing

Before training, we strip all Tashkeel from the arabic sentences in the OpenSubtitles dataset using pyarabic

Training

For now, we only trained a slightly modified version of the tiny model with the following hyperparameters:

num_hidden_layers=6,
hidden_size=64,
num_heads=4,
filter_size=256,

layer_postprocess_dropout=0.1,
attention_dropout=0.1,
relu_dropout=0.1,

optimizer_adam_beta1=0.9,
optimizer_adam_beta2=0.997,
optimizer_adam_epsilon=1e-09

training loss

Time

Model was trained for ~20h for 10 epochs (2 hours/epoch)

Evaluation results

We sampled a 1000-sentence portion from the OpenSubtitles v2018 training set for evaluation. Below are the case-insensitive BLEU scores after 10 epochs.

Param Set Score
tiny 26.54

evaluation bleu

Sample Translations

arabic (in) english (out)
دائما لشخص واحد Always for one person.
وهذا لن يشكل فارق، فأنا أقود سيارتي بهذا الطريق اسبوعيا And that won't be a difference, I'm driving my car this way a week.
أنا لا أبحث عن الرجل المناسب I'm not looking for the right guy.
اعتقد أنني بدأت أعجب بها وهي أيضا تبادلني نفس الشعور I think I'm starting to like her, and she also makes me the same feeling.
ماذا لو ان هذه هي آخر فرصة لي للتحدث؟ What if this is the last chance to talk to me?

Train your own Model

Below are the commands for running the Transformer model. See the Detailed instrutions for more details on running the model.

cd /path/to/models/official/transformer

# Ensure that PYTHONPATH is correctly defined as described in
# https://github.com/tensorflow/models/tree/master/official#requirements
# export PYTHONPATH="$PYTHONPATH:/path/to/models"

# Export variables
PARAM_SET=big
DATA_DIR=$HOME/transformer/data
MODEL_DIR=$HOME/transformer/model_$PARAM_SET
VOCAB_FILE=$DATA_DIR/vocab.ende.32768

# Download training/evaluation datasets
python data_download.py --data_dir=$DATA_DIR

# Train the model for 10 epochs, and evaluate after every epoch.
python transformer_main.py --data_dir=$DATA_DIR --model_dir=$MODEL_DIR \
    --vocab_file=$VOCAB_FILE --param_set=$PARAM_SET \
    --bleu_source=test_data/dev.ar --bleu_ref=test_data/dev.en

# Run during training in a separate process to get continuous updates,
# or after training is complete.
tensorboard --logdir=$MODEL_DIR

# Translate some text using the trained model
python translate.py --model_dir=$MODEL_DIR --vocab_file=$VOCAB_FILE \
    --param_set=$PARAM_SET --text="hello world"

# Compute model's BLEU score using the newstest2014 dataset.
python translate.py --model_dir=$MODEL_DIR --vocab_file=$VOCAB_FILE \
    --param_set=$PARAM_SET --file=test_data/newstest2014.en --file_out=translation.en
python compute_bleu.py --translation=translation.en --reference=test_data/dev.en

About

Arabic To English translation using transformer neural nets.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published