Need help with coral-cnn?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

216 Stars 41 Forks MIT License 25 Commits 13 Opened issues


Rank Consistent Ordinal Regression for Neural Networks with Application to Age Estimation

Services available


Need anything else?

Contributors list

Rank-consistent Ordinal Regression for Neural Networks

This repository contains the PyTorch model code for the paper

  • Wenzhi Cao, Vahid Mirjalili, Sebastian Raschka (2020): Rank Consistent Ordinal Regression for Neural Networks with Application to Age Estimation. Pattern Recognition Letters.
    [Journal Paper] [ArXiv Preprint]
    [PyTorch Package] [Keras Port]

This GitHub repository contains the code files and training logs used in the paper. If you are primarily interested in using CORAL, a PyTorch library with Tutorials can be found here:

PyTorch Model Code

Note that the model code across datasets is identical for the different datasets, however, we hard coded the file paths to the datasets at the top of the file and using dataloaders specific to the corresponding dataset organization. You likely need to change the file paths in the scripts depending on where you save the image datasets and label files if you wish to run the code.

All code was run on PyTorch 1.5 and Python 3.7, and we do not guarantee upward and downward compatibility to other PyTorch and Python versions.

The model code can be found in the

subdirectory, and the code files are labeled using the scheme
  •  refers to either AFAD (
    ), MORPH-2 (
    ), or CACD (
  •  refers to either CORAL (
    ), ordinal regression as in Niu et al. (
    ), or cross-entropy (


The following code trains

on the
python --seed 1 --cuda 0 --outpath afad-model1
  • --seed 
    : Integer for the random seed; used for training set shuffling and the model weight initialization (note that CUDA convolutions are not fully deterministic).
  • --cuda 
    : The CUDA device number of the GPU to be used for training (
    --cuda 0
    refers to the 1st GPU).
  • --outpath 
    : Path for saving the training log (
    ) and the parameters of the trained model (

Here is an overview of the differences between a regular CNN and a CORAL-CNN:

(Click to see a high resolution version.)

Training Logs and Trained Models from the Paper

We share all training logs in this GitHub repository under the ./experiment-logs subdirectory. Due to the large file-size (85 Mb per model), we could not share the trained models on GitHub; however, all trained models can be downloaded from Google Drive via the following link:

Image files

The image files of the face image datasets are available from the following websites:

  • CACD:

  • AFAD:

  • MORPH-2:

Data preprocessing code

We provide the dataset preprocessing code that we used to prepare the CACD and MORPH-2 datasets as described in the paper. The code is located in the

subdirectory. AFAD did not need further preprocessing.

Labels and train/test splits

We provide the age labels (obtained from the orginal dataset resources) and train/test splits we used in CSV format located in the

  • CACD: labels 0-48 correspond to ages 14-62
  • AFAD: labels 0-25 correspond to ages 15-40
  • MORPH-2: labels 0-54 correspond to ages 16-70

Using Trained Models

We share the pre-trained models from the paper that can be used to make predictions on AFAD, MORPH-2, or CACD images. Please see the README in the single-image-prediction__w-pretrained-models subdirectory for details.

Implementations for Other Deep Learning Frameworks

Porting Guide

Our models were originally implemented in PyTorch 1.5. We provide a recipe for porting the code is provided at coral-implementation-recipe.ipynb. Also see the the file-diff comparing CORAL with regular CNN.


A Keras port of this code was recently developed and made available at

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.