create_tfrecords

by kwotsin

A simpler way of preparing large-scale image dataset by generalizing functions from TensorFlow-slim

126 Stars 51 Forks Last release: Not found MIT License 17 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

create_tfrecords

A simpler way of preparing large-scale image dataset by generalizing functions from TensorFlow-slim.

Requirements

  1. Python 2.7.x
  2. TensorFlow >= 0.12

NOTE: If you want to run this program on Python 3, clone and run

git checkout python-3.0
for the Python 3 branch instead.

Usage

$python create_tfrecord.py --dataset_dir=/path/to/dataset/ --tfrecord_filename=dataset_name

#Example: python create_tfrecord.py --dataset_dir=/path/to/flowers --tfrecord_filename=flowers #Note that the dataset_dir should be the folder that contains the root directory and not the root directory itself.

Arguments

Required arguments:

  • dataset_dir (string): The directory to your dataset that is arranged in a structured way where your subdirectories keep classes of your images.

For example:

flowers\
    flower_photos\
        tulips\
            ....jpg
            ....jpg
            ....jpg
        sunflowers\
            ....jpg
        roses\
            ....jpg
        dandelion\
            ....jpg
        daisy\
            ....jpg

  Note: Your datasetdir should be /path/to/flowers and not /path/to/flowers/flowersphotos

  • tfrecord_filename (string): The output name of your TFRecord files.

Optional Arguments

  • validation_size (float): The proportion of the dataset to be used for evaluation.

  • num_shards (int): The number of shards to split your TFRecord files into.

  • random_seed (int): The random seed number for repeatability.

Complete Guide

For a complete guide, please visit here.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.