Need help with create_tfrecords?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

kwotsin
129 Stars 51 Forks MIT License 17 Commits 6 Opened issues

Description

A simpler way of preparing large-scale image dataset by generalizing functions from TensorFlow-slim

Services available

!
?

Need anything else?

Contributors list

# 58,185
Python
Tensorf...
pytorch
cgan
17 commits

create_tfrecords

A simpler way of preparing large-scale image dataset by generalizing functions from TensorFlow-slim.

Requirements

  1. Python 2.7.x
  2. TensorFlow >= 0.12

NOTE: If you want to run this program on Python 3, clone and run

git checkout python-3.0
for the Python 3 branch instead.

Usage

$python create_tfrecord.py --dataset_dir=/path/to/dataset/ --tfrecord_filename=dataset_name

#Example: python create_tfrecord.py --dataset_dir=/path/to/flowers --tfrecord_filename=flowers #Note that the dataset_dir should be the folder that contains the root directory and not the root directory itself.

Arguments

Required arguments:

  • dataset_dir (string): The directory to your dataset that is arranged in a structured way where your subdirectories keep classes of your images.

For example:

flowers\
    flower_photos\
        tulips\
            ....jpg
            ....jpg
            ....jpg
        sunflowers\
            ....jpg
        roses\
            ....jpg
        dandelion\
            ....jpg
        daisy\
            ....jpg

  Note: Your datasetdir should be /path/to/flowers and not /path/to/flowers/flowersphotos

  • tfrecord_filename (string): The output name of your TFRecord files.

Optional Arguments

  • validation_size (float): The proportion of the dataset to be used for evaluation.

  • num_shards (int): The number of shards to split your TFRecord files into.

  • random_seed (int): The random seed number for repeatability.

Complete Guide

For a complete guide, please visit here.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.