Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How Should I customize data for Custom Dataset Training? #94

Open
PIjarihd opened this issue Dec 30, 2021 · 2 comments
Open

How Should I customize data for Custom Dataset Training? #94

PIjarihd opened this issue Dec 30, 2021 · 2 comments

Comments

@PIjarihd
Copy link

PIjarihd commented Dec 30, 2021

I am trying to train the custom dataset. However, I am quite confused with the process. What exact changes should I made? Can you please guide me step wise? I studied #62. #8 and #5 , however I still could not figure out how should I prepare the data? Do I need to make the folders exactly like this?
KakaoTalk_20211229_194244828
Which code should I run to downscale the image? How should I prepare caches?
Please guide me or provide me some tutorial link. Thanks in advance.

@huchi00057
Copy link

Same problem

@Gooda97
Copy link

Gooda97 commented May 16, 2023

I am trying to train the custom dataset. However, I am quite confused with the process. What exact changes should I made? Can you please guide me step wise? I studied #62. #8 and #5 , however I still could not figure out how should I prepare the data? Do I need to make the folders exactly like this? KakaoTalk_20211229_194244828 Which code should I run to downscale the image? How should I prepare caches? Please guide me or provide me some tutorial link. Thanks in advance.

lr_images_path = r'C:\Users\dell\Downloads\super_resolution\input'
hr_images_path = r'C:\Users\dell\Downloads\super_resolution\output'

file_paths_1 = sorted(sorted(glob.glob(os.path.join(lr_images_path, ".png"))))
file_paths_2 = sorted(sorted(glob.glob(os.path.join(hr_images_path, "
.png"))))

lr_dataset = tf.data.Dataset.from_tensor_slices(file_paths_1)
hr_dataset = tf.data.Dataset.from_tensor_slices(file_paths_2)

def load_and_preprocess_image(file_path):
image = tf.io.read_file(file_path)

image = tf.image.decode_png(image, channels=3)

image = tf.image.convert_image_dtype(image, tf.float32)

return image

lr_dataset = lr_dataset.map(load_and_preprocess_image)
hr_dataset = hr_dataset.map(load_and_preprocess_image)

combined_dataset = tf.data.Dataset.zip((lr_dataset, hr_dataset))

batched_dataset = combined_dataset.batch(1)

prefetched_dataset = batched_dataset.prefetch(tf.data.experimental.AUTOTUNE)

DATASET_SIZE= len(prefetched_dataset)

train_size = int(0.97 * DATASET_SIZE)
test_size = DATASET_SIZE - train_size

train_dataset = prefetched_dataset.take(train_size)
test_dataset = prefetched_dataset.skip(train_size)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants