Stanford CS231n k-Nearest Neighbor (kNN) exercise

その買うを、もっとハッピーに。|ハピタス

Stanford universiyのcs231n-GitHubにあるassignment1をやってみた。とは言っても正確にはただコードをコピペしてみただけだが、機械学習学習の初歩はコピペ&learnという格言があるように、他人のコードを参考にして学ぶのが一番の近道だそうだ。

スポンサーリンク

k-Nearest Neighbor (kNN) exercise

KNN classifierはtwo stagesで構成される:

  1. トレーニング中、分類器はトレーニングデータを受け取って単純にそれを記憶する。
  2. test中、KNNは全テスト画像を全トレーニング画像と比較してk最類似トレーニング例のラベルを転送する。
  3. kの値は交差検証される。

今回のエクササイズで、これらのステップを実装して、画像分類パイプライン、交差検証の基本を理解し、効率的なベクトル化コードを書くことを習得する。

# Run some setup code for this notebook.

import random
import numpy as np
from cs231n.data_utils import load_CIFAR10
import matplotlib.pyplot as plt

from __future__ import print_function

# This is a bit of magic to make matplotlib figures appear inline in the notebook
# rather than in a new window.
%matplotlib inline
plt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots
plt.rcParams['image.interpolation'] = 'nearest'
plt.rcParams['image.cmap'] = 'gray'

# Some more magic so that the notebook will reload external python modules;
# see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython
%load_ext autoreload
%autoreload 2
# Load the raw CIFAR-10 data.
cifar10_dir = 'cs231n/datasets/cifar-10-batches-py'

# Cleaning up variables to prevent loading data multiple times (which may cause memory issue)
try:
   del X_train, y_train
   del X_test, y_test
   print('Clear previously loaded data.')
except:
   pass

X_train, y_train, X_test, y_test = load_CIFAR10(cifar10_dir)

# As a sanity check, we print out the size of the training and test data.
print('Training data shape: ', X_train.shape)
print('Training labels shape: ', y_train.shape)
print('Test data shape: ', X_test.shape)
print('Test labels shape: ', y_test.shape)
Training data shape:  (50000, 32, 32, 3)
Training labels shape:  (50000,)
Test data shape:  (10000, 32, 32, 3)
Test labels shape:  (10000,)
# Visualize some examples from the dataset.
# We show a few examples of training images from each class.
classes = ['plane', 'car', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
num_classes = len(classes)
samples_per_class = 7
for y, cls in enumerate(classes):
    idxs = np.flatnonzero(y_train == y)
    idxs = np.random.choice(idxs, samples_per_class, replace=False)
    for i, idx in enumerate(idxs):
        plt_idx = i * num_classes + y + 1
        plt.subplot(samples_per_class, num_classes, plt_idx)
        plt.imshow(X_train[idx].astype('uint8'))
        plt.axis('off')
        if i == 0:
            plt.title(cls)
plt.show()
# Subsample the data for more efficient code execution in this exercise
num_training = 5000
mask = list(range(num_training))
X_train = X_train[mask]
y_train = y_train[mask]

num_test = 500
mask = list(range(num_test))
X_test = X_test[mask]
y_test = y_test[mask]
# Reshape the image data into rows
X_train = np.reshape(X_train, (X_train.shape[0], -1))
X_test = np.reshape(X_test, (X_test.shape[0], -1))
print(X_train.shape, X_test.shape)
(5000, 3072) (500, 3072)
from cs231n.classifiers import KNearestNeighbor

# Create a kNN classifier instance. 
# Remember that training a kNN classifier is a noop: 
# the Classifier simply remembers the data and does no further processing 
classifier = KNearestNeighbor()
classifier.train(X_train, y_train)
スポンサーリンク

two loops implementation

次に、KNN分類器を使ってテスト画像を分類する。

  1. 最初に全テスト例-全トレイン例間の距離を計算する。
  2. この距離を考慮して、各テスト例用のk近似例を探し出してそれらにラベルを決めさせる。

まず、double loopを使って全トレーニング例-テスト例間の距離を計算する。Ntrトレーニング例とNteテスト例の場合、この第一段階では、各要素(i,j)がi-thテストとj-thトレーニング例間の距離であるNte x Ntr マトリクスが得られる。

import time
# Open cs231n/classifiers/k_nearest_neighbor.py and implement
# compute_distances_two_loops.
start_time = time.time()
#dists[i][j] = np.sqrt(np.sum((X[i] - self.X_train[j]) ** 2))

# Test your implementation:
dists = classifier.compute_distances_two_loops(X_test)
print(dists.shape)
elapsed_time = time.time() - start_time
print("Elapsed time: {}".format((elapsed_time)))
(500, 5000)
Elapsed time: 27.9486083984375
import time
# Open cs231n/classifiers/k_nearest_neighbor.py and implement
# compute_distances_two_loops.
start_time = time.time()
#dists[i,j] = np.sqrt(np.sum(np.square(X[i,:]-self.X_train[j,:])))

# Test your implementation:
dists = classifier.compute_distances_two_loops(X_test)
print(dists.shape)
elapsed_time = time.time() - start_time
print("Elapsed time: {}".format((elapsed_time)))
(500, 5000)
Elapsed time: 27.34689235687256
import time
# Open cs231n/classifiers/k_nearest_neighbor.py and implement
# compute_distances_two_loops.
start_time = time.time()
#dists[i][j] = np.linalg.norm(X[i]-self.X_train[j])

# Test your implementation:
dists = classifier.compute_distances_two_loops(X_test)
print(dists.shape)
elapsed_time = time.time() - start_time
print("Elapsed time: {}".format((elapsed_time)))
(500, 5000)
Elapsed time: 19.66613459587097
# We can visualize the distance matrix: each row is a single test example and
# its distances to training examples
import matplotlib.pylab as pylab
pylab.rcParams['figure.figsize'] = 30, 40
pylab.rcParams["font.size"] = "30"
plt.imshow(dists, interpolation='none' )
plt.show()

次回はこのdists[i][j] = np.linalg.norm(X[i]-self.X_train[j])というコードを詳しく検証したい。

参考サイトhttps://github.com/

スポンサーリンク
スポンサーリンク