Category: Tutorial
How to classify your illustrations using Hierarchical Clustering
Hierarchical Clustering is a simple, intuitive yet powerful method in Machine Learning that allows you to build models that recoignizes similar patterns on a dataset of images or text. This can help you to understand better the mood, balance and palette of your artworks.
First we need to load our dataset, I've done this by uploading a CSV file containing the URL path of all the images that I want to classify together with the features. The best thing is that we can customize how we want the features to be based on the artwork. (Es: we can determinate a scale of Greyscale that ranges from 0 to 1, 0 meaning total Black and 1 total White)
imageClassifier.csv
Image_ID,Color_R,Color_G,Color_B,ColourDogde_UP1,ColourDogde_UP2,ColourDogde_UP3,ColourDogde_UP4,Hue,Saturation, composition_Level....
elevation.jpg,0.23,0.45,0.12,0.56,0.72,0.41,0.33,0.29,0.75, 0.34....
image2.jpg,0.56,0.32,0.78,0.44,0.38,0.89,0.65,0.21,0.54, 0.42....
image3.jpg,0.11,0.67,0.34,0.61,0.49,0.73,0.77,0.43,0.48, 0.23....
image4.jpg,0.88,0.54,0.27,0.39,0.21,0.60,0.53,0.62,0.35, 0.4....
We train then asimple Clustering model on Google Colab:
from sklearn.cluster import AgglomerativeClustering
from scipy.cluster.hierarchy import dendrogram, linkage
import matplotlib.pyplot as plt
import pandas as pd
#Importing the image dataset
df = pd.read_csv('imageClassifier.csv')
Z = linkage(features_list, method='ward')
# Dendogram
plt.figure(figsize=(10, 7))
dendrogram(Z)
plt.show()
# Hierarchical Clustering
cluster = AgglomerativeClustering(n_clusters=5, affinity='euclidean', linkage='ward')
cluster_labels = cluster.fit_predict(features_list)
Example Output:
The Speedpainting(elevation.jpg) was very bright compared to other images of the same dataset. Its scale of Colour Dogde was of 0.82, very high compared to other(That had less than 0.60)
Published:14 September 2024