Development of Fashion Product Retrieval and Recommendations Model Based on Deep Learning
- Authors
- Jo, Jaechoon; Lee, Seolhwa; Lee, Chanhee; Lee, Dongyub; Lim, Heuiseok
- Issue Date
- 3월-2020
- Publisher
- MDPI
- Keywords
- deep learning; convolutional neural network (CNN); generative adversarial network (GAN); Image2Vec; fashion recommendation
- Citation
- ELECTRONICS, v.9, no.3
- Indexed
- SCIE
SCOPUS
- Journal Title
- ELECTRONICS
- Volume
- 9
- Number
- 3
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/57455
- DOI
- 10.3390/electronics9030508
- ISSN
- 2079-9292
- Abstract
- The digitization of the fashion industry diversified consumer segments, and consumers now have broader choices with shorter production cycles; digital technology in the fashion industry is attracting the attention of consumers. Therefore, a system that efficiently supports the searching and recommendation of a product is becoming increasingly important. However, the text-based search method has limitations because of the nature of the fashion industry, in which design is a very important factor. Therefore, we developed an intelligent fashion technique based on deep learning for efficient fashion product searches and recommendations consisting of a Sketch-Product fashion retrieval model and vector-based user preference fashion recommendation model. It was found that the "Precision at 5" of the image-based similar product retrieval model was 0.774 and that of the sketch-based similar product retrieval model was 0.445. The vector-based preference fashion recommendation model also showed positive performance. This system is expected to enhance consumers' satisfaction by supporting users in more effectively searching for fashion products or by recommending fashion products before they begin a search.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Computer Science and Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.