Predicting pairwise relations with neural similarity encoders
- Authors
- Horn, F.; Mueller, K-R
- Issue Date
- 12월-2018
- Publisher
- POLSKA AKAD NAUK, POLISH ACAD SCI, DIV IV TECHNICAL SCIENCES PAS
- Keywords
- neural networks; kernel PCA; dimensionality reduction; matrix factorization; SVD; similarity preserving embeddings
- Citation
- BULLETIN OF THE POLISH ACADEMY OF SCIENCES-TECHNICAL SCIENCES, v.66, no.6
- Indexed
- SCIE
SCOPUS
- Journal Title
- BULLETIN OF THE POLISH ACADEMY OF SCIENCES-TECHNICAL SCIENCES
- Volume
- 66
- Number
- 6
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/71301
- DOI
- 10.24425/bpas.2018.125929
- ISSN
- 0239-7528
- Abstract
- Matrix factorization is at the heart of many machine learning algorithms, for example, dimensionality reduction (e.g. kernel PCA) or recommender systems relying on collaborative filtering. Understanding a singular value decomposition (SVD) of a matrix as a neural network optimization problem enables us to decompose large matrices efficiently while dealing naturally with missing values in the given matrix. But most importantly, it allows us to learn the connection between data points' feature vectors and the matrix containing information about their pairwise relations. In this paper we introduce a novel neural network architecture termed similarity encoder (SimEc), which is designed to simultaneously factorize a given target matrix while also learning the mapping to project the data points' feature vectors into a similarity preserving embedding space. This makes it possible to, for example, easily compute out-of-sample solutions for new data points. Additionally, we demonstrate that SimEc can preserve non-metric similarities and even predict multiple pairwise relations between data points at once.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.