This document discusses using semantic indexing to link entities. It explains how word embedding techniques like Word2Vec represent words as vectors in a continuous space to calculate similarity, which can then be used to link related entities. Different embedding models and corpora can result in different representations. Tracking entities over time in chronological vector spaces allows analysis of concept drift. Semantic indexing of free-text metadata is a potential source of additional links between entities.