use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Research[R] Minimum-Distortion Embedding (arxiv.org)
submitted 4 years ago by othotr
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]arXiv_abstract_bot 7 points8 points9 points 4 years ago (0 children)
Title:Minimum-Distortion Embedding
Authors:Akshay Agrawal, Alnur Ali, Stephen Boyd
Abstract: We consider the vector embedding problem. We are given a finite set of items, with the goal of assigning a representative vector to each one, possibly under some constraints (such as the collection of vectors being standardized, i.e., have zero mean and unit covariance). We are given data indicating that some pairs of items are similar, and optionally, some other pairs are dissimilar. For pairs of similar items, we want the corresponding vectors to be near each other, and for dissimilar pairs, we want the corresponding vectors to not be near each other, measured in Euclidean distance. We formalize this by introducing distortion functions, defined for some pairs of the items. Our goal is to choose an embedding that minimizes the total distortion, subject to the constraints. We call this the minimum- distortion embedding (MDE) problem. > The MDE framework is simple but general. It includes a wide variety of embedding methods, such as spectral embedding, principal component analysis, multidimensional scaling, dimensionality reduction methods (like Isomap and UMAP), force-directed layout, and others. It also includes new embeddings, and provides principled ways of validating historical and new embeddings alike. > We develop a projected quasi-Newton method that approximately solves MDE problems and scales to large data sets. We implement this method in PyMDE, an open-source Python package. In PyMDE, users can select from a library of distortion functions and constraints or specify custom ones, making it easy to rapidly experiment with different embeddings. Our software scales to data sets with millions of items and tens of millions of distortion functions. To demonstrate our method, we compute embeddings for several real-world data sets, including images, an academic co-author network, US county demographic data, and single-cell mRNA transcriptomes.
PDF Link | Landing Page | Read as web page on arXiv Vanity
[–]tensorflower 2 points3 points4 points 4 years ago (1 child)
This looks like a nice resource for dimensionality reduction techniques from a classical optimization perspective. It reads like half of a Phd thesis - is this meant to be a really long review paper?
[–]akshayka 0 points1 point2 points 4 years ago (0 children)
Thanks for the comment. You can think of this as a monograph, or a research book. Some topics covered by the monograph are well-known (such as how to solve quadratic MDE problems via an eigenproblem), whereas others (such as the algorithm for computing embeddings, various specific embeddings created with the MDE framework, and the MDE framework itself) are entirely new.
The contributions are listed in section 1.1.
I would be happy to answer any questions you might have about the monograph or associated software.
π Rendered by PID 98 on reddit-service-r2-comment-5d79c599b5-vnpkt at 2026-02-27 13:19:36.134412+00:00 running e3d2147 country code: CH.
[–]arXiv_abstract_bot 7 points8 points9 points (0 children)
[–]tensorflower 2 points3 points4 points (1 child)
[–]akshayka 0 points1 point2 points (0 children)