Geometric graph theory | Routing algorithms
In distributed computing and geometric graph theory, greedy embedding is a process of assigning coordinates to the nodes of a telecommunications network in order to allow greedy geographic routing to be used to route messages within the network. Although greedy embedding has been proposed for use in wireless sensor networks, in which the nodes already have positions in physical space, these existing positions may differ from the positions given to them by greedy embedding, which may in some cases be points in a virtual space of a higher dimension, or in a non-Euclidean geometry. In this sense, greedy embedding may be viewed as a form of graph drawing, in which an abstract graph (the communications network) is embedded into a geometric space. The idea of performing geographic routing using coordinates in a virtual space, instead of using physical coordinates, is due to Rao et al. Subsequent developments have shown that every network has a greedy embedding with succinct vertex coordinates in the hyperbolic plane, that certain graphs including the polyhedral graphs have greedy embeddings in the Euclidean plane, and that unit disk graphs have greedy embeddings in Euclidean spaces of moderate dimensions with low stretch factors. (Wikipedia).
Word embeddings are one of the coolest things you can do with Machine Learning right now. Try the web app: https://embeddings.macheads101.com Word2vec paper: https://arxiv.org/abs/1301.3781 GloVe paper: https://nlp.stanford.edu/pubs/glove.pdf GloVe webpage: https://nlp.stanford.edu/proje
From playlist Machine Learning
Embedding Graphs with Deep Learning
This video explains how to Embed Graphs with Deep Learning. This includes showing the difference between Matrix Decomposition and Deep learning methods as well. Thanks for watching! www.henryailabs.com
From playlist Deep Learning on Graphs
First look at Knowledge Graph Embedding (w/ simple Jupyter NB dgl-ke)
Knowledge Graph Embedding and its advantages for answering search queries. Simple explanation of Knowledge Graph Embedding and its use case. Tech to answer your (Siri) questions is basically a Deep Graph Knowledge Embedding Library (DGL-KE), a knowledge graph (KG) embeddings library built
From playlist Learn Graph Neural Networks: code, examples and theory
Graph Neural Networks, Session 6: DeepWalk and Node2Vec
What are Node Embeddings Overview of DeepWalk Overview of Node2vec
From playlist Graph Neural Networks (Hands-on)
Understanding the Origins of Bias in Word Embeddings
Toronto Deep Learning Series Author Speaking For more details, visit https://tdls.a-i.science/events/2019-03-18/ Speaker: Marc Etienne Brunet (author) Facilitator: Waseem Gharbieh Abstract: The power of machine learning systems not only promises great technical progress, but risks soc
From playlist Natural Language Processing
Jasna Urbančič (11/03/21):Optimizing Embedding using Persistence
Title: Optimizing Embedding using Persistence Abstract: We look to optimize Takens-type embeddings of a time series using persistent (co)homology. Such an embedding carries information about the topology and geometry of the dynamics of the time series. Assuming that the input time series
From playlist AATRN 2021
Rasa Algorithm Whiteboard - Understanding Word Embeddings 1: Just Letters
We're making a few videos that highlight word embeddings. Before training word embeddings we figured it might help the intuition if we first trained some letter embeddings. It might suprise you but the idea with an embedding can also be demonstrated with letters as opposed to words. We're
From playlist Algorithm Whiteboard
How to Embed a Twitter Feed to a Google Sites Page
How to embed a Twitter feed (widget) directly into a Google Site page. Duane Habecker: https://about.me/duanehabecker Twitter: @dhabecker
From playlist EdTech Stuff
TransCoder: Unsupervised Translation of Programming Languages (Paper Explained)
Code migration between languages is an expensive and laborious task. To translate from one language to the other, one needs to be an expert at both. Current automatic tools often produce illegible and complicated code. This paper applies unsupervised neural machine translation to source co
From playlist Papers Explained
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 15 – Natural Language Generation
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Cfhyya Professor Christopher Manning & PhD Candidate Abigail See, Stanford University http://onlinehub.stanford.edu/ Professor Christopher Manning Thomas M. Sieb
From playlist Stanford CS224N: Natural Language Processing with Deep Learning Course | Winter 2019
Le Song: "A Framework For Differentiable Discovery Of Graph Algorithms"
Deep Learning and Combinatorial Optimization 2021 "A Framework For Differentiable Discovery Of Graph Algorithms" Le Song - Georgia Institute of Technology Abstract: Recently there is a surge of interests in using graph neural networks (GNNs) to learn algorithms. However, these works focu
From playlist Deep Learning and Combinatorial Optimization 2021
Hamiltonicity of Cayley graphs and Gray codes: open problems by Elena Konastantinova
DATE & TIME 05 November 2016 to 14 November 2016 VENUE Ramanujan Lecture Hall, ICTS Bangalore Computational techniques are of great help in dealing with substantial, otherwise intractable examples, possibly leading to further structural insights and the detection of patterns in many abstra
From playlist Group Theory and Computational Methods
Peter Frazier: "Accelerating Scientific Discovery through Interpretable Machine Learning and Int..."
Machine Learning for Physics and the Physics of Learning 2019 Workshop II: Interpretable Learning in Physical Sciences "Accelerating Scientific Discovery through Interpretable Machine Learning and Intelligent Experimentation" Peter Frazier, Cornell University Abstract: Historically, the
From playlist Machine Learning for Physics and the Physics of Learning 2019
The Pythagorean Siphon Inside Your Washing Machine
For a limited time, use this link to get a free trial of Skillshare Premium Membership: https://skl.sh/stevemould09201 There's a greedy cup siphon in your washing machine fabric softener try. Also called a Pythagorean cup. It's also used in urinals and novelty drinking receptacles. It's a
From playlist Best of
Anthony Nouy: Adaptive low-rank approximations for stochastic and parametric equations [...]
Find this video and other talks given by worldwide mathematicians on CIRM's Audiovisual Mathematics Library: http://library.cirm-math.fr. And discover all its functionalities: - Chapter markers and keywords to watch the parts of your choice in the video - Videos enriched with abstracts, b
From playlist Numerical Analysis and Scientific Computing
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 12 - Natural Language Generation
For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/3nKbk4p To learn more about this course visit: https://online.stanford.edu/courses/cs224n-natural-language-processing-deep-learning To follow along with the course
From playlist Stanford CS224N: Natural Language Processing with Deep Learning | Winter 2021
Computing Homology Cycles with Certified Geometry - Tamal Dey
Computing Homology Cycles with Certified Geometry Tamal Dey Ohio State University March 7, 2012
From playlist Members Seminar
Hibernate Tutorial 08 - Value Types and Embedding Objects
We'll learn the difference between Entity type objects and Value type objects. We'll use the @Embeddable annotations to embed a value type object into our Entity class.
From playlist Hibernate
Panorama of Mathematics: Alfio Quarteroni
Panorama of Mathematics To celebrate the tenth year of successful progression of our cluster of excellence we organized the conference "Panorama of Mathematics" from October 21-23, 2015. It outlined new trends, results, and challenges in mathematical sciences. Alfio Quarteroni: "Reduced
From playlist Panorama of Mathematics