1- # Neural Structured Learning: Training neural networks with structured signals
1+ # Neural Structured Learning: Training Neural Networks with Structured Signals
22
33Hands-on tutorial at [ KDD 2020] ( https://www.kdd.org/kdd2020/ ) .
44
5- ## Tutors
5+ ## Organizers
66
77* Allan Heydon (Google Research)
88* Arjun Gopalan (Google Research)
@@ -28,14 +28,13 @@ graph neural networks [7,8].
2828
2929## Outline
3030
31- Below is an outline of how our tutorial will be structured. This is subject to
32- minor changes.
31+ Below is the outline of our tutorial.
3332
3433### Introduction to NSL
3534
36- We will begin the tutorial with a presentation that gives an overview of the
37- Neural Structured Learning framework as well as explains the benefits of
38- learning with structure .
35+ We will begin the tutorial with an overview of the Neural Structured Learning
36+ framework and motivate the advantages of training neural networks with
37+ structured signals .
3938
4039### Data preprocessing in NSL
4140
@@ -45,47 +44,51 @@ This part of the tutorial will include a presentation discussing:
4544- Augmenting training data for graph-based regularization in NSL
4645- Related tools in the NSL framework
4746
48- ### Graph regularization using natural graphs
47+ ### Graph regularization using natural graphs (Lab 1)
4948
5049Graph regularization [ 2] forces neural networks to learn similar
5150predictions/representations for entities connected to each other in a similarity
52- graph. Natural graphs or organic graphs are sets of data points that have an
51+ graph. * Natural graphs* or * organic graphs* are sets of data points that have an
5352inherent relationship between each other. We will demonstrate via a practical
54- tutorial, the use of natural graphs for graph regularization for classifying the
53+ tutorial, the use of natural graphs for graph regularization to classify the
5554veracity of public message posts.
5655
57- ### Graph regularization using synthesized graphs
56+ ### Graph regularization using synthesized graphs (Lab 2)
5857
5958Input data may not always be represented as a graph. However, one can infer
6059similarity relationships between entities and subsequently build a similarity
61- graph. We will demonstrate via a practical tutorial, the use of graph building
62- and subsequent graph regularization for text classification.
60+ graph. We will demonstrate graph building and subsequent graph regularization
61+ for text classification using a practical tutorial. While graphs can be built in
62+ many ways, we will make use of text embeddings in this tutorial to build a
63+ graph.
6364
64- ### Adversarial regularization
65+ ### Adversarial regularization (Lab 3)
6566
66- - Practical tutorial demonstrating adversarial learning techniques [ 3,4] for
67- image classification. It will cover methods like Fast Gradient Sign Method
68- (FGSM) and Projected Gradient Descent (PGD).
67+ Adversarial learning has been shown to be effective in improving the accuracy of
68+ a model as well as its robustness to adversarial attacks. We will demonstrate
69+ adversarial learning techniques [ 3,4] like * Fast Gradient Sign Method* (FGSM)
70+ and * Projected Gradient Descent* (PGD) for image classification using a
71+ practical tutorial.
6972
70- ### Neural Structured Learning in TensorFlow Extended (TFX)
73+ ### NSL in TensorFlow Extended (TFX)
7174
72- - Short presentation on how Neural Structured Learning can be integrated with
75+ - Presentation on how Neural Structured Learning can be integrated with
7376 [ TFX] ( https://www.tensorflow.org/tfx ) pipelines.
7477
7578### Research and Future Directions
7679
77- - Presentation discussing recent research related to NSL and future directions
80+ - Presentation discussing:
81+ - Recent research related to NSL
82+ - Future directions for NSL research and development
83+ - Academic and industrial collaboration opportunities
7884- Prototype showing how NSL can be used with the
79- [ Graph Nets] ( https://github.com/deepmind/graph_nets ) library.
85+ [ Graph Nets] ( https://github.com/deepmind/graph_nets ) [ 9 ] library.
8086
81- ### Closing
87+ ### Conclusion
8288
83- We will conclude our tutorial session with a presentation that will include:
84-
85- - Summary
86- - Resources
87- - Q/A
88- - Survey/feedback
89+ We will conclude our tutorial with a summary of the entire session, provide
90+ links to various NSL resources, and share a link to a brief survey to get
91+ feedback on the NSL framework and the hands-on tutorial.
8992
9093## References
9194
@@ -107,3 +110,4 @@ We will conclude our tutorial session with a presentation that will include:
107110 2019
1081118 . Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, P. Yu, “A Comprehensive Survey on
109112 Graph Neural Networks” arXiv 2019.
113+ 9 . https://github.com/deepmind/graph_nets
0 commit comments