ICLR 2023 Knowledge Graph Citations
Should we all be making a Knowledge Graph part of our organisation's AI strategy?
The ICLR is now recognised as one of the top conferences in deep learning, therefore, a good way to get a feel for the current hot topics in machine learning is to look at its’ submissions.
For those of you who don’t know, submissions include citations where the authors reference the related papers. Now, these citations form a network and Nomic have visualised that network for ICLR 2023 so that papers that cite each other are clustered together.
Here I use the connectivity in that network to draw some conclusions about the current state of Knowledge Graphs in machine learning:
1/ Graph Neural Networks now form a distinct cluster within the wider field of deep learning, from which I conclude that GNNs are rising to become a significant force within deep learning and should be on everyone's AI radar.
2/ Knowledge Graph citations are quite defused within the wider GNN cluster, from which I conclude that Knowledge Graphs have broad application within GNNs.
3/ Lastly, I note with a personal interest that ViT models are clustered closely with GNNs. ViT models have migrated the transformer architecture from language to images. The proximity of ViT to GNNs in the citation network may hint at deeper (and very exciting!) unifying geometric principles underlying the GNN and ViT architectures.
I hope that you find these observations useful. I also hope they demonstrate the power of network analysis, and in a pleasing symmetry, that this form of analysis has given you a high-level intuition of the insightful way that Graph Neural Networks themselves work