Skip to content

Commit 1de3e0e

Browse files
Golovnevaolggoluralik
authored
workshop: first commit (#33)
* workshop: first commit * rename * Apply suggestions from code review * Apply suggestions from code review --------- Co-authored-by: olggol <olggol@meta.com> Co-authored-by: Ilia Kulikov <kulikov@cs.nyu.edu>
1 parent 9997645 commit 1de3e0e

File tree

1 file changed

+48
-0
lines changed

1 file changed

+48
-0
lines changed

workshop/COLM_2025/README.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
# Reasoning, Attention & Memory: RAM 2 – 10 Years On
2+
3+
Ten years ago in Montreal 2015, the RAM workshop took place to bring together the burgeoning field covering the "interplay of reasoning, attention and memory", just before Transformers were invented – but when many of the components to get there had just been published and were in place. The workshop included many speakers who are still prominent in pushing these directions today: Yoshua Bengio, Kyunghyun Cho, Jürgen Schmidhuber, Sainbayar Sukhbaatar, Ilya Sutskever, and more. See the historical website for more details.
4+
5+
Ten years later, we are hosting RAM 2 in the same location in Montreal, with a two-fold purpose. Firstly, as a retrospective and analysis of what has happened in the last 10 years. We are inviting presenters from the first workshop to this end, as well as to add their current perspectives. Hence secondly, and more importantly, we will bring together the field to discuss new trends and future directions for the next 10 years – which is further enabled by inviting new speakers, panelists and poster presenters discussing these fresh ideas.
6+
7+
Why does this make sense? The RAM topic is as important as ever, and has gone on to dominate the field. These new directions include:
8+
9+
(R): New reasoning methods including both token-based and that use continuous vectors, and how they combine with memory.
10+
11+
(A): New attention methods that enable better reasoning and use of short and long-term memory.
12+
13+
(M): Architectural changes to LLMs to improve memory and reasoning capabilities.
14+
15+
Overall, we highlight that the workshop is most concerned with methods that aim to explore the interplay between these three aspects.
16+
17+
18+
## Call for Papers
19+
20+
We will host paper submissions on open review, link will be provided later. We invite researchers and practitioners to submit their work to the COLM 2025 Workshop on Reasoning, Attention & Memory 2 (RAM2@COLM25).
21+
22+
* **Submission Deadline:** June 23, 2025
23+
* **Author Notification Deadline:** July 24, 2025
24+
* **Submission Details:** Submissions should follow [general guide for COLM conference](https://colmweb.org/cfp.html). Papers can be up to 9 pages (not including references) and have to be anonymized. All submissions must be in PDF format, please use the [LaTeX style files provided by organizers](https://github.com/COLM-org/Template/archive/refs/tags/2025.zip).
25+
26+
27+
## Invited speakers
28+
+ Yoshua Bengio, Univ. of Montreal
29+
+ Kyunghyun Cho, NYU & Prescient Design
30+
+ Yejin Choi, Stanford & NVIDIA
31+
+ Azalia Mirhoseini, Stanford
32+
+ Juergen Schmidhuber, KAUST
33+
+ Sainbayar Sukhbaatar, Meta
34+
+ Jason Wei, OpenAI
35+
36+
37+
## Tentative schedule:
38+
TBD
39+
40+
41+
## Organizing Committee
42+
+ Ilia Kulikov
43+
+ Jason Weston
44+
+ Jing XU
45+
+ Olga Golovneva
46+
+ Swarnadeep Saha
47+
+ Marjan Ghazvininejad
48+
+ Ping Yu

0 commit comments

Comments
 (0)