I agree. Since your article 5 years ago, we have built massive high-fidelity neural encodings of text and images. And transformers can draw from the state at any point in a sequence. But we need more work toward learning/encoding logical and probabilistic
So far, Altmetric has seen
968 X posts from 768 X users, with an upper bound of 5,520,226 followers.
7,962 followers
598 followers
RT @cabitzaf: @curtlanglotz And context. In a JAMA piece (2017) we basically made the Cassandra point that we face the risk of a "demise of…
7,962 followers
RT @cabitzaf: @curtlanglotz And context. In a JAMA piece (2017) we basically made the Cassandra point that we face the risk of a "demise of…
1,473 followers
@curtlanglotz And context. In a JAMA piece (2017) we basically made the Cassandra point that we face the risk of a "demise of context" in medical reasoning, if we think that everything needed to make a diagnosis can be traced/reduced to text. And LLMs see