Research interests: Complexity Sciences, Matrix Decomposition, Clustering, Manifold Learning, Networks, Synthetic (numerical) data, Portfolio optimization. 🇨🇮🇿🇦
Now ICLR authors say their peer reviews were churned out by AI. Reviews full of hallucinated content that didn’t exist and offering useless, vague feedback.
Now ICLR authors say their peer reviews were churned out by AI. Reviews full of hallucinated content that didn’t exist and offering useless, vague feedback.
Combining contrastive learning and message passing markedly improves features created from embedding graphs, scalable to huge graphs.
It taught us a lot on graph feature learning 👇
1/10
Combining contrastive learning and message passing markedly improves features created from embedding graphs, scalable to huge graphs.
It taught us a lot on graph feature learning 👇
1/10
#neurips2025
#neurips2025
www.nature.com/articles/s41...
www.nature.com/articles/s41...
📆 Nov 25, 2025
📆 Nov 25, 2025
arxiv.org/abs/2511.17247
arxiv.org/abs/2511.17247
Grok is sloppy about it.
Other companies are subtle about it.
The only difference is competence, not intent.
My guess is that this will get worse as AI tech improves. For instance, fake videos of minorities doing crime.
Gradient descent and backpropagation have a lot of problems, alignment becomes a nightmare. Evolutionary algos fix this, but they don’t scale
A recent paper, EGGROLL, makes it computationally feasible to do now
www.alphaxiv.org/abs/2511.16652
Sinong Geng, houssam nassif, Zhaobin Kuang, Anders Max Reppen, K. Ronnie Sircar
Action editor: Reza Babanezhad Harikandeh
https://openreview.net/forum?id=KLOJUGusVE
#portfolio #finance #financial
Sinong Geng, houssam nassif, Zhaobin Kuang, Anders Max Reppen, K. Ronnie Sircar
Action editor: Reza Babanezhad Harikandeh
https://openreview.net/forum?id=KLOJUGusVE
#portfolio #finance #financial
This post is a good example of that.
Once you accept that the origin is special, then….
This post is a good example of that.
Once you accept that the origin is special, then….