Maximilian Weber
banner
Maximilian Weber
@max-web.bsky.social
@jbgruber.bsky.social presenting rollama, an R package that brings gLLMs to your local R setup. Exciting possibilities for reproducible computational text analysis! #comptext
April 26, 2025 at 8:06 AM
Reposted by Maximilian Weber
@max-web.bsky.social presenting our first try to use LLMs ({rollama}) to extract topics from a large corpus: the public reaction to the releases of Galactica and Chatgpt within weeks from each other in 2022 (which couldn't have been more different!) #comptext2025
April 25, 2025 at 10:15 AM
Reposted by Maximilian Weber
New Course Alert! 🚀

Generative AI for Social Science Research

This course explores how to apply AI in social science research, from basic interactions to fine-tuning models for large-scale data analysis.

Find out more & apply: bit.ly/4l3nnGI

#ESS2025 #GenerativeAI
March 25, 2025 at 7:14 PM
Reposted by Maximilian Weber
deepseek is an extremly impressive open weights LLM. But the guardrails are absolutely wild 🤨
January 27, 2025 at 1:55 PM
Running Phi-4 and other LLMs in R using Google Colab for free through the rollama R-Package medium.com/@weber.aca/r...
Running Phi-4 and other LLMs in R using Google Colab for free through the rollama R-Package
Learn how to harness the power of LLMs using R and Google Colab’s free GPU resources.
medium.com
January 10, 2025 at 1:54 PM
Reposted by Maximilian Weber
Submission for #COMPTEXT2025 is still open until January 15! We are looking forward to receiving your proposals for papers, panels, and data presentations! We also appreciate it if you can spread the word and circulate shorturl.at/AmocX! See you at the University of Vienna on 24-26 April!
January 6, 2025 at 2:15 PM
Reposted by Maximilian Weber
I'll get straight to the point.

We trained 2 new models. Like BERT, but modern. ModernBERT.

Not some hypey GenAI thing, but a proper workhorse model, for retrieval, classification, etc. Real practical stuff.

It's much faster, more accurate, longer context, and more useful. 🧵
December 19, 2024 at 4:45 PM
Reposted by Maximilian Weber
Nice paper showing just *how* irreprodroducible research with proprietary generative LLMs is. Luckily there are open source alternatives (and they are very easy to use too!)
Pleased to share the latest version of my paper with Arthur Spirling and @lexipalmer.bsky.social on replication using LMs

We show:

1. current applications of LMs in political science research *don't* meet basic standards of reproducibility...
December 19, 2024 at 7:23 AM
Reposted by Maximilian Weber
Call for abstracts for a Special Volume on Computational Social Science (of the journal "Soziale Welt")!
@marklutter345.bsky.social and I are looking for contributions presenting new research, discussing CSS methods, giving overviews of applications or ongoing research projects. Please circulate!
Call for abstracts: Special Volume Computational Social Science
www.janfuhse.de
December 12, 2024 at 9:10 AM
Reposted by Maximilian Weber
I read this paper a while back, and I liked it. But now it also feels relevant.

www.nber.org/papers/w29724
December 10, 2024 at 2:35 AM
Run LLMs locally in R with the rollama R package. The latest update simplifies query generation (Prompt) and adds support for Hugging Face models (@jbgruber.bsky.social) Check it out:
rollama Update: Easy query generation and Hugging Face model support
This post highlights updates to the Rollama R package, which wraps the Ollama API to enable local execution of Generative Language Models…
medium.com
December 9, 2024 at 2:16 PM
Reposted by Maximilian Weber
New #rstats 📦 version: {rollama} 0.2.0
What's new?🧵👇

https://cran.r-project.org/web/packages/rollama/index.html
December 9, 2024 at 11:21 AM
Reposted by Maximilian Weber
If you find yourself with too much free time over the (long) weekend / holidays, I have ~3h Building an LLM from the Ground Up workshop on YouTube that may come in handy: m.youtube.com/watch?v=quh7...
Building LLMs from the Ground Up: A 3-hour Coding Workshop
YouTube video by Sebastian Raschka
m.youtube.com
November 27, 2024 at 4:39 AM