Erkan Karabulut
erkankarabulut.bsky.social
Erkan Karabulut
@erkankarabulut.bsky.social
PhD Student at the University of Amsterdam | Neurosymbolic AI.
https://erkankarabulut.github.io/
Another question came to my mind about this definition: can such consciousness be transferred to a new body?

Or if one’s body were cryopreserved, would it retain the same consciousness—would it still be the same person?

Practically, we’d want consciousness to be transferable.
October 26, 2025 at 10:29 AM
The more you are aware (I guess of yourself, others, and the connections), the higher your consciousness level is.

So a baby (or an AI), over time, learns more about itself, others, and its surroundings, and by doing so gets more conscious.
October 26, 2025 at 10:29 AM
One particularly interesting view I learned was from Psychologist Markus Ofner. Summarizing with my own words:

Consciousness is in the air and in us. We don't own it, but we get conscious. We are all individuals, but also part of a whole, all interconnected.
October 26, 2025 at 10:29 AM
Similar to the question of "do numbers (or math) exist in nature or did we invent it?", does consciousness exist in nature or should we invent it?

Assuming that we can't define it, hence understand what it is, can't we come up with a definition that works better for us, similar to math?
October 26, 2025 at 10:02 AM
If you're attending and interested in knowledge discovery, interpretable machine learning, or XAI, say hi!

📜 arxiv.org/pdf/2509.20113
🐍 tinyurl.com/3z8cmuhw
🐍 Python Library: github.com/DiTEC-projec...

Co-authored with @dfdazac.bsky.social, @p-groth.bsky.social, and @vdegeler.bsky.social.

🧵2/2
arxiv.org
October 22, 2025 at 2:25 PM
🧠 Neurosymbolic rule learning methods start a paradigm shift in knowledge discovery by allowing us to utilize prior knowledge while discovering new knowledge!

Together with @dfdazac.bsky.social, @p-groth.bsky.social, and @vdegeler.bsky.social.

🧵8/8
September 26, 2025 at 3:09 PM
📊 On 5 real-world gene expression datasets with 18K+ columns and <100 rows, both methods led to significantly higher quality association rules in terms of confidence and association strength, with limited increase in execution time (1.2 - 2 times at most).

🧵7/8
September 26, 2025 at 3:09 PM
1️⃣ Aerial+WI. Weight initialization of Aerial+ based on tabular data embeddings from a foundation model, using a projection encoder.

2️⃣ Aerial+DL. Tabular embeddings are aligned with Aerial+'s reconstructions via a projection encoder and joint loss, ensuring a better semantic column alignment.

🧵6/8
September 26, 2025 at 3:09 PM
💡In the scope of knowledge discovery via rule learning, we adapted and evaluated two transfer learning methods that utilize table embeddings from a tabular foundation model, TabPFN.

🧵5/8
September 26, 2025 at 3:09 PM
🔍 Therefore, we introduce the problem of discovering association rules in high-dimensional small tabular data for the first time.

📗Tabular foundation models have addressed this issue by pre-training on large datasets and transfer learning to small datasets for predictive tasks.

🧵4/8
September 26, 2025 at 3:09 PM
❓However, Neurosymbolic methods inherit limitations of neural networks into rule learning, particularly reduced performance in low-data regimes. In rule learning, this translates to not being able to capture high-quality patterns in the data.

🧵3/8
September 26, 2025 at 3:09 PM
🐍 Library: github.com/DiTEC-projec...

📊 We show that knowledge discovery from high-dimensional tables, as in gene expression datasets (~18K columns), is scalable with Neurosymbolic rule learning, Aerial+, a method we have proposed earlier (arxiv.org/pdf/2504.19354, presented at NeSy 2025).

🧵2/8
September 26, 2025 at 3:09 PM
PyAerial runs 1–2 orders of magnitude faster than major rule mining libraries in C/C++, R, and Python.

Using an under-complete Autoencoder, it avoids non-informative patterns (rule explosion), captures most significant associations, higher confidence, stronger links, and full coverage.

🧵8/8
September 14, 2025 at 12:21 PM
It can also be integrated with rule visualization software(s), such as NiaARM.

(coming out soon) PyAerial supports transfer learning from a tabular foundation model (such as TabPFN), e.g., by re-using model weights, or semantical alignment to a given set of table embeddings.

🧵7/8
September 14, 2025 at 12:21 PM
Rules of different forms, e.g., classification rules, item constraints, can be learned as part of the exposed Aerial+ interfaces.

PyAerial can run on parallel threads and on a GPU, and scales on high-dimensional tables (1K+ columns).

🧵6/8
September 14, 2025 at 12:21 PM
🎯 What can it do (technical)?

In just 2 lines of Python code, PyAerial can learn a concise set of high-quality association rules from a table in pandas dataframe form, by utilizing an under-complete denoising autoencoder!

🧵5/8
September 14, 2025 at 12:21 PM
For high-stakes decisions (e.g., recidivism), PyAerial learns associations between features and labels as classification rules.

Rather than scanning the entire dataset, PyAerial lets users specify items of interest, improving runtime and the interoperability of discovered patterns.

🧵4/8
September 14, 2025 at 12:21 PM
On high-dimensional tables (1K+ columns) like gene expression datasets, PyAerial runs 1–2 orders of magnitude faster than leading rule mining libraries.

In low-data regimes (e.g., rare diseases), PyAerial can leverage tabular foundation models (TabPFN) to boost performance (paper out soon!).

🧵3/8
September 14, 2025 at 12:21 PM
🔧 Practical implications

PyAerial can extract high-quality, significant patterns from any given table, without causing the well-known rule explosion problem (many redundant, non-informative patterns).

🧵2/8
September 14, 2025 at 12:21 PM
Tomorrow at the Neurosymbolic Learning and Reasoning (NeSy2025) conference, I will be presenting our novel, scalable association rule 'learning' method for tabular data.

Say hi if you are interested in knowledge discovery and/or interpretable ML!

📜 tinyurl.com/48fmu3eh
🐍 tinyurl.com/45s75r6w
September 9, 2025 at 6:36 PM