AI enabler. I help people use AI to the fullest. English/Spanish speaker living in Las Palmas de Gran Canaria CTO https://secret-source.eu Founder of #GofiGeeks https://gofigeeks.org
You said, "Jest may not be picking up the override properly" but this is impossible. Jest does exactly what it is told to do. Jest is heavily tested so it is unlikely it is not doing something properly.
(prompt continues below)
October 16, 2025 at 10:14 AM
You said, "Jest may not be picking up the override properly" but this is impossible. Jest does exactly what it is told to do. Jest is heavily tested so it is unlikely it is not doing something properly.
Clearly defining the problem is half the solution. When you say, "starved for feedback on how our work impacts the world", what does that feedback look like? Is it citations in other research, lower poverty rates, higher overall GDP, something else? Honest question.
July 12, 2025 at 10:02 PM
Clearly defining the problem is half the solution. When you say, "starved for feedback on how our work impacts the world", what does that feedback look like? Is it citations in other research, lower poverty rates, higher overall GDP, something else? Honest question.
It also seems reasonable to expect AIs to eventually stop writing human-readable code and just write bits and bytes with the expectation that humans will never be expected to intervene. Should we consider legislation against doing this even if limited to certain contexts??
April 25, 2025 at 8:01 AM
It also seems reasonable to expect AIs to eventually stop writing human-readable code and just write bits and bytes with the expectation that humans will never be expected to intervene. Should we consider legislation against doing this even if limited to certain contexts??
Writing code that is difficult to maintain and modify is irresponsible but unless the operator, me, _knows_ what unmaintainable code looks like, there's nothing to stop the AI from producing it. It is easy to imagine a future in which codebases are so complex that only AIs can understand them.
April 25, 2025 at 8:01 AM
Writing code that is difficult to maintain and modify is irresponsible but unless the operator, me, _knows_ what unmaintainable code looks like, there's nothing to stop the AI from producing it. It is easy to imagine a future in which codebases are so complex that only AIs can understand them.
Well organized code also has fewer bugs, normally, because it is compartmentalized and thus easily tested. AI and AI-enabled coding tools are great at producing functioning code but never include tests unless explicitly told to do so. It's as if they have no concept of responsibility.
April 25, 2025 at 8:01 AM
Well organized code also has fewer bugs, normally, because it is compartmentalized and thus easily tested. AI and AI-enabled coding tools are great at producing functioning code but never include tests unless explicitly told to do so. It's as if they have no concept of responsibility.