appenz.bsky.social
@appenz.bsky.social
Running out of GPU capacity? NVIDIA hates this one simple trick 😁 Seriously though, congratulations to Luma AI for their new model.
February 11, 2025 at 8:25 PM
And this is fully fine of Flux dev on Replicate with two LoRAs and a good amount of parameter optimizations. It's a different style, but also a whole different level of quality.
January 20, 2025 at 6:29 AM
For comparison, here is what Krea AI generates with a photo and an style to give it the animated movie look. This does look a little like a me.
January 20, 2025 at 6:29 AM
1/3 iOS in the latest beta has an image generator. It can create AI images of you based on a photo - except they look nothing like you.

This is Apple Playground on iOS 18.3 beta.
January 20, 2025 at 6:29 AM
6/6 Full text of the "Ensuring U.S. Security and Economic Strength in the Age of Artificial Intelligence" below. I recommend loading it into an LLM and use it to find stuff.

Full text: public-inspection.federalregister.gov/2025-00636.pdf

Just don't trust the LLM to do the math. This is GPT-4o.
January 13, 2025 at 6:23 PM
5/6 No license is required for: Australia, Belgium, Canada, Denmark, Finland, France, Germany, Ireland, Italy, Japan, the Netherlands, New Zealand, Norway, Republic of Korea, Spain, Sweden, Taiwan, the
United Kingdom.

Singapore, Switzerland and Israel are missing.
January 13, 2025 at 6:23 PM
3/6 There are no restrictions for open weight models. This will certainly help open source models as they are now much easier to handle.
January 13, 2025 at 6:23 PM
5/5 One more thing that surprised me was that all of the top quant trading firms from wall street had large recruiting booth at NeurIPS. Crazy times.
December 17, 2024 at 6:59 PM
2/5 Our assumption since early 2023 has been that pre-training of LLMs would stall as we are out of data. Ilya's Test-of-Time acceptance speech may end the debate. LLM performance will converge, which likely helps OSS models. The emphasis shifts to higher layers including...
December 17, 2024 at 6:59 PM
🧵1/5 NeurIPS was a blast. Looking back, my top three take-aways are:
1. Pre-training has topped out (at least for LLMs)
2. Inference-time compute and workflow are the new frontier
3. Auto-regressive vs. diffusion is still open
Detailed thoughts below.

[photo @venturetwins]
December 17, 2024 at 6:59 PM