Neil Fitzgerald
banner
nfitzger.glammr.us.ap.brid.gy
Neil Fitzgerald
@nfitzger.glammr.us.ap.brid.gy
Head of #DigitalResearch #BritishLibrary #DigitalScholarship | Executive Committee member: #iiif | #impactocr | #ai4lam | opinions my own.

[bridged from https://glammr.us/@nfitzger on the fediverse by https://fed.brid.gy/ ]
RE: https://techhub.social/@BL_DigiSchol/115668034290546774

Thank you all for making it such a great experience, looking forward to #ff2026!
techhub.social
December 5, 2025 at 10:13 PM
Reposted by Neil Fitzgerald
i shoulda gone to #ff2025
December 5, 2025 at 12:54 PM
Reposted by Neil Fitzgerald
Fantastic new wildflower plates and descriptions shared as a benchmark task from Smithsonian folks - github.com/Smithsonian/... with comparison of Gemini VLM models and Qwen models on it #ff2025
December 5, 2025 at 12:22 PM
Reposted by Neil Fitzgerald
Interesting game experiment in Bristol Museum — making egyptian god characters into personalities with quests for visitors to solve, “CultureQuest”.
December 5, 2025 at 10:42 AM
Reposted by Neil Fitzgerald
Mind blown by fashion recreation of arctic cultural clothing in 3d using PS, Illustrator, and Clo3d by this Smithsonian work #ff2025
December 4, 2025 at 5:01 PM
My semi regular refrain, #ocr is not a solved problem #ff2025 #AI4LAM
December 4, 2025 at 4:31 PM
Reposted by Neil Fitzgerald
Great insights from Nation Library of Sweden attempts at AI assisted workflows - if you have a long complex pipeline, it’s hard to debug or find location of a decision; and sometimes when AI method struggles, it would have been hard for humans. #ff2025
December 4, 2025 at 2:40 PM
Reposted by Neil Fitzgerald
Yale Library talk on implementing MCP call to Yale catalogue in Claude — with great results. Boosted user confidence in searches. #ff2025
December 4, 2025 at 2:11 PM
Reposted by Neil Fitzgerald
TIL allmaps.org - lots of great svg annotations on historical maps, and help with labeling
December 4, 2025 at 11:44 AM
Reposted by Neil Fitzgerald
Fantastically interesting use of many many vlm models to augment metadata of artworks at Harvard Art Museum - Jeff Steward pointing out that AI descriptions may extend curator views / eyes to make art more searchable and accessible #ff2025
December 4, 2025 at 12:43 PM
Reposted by Neil Fitzgerald
Fab from French Ministry of Culture - https://comparia.beta.gouv.fr for people to compare LLM responses blindly and then learn about them. Developed by a solid multidisciplinary team. #ff2025
December 4, 2025 at 10:30 AM
Reposted by Neil Fitzgerald
Wow, folks at the Bodleian using Gemini, NotebookLM, ChatGPT (openai research funding!), CoPilot… #ff2025
December 4, 2025 at 10:15 AM
Reposted by Neil Fitzgerald
Hmm, Rachel Coldicutt - “AI is an austerity policy” - if you don’t have enough money, throw it to AI. And “FOMO is not a strategy.” Makes good points, although I am a fan of the creepy & weird and she isn’t 😆 #ff2025
December 4, 2025 at 9:24 AM
Reposted by Neil Fitzgerald
London! I am in you! #ff2025
December 2, 2025 at 9:28 PM
Reposted by Neil Fitzgerald
If you want to hear about what I've been doing with the State Library of Victoria's place-based collections over the last few months, come along to my talk at the Library next Wednesday at 1.00pm. It's free and unticketed, just rock up! All welcome! […]
Original post on hcommons.social
hcommons.social
November 27, 2025 at 12:26 AM
Reposted by Neil Fitzgerald
Want to join a UK based Trade Union for technology workers?
Get 3 months' free membership - use code UW3MF
https://prospect.org.uk/join/

Mention my name when you join and I get a tenner (entirely optional).

#tradeunion
Join Prospect today
Get support with problems at work, and join 157,000 others to fight for better pay, conditions, and job protection.
prospect.org.uk
November 19, 2025 at 12:51 PM
Reposted by Neil Fitzgerald
glammr.us
November 13, 2025 at 5:38 PM
glammr.us
November 13, 2025 at 5:38 PM
Reposted by Neil Fitzgerald
Do you love Open Source?
We are looking for ambassadors for @openuk

We offer benefits to our ambassadors, like free invites to our exclusive events, including our annual Awards, and an opportunity to network with key figures and contributors to global Open […]

[Original post on mastodon.social]
November 13, 2025 at 12:52 PM
Reposted by Neil Fitzgerald
I'll be Zooming into this @IIIF workshop at Sydney Uni on Monday to talk about some of my experiments, including the latest work on SLV maps. It's free and there's still a few places left, so come along if you're IIIF-curious […]
Original post on hcommons.social
hcommons.social
November 5, 2025 at 6:28 AM
Reposted by Neil Fitzgerald
Hey UK GLAM friends, I'm super keen to get to the GLAM Labs conference in Edinburgh next June, but I need to find some funding. https://www.glamlabs.io/events/glam-labs-futures-26 Is there anything I could come and do for you around June next year that could help pay my way?
International GLAM Labs Community - GLAM Labs Futures 26
GLAM Labs Futures ● 25-26 June 2026 ● Scotland
www.glamlabs.io
November 4, 2025 at 7:36 AM
Reposted by Neil Fitzgerald
I've written a blog post with a few more details about the GLAM data plumbing involved in hooking the SLV's digitised maps up to Allmaps via @IIIF for georeferencing: https://updates.timsherratt.org/2025/11/04/turning-the-slvs-maps-into.html #maps #glam #digitalhumanities
I often describe what I do as GLAM data plumbing. Most of the time I’m not creating new tools, I’m figuring out what data is available and how I can connect it up to _existing_ tools. It’s rarely straightforward, but if I can get all the pipes connected and data flowing in the right direction, suddenly new things become possible. **Things like turning all the State Library of Victoria’s digitised maps into data.** I’ve just created a workflow that uses AllMaps and IIIF to georeference the SLV’s digitised maps. There’s some technical details below, but the idea is pretty simple. A userscript links the SLV image viewer to Allmaps – so you just click on a button, and the digitised map opens, ready for georeferencing. Why is this useful? Georeferencing relates a digitised map to real world geography. It describes the map’s position and extent using geospatial coordinates – turning historic documents into geospatial data that can be indexed, visualised and manipulated. Georeferencing opens digitised maps to new research uses. So, how many maps we can georeference before my residency finishes in December? Hundreds? Thousands? If you like maps and want to help, head to the documentation page to find out how to get started. And if you want to see how things are progressing, have a look at the project dashboard. View the documentation to get started A few technical details follow… Early on in my time as Creative Technologist-in-Residence at the State Library of Victoria, I started playing around with Allmaps for georeferencing digitised maps. It’s a great tool (really a suite of tools and standards) because instead of constructing a whole new platform it integrates with existing IIIF services. The SLV provides digitised images through IIIF, so I thought it should be possible to use Allmaps to georeference the SLV’s map collection. But I struck a problem that took some time to unravel. The IIIF urls in the SLV manifests include port numbers and that confused Allmaps. The manifests also sometimes contained references to image formats that weren’t actually accessible, generating errors when they were loaded. Hopefully these problems will be fixed by the SLV, but in the meantime I’ve created a proxy service that edits the manifest on the fly. The proxied urls can be loaded into the Allmaps Editor without errors. Pipes fixed, data flowing! Using the manifest proxy To generate a link to a proxied manifest, first grab the item's `IE` identifier from the url of the digitised item viewer. For example, the identifier in this url `https://viewer.slv.vic.gov.au/?entity=IE15485265&mode=browse` is `IE15485265`. Once you have the identifier, add it to the end of the url `https://wraggelabs.com/slv_iiif/`. For example, https://wraggelabs.com/slv_iiif/IE15485265. You can then supply this url to the Allmaps editor. But having to fiddle around with proxies didn’t make a great user experience. I needed some way of integrating the two services, so that a user could just click a button in the SLV website and start editing in Allmaps. Userscripts to the rescue! I wrote recently about hacking GLAM collection interfaces using userscripts. Since I started my residency at the SLV, I’ve also created a userscript to display the IIIF manifest url in the SLV image viewer, and run a Code Club workshop where we played around with an assortment of SLV website hacks. As in a number of these examples, the georeferencing userscript adds new features to the SLV website, but there’s a fair bit more going on under the hood. It runs automatically every time you load the SLV image viewer, and then: * it checks the metadata of the digitised item to see it it’s a map (or something that contains maps, like an atlas or street directory) * if it looks like a map, it generates an Allmaps identifier using the item’s IIIF manifest url and checks with Allmaps to see whether the item has already been georeferenced * it adds a ‘Georeferencing’ section to the page, with a button to georeference the item (or edit the existing georeferencing) * if the item has already been georeferenced, it adds a button to view the item in the Allmaps Viewer, and embeds a live preview Accessing metadata The userscript gets the item metadata from a JSON file that's loaded by the image viewer. The JSON file includes a lot of extra, useful information about the digitised item. To access the JSON file, you just construct a url like this: `https://viewerapi.slv.vic.gov.au/?entity=[IE identifier]&dc_arrays=1`. The IE identifier is in the url of the image viewer. Allmaps identifiers Allmaps creates its identifiers by hash encoding the IIIF urls. The userscript borrows some code from the Allmaps id module to generate the ids, then sends a HEAD request to the Allmaps API to see whether an entry for the current manifest exists. Example of an item that hasn't been georeferenced yet Example of an item that has been georeferenced, displaying an embedded version of the Allmaps viewer I’ve also created a GitHub repository to save copies of the data. Every two hours this notebook is run to query the Allmaps API for newly georeferenced maps. These are added to a dataset which is saved in three formats: * a CSV file * a CSV file that includes thumbnails and links for viewing in Datasette-Lite * a GeoJSON file, that can be viewed in services like geojson.io At the same time, the data for each individual map is downloaded and saved as IIIF annotations (in JSON) and GeoJSON. Finally, this notebook is run to generate a dashboard that provides an overview of the project’s progress. The project dashboard is updated every two hours One of the Allmaps developers described all my plumbing and workarounds as a ‘very cool lofi example of how you can set this up with little means’, and I think that’s pretty apt. It’s really just an experiment to demonstrate the possibilities, but by connecting up existing services it’s generating real data of long term value.
updates.timsherratt.org
November 4, 2025 at 4:15 AM
Reposted by Neil Fitzgerald
Help me turn the State Library Victoria's digitised maps into data!

As part of my residency at the SLV LAB, I've been experimenting with using Allmaps and @IIIF to georeference the Library's maps. Georeferencing relates a digitised map to real world […]

[Original post on hcommons.social]
October 30, 2025 at 11:51 PM
Reposted by Neil Fitzgerald
Help me turn the State Library of Victoria's digitised maps into data! https://wragge.github.io/slv-allmaps/ (bit of a soft launch, so let me know if you see any errors in the documentation...)
October 30, 2025 at 5:47 AM