Fondue night was a hit last night, but a consequence I hadn’t considered is that Charlie may refuse to eat anything for the next couple of days if he can’t poke it with a fondue fork first 🫕 🤷♂️
-
-
The core thing to remember with coffee is that freshly roasted beans matter more than anything else. No fancy equipment can overcome not having fresh beans.
-
14 years with a WordPress.com account. I’m pretty sure I initially signed up to use Akismet, and I had no idea I’d end up working there years later!
-
Most search models in use these days are based on lexical similarity, or how many important words overlap. A more accurate model is based on semantic similarity, or how much abstract meaning overlaps.
Semantic similarity is based upon transformer models, a type of deep learning model, that creates an embedding to represent each document’s semantic meaning.
Lexical similarity is great if you know the exact keywords you are searching for, but brittle if you do not. Take two examples:
- Obama speaks to the media in Illinois.
- The President greets the press in Chicago.
For anyone familiar with US politics, the first example essentially says the same thing as the second – in other words, they are semantically (conceptually) similar. However, the important words in the two examples don’t match up. So the lexical similarity won’t return the same results.
A semantic similarity approach matches up the words Obama and President, media and press, Illinois and Chicago, and speaks and greets.
I hadn’t considered that those models measure similarity between two words, but not between two sets of words, called documents in the parlance:
As it turns out, current state-of-the-art language models are good at measuring the similarity between two words, but not great at measuring the similarity between two documents. We had to perform a considerable amount of R&D work to develop a transformer model that could create document embeddings—we hope to go into the gory details of this work in future technical posts.
One essential trick was to use word mover distance to create labels for pairs of documents in an unsupervised manner—so that our model could learn how to map a document’s word embeddings into a single document embedding. But, for now, the example above gives you the high-level idea behind our approach.
-
Something I need to look into: How to automate posting starred posts from my feed reader as webmention-style likes on my website. Probably a WP cron job that fetches starred posts from Feedbin’s api, loops through them, marks them up appropriately, and publishes them.
-
Bookmarked https://www.fedibblety.com/recipe/peppernuts/.
A type of Christmas cookie from the Dibble-Felty family that I’d like to make next year.
-
From the Yes Plz dispatches with this week’s coffee, dispatch 212:
Mose mornings I make coffee either with a small single cup Kalita filter or a larger Chemex pourover. I could much more easily make coffee every bit as tasty in a fancy automatic drip machine, but I enjoy my morning ritual. I like having a pre-caffeinated moment of focusing on just a single task with no other distractions. It’s not quite meditation or anything, but I think it gives grounding to my mornings.
Tonx, Yes Plz Weekly Dispatch 212I relate to this a lot. I’ve made coffee manually since 2008, and ground my beans with a hand grinder since 2014. I like the ritual of slowing down and making it all by hand, and I especially like that grinding beans by hand is quiet, unlike every electric coffee grinder I’ve encountered, which is offensive in the morning.
That said, these days I can no longer focusing on a single task like making coffee in the morning… Charlie is usually nearby and I need to keep an eye on him. Sometimes he likes to help crank the grinder, which takes longer but is sweet.
-
I installed and tested this out. After resolving a conflict with another plugin and flushing my permalinks, things started working as expected. I sent out a few likes to some friends and it looks like they went through! Excited to use this.
I might rewrite the slug from notes to “micro” or “short” in order to not get confused with my digital garden, notes.cagrimmett.com