Dynamically Typed

Facebook's improved automated alternative text

Facebook has launched a significantly improved version of its automatic alternative text (AAT) feature, which helps blind or visually impaired people understand the contents of images in their Facebook feed. As explained in Facebook’s tech blog post, this new version of AAT can recognize over 1,200 distinct concepts. Interestingly, the model was trained on weakly supervised data, using the hashtags on billions of public Instagram images as labels. So if you’ve ever posted a picture of your latte and tagged it #latte on Instagram, you may have had a tiny impact on this feature. The blog post also details the user research that went into improving AAT — something I think we usually don’t hear enough about (or do enough of!) in productized AI — so make sure to give it a read. (I wish I could credit the person who wrote this post, but sadly Facebook keeps these posts anonymous, which seems a bit out of character for the company.)