The Lindahl Letter
The Lindahl Letter
All the future AI features
0:00
-6:07

All the future AI features

Longtime readers of my work know that within my normative bias I tend to break things down into form, function, assumptions, and structure (FFAS). Instead of taking that path with this Substack based letter format, each week my commentaries are going to drift into a more ongoing narrative about the patterns, traditions, and concerns that rise to the forefront of my thoughts. All right, now is the time, let’s jump into that narrative at the deepest end of the things being considered. Maybe the theme of today is about thinking globally and building action locally. That trope remains popular and will continue to be popular moving forward. Zooming out to the global view of things a number of ongoing narratives abound these days with a sea of digital content being created. A great flood of information has been intensifying. I would actually begin to build up an argument that even the best collections of knowledge are going to start breaking down as the flood intensifies. We are now seeing the script get flipped and things are going from macro flooding to incredibly local models that are unique to individual computer operating systems. We are probably going to have to see some sort of defense against actors trying to federate all the local models into a larger system of trading.

Apple executives are reasoned and measured in the deployment of products. Like many of you, I was seriously curious to see what they would do at Apple as AI hype reached a crescendo. Earnings calls and forecasts seem to be triangulated on what AI will do for a company. Apparently, Apple Intelligence is going to do a lot of things [1]. It’s going to do so many things within the Apple ecosystem that endless hours have speculated about it. Google had the opportunity to really bring all the data within their ecosystem together in a very local way, but for some reason they just did not deliver on that potential. We are seeing Microsoft teams bringing forward a feature called recall which uses continuous screenshots with a local model. We are also starting to see the arrival of Copilot machines [2]. That means both Apple and Microsoft are going to provide very local personalized AI experiences. It’s unlikely that Apple executives will try to capture local user data and use it for federated LLM training. However, our friends at Microsoft will probably call this anonymized continuous learning a feature that enhances the model. 

Over the last couple of years I have had numerous conversations with people about understanding ROI related to technology projects. During those chats I try to explain that AI or ML is not really the product of the future for most companies. Telling people that their company is probably not the one that will become infinitely rich off of AI is always dicey. I think the underlying technology, models, and methods will become commoditized. Whatever company emerges at the top may have a brief advantage, but it will fade quickly as no moat exists for a repeatable idea. Generally speaking, most companies will end up using AI/ML to augment, automate, or add features to products they already have or are considering building. We are starting to see major players like Apple, Google, and others explaining that AI will power features or delivery in core products. It’s happening now in terms of announcements and we are now waiting for all the future AI features to launch. Maybe at some point along the way during the true intersection of technology and modernity we will see an AGI event or something on that level. 

All the future AI features are about to get a lot more coverage as people realize just how much has been spent to get to where we are right now. Nvidia as a company had a stock split and has the largest market cap after recently beating out both Apple and Microsoft. A mind boggling amount of money has been spent training models in both the cloud and in terms of hardware. We are starting to see some more specialized hardware showing up to the party. For some companies the GPU was the coin of the realm. Over at Hugging Face you can take a look at the open LLM leaderboard to see what model is currently king of the hill [3]. You will see from that sightseeing tour of the Hugging Face leaderboards that a lot of different LLM models exist right now and a lot of them are bunched up in terms of the rankings. What is interesting about that is that you can download models. Some of them even run locally and are pretty accessible in terms of deployment and usability.  

Footnotes:

[1] https://www.apple.com/apple-intelligence/ 

[2] https://blogs.microsoft.com/blog/2024/05/20/introducing-copilot-pcs/ 

[3] https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard 

What’s next for The Lindahl Letter? 

  • Week 159: The next level of featurization

  • Week 160: Increasingly problematic knowledge graph updates

  • Week 161: Structuring really large knowledge graphs

  • Week 162: Indexing facts vs. graphing knowledge

If you enjoyed this content, then please take a moment and share it with a friend. If you are new to The Lindahl Letter, then please consider subscribing. Stay curious, stay informed, and enjoy the week ahead!

Discussion about this podcast

The Lindahl Letter
The Lindahl Letter
Thoughts about technology (AI/ML) in newsletter form every Friday