This episode currently has no reviews.
Submit ReviewMicrosoft's Copilot AI service will soon be able to run locally on PCs, thanks to built-in neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power. By running more elements of Copilot locally, lag time will be reduced and performance and privacy may be improved. Currently, Copilot runs primarily in the cloud, causing delays for smaller tasks. Intel's Lunar Lake chips, shipping in 2025, will have triple the NPU speeds of its current chips. Microsoft is also expanding Copilot's capabilities in Teams, allowing it to pull insights from both meeting chat and call transcripts, help rewrite messages, and generate new messages based on chat context. Microsoft is also introducing features to improve hybrid meetings, such as individual video feeds for each attendee and automatic camera switching for the best view. These updates will be rolled out in the coming months.
--- Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/messageThis episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review