Apple’s annual “AI Summit” is imminent: In contrast to the public announcements made by its competitors Microsoft and Google, Apple’s event on the subject of “artificial intelligence” is to be restricted to its own employees only. For the first time this year, the AI Summit will take place in person on site in the large Steve Jobs Theater in Apple Park, which the company otherwise uses for product announcements and keynotes. The event should also be streamed internally, notes Bloomberg journalist Mark Gurman.
Apple optimizations for Stable Diffusion
Information on the agenda of Apple’s “AI Summit” has not yet penetrated. Unlike Microsoft and Google, Apple has not publicly jumped on the AI hype surrounding chatbots and text-to-image generators fed by “Large Language Models”. Especially with regard to the limited conversational skills of the Siri language assistance system, the company is likely to be increasingly under pressure.
The group has been using machine learning as a substructure for years, and recently Apple even adapted its operating systems unexpectedly quickly especially for stable diffusion: Optimizations in CoreML in iOS 16 and macOS 13 are intended to adapt the text-to-image generator to the in-house M and A – Speed up chips. Apple’s chips are equipped with a “neural engine” that is designed for tasks related to machine learning and is continuously optimized by the manufacturer. The Neural Engine is an “investment in the future,” an Apple executive told Mac&i in September.
Tim Cook: AI affects all products
At the end of last week, CEO Tim Cook left unanswered a question from a financial analyst as to whether Apple intends to further expand its service business through artificial intelligence. But AI is a “focus,” Cook said, pointing to features like the iPhone 14 series’ car crash detection. Apple sees “enormous potential in the area” that ultimately affects all products and services.
(lbe)
To home page