Tech companies are really excited about artificial intelligence, often to the point of creating unhelpful AI-related services and features just to prove they’re doing something with the technology. Underneath the vapid marketing and useless functionality, there are some genuinely impressive AI features that will make a difference in your day-to-day life.
Qualcomm invited me to its headquarters in San Diego, California this week to show off its ongoing work on AI technology. You might not have heard of Qualcomm, but it’s the company that builds the core chipsets for countless phones and tablets, from high-end devices like the Galaxy S24 Ultra to budget models like the Moto G 5G. The company’s modems are found in most iPhones, and it’s building VR and AR hardware for use in the Meta Quest and other headsets. Most recently, Qualcomm started building high-end System-on-a-Chip (SoC) designs for Windows laptops, in direct competition with CPUs from Intel and AMD.
Qualcomm is developing a lot of AI hardware and software, built up from the company’s experience with mobile image processing and other earlier implementations of on-device machine learning. The new Snapdragon X chipsets for PC laptops have a dedicated neural processing unit (NPU) for on-device AI tasks. The company’s newer mobile chips, like the Snapdragon 8s Gen 3, can handle some large language models (LLMs) without help from an external server over an internet connection. Qualcomm isn’t alone here, to be clear—the latest laptop CPUs from AMD and Intel also have NPUs, and consumer Nvidia GPUs can also handle many on-device AI workloads.
I know what you’re thinking. You’re tired of hearing every tech company ramble about AI like it’s the magical solution to all the world’s problems. You’re sick of the AI features popping up in your favorite apps. Maybe you’re an artist, writer, or some other creator that heard OpenAI’s CTO say that AI could kill some creative jobs that “shouldn’t have been there in the first place,” and you’re ready to burn it all down to the ground. I get that, and I agree most implementations of “AI” right now are solutions in search of a problem or actively harmful.
Underneath the AI hype cycle nonsense, and the executives excited about replacing countless workers with cheaper automation, there are some actually useful features that have only become feasible with recent hardware from Qualcomm, Intel, AMD, NVIDIA, and other companies.
Cephable, a company that builds a camera-based input tool for people with disabilities, showed off an updated version of its software running on a Snapdragon X Elite laptop. It uses a webcam for monitoring head movements and facial expressions, translating them into key presses or other actions for desktop software (for example, turning your head to change slides in a PowerPoint presentation). The new version for Snapdragon laptops runs all machine learning software on the dedicated NPU, reducing battery usage, improving processing speed and accuracy, and freeing up CPU and GPU resources for your other applications. There was another demonstration of djay Pro that could split songs into multiple instrumental and vocal tracks for real-time DJ mixing, which is only practical with on-device AI. The latest Logic Pro update on Mac and iPad has similar functionality for audio production.
The ability to run large language models on a more typical smartphone, tablet, or PC opens up other interesting use cases. For example, the upcoming “Apple Intelligence” on iPhones, iPads, and Macs will use on-device AI to sort notifications and better understand spoken language in Siri. There are some features that are harder to build and scale when they require a powerful datacenter somewhere, and that’s what hardware makers are trying to change right now.
There aren’t many applications and services that use on-device AI right now, because they can behave differently across different devices and operating systems, and not everyone has a phone or PC with the required processing power. Newer developer tools, like NVIDIA’s TensorRT-LLM and Qualcomm’s AI Engine Direct SDK, are slowly making that part more accessible for software developers. Eventually, adding a feature that requires a powerful LLM won’t be much more complex than adding a feature that needs any other system function, and I expect that’s when we’ll see more apps adding useful features.
这些进步预示着未来将有更多的设备上 AI 功能成为可能,并且它们将像您喜爱的应用程序中的任何其他功能一样实现。令人讨厌的人工智能聊天机器人或社交媒体帖子上人工智能建议的回复的趋势最终将消失(希望如此),但我们将留下实际上有用的功能。这才是真正的人工智能革命:不是 Microsoft Edge 中巨大的 Copilot 按钮,而是您的应用程序和设备变得更加智能,并且更快、更高效地完成特定任务。
“人工智能”这个词在过去的一两年里已经失去了大部分意义,就像“加密货币”在上一次加密货币泡沫期间变得毫无意义一样。这可能意味着需要昂贵的数据中心的大型语言模型,或者可能是在现代智能手机上拍照时使用的图像处理算法。还有许多支持人工智能的设备显然只是为了炒作价值而使用这个术语,比如带有“人工智能智能烹饪技术”的电饭锅。
“人工智能”一词也经常用于描述几年前被称为“机器学习”的相同功能,例如照片中的对象识别或在语言之间翻译文本。其中许多机器学习功能都很有用,例如 Google Photos 添加了在照片库中搜索特定人物或宠物的功能,或者使用 Google Lens 来找出您刚刚发现的错误类型。其中大部分功能从来没有像许多现代人工智能功能那样令人讨厌和引人注目,而且其中许多功能不需要大型昂贵的服务器。
真正的人工智能革命不会是到处都是烦人的弹出窗口或聊天机器人,也不会是社交媒体上到处都是人工智能生成的丑陋图像。这只是软件数十年发展过程中的又一步,让您的设备变得更加有用。这就是令我兴奋的人工智能。
披露:我前往加利福尼亚州圣地亚哥参观人工智能分析师和媒体研讨会的费用是由高通公司支付的,包括旅行和住宿。这篇文章上线前,高通并未对其进行审核。
以上是真正的人工智能革命将是无形的的详细内容。更多信息请关注PHP中文网其他相关文章!