Home  >  Article  >  Technology peripherals  >  The Depth Anything V2 model of the Byte model team was selected as Apple’s latest CoreML model

The Depth Anything V2 model of the Byte model team was selected as Apple’s latest CoreML model

王林
王林Original
2024-06-28 22:40:06299browse

Recently, Apple released 20 new Core ML models and 4 data sets on HuggingFace, and the monocular depth estimation model Depth Anything V2 from the Byte model team was selected among them.

字节大模型团队Depth Anything V2模型入选苹果最新CoreML模型

CoreML
  1. Apple's machine learning framework is used to integrate machine learning models to run efficiently on devices such as iOS and MacOS.
  2. Perform complex AI tasks without the need for an internet connection, enhancing user privacy and reducing latency.
  3. Apple developers can build intelligent and safe AI applications through these models.

Depth Anything V2

  1. A monocular depth estimation model developed by the Byte model team.
  2. The V2 version has finer detail processing, stronger robustness, and significantly improved speed.
  3. Contains models of different sizes from 25M to 1.3B parameters.
  4. The CoreML version included by Apple has been optimized by HuggingFace official engineering, using the smallest 25M model, and the inference speed on iPhone 12 Pro Max reaches 31.1 milliseconds.
  5. Can be used in fields such as autonomous driving, 3D modeling, augmented reality, security monitoring and spatial computing.

CoreML Model

  1. Apple’s newly released CoreML model covers multiple fields from natural language processing to image recognition.
  2. Developers can use the coremltools package to convert models trained by frameworks such as TensorFlow into Core ML format.
  3. Utilize CPU, GPU and Neural Engine to optimize performance on your device, minimizing memory footprint and power consumption.

    字节大模型团队Depth Anything V2模型入选苹果最新CoreML模型

The above is the detailed content of The Depth Anything V2 model of the Byte model team was selected as Apple’s latest CoreML model. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn